January 1, 2009 at 11:18 pm - by devsnapshot (7 Posts)
We're in a time of technological shifts of epic proportions. It wasn't long ago when people were still feeding room-sized chunks of metal simple commands in the form of flimsy, porus paper pieces. Now we have keyboards for the same task. I only see diabolic trickery like this on standardized testing.
I believe the same kinds of advancements have been made in software development and programming. Consider the low-level AT&T Assembly language and the high-level Python language. Assembly language was once the universal language because it was the only language. It gave step-by-step instructions on what registers the processor had to push, and what system calls to make in order to print outputs. How often have you heard of Assembly language in frequent use? Just as often as you've heard of punched cards still fed to computers. The majority of high schools don't teach it anymore. In Assembly, a 'clean' "Hello World" program takes many lines of source code. Each call made is the simplest of those available in the processor itself. An example can be seen here, under section 2.3. http://asm.sourceforge.net/intro/hello.html Why? Well, because there was no other way than to tell the computer every little instruction to execute. I feel the advantage in using this code today is to understanding how the software interacts with the hardware at a deeper level.
Meanwhile, a clean "Hello World" in Python is simply:
print "Hello, World!"
This is easy to understand and I like it for actual development because of its simplicity. It masks all the processor calls and data definitions found in ASM -- but the calls and definitions are still there.
All code is eventually converted to Machine Language. If you open an executable file with a hex editor, that's what you'll see; machine language. It is something that is not meant for humans, but the CPU. Low-level languages like Assembly and C provide the least abstraction from it while high-level languages like Python and Ruby heavy disfigure and cover it.
There is a certain speed tradeoff in high-level languages masking the low-level stuff. Sometimes, they can't provide the most efficient means of following instructions in a program. Most of the time, modern computers allow this to be neglected.
The number of people adept in low-level languages steadily diminishes and code becomes easier and easier to write as languages become more and more masking. This greatly expediates the development process, but may cause a shortage of knowledge in software-hardware interactions.
In my opinion, low-level languages should be limited to scholarly study and cut-throat optimizations, while high-level languages are to be used in real-world development. I take in to consideration the power of modern computers.