About This Item

Unique ID Code: 0000142412
Added by: RJS
Added on: 1/6/2011 20:11
View Changes

Videos and Info
  • Log in to Add Videos, Interviews, Etc
  • This article is lonely!

    Places to Buy

    Searching for products...

    Item Images

    Programmers Today Don't Know They're Born

    So much has changed in the computing world since I first got my ZX81 at the age of what I think was 9. The power of PCs is just insane nowadays in comparison, in fact even some of the cheapest mobile phones make the glory days of home computing look like a relic from a bygone era.

    We have graphical and CPU horsepower that can produce amazing images, and whilst not quite completely photo realistic, that step is only a matter of time. The Internet has brought specialist information into the home of anybody, created communities of like minded people that cross boundaries of nationality, race, sexuality and politics. We regularly talk to people who we might otherwise not even take a second glance at had we met them in a supermarket.

    I hereby state that I love all these sides of progress, they are truly wonderful and I have no wish for us to go back. However, I feel something in this whole transformation process has been lost.


    Machine Code Beckons


    When you got a ZX Spectrum, a Commodore 64, Oric 1, BBC Micro, Amstrad CPC, Dragon 32, or whatever... they all did one thing that no computer does these days. They encouraged you to learn how to program it. Admittedly they did so with one of the world's most awful languages, BASIC, but they did so none the less. More than that, once they hooked you into coding on them, they encouraged a vital step further, they almost demanded you begin speaking their language.

    Inline Image

    Yes, anybody who spent a certain amount of time with their old computer would almost inevitably want to write code in machine language, the binary instructions that the CPU spoke at the heart of all systems. They encouraged this because the technical limitations meant that BASIC was just too slow for anything other than a Football Management game.

    After some studying of machine code, you'd soon start to understand and appreciate many things about the system you were using, from how the chip itself actually worked, to how it spoke with other components. You understood how every programming task had to be broken down into ever simpler elements, because this was the only way to get things done. There was none of this grabbing a library here, or just downloading and altering something off the Internet there, you wrote it all yourself.


    But Speed Was a Huge Problem


    The limitations of CPU power made it all the harder, because these machines could only execute so many millions of cycles a second. Oh but millions is loads isn't it! No, actually it isn't. Take the ZX Spectrum for example, it had a CPU clocked at 3.5Mhz, which means the CPU ran at 3,500,000 cycles per second. But it takes between three and six of those cycles to execute a single machine cycle (the measurement used to define how long an instruction takes), and instructions themselves can take between 1 and 6 machine cycles.

    Suddenly that big sounding number is reduced radically, and quite how to exactly calculate how many it can manage per second becomes a lot harder. But let's take a few best case scenarios and say on average an instruction takes 2 machine cycles, which uses a minimum of 3 clock cycles, that's 3.5 million divided by 2 divided by 3, which is roughly 583,333 instructions per second.

    But even that number is misleading, because if you are coding a game, then frames per second comes into play. In olden days, before LCDs we all used CRTs for our televisions, you might even know somebody who still has one. PAL TV in the UK ran at 50Hz, which is fifty hertz, and hertz is times per second. Because of historical limitations that I won't go into here, but involved a lack of transmission bandwidth, each frame of video was divided up into two fields, sent as alternating lines. Therefore 50Hz was more like 25Hz if you de-interlaced the v... ok I'm getting off the point a bit here.

    So where were we? Oh yes, if you are aiming for a 25Hz image, not a bad aim for fluid fast moving graphics, your earlier amount of instructions per second becomes 23,000 instructions per frame. That's pretty much all you get, if you take longer then you miss the next frame or even worse you start drawing over that frame whilst it's being sent to the TV set causing what is these days known as tearing. Part of your image is from the previous frame, other parts from the next one, a distracting mess.

    Inline Image

    I'm not even going to get into how much harder things were on the ZX81 or Atari 2600, both machines which lacked proper display hardware, and actually required the CPU to be doing specific things for anything to even display an image every update. When some of your instructions per frame are actually spent sending that frame to your telly, even less is there to play with.


    The Thinking Lost to a Programming Generation


    Yes, I'm approaching my point here, be still mon ami we are nearly there. Coding well for these old machines required a level of skill and ingenuity which seems missing from so many coders these days. So what if your program runs 20% slower than it could do if you knew enough about what you were doing to have written it more efficiently in the first place? Next year's line of CPUs will run it 30% faster.

    Who cares if your stuff uses twice the RAM it needs to? In twelve months time it'll be cheaper and more abundant. These days it not only isn't worth the time to hand craft things in machine code for optimum performance, it isn't even worth writing your C code so it compiles into faster machine language.

    I remember programming in the days of DOS and 16-bit Windows, with horrendous segment offset memory models that would drive you insane. Back then when you stepped through your code there was no right clicking in a window and inspecting variables, your view was a line of C code and then the assembly language it translated into. It made you see the direct result of how you wrote your code and what came out the other side.

    And basic knowledge of what looked good and what looked bad in machine code told you how just doing simple things in C would make things run noticeably faster. That was a good skill to have, one which is somewhat redundant now. However I'm not sure it is, and the most noticeable area I see it in is ironically an interpreted language which is quite far removed from the days of the ZX Spectrum, yet writing it badly can cause huge performance issues.


    Old Skills with New SQLs


    Enter the modern relational database system, aka the RDBS, at the heart of almost every modern website and the biggest danger to it's performance. Anyone can download free software and design their own database system, pretty easily, however the pitfalls in making a bad one are far and wide. Do a poor job of it and you end up with a bad legacy that will bite you over and over again. You can even have the best designed database system in the world, and still bring it to its knees by coding awful queries.

    Here I find that old classic computer knowledge, and the skills I developed all those years ago, applying themselves rather well. I know what sort of things a CPU can do fast, and what things it does slowly. I know what eats RAM and why, I know what things are quick to compare and why.

    Inline Image

    Now I'm not saying my first forays into SQL where glorious, in the beginning I was as blind as the next blind person, but as I grew to see how they worked I discovered the pitfalls and the peaks. Can you design a super fast DB without any of this knowledge? Sure you can, but knowing why it finds some things harder than others certainly increases your hit rate.

    Coding these days seems far more about pulling off the shelf libraries, over complicated interfaces designed to make things easier, but ultimately all far too much of it does is hide you from complexity you really should know. Sure, some fancy interface might connect to a database on your behalf and write the SQL query for you, heck some of them will even create a whole database without you knowing much about SQL.

    But they will almost always come across plenty of situations where complicated querys don't scale, and I guess for most companies that isn't an issue since if the size of the system becomes unwieldy they'll just upgrade the hardware. But so often I wonder how much CPU time and power the world could save, if we just wrote it all properly in the first place.

    Why with all this modern day horse power does Microsoft Windows 7 still seem to hang around doing nothing just as often as Windows 95? Why does my Spectrum always seem more responsive than kit millions of times its power? How can Flash and a web browser eat so much of my CPU time and memory? Unless it's working out the cure for cancer in the background, I can only think it's nothing more than badly written.

    Now if you'll excuse me, I'm off to play with 64-bit hand coded machine language Linux, it's only 16k. You can write a "Hello World" app in just 31 bytes, try doing that in Windows!

    Your Opinions and Comments

    Do you thin there are just too many short-cuts? I note that many web designers just download templates that inevitably make everything start to look and perform in the same way. That's OK from a users point of view but it gets incredibly boring.

    From a programming perspective, with so many communities on the web trading tips and code, there is no longer any imperative to work things from scratch.
    posted by Stuart McLean on 4/6/2011 19:48
    In some respects yes. :/ I have nothing against things that make developing faster. But when it makes things easier at the expense of knowledge then bad things happen. As an example, C is not a difficult programming language, and whilst it can be used to create bad code it doesn't encourage it.

    VisualBasic on the other hand let's people who shouldn't be let near code write it, and the nature if the language encourages bad code techniques. Queue HUGE explosion a few years back of awful shareware and freeware. Thanks Microsoft.

    I wish that was their only contribution to bad development. However they aren't the only ones, PHP is as nasty.
    posted by RJS on 4/6/2011 20:25
    ...and it will just get worse. So what's the solution?!
    posted by Stuart McLean on 4/6/2011 20:44
    I'm not sure there is one really. In some respects there are still people around who get down and dirty with hardware, producing amazing things like coding a working 64 bit version of Linux in just 16k of code.

    The good side of things is VisualBasic as an app development tool seems long since dead, killed off in part by the open source movement and all that surrounds it providing a collaborative environment with many headstrong leaders that keep amazingly high coding standards. Whenever I scour the source of some projects I often doubt that equivalent code bases of major software producers are that well maintained.

    The bad side I think, is web development. All those have-a-go programmers that many years ago wrote awful VB apps now write websites. Although this seems a harmless place for them to play, websites getting hacked regularly demonstrate wholes that shouldn't be there.
    posted by RJS on 4/6/2011 20:50
    Sony being a prime example. Incredible. And the rise of mobilty and tablets will just make this worse.
    posted by Stuart McLean on 4/6/2011 21:25
    Heh, I'm not sure if Sony is or isn't a prime example, since they may be more of a demonstration of what happens when you get cheap about your internet presence as a multi-national company.

    WordPress and certain bulletin board packages, both of which seem to be a constant stream of vulnerabilities, are maybe better ones. The number of times I read about a SQL injection flaw that brought down a website, never ceases to amaze me. No excuse for it really. :/

    In a related note, this weeks Click on BBC News 24 had a nice piece about David Braben's tiny computer, and also spoke about how kids these days at schools are taught how to use applications and not program computers.
    posted by RJS on 5/6/2011 11:38
    I could never get into programming.  Although I once spent a little too much time translating a lunar lander program from Research Machines Basic to ZX Spectrum basic (and drove myself bananas in the process as I didn't really follow what I was doing).  I never really got any further than:

    10 Print "Hello Chunky"
    20 Goto 10
    posted by Mark Oates on 5/6/2011 23:53