Why?

13

Comments

  • edited December 2013
    I'm just wondering why people are here?

    Why are we working on or using old hardware? Why are some of you putting a lot of effort in upgrading the hardware when it will never be as good in terms of speed or graphics quality as a modern PC?

    I'm not saying for a microsecond that we shouldn't, I'm just curious about your individual reasons.

    Probably because it brings me back to my youth and the great feeling of learning & accomplishing something with my own hands/tinkering without some magical function doing me things that requires thousands lines worth of code. I always loved designing things slowly on paper, reinventing the wheel and writing assembly :razz: I cannot justify such "inefficient luxury" in professional life.

    The other thing (after the youth flashback) is that I can understand the system. The last SoC I worked with had a manual of +4200 pages just to explain the IO and memory map. I cannot justify myself spending endless hours digging into some modern system like that just for fun.. because it is no fun. My spare time is limited. Specifically when I am not a programmer by profession anymore, thus all larger programming tasks are a challenge by definition and limited to spare time bedroom coding :cry:

    Then the challenge of beating the limits of some dead sad hardware.. like Speccy :D I am no way skilled enough in art of programming that I could accomplish a "woow" effect with a modern hardware even to myself. Actually not on anything beyond Speccy specs, heh.

    Last but not the least is the community. Speccy was popular and the folks still hanging around are mostly just excellent. Helpful and ingenious.
  • edited December 2013
    Its got to be the WOW! factor. Back in the day it was getting a Multiface 1, an Opus Discovery... Now its the Interface 1bis, a DivIDE, a Harlequin (hopefully soon)...

    Also, I loved programming - something, anything. Even if I never got my head round machine code, I could get something on screen that worked. Todays computers require almost way too much effort to get "hello world" displayed.

    I've recently been buying the computers that I never had back in the 80's but always wanted as the Spectrum was "just a toy". How wrong I was, the Vic20, C64, Amiga, BBC may have been technically better, but the Spectrum had a quality all of them lacked. Only the ST came close, although I still have to get a TRS80 and Sharp Z80 to be sure.

    With all the kit I have now, its the Spectrum that gets used 90% of the time. Think that says it all.
  • edited December 2013
    Well you should try modern languages? Basic really sucks arse ;)

    I normally recommend Python for programmers that like Basic.
  • edited December 2013
    Well you should try modern languages? Basic really sucks arse ;)

    I normally recommend Python for programmers that like Basic.

    I've done a lot with VB, ADA, Fortran & C - across a lot of platforms. Last company I worked for had some very good applications in VB/ASP - got an expert in to modernise with Java and he bankrupted us. Directors got caught up in the myth that anything newer must be better.

    Basic is slow but if it does the job, why use anything else? GFA basic was very good/fast.

    Python is interesting and worth a look. Got a RasPi to check that out.
  • edited December 2013
    Yeah Java is a bit bobbins. C# is a lot better but still Java like.

    I wouldn't call Fortran and C modern though ;)
  • edited December 2013
    Why are we working on or using old hardware?
    Spectrum in its day was affordable and welcoming. It offered comprehensible challenges. To all and everyone and me, it seemed......
    Nowadays I compare Spectrum with the Willys WWII jeep. Hardly capabilities but still an icon. A one-time happening with a positive flavour.
    I myself still find challenges in system (ROM) related items.
    Why are some of you putting a lot of effort in upgrading the hardware when it will never be as good in terms of speed or graphics quality as a modern PC?
    Objectives like 'the best' or 'the fastest' and the like have, other than in academic sense, little or no significance in my hobby activities.
    I do not see any sense in a comparison with the powers of modern PCs. Are there still people that believe there is a race going on?
  • edited December 2013
    OOP is not good for the cache, which is the most important thing for performance.

    Depends on the application. Most applications are I/O bound rather than CPU bound so most of the time cache misses aren't worth worrying about.
  • edited December 2013
    Yeah Java is a bit bobbins. C# is a lot better but still Java like.

    I wouldn't call Fortran and C modern though ;)

    It is when music stopped being good in the 70's :)
  • edited December 2013
    Yeah I agree. Obviously profiling is your friend here.

    If you really need to go down to the knuckle (e.g. games) then squeezing the last bit of performance out is important and OOP isn't going to cut it you need to plan data to be cache friendly.
  • edited December 2013
    Well that still just depends on how you design your objects. Behind the scenes there is really very little difference between OOP code and non-OOP code. The only issue really is people who go too far in over engineering hideous object hierarchies. But those same people are just as likely to invent poorly thought out function code too.
  • edited December 2013
    There is though. OOP has objects with data. That's not cache friendly. What you need is a lot of data sequentially which you process all at once, and have your objects reference the data. You can do that with OOP of course just it's not the natural way to think about things.
  • edited December 2013
    I'm not sure what you mean by "an object with data", as if it's somehow different. Under the hood there really is little difference between an object and a 'struct'. For really performance critical code you might opt to organise objects in a very specific way, just as you would structs in C (you may even break them up entirely or combine them into a single object) but OOP itself isn't inherently bad the way you seem to suggest - in reality it's little more than a specific way of naming function calls.
  • edited December 2013
    I'm not saying it's bad.

    If you have 1000 objects with position, health, colour of shirt etc. I'm saying it's going to be a cache miss to update all the positions sequentially.

    If you have 1000 positions in an array of plain old data and iterate over that you'll see a remarkable performance increase.
  • edited December 2013
    Well yes, but that's just because of a poor choice of data structure given the job at hand, it's nothing to do with OOP, per se. It's one of those things you tend to see mostly because university's still teach bad programming practices based far too much on conceptual thinking (things like big O notation) rather than being grounded in the realities of how computers really work and how to approach problems in a way that works better in reality. That's nothing new though, I remember being taught that using pointers was "better" in a C course, because there was "less copying" going on - whereas in reality dereferencing pointers is probably one of the most expensive operations available.
  • edited December 2013
    Agreed. Why anyone would do a computer programming course at university is beyond me though ;) Do something that might be useful and/or interesting and/or relevant for the rest of your life rather than next 10 years max (I chose maths).
  • edited December 2013
    Why anyone would do a computer programming course at university is beyond me though ;)

    I did an MSc in Informatics (comp sci for arts grads) at Birkbeck and found it a very valuable experience. I don't code for a living and I had a lot of gaps in my knowledge, with my only previous computing qualification being an A-Level in comp sci I got in 1993. After the course I feel like I have a much more robust understanding of the fundamentals, a good handle on RISC (to the point that I wrote a virtual 64-bit CPU for the Z80), a good handle on OOP and design patterns (including when not to use them), and if I really had to I could write a database application using SQL (I also have no intention of coding for a living). It helps when people are snotty to me because I'm "just the writer".
  • edited December 2013
    Well arts mix is probably good. Working in computer industry we tend to see a lot of propeller heads who can't debug and can't communicate and if they do are arrogant (and usually wrong). As I said I did maths it teaches you how to analyse problems and split them up into smaller parts. Computer programming was piss after that.
  • edited December 2013
    Well arts mix is probably good. Working in computer industry we tend to see a lot of propeller heads who can't debug and can't communicate and if they do are arrogant (and usually wrong). As I said I did maths it teaches you how to analyse problems and split them up into smaller parts. Computer programming was piss after that.

    That reminds me, I never really could get my head around equations until I did extra classes in statistics at Birkbeck. My biggest gripe about developers is that most wouldn't know a good user interface if they saw one. On the other hand in my experience some people that do UX for a living are even worse.
  • edited December 2013
    Well the art of writing a good user interface is asking the users what they want.
  • edited December 2013
    Well the art of writing a good user interface is asking the users what they want.

    That only works if the users a) know what they want and b) are articulate enough to express it accurately.
  • edited December 2013
    Indeed, which is why you need mediator (i.e. programmer) to tell them or coach them into good user interface practice.

    Making them draw a diagram helps as a first step.

    EDIT: Proper user interface design involves watching guinea pigs using it, of course. That's expensive though.
  • edited December 2013
    WHERE'S MY ****ING GOTO? I DEMAND YOU RETURN MY GOTO!
  • edited December 2013
    Proper user interface design involves watching guinea pigs using it, of course. That's expensive though.

    Even better, and more expensive, is showing the guinea pigs the video of them using it and asking them to explain why they did what they did.
  • edited December 2013
    The main reason - a huge amount of nostalgia.
    And this:
    It’s an escape to a simpler time. And the simplicity of the Spectrum, from a technological point of view, is pure beauty to me.
  • edited December 2013
    guesser wrote: »
    WHERE'S MY ****ING GOTO? I DEMAND YOU RETURN MY GOTO!

    Its GOSUBs that are RETURNed. GOTOs just wander off somewhere and get lost.
  • edited December 2013
    rune wrote: »
    Last company I worked for had some very good applications in VB/ASP - got an expert in to modernise with Java and he bankrupted us.

    Would it have been different if they got an expert to rewrite everything in any other language (even Assembly) instead?

    Programming languages are not magical. They are just a tool, that can be improperly used just like any other tool.

    It's like complaining a shelf is somewhat unstable because it was put together using only nails instead of screws. Therefore hammers are terrible tools and should be abolished, screwdrivers are much better tools!
    Creator of ZXDB, BIFROST/NIRVANA, ZX7/RCS, etc. I don't frequent this forum anymore, please look for me elsewhere.
  • edited December 2013
    coming from somebody that doesn't know any coding, that used to gaze in awe at the games from the arcades (I remember how amazing those little colored moving light dots did look to me..it was Bosconian and you could see and count the pixels in each starship) and that still remember the comment that my dad made looking at a Psion logo in the HORIZON loading screen from my speecy ("computer graphics has made giant steps!") I'll tell you why Mr Jones (very nice to meet you btw :))

    BECAUSE it's fun, it's clean and (thanks to AGD and CGD) it let me put tons of movable colored pixels on the screen :D and play with it

    :D
    Gab
    PS and I am looking forward to the new game desigining utility taht will use Nirvana (I am sure somebody is working on that ...well i really hope so) !
    Find my (mostly unfinished) games here
    https://www.facebook.com/groups/1694367894143130/
  • edited December 2013
    Would it have been different if they got an expert to rewrite everything in any other language (even Assembly) instead?

    Programming languages are not magical. They are just a tool, that can be improperly used just like any other tool.

    It's like complaining a shelf is somewhat unstable because it was put together using only nails instead of screws. Therefore hammers are terrible tools and should be abolished, screwdrivers are much better tools!

    The fact that we had a great application that did everything needed was neither here nor there. The owner retired, his son took over and believed that "old" was bad and Java was the future...

    In the end the project manager spent all the money and killed the company. Directors didn't want to listen to the people who were saying that Java was wrong for the job.

    When customers complained that the new apps were too clunky his answer was to tell them they needed better hardware, more memory etc.
  • edited December 2013
    Would it have been different if they got an expert to rewrite everything in any other language (even Assembly) instead?

    Programming languages are not magical. They are just a tool, that can be improperly used just like any other tool.

    It's like complaining a shelf is somewhat unstable because it was put together using only nails instead of screws. Therefore hammers are terrible tools and should be abolished, screwdrivers are much better tools!

    A shelf designed to be built with a hammer and nails will work just as well as one designed around screws and a screwdriver.

    The problem comes when the old-school hammer and nails guy tries to fix a screwed together shelf, or the young fresh-out-of-shelf-college guy gets a job at Nailed Together Shelves Я Us and tries to fix stuff with a screwdriver
  • edited December 2013
    rune wrote: »
    The fact that we had a great application that did everything needed was neither here nor there. The owner retired, his son took over and believed that "old" was bad and Java was the future...

    You also have to think about the future. An application written in an out-of-fashion language will eventually be expensive to maintain because as time passes, fewer people will have the skills to maintain it. Not to mention your users may complain about how old fashioned the UI is and you can lose to the competition based solely on that, no matter how good the back end is. If you're worried about customers being able to run your software on new gadgets, keep your fingers crossed that your out-of-fashion language is supported or legacy binaries can run on them.

    I don't know if any of that applies at all but nearly bankrupting a company sounds much worse :D
    When customers complained that the new apps were too clunky his answer was to tell them they needed better hardware, more memory etc.

    This is the way things are. Modern sw engineering is expensive in terms of memory and cpu cycles and technology has been growing into it to make it possible. Java languished for almost 10 years because the technology wasn't there to make it economical in terms of performance. There are two features of modern OOP languages that are so important that it trumps everything else :- maintainability and ability to handle complexity. The likelihood of project success is increased, OOP design requires the programmer to modularize code which helps with future-proofness, and development times drop.

    But OOP languages themselves and OOP design result in bigger and slower code that require faster processors and more memory. Where memory and cpu speed are constrained, modern OOP languages are still not a serious consideration. And this may change as the 32 bit processors and memory become cheaper and occupy less die area -- I saw one ad for a keyboard, a keyboard, with a 32-bit processor and 2MB of flash memory which as far as I could tell was used to flash leds. This sort of thing should normally die a Darwin death but technology is so cheap now that the material costs are less than the engineering costs. As far as semiconductors are concerned, the one thing that may stop 32 bit processors from completely supplanting 16- or 8- bits from the die is heat and power requirements.

    As for the unsuitability of modern OOP languages to memory constrained and speed constrained systems, I am not even talking about large footprint languages like Java and C# that require a virtual machine, parameter validation at runtime, and a JIT compiler to get reasonable performance.

    I am also talking about C++, that OOP language closest to the processor and the best performer. The reason is OOP programming style results in the very frequent creation and destruction of temporary objects (C++11's move copies was introduced to mitigate this). You also lose control of execution time as the programmer is not always directly aware of what code sequence may execute, as a lot of it can be hidden. In real time systems you need to know that a certain task takes a maximum x cycles to execute. That is near impossible with an OOP compiled language. And then you have to be very aware of what sort of code is generated by an OOP compiler and there are very few programmers that know this. Even something as simple as templatized code can result in very large programs if the programmer does not have some minimum amount of skill. If programmers can screw things up, they will find a way and C++ provides many, many ways to screw up.
Sign In or Register to comment.