How did people write large assembler programs on real Spectrums?

edited May 2007 in Development
I have been wondering how people did their development back on real Spectrums. If you should write a large program, there would first be an assembler program in memory, then the assembly code written by the programmer (which would take at least as much space as the actual code) and then some memory left for the produced machine code. For me it seems like the actual program code could only occupy half the memory or am I thinking wrong?
Post edited by Rickard on

Comments

  • edited April 2007
    I think most used a cross-assembler running on a PC. 640K was a lot back then!

    I've always wondered about the various mastering utilities though - who has the Speedlock SAVE code, or Powerload etc. Somebody somewhere must have the authoring software.
    Oh bugger!<br>
  • edited April 2007
    When I wrote Virgin Atlantic Challenge I developed it in small sections. When the project neared completion I assembled the code to a microdrive. I then loaded in the graphics data and the code from the microdrive and ran it. A lot of the time the code would crash so I would then have to reload the assembler followed by the source. This took ages, especially when the microdrive tape failed :(

    Writing Iron Sphere was a dream being able to use Spin and Tommygun.
  • edited April 2007
    AFAIK most developers had several linked computers, or a "more powerful" computer which injected the compiled code to a speccy, or even a minicomputer (Ocean et al).

    SirFred was developed using two speccies linked together. They used one to code and compile, and the other one to test. If the game crashed, no re-loading was neccessary.

    Most homebrew companies in Spain used an Amstrad CPC linked to a Speccy, for example, or simply developed on the CPC and then ported for the Speccy. A CPC 6128 was a powerful enough machine to develop with, as the floppies saved lots of time.
  • oboobo
    edited April 2007
    Rafaelle Cecco mentioned using a PC setup in Cecco's Log:
    As you can see from the photographs, a lot more equipment is used in developing Spectrum games than a single Spectrum and cassette-based assembler! The Spectrum is actually developed on an IBM compatible PC which runs a fast Z80 cross-assembler that can compile a 200K source file in a few seconds. After the program has been assembled, it can be downloaded to the Spectrum via a parallel link, ready for testing.

    Hisoft's DevPac supported multi-stage assembling from tape, but it seemed to be a lot more trouble than it was worth. I remember the untokenised source code being very bloated, leaving room for only 2-3K of object code if you were lucky. I used to just piece everything together by hand - something that was much easier when I finally got a +D :-)

    I've switched to doing all SAM and Speccy code in PC-based assemblers, as it's so much quicker to knock things up. I did spend a while using native editors inside an emulator, but it's much nicer to be able to use your normal editor, and the development cycle is much faster too.

    Si
  • edited April 2007
    aowen wrote: »
    Sinclair itself had a network of Vaxen running CP/M. When Amstrad bought the company this was replaced with a network of PCWs.

    I'm still rather dubious about that claim - the VAX didn't have a Z80 and could not run CP/M (except under emulation). Any VAX cluster of that period would have been running VAX/VMS or as an outside chance, Ultrix.
  • edited April 2007
    obo wrote: »
    Rafaelle Cecco mentioned using a PC setup in Cecco's Log:

    Starting to read his log, does anyone knows which game he is working on? (Still unnamed in the article.)

    EDIT: In the next issue of Crash I found the name: Stormlord.
  • edited April 2007
    obo wrote: »
    I remember the untokenised source code being very bloated, leaving room for only 2-3K of object code if you were lucky. I used to just piece everything together by hand - something that was much easier when I finally got a +D

    Yes, untokenised source can be quite large, tho' the resulting code is usually quite small.

    Personally I wrote my earlier code using an assembler that actually allowed me to use the 128k editor (the code was written into REM statements - to compile the code you would load the assembler and MERGE your source, then RUN). I was lucky I guess in that my development machine was a +3, so everything was stored to disc. Like obo I would piece the results together by hand - the first "draft" of my software was a BASIC program that loaded all of the various files. Once I was happy with it I'd do that but instead of running it I'd load up Turbo Imploder, and save the result to disc.

    Disc systems are definately useful - this is evident with assemblers such as (thinks...) Tornado, which I think allowed the equivalent of #include. Or was that something else?

    These days... makefiles, some custom tools what I wrote, and SjASMPlus.
  • TMRTMR
    edited April 2007
    Winston wrote: »
    I'm still rather dubious about that claim - the VAX didn't have a Z80 and could not run CP/M (except under emulation). Any VAX cluster of that period would have been running VAX/VMS or as an outside chance, Ultrix.

    You don't need a Z80 on the source machine to assemble code from it, Graftgold had two pretty much identical Opus 386 machines for example, one used by Steve Turner for Spectrum and Amstrad code and the other by Andrew Braybrook for C64 (mentioned at length in Braybrook's Morpheus diary). If memory serves, Atari were using VAXen to assemble coin-op code (the source being Jed Margolis' VAXMail logs).

    There's footage of the hardware Imagine used to cross assemble in Commercial Breaks, Ocean used a mixture of hardware including bespoke ST-based cross assemblers that took their source code cues from Zeus and didn't the Olivers use a CPC to write stuff and ship the code over...?
    Rickard wrote: »
    Starting to read his log, does anyone knows which game he is working on? (Still unnamed in the article.)

    Stormlord i believe...?
  • edited April 2007
    TMR wrote: »
    You don't need a Z80 on the source machine to assemble code from it

    But you do if you want to run CP/M on the machine, which was only ever available for Z80, 8080 and later 8086 based systems. CP/M never ran natively on a VAX. That was the aim of my comment, not that you couldn't develop Spectrum software on a VAX (there's no reason why not - I've done it, I have sjasm plus working on a VAX running BSD) but that the development was done on CP/M running on a VAX which would make very little sense - when doing cross development using a native, non-emulated OS like VMS or Ultrix would have been much better.
  • edited April 2007
    Winston wrote: »
    But you do if you want to run CP/M on the machine, which was only ever available for Z80, 8080 and later 8086 based systems. CP/M never ran natively on a VAX. That was the aim of my comment, not that you couldn't develop Spectrum software on a VAX (there's no reason why not - I've done it, I have sjasm plus working on a VAX running BSD) but that the development was done on CP/M running on a VAX which would make very little sense - when doing cross development using a native, non-emulated OS like VMS or Ultrix would have been much better.

    I think I may be missing something here ... I just got up and haven't really got the sleepygunk outta me eyes ... erm ... why didn't we see more games for the VAXes?
  • edited April 2007
    I never ran CP/M on a Vax, tho' I did get it running on a Henry Hoover, and I've had Ubuntu Linux running on a Dyson.
  • edited April 2007
    icabod wrote: »
    I never ran CP/M on a Vax, tho' I did get it running on a Henry Hoover, and I've had Ubuntu Linux running on a Dyson.

    don't tell us what you got from the goblin. Some things are better left unsaid.
    My test signature
  • edited April 2007
    Rickard wrote: »
    Starting to read his log, does anyone knows which game he is working on? (Still unnamed in the article.)

    EDIT: In the next issue of Crash I found the name: Stormlord.
    He was actually working on two, Stormlord and Cybernoid 2.
    I wanna tell you a story 'bout a woman I know...
  • edited April 2007
    The big companies used PDS (Programmers Development System) to write the games this could compile code to different machines from a PC. I believe Ocean used it.
    I wanna tell you a story 'bout a woman I know...
  • edited April 2007
    Rickard wrote: »
    I have been wondering how people did their development back on real Spectrums. If you should write a large program, there would first be an assembler program in memory, then the assembly code written by the programmer (which would take at least as much space as the actual code) and then some memory left for the produced machine code. For me it seems like the actual program code could only occupy half the memory or am I thinking wrong?

    Well, we had to use lots of tapes back then :)

    Seriously, most Czech stuff I know about was developed without the use of another computer. We all seemed to use the Tomas Rylek's PikAsm assembler exclusively, which stored the source in precompiled form already, allowing huge savings, for both space and compile time. The memory footprint of the programs was not that large, after all, the data took lot of space as well. One usually used some form of an editor to create the data separately, saved them to tapes in separate blocks, writing down the sizes. Once this was all ready, one compiled the program once again, took its size, then created the memory map on a piece of paper - font goes here, sprites here, background maps here, tiles here, code here, space for structures here, stack here, interrupt handler here, etc. Then one used this map and defined the addresses via EQU for the program, recompiled it one last time and saved it to tape. Finally, you would run screen based monitor (most people here used V.A.S.T., another ingenious creation of Tomas Rylek), cleared entire memory from 23296 to 65535, and then loaded the corresponding blocks to appropriate locations, eventually filling in some pieces directly from the monitor. Once done, you save the whole memory block, and you are almost done. The rest was just running it through compressor, then running the result through loader encrypter, and that was about it.

    Patrik
  • edited April 2007
    Here is a timely reminder of Sinclair's VAX.

    http://opinion.zdnet.co.uk/leader/0,1000002208,39286815,00.htm

    I visited Amstrad two years before they bought Sinclair and they had a Vax then, (for payroll), although it was the CPC developers who managed it and kicked it in the right place when it went wrong.

    I eventually obtained two Spectrums and assembled on one and passed the object code over the network to the other for testing. Devpac had a system that would save source code to microdrive and assemble large object code from microdrive.
  • edited April 2007
    I used to write my games on an old +2 using the LERM Z80 Toolkit. By putting the symbol table into screen RAM this allowed about 30K of source code and 5k of object code. Most of my games consisted of data for graphics, screens etc. so 5k was usually enough for most games, though sometimes I'd separate a few routines into a second file. Any unrolled loops could be poked into memory with Sinclair BASIC.

    Since I started to develop on the PC things have gone the opposite way, and the data gets included with program code in one big source file. The source for Gamex, for example, is a fairly compact 533120 bytes of Wordstar text file.
    Still supporting Multi-Platform Arcade Game Designer, currently working on AGD 5. I am NOT on Twitter.
    Egghead Website
    Arcade Game Designer
    My itch.io page
  • edited April 2007
    icabod wrote: »
    Personally I wrote my earlier code using an assembler that actually allowed me to use the 128k editor (the code was written into REM statements - to compile the code you would load the assembler and MERGE your source, then RUN).

    Completely unrelated (well, not really), but I see that Roy Longbottom has today granted distribution of this very same package. I picked it up at a car boot sale, and it's quite lovely. Well, I liked it, anyway.
  • edited April 2007
    ...and hopefully it will get TZX'd soon...provided I can get the tape to load I bought off Ebay. :)

    EDIT: I recall coding assembler in REM statements as well, I wrote a turboload save routine this way - must be 1984? Maybe I used the same program, I don't remember.
  • edited April 2007
    aowen wrote: »
    My information that Sinclair's Vaxen were running CP/M comes from the author of that article who is a former Sinclair (and Amstrad) employee, and I have no reason to disbelieve him. If anyone could get CP/M running on a Vax it would be the boffins at Sinclair.

    He or you must remember incorrectly, I suspect - unless they used CP/M under emulation (and I just can't understand why you'd want to do that, unless you just wanted to use the Z80 emulator). CP/M was never available for the VAX (it was an extremely limited 8 bit OS, the VAX is a 32 bit system) and I can't think of any rational reason why you'd want to cripple a VAX by porting it, which would be a monumental task for a small company, considering that CP/M is closed source and you'd be porting to a completely foreign architecture. If you wanted to be cheap, you'd run BSD on the VAX instead of VMS.

    Incidentally, the ZDNet article is pretty inaccurate too - when Amstrad took over Sinclair, the VAX was very much not obsolete, and very much supported by DEC (and DEC would have happily supplied you an 8 inch disk drive if you really needed one). Even the earliest VAX (the VAX 11/780) was only a few years old at the time, and minicomputers like that were expected to be in service for over a decade at most places. DEC made the VAX into the 1990s, and VMS was kept up on the machine until a couple of years ago. The article is also inaccurate about it being impossible to reconstruct the original device (i.e. a hardware Speccy) - most parts are still manufactured, and the ULA could easily be made in an FPGA.
  • edited April 2007
    Originally I developed on a 48K with a Dk'tronics keyboard and a Waferdrive system. My code was split into a large number of small basic programs as my compiler used code in REM statements, which were compiled to set locations of memory, all of which would be finally be gathered together into a single piece of code. This was a nightmare as if one section grew too large and started overrunning the next block, which was often the case, I'd have to re-arrange all the code!

    These days I still split my code amongst different files, logically grouped by content (graphics, AY, maths, keyboard, sprites etc), but obviously now can just include them all into a master game file.
  • zubzub
    edited April 2007
    Winston wrote: »
    But you do if you want to run CP/M on the machine, which was only ever available for Z80, 8080 and later 8086 based systems.

    While I'm sure you're right about VAXen, I was quite surprised when I found out that CP/M also ran on the M68000 and the Zilog Z8000. There's also GEMDOS, but exactly how much of a difference there is between CP/M-68K and GEMDOS, I don't know. :)
    FUSE: the Free Unix Spectrum Emulator, also for Windows, OS X and more!
    http://fuse-emulator.sourceforge.net/
  • edited May 2007
    DEATH wrote: »
    I've always wondered about the various mastering utilities though - who has the Speedlock SAVE code, or Powerload etc. Somebody somewhere must have the authoring software.

    It's possible to create new tapes using those systems without the original SAVE routines - provided you know enough about how they work. I've made a few new Speedlock TZXs myself... for example, Amaurote:

    http://www.markboyd.myby.co.uk/amaurote.tzx

    I made one using Speedlock 2 as well (or the later revision of Speedlock 1 if you use the old numbering system like wot I do) but I can't release that as it's a denied title :)

    Marko
  • edited May 2007
    There is actually no need to keep whole program in RAM while developing it. Since nobody can programm/debug 10KB or even 1KB code in half hour, there is a need to have only small part of source in RAM. All this leads us directly to jumptables, used in lot of programs.
    Most used subrutines, locations are 'listed' in jumptables, so address changes by programming will not affect calls from other (already made parts).
    Such jumptables harm not much after finishing program, so they are often left in finished product.
  • edited May 2007
    x86 is very similar to Z80 (indeed, you can directly translate Z80 code to 8086 code). In any case, QDOS (which became MS-DOS) was not a port of CP/M (illegal or otherwise) any more than Linux is a port of Unix.

    The early IBM PC and previous CP/M machines architecturally were pretty similar. The same thing cannot be said about the VAX platform. 8086 is a segmented 64K address space style of architecture (and so not that alien compared to a Z80 with paging). Finally, it would be insanity to spend large sums of money on a VAX cluster then cripple it something like CP/M (to do which you'd need to totally rewrite CP/M to translate it to the VAX platform, as well as add the things that VMS already has, like clustering - after all, it was a network of VAXen). If you actually wanted to run CP/M you'd spend 1/10th of the cost of a VAX and buy a real CP/M based personal computer. The VAX is a multi user multi tasking minicomputer. CP/M is a single user, single tasking operating system for personal computers.

    If they did use CP/M on a VAX, it would only have been under emulation and presumably to use the Z80 emulation to test code with decent debugging tools that you otherwise don't get when running on a real Z80, and they'd have been using cross development tools under VAX/VMS (or possibly BSD or Ultrix). The VAXen of the early 1980s were certainly man enough to emulate the Z80 (but not at full speed). Later VAXen could probably emulate the Z80 at full speed - by the mid-80s they were up to a few MIPS.
  • zubzub
    edited May 2007
    Winston wrote: »
    you'd need to totally rewrite CP/M to translate it to the VAX platform

    Large chucks on CP/M 3 were written in a high level language, PL/M. There was a compiler for PL/M on the VAX. While it seems unlikely that anyone would have wanted to port CP/M to the VAX, I doubt that it would have taken as much work as you think.

    FWIW, in CP/M 3, wc -l *.ASM gives 30957 lines of 8080 assembly, whereas wc -l *.PLM shows 24983 lines in PL/M. Judging from the copyright messages, the code seems to date from 1982.
    FUSE: the Free Unix Spectrum Emulator, also for Windows, OS X and more!
    http://fuse-emulator.sourceforge.net/
  • edited May 2007
    ghbearman wrote: »
    ...and hopefully it will get TZX'd soon...provided I can get the tape to load I bought off Ebay. :)

    EDIT: I recall coding assembler in REM statements as well, I wrote a turboload save routine this way - must be 1984? Maybe I used the same program, I don't remember.

    Yes, my Currah MicroSource would compile from REM ! statements. Having a hardware assembler kept memory free for development (it only required about 2K of stack space during compilation) and you didn't have to load it as it was there from switch on.

    I used to draw out a memory map and decide where I was going to place graphics, buffers, maps and various source code routines. These would be saved on different tapes (after updating a source code module, I would save it as a NEW version on the tape dedicated to that module.

    Used to hate it when your code would crash, you had to SAVE the source code before running it in case it did crash and then you had to load it up again, knowing that it had a bug that you then had to fix.

    Oh, how much easier it is nowadayes, cross-compiling using TASM on the PC onto an emulator.
  • edited May 2007
    When I was 15 or 16 years old I wrote a letter ( by hand ) to Andy Severn from Players Software asking him about coding for the spectrum and how it was like, how they did it, where to learn more on Z80 and so on - we didn't have internet way back then. What I do remember from his letter was that he said they programmed on a CP/M machine and cross-assembled the code for the speccy. I also remember him saying that the hardware was CPC running CP/M. I don't recall VAX or VMS in the mix but this was Players Software, not Ocean and Imagine :grin:
  • edited May 2007
    icabod wrote: »
    I never ran CP/M on a Vax, tho' I did get it running on a Henry Hoover, and I've had Ubuntu Linux running on a Dyson.


    <pedant>

    What the hell's a Henry Hoover? That's a bit like a Commodore Macintosh, or a Vauxhall Mondeo. Those "Henry" vacuum cleaners with the comedy faces on them are made by a company called Numatic.

    </pedant>
  • edited May 2007
    aowen wrote: »
    They're also the most rugged vacuum cleaners you can buy. Most ships I sailed in have them. They can be used without a bag for use with wet materials but there's another cleaner (a blue one) designed for dealing with liquids. Can't remember the name of that one though.

    George? Charles? James?
Sign In or Register to comment.