If you replace the ULA in the original machine with one that supports more graphics modes does it cease to be a Spectrum?
Most definitely Yes!
It is the simplicity of the original 48K machine that is so endearing and attracted, and still attracts, so many add-on manufacturers.
When Alan Turing saw what the National Physics Laboratory had done with his designs for the Ace in 1950 he said it was "more in the American tradition of solving one's difficulties by means of much equipment rather than by thought."
This is the trap that Timex and Amstrad fell into.
The other endearing feature is the BASIC - so deceptively simple. The only implementation to place all the user's memory at the disposal of the programmer at the start of every statement.
To take such trouble marked the author as a genius and an original thinker then. That the UK government has just given him more than ?400,000 to apply his mathematical analysis to problems in quantum physics confirms his Turing-like standing. With the possible exception of some of the Acorn crew no other 80s system guys have so endured.
It is a pleasure and a privilege to use the BASIC and read his books.
If you replace the ULA in the original machine with one that supports more graphics modes does it cease to be a Spectrum?
Well, if it passes the "Cobra Test", I would consider it a true Spectrum.
The Cobra Test is like the Turing Test:
"A human judge engages in a Cobra game with one Speccy and one clone, each of which tries to appear Speccy. All participants are placed in isolated locations. If the judge cannot reliably tell the clone from the Speccy, the clone is said to have passed the test."
NB: 128K models actually DO NOT pass the test, so they are not COBRA-COMPLETE.
I think someone did a COBRA 'fix' that removed the "in a,(0xff)" attribute read, or whatever it was, with a big delay loop, for the +2 et al - but just caused everything to jerk and flicker.
"A human judge engages in a Cobra game with one Speccy and one clone, each of which tries to appear Speccy. All participants are placed in isolated locations. If the judge cannot reliably tell the clone from the Speccy, the clone is said to have passed the test."
NB: 128K models actually DO NOT pass the test, so they are not COBRA-COMPLETE.
A reasonably good emulator running on a PC could probably pass that test (although depending on how attentive the judge was, you might need to put some effort into getting a monitor to run at 50Hz, and/or throw in some fake dot crawl). Would that count as a true Spectrum?
Only when that PC would take input from a Videoface too.
A 'Spectrum' that acts as one but does not allow the use all hard existing interfaces, can't be Spectrum.
OK, lets assume the perfect emulator with all intfs included: Would such emulator allow me to continue the hardware experiments which I started on a real Spectrum with hot soldering iron at hand? This thread is becoming more interesting than I expected!
A reasonably good emulator running on a PC could probably pass that test (although depending on how attentive the judge was, you might need to put some effort into getting a monitor to run at 50Hz, and/or throw in some fake dot crawl). Would that count as a true Spectrum?
I still haven't found an emulator good enough to pass that test. And I don't take into account dot crawl, PAL ghosting, and other analog TV artifacts. IMHO, there's too many people obsessed in those details instead of simply making the emulator perfect at the logical/digital level.
We didn't like those artifacts back in the 80s, we fine tuned our TVs to display as fewer artifacts as possible, so I don't see the point of perfectly emulating that other than for nostalgic reasons.
Some artifacts as scan-lines, aperture grilles and filtering are necessary, because a crystal-clean scaled up rendition of a Spectrum screen is simply AWFUL, and CRTs have specific filtering circuitry that TFTs don't, and should.
So I still haven't played Cobra in an emulator that keeps a steady, non jerky frame rate, without spurious freezes, frame drops, quirks, delayed audio, etc. for a full game. Simply because neither Windows, linux, nor Mac OS are real time operating systems.
I wrote a ULA kernel some months ago in C, which I later rewrote in pure x86 and PowerPC assembler, but even though I managed to achieve a steady 50 Hz frame rate on PC, then came Windows and Mac OS X and conspired in dropping spurious frames every once in a while.
I think someone did a COBRA 'fix' that removed the "in a,(0xff)" attribute read, or whatever it was, with a big delay loop, for the +2 et al - but just caused everything to jerk and flicker.
To cause Cobra to jerk and flicker is a crime, and "crime is a disease". That's why I always play that game on a real 48K Speccy.
Most definitely Yes!
It is the simplicity of the original 48K machine that is so endearing and attracted, and still attracts, so many add-on manufacturers.
There is an add-on that allows the Spectrum to use the MSX graphics modes. It just plugs in the back and uses the TI graphics chip. I think the SAM started out as a design for a Spectrum graphics add-on as well.
However, your argument that 90% of software will run on the 48K machine is sound. Yet most people around here still seem to prefer the Spectrum 128. There's no accounting for taste I guess.
Well, if it passes the "Cobra Test", I would consider it a true Spectrum.
I suspect this would mean that a Sinclair ZX Spectrum 48k NTSC would not pass too due to reduced frame times running at 60Hz - I don't think that is a particularly ideal definition.
So I still haven't played Cobra in an emulator that keeps a steady, non jerky frame rate, without spurious freezes, frame drops, quirks, delayed audio, etc. for a full game. Simply because neither Windows, linux, nor Mac OS are real time operating systems.
I don't think it has much to do with the operating system - simply the graphics and sound are not precisely synchronised on a PC/Mac type machine, so despite emulating say precisely enough sound data for 1s at 44.1KHz, the monitor refreshes will not precisely match, so you have to make a choice about what drives what, and the best choice is the sound - which means you will drop/dup a frame once in a blue moon. Incidentally, I seem to recall that Patrik has mentioned that the DS doesn't suffer from this and operates at a syncronised 50Hz, so ZXDS doesn't have this problem.
A secondary complication is that the LCD standard refresh of 60Hz (or the common CRT 60, 70, 72Hz) is not well suited to 50Hz Speccy emulation, so will introduce some jitter as some frames are onscreen for more than others. Spectaculator apparently works with 100Hz CRT monitors for a smooth display (though I guess with the once in a blue moon frame sync issue mentioned above - Jon?).
When I first started out doing 'Saucer', and getting a few members of this board to play test it, the 'star field' and stuff jerked around. I couldn't test it on a real speccy, just emulators. I had to mess with code to make it work like it should. Still not sure how it looks on a real speccy!
I don't think it has much to do with the operating system - simply the graphics and sound are not precisely synchronised on a PC/Mac type machine,
With a thorough selection of graphics/sound cards and a proper PAL TV output it's feasible to achieve a perfectly synchonized audio/video and smooth frame updates.
The problem is that those OSes are not real time, so the system API cannot guarantee a certain process will be completed in say 1/50 of a second. Another process with higher priority, such a memory swap, buffer flush, etc. can steal the CPU for a Spectrum frame worth of time, or longer, so you have the flicker.
It's possible to trim the OS to the minimum: kill unnecessary processes, disable network, have only the emulator running, raise the priority of the emulator, etc. but even in that "best case scenario", something steals the CPU and popppp!!!
Incidentally, I seem to recall that Patrik has mentioned that the DS doesn't suffer from this and operates at a syncronised 50Hz, so ZXDS doesn't have this problem.
Sounds good! I'll have a peet at ZXDS later. But even though the NDS may be able to sync at 50 Hz, the point is that NDS does not run a preemptive multitasking OS, the emulator has all the machine power, and can guarantee a frame would be completed on time.
A secondary complication is that the LCD standard refresh of 60Hz (or the common CRT 60, 70, 72Hz) is not well suited to 50Hz Speccy emulation, so will introduce some jitter as some frames are onscreen for more than others.
It would be feasible to workaround 50 Hz refresh on standard 60 Hz LCDs: simply render to a double framebuffer at 50 Hz, and blend a fraction of one frame and the next at 60 Hz, "similar" to what has been done for ages to convert 24 fps film to 60 Hz.
It's possible to trim the OS to the minimum: kill unnecessary processes, disable network, have only the emulator running, raise the priority of the emulator, etc. but even in that "best case scenario", something steals the CPU and popppp!!!
Do you actually have evidence of this happening on a reasonable spec modern PC, or are you just quoting buzzwords?
Do you actually have evidence of this happening on a reasonable spec modern PC, or are you just quoting buzzwords?
I have 3 reasonable spec modern PCs (one of them PAL output capable) and a reasonable spec modern Mac, and still haven't managed to run Cobra at a steady full 50 Hz frame rate for a full game without missing a frame, without a spurious flicker or without an occasional sprite jerk.
Please note that I'm not putting the blame on emulators -I've been using yours for ages and works like a charm- but on the OS.
With a thorough selection of graphics/sound cards and a proper PAL TV output it's feasible to achieve a perfectly synchonized audio/video and smooth frame updates.
I don't believe that this is true, but would be prepared to look at any evidence you have of syncronised clocks between sound cards and graphics cards suitable for use on a PAL TV for PC/Mac. I'm sure that many people will be interested for their own emulation setups.
The problem is that those OSes are not real time, so the system API cannot guarantee a certain process will be completed in say 1/50 of a second. Another process with higher priority, such a memory swap, buffer flush, etc. can steal the CPU for a Spectrum frame worth of time, or longer, so you have the flicker.
I don't think this is the root cause in the vast majority of cases, modern OSs are all pre-emptive, CPUs are very very quick, and an emulator only needs to run for a small portion of time every 20ms. Running on a heavily-loaded system can cause problems if the emulator isn't using high priority tasks to do the job, but I don't think that is the common case. Do you think you see the same problem with DVD playback on those systems? I can't say that I have, and I expect that Speccy emulation is not that different.
While standard Linux, OS X and Windows are not hard real time OSs, they all feature high priority tasks that are not so prone to this issue.
It would be feasible to workaround 50 Hz refresh on standard 60 Hz LCDs: simply render to a double framebuffer at 50 Hz, and blend a fraction of one frame and the next at 60 Hz, "similar" to what has been done for ages to convert 24 fps film to 60 Hz.
Looking at this type of approach is one of the things on my to-do list.
I don't believe that this is true, but would be prepared to look at any evidence you have of syncronised clocks between sound cards and graphics cards suitable for use on a PAL TV for PC/Mac.
What evidence do you need? If you have a graphics card that is capable of outputting a true 50.000 Hz PAL video signal, and a decent sound card which you know the latency, where's the problem in syncronizing both? 44,100 Hz is an exact multiple of 50.
I don't think this is the root cause in the vast majority of cases, modern OSs are all pre-emptive, CPUs are very very quick, and an emulator only needs to run for a small portion of time every 20ms.
So what's the root then? Does Cobra run smoothly and without the slightest frame drop, sprite jerk, occasional flicker, etc. on your system?
It's been raining outside all day, I have loads of washing to dry, it's too hot to put the central heating on to warm up the radiators, and I've lost a sock!
So what's the root then? Does Cobra run smoothly and without the slightest frame drop, sprite jerk, occasional flicker, etc. on your system?
Yes.
Let's put it this way: if you've found something which can actually cause the Linux kernel to prevent an active task running for 20ms, the kernel devs would love to know about it, as that would be a very serious bug in the scheduler. Occam's Razor says something else is going on here.
Let's put it this way: if you've found something which can actually cause the Linux kernel to prevent an active task running for 20ms, the kernel devs would love to know about it, as that would be a very serious bug in the scheduler. Occam's Razor says something else is going on here.
I don't use linux at the moment. Just Mac OS X and XP.
I'm at work, and I've just tested Cobra in both Spectaculator 7.0.1.1310 and Fuse 0.10.0.2 for Windows on my quite powerful office machine (max. CPU usage: 2%) with the very same results: occasional frame miss, occasional sync glitch, occasional sprite flicker.
Also, I've started asking all people I know they use Speccy emulators whether Cobra runs absolutely smoothly on their systems, and none of them has given me a full positive response yet.
I'll try to install a decent+recent linux distribution on my most powerful machine. If it runs smoothly, then we've found the issue.
What evidence do you need? If you have a graphics card that is capable of outputting a true 50.000 Hz PAL video signal, and a decent sound card which you know the latency, where's the problem in syncronizing both? 44,100 Hz is an exact multiple of 50.
If there are distinct clock sources (i.e. the sound card has it's own timing, the CPU has it's own, and the GPU it's own again), then they drift relative to one another which means that they fall out of sync leading to glitches.
So if you do the "obvious" thing, you wait for the VBLANK, generate the frame and 44100Hz/50 sound samples, play the samples and still find you get audio glitches where the buffer underruns as the playback rate is not precisely calibrated with the video.
As I mentioned, the DS has a single clock source for all, so doesn't suffer this problem.
I would expect that the evidence would be technical specs for a PC CPU/Video/Sound card, detailing the clock source used for each and how they are synchronised.
So what's the root then? Does Cobra run smoothly and without the slightest frame drop, sprite jerk, occasional flicker, etc. on your system?
I believe that there is more than one thing that could contribute to this list:
1) frame drop, high load on machine/machine not up to the job - I don't think I ever see this, but could probably make it happen
2) sprite jerk, the ratio between display frequency and Spectrum frequency - I intend to look at this one day as I mentioned, and I think I do see this type of effect (basically as different Speccy frames are shown for different periods)
3) occasional flicker, don't know - I've got more than one Cobra TZX, this one (ftp://ftp.worldofspectrum.org/pub/sinclair/games/c/Cobra.tzx.zip) seems to be fine on Fuse to me, but another one I have cobra_1.tzx flickers a lot
I'm at work, and I've just tested Cobra in both Spectaculator 7.0.1.1310 and Fuse 0.10.0.2 for Windows on my quite powerful office machine (max. CPU usage: 2%) with the very same results: occasional frame miss, occasional sync glitch, occasional sprite flicker.
Did you run Spectaculator with 100Hz mode and a 100Hz CRT?
Also, I've started asking all people I know they use Speccy emulators whether Cobra runs absolutely smoothly on their systems, and none of them has given me a full positive response yet.
If there are distinct clock sources (i.e. the sound card has it's own timing, the CPU has it's own, and the GPU it's own again), then they drift relative to one another which means that they fall out of sync leading to glitches.
Of course, but doesn't exist a mechanism to sync/compensate for this drifting? Can this drifting be serious enough to be noticed even if compensated for, say every frame?
Why aren't these driftings noticed in video playback?
I must confess I am not an experienced audiovisual "sync'er" on Windows machinery (all my Windows programs are actually MUTE), so I don't have answers to these questions and would appreciate further info.
So if you do the "obvious" thing, you wait for the VBLANK, generate the frame and 44100Hz/50 sound samples, play the samples and still find you get audio glitches where the buffer underruns as the playback rate is not precisely calibrated with the video.
So it's not possible to generate extra audio data (at least as much as the total drifting tolerance of both cards, and that must be VERY FEW samples per frame) just in case the audio frame underruns, and then compensate for the extra samples that have been played?
An adjustment of a few samples should not be noticed, as long as this adjustment is done on a frame basis (or every few frames).
Not. All people I have asked by now are wintel-heads :-o but will keep asking :)
Hey, and I absolutely believe ZXDS runs smoothly (I'll try to check it out this afternoon!). I have only ranted about Windows, Mac OS X and linux! LOL!
If there are distinct clock sources (i.e. the sound card has it's own timing, the CPU has it's own, and the GPU it's own again), then they drift relative to one another which means that they fall out of sync leading to glitches.
But the accuracy of even cheap quartz crystal oscillators is such that to have sufficient drift caused by the clocks at something at audio frequencies would probably take hours of runtime to actually manifest itself as a glitch - certainly not 1/50th of a second.
Of course, but doesn't exist a mechanism to sync/compensate for this drifting? Can this drifting be serious enough to be noticed even if compensated for, say every frame?
I believe the standard approach is to standardise on driving the app with the audio clock as audio glitches are the most jarring. This means that you will occasionally drop or duplicate a frame
Why aren't these driftings noticed in video playback?
The only approach I am aware of is that you do drop/gain the odd frame and it isn't very noticeable as there will be some time passing before the frame is dropped or gained.
So it's not possible to generate extra audio data (at least as much as the total drifting tolerance of both cards, and that must be VERY FEW samples per frame) just in case the audio frame underruns, and then compensate for the extra samples that have been played?
Fuse for example does generate a frame ahead to ensure that there is enough data to cover the rate of consumption by the sound card (or uses APIs that block when the sound queue is full).
I think that most APIs require you to submit sound to be played, so you will only find out that you don't have enough sound when asked for another sound fragment (when you need more than just the sample or two that you were short), or if you had too much sound you will back up sound in your output queue.
I think not possible is a big call, but I think the details are likely to be very platform-specific.
Comments
Most definitely Yes!
It is the simplicity of the original 48K machine that is so endearing and attracted, and still attracts, so many add-on manufacturers.
When Alan Turing saw what the National Physics Laboratory had done with his designs for the Ace in 1950 he said it was "more in the American tradition of solving one's difficulties by means of much equipment rather than by thought."
This is the trap that Timex and Amstrad fell into.
The other endearing feature is the BASIC - so deceptively simple. The only implementation to place all the user's memory at the disposal of the programmer at the start of every statement.
To take such trouble marked the author as a genius and an original thinker then. That the UK government has just given him more than ?400,000 to apply his mathematical analysis to problems in quantum physics confirms his Turing-like standing. With the possible exception of some of the Acorn crew no other 80s system guys have so endured.
It is a pleasure and a privilege to use the BASIC and read his books.
Well, if it passes the "Cobra Test", I would consider it a true Spectrum.
The Cobra Test is like the Turing Test:
"A human judge engages in a Cobra game with one Speccy and one clone, each of which tries to appear Speccy. All participants are placed in isolated locations. If the judge cannot reliably tell the clone from the Speccy, the clone is said to have passed the test."
NB: 128K models actually DO NOT pass the test, so they are not COBRA-COMPLETE.
http://www.youtube.com/watch?v=R9L450iferM
The same thing happend to Terra Cresta - but that was done in-house after I left Ocean. Nothing to do with me!
A reasonably good emulator running on a PC could probably pass that test (although depending on how attentive the judge was, you might need to put some effort into getting a monitor to run at 50Hz, and/or throw in some fake dot crawl). Would that count as a true Spectrum?
A 'Spectrum' that acts as one but does not allow the use all hard existing interfaces, can't be Spectrum.
OK, lets assume the perfect emulator with all intfs included: Would such emulator allow me to continue the hardware experiments which I started on a real Spectrum with hot soldering iron at hand? This thread is becoming more interesting than I expected!
I still haven't found an emulator good enough to pass that test. And I don't take into account dot crawl, PAL ghosting, and other analog TV artifacts. IMHO, there's too many people obsessed in those details instead of simply making the emulator perfect at the logical/digital level.
We didn't like those artifacts back in the 80s, we fine tuned our TVs to display as fewer artifacts as possible, so I don't see the point of perfectly emulating that other than for nostalgic reasons.
Some artifacts as scan-lines, aperture grilles and filtering are necessary, because a crystal-clean scaled up rendition of a Spectrum screen is simply AWFUL, and CRTs have specific filtering circuitry that TFTs don't, and should.
So I still haven't played Cobra in an emulator that keeps a steady, non jerky frame rate, without spurious freezes, frame drops, quirks, delayed audio, etc. for a full game. Simply because neither Windows, linux, nor Mac OS are real time operating systems.
I wrote a ULA kernel some months ago in C, which I later rewrote in pure x86 and PowerPC assembler, but even though I managed to achieve a steady 50 Hz frame rate on PC, then came Windows and Mac OS X and conspired in dropping spurious frames every once in a while.
To cause Cobra to jerk and flicker is a crime, and "crime is a disease". That's why I always play that game on a real 48K Speccy.
There is an add-on that allows the Spectrum to use the MSX graphics modes. It just plugs in the back and uses the TI graphics chip. I think the SAM started out as a design for a Spectrum graphics add-on as well.
However, your argument that 90% of software will run on the 48K machine is sound. Yet most people around here still seem to prefer the Spectrum 128. There's no accounting for taste I guess.
I suspect this would mean that a Sinclair ZX Spectrum 48k NTSC would not pass too due to reduced frame times running at 60Hz - I don't think that is a particularly ideal definition.
I don't think it has much to do with the operating system - simply the graphics and sound are not precisely synchronised on a PC/Mac type machine, so despite emulating say precisely enough sound data for 1s at 44.1KHz, the monitor refreshes will not precisely match, so you have to make a choice about what drives what, and the best choice is the sound - which means you will drop/dup a frame once in a blue moon. Incidentally, I seem to recall that Patrik has mentioned that the DS doesn't suffer from this and operates at a syncronised 50Hz, so ZXDS doesn't have this problem.
A secondary complication is that the LCD standard refresh of 60Hz (or the common CRT 60, 70, 72Hz) is not well suited to 50Hz Speccy emulation, so will introduce some jitter as some frames are onscreen for more than others. Spectaculator apparently works with 100Hz CRT monitors for a smooth display (though I guess with the once in a blue moon frame sync issue mentioned above - Jon?).
I could take a look if you're still interested and that's all that's necessary to reignite your rapid Saucer development ;)
With a thorough selection of graphics/sound cards and a proper PAL TV output it's feasible to achieve a perfectly synchonized audio/video and smooth frame updates.
The problem is that those OSes are not real time, so the system API cannot guarantee a certain process will be completed in say 1/50 of a second. Another process with higher priority, such a memory swap, buffer flush, etc. can steal the CPU for a Spectrum frame worth of time, or longer, so you have the flicker.
It's possible to trim the OS to the minimum: kill unnecessary processes, disable network, have only the emulator running, raise the priority of the emulator, etc. but even in that "best case scenario", something steals the CPU and popppp!!!
Sounds good! I'll have a peet at ZXDS later. But even though the NDS may be able to sync at 50 Hz, the point is that NDS does not run a preemptive multitasking OS, the emulator has all the machine power, and can guarantee a frame would be completed on time.
It would be feasible to workaround 50 Hz refresh on standard 60 Hz LCDs: simply render to a double framebuffer at 50 Hz, and blend a fraction of one frame and the next at 60 Hz, "similar" to what has been done for ages to convert 24 fps film to 60 Hz.
Do you actually have evidence of this happening on a reasonable spec modern PC, or are you just quoting buzzwords?
I have 3 reasonable spec modern PCs (one of them PAL output capable) and a reasonable spec modern Mac, and still haven't managed to run Cobra at a steady full 50 Hz frame rate for a full game without missing a frame, without a spurious flicker or without an occasional sprite jerk.
Please note that I'm not putting the blame on emulators -I've been using yours for ages and works like a charm- but on the OS.
I don't believe that this is true, but would be prepared to look at any evidence you have of syncronised clocks between sound cards and graphics cards suitable for use on a PAL TV for PC/Mac. I'm sure that many people will be interested for their own emulation setups.
I don't think this is the root cause in the vast majority of cases, modern OSs are all pre-emptive, CPUs are very very quick, and an emulator only needs to run for a small portion of time every 20ms. Running on a heavily-loaded system can cause problems if the emulator isn't using high priority tasks to do the job, but I don't think that is the common case. Do you think you see the same problem with DVD playback on those systems? I can't say that I have, and I expect that Speccy emulation is not that different.
While standard Linux, OS X and Windows are not hard real time OSs, they all feature high priority tasks that are not so prone to this issue.
Looking at this type of approach is one of the things on my to-do list.
What evidence do you need? If you have a graphics card that is capable of outputting a true 50.000 Hz PAL video signal, and a decent sound card which you know the latency, where's the problem in syncronizing both? 44,100 Hz is an exact multiple of 50.
So what's the root then? Does Cobra run smoothly and without the slightest frame drop, sprite jerk, occasional flicker, etc. on your system?
It's been raining outside all day, I have loads of washing to dry, it's too hot to put the central heating on to warm up the radiators, and I've lost a sock!
This bloody "housewifing" is rubbish!
Yes.
Let's put it this way: if you've found something which can actually cause the Linux kernel to prevent an active task running for 20ms, the kernel devs would love to know about it, as that would be a very serious bug in the scheduler. Occam's Razor says something else is going on here.
I don't use linux at the moment. Just Mac OS X and XP.
I'm at work, and I've just tested Cobra in both Spectaculator 7.0.1.1310 and Fuse 0.10.0.2 for Windows on my quite powerful office machine (max. CPU usage: 2%) with the very same results: occasional frame miss, occasional sync glitch, occasional sprite flicker.
Also, I've started asking all people I know they use Speccy emulators whether Cobra runs absolutely smoothly on their systems, and none of them has given me a full positive response yet.
I'll try to install a decent+recent linux distribution on my most powerful machine. If it runs smoothly, then we've found the issue.
Stripped socks! What else? :oops:
If there are distinct clock sources (i.e. the sound card has it's own timing, the CPU has it's own, and the GPU it's own again), then they drift relative to one another which means that they fall out of sync leading to glitches.
So if you do the "obvious" thing, you wait for the VBLANK, generate the frame and 44100Hz/50 sound samples, play the samples and still find you get audio glitches where the buffer underruns as the playback rate is not precisely calibrated with the video.
As I mentioned, the DS has a single clock source for all, so doesn't suffer this problem.
I would expect that the evidence would be technical specs for a PC CPU/Video/Sound card, detailing the clock source used for each and how they are synchronised.
I believe that there is more than one thing that could contribute to this list:
1) frame drop, high load on machine/machine not up to the job - I don't think I ever see this, but could probably make it happen
2) sprite jerk, the ratio between display frequency and Spectrum frequency - I intend to look at this one day as I mentioned, and I think I do see this type of effect (basically as different Speccy frames are shown for different periods)
3) occasional flicker, don't know - I've got more than one Cobra TZX, this one (ftp://ftp.worldofspectrum.org/pub/sinclair/games/c/Cobra.tzx.zip) seems to be fine on Fuse to me, but another one I have cobra_1.tzx flickers a lot
Did you run Spectaculator with 100Hz mode and a 100Hz CRT?
Have they been using a 100Hz CRT or ZXDS?
Of course, but doesn't exist a mechanism to sync/compensate for this drifting? Can this drifting be serious enough to be noticed even if compensated for, say every frame?
Why aren't these driftings noticed in video playback?
I must confess I am not an experienced audiovisual "sync'er" on Windows machinery (all my Windows programs are actually MUTE), so I don't have answers to these questions and would appreciate further info.
So it's not possible to generate extra audio data (at least as much as the total drifting tolerance of both cards, and that must be VERY FEW samples per frame) just in case the audio frame underruns, and then compensate for the extra samples that have been played?
An adjustment of a few samples should not be noticed, as long as this adjustment is done on a frame basis (or every few frames).
Hey, interesting topic huh? :-)
Not at work, but at home I run both emulators with a PAL 50 Hz CRT.
Not. All people I have asked by now are wintel-heads :-o but will keep asking :)
Hey, and I absolutely believe ZXDS runs smoothly (I'll try to check it out this afternoon!). I have only ranted about Windows, Mac OS X and linux! LOL!
But the accuracy of even cheap quartz crystal oscillators is such that to have sufficient drift caused by the clocks at something at audio frequencies would probably take hours of runtime to actually manifest itself as a glitch - certainly not 1/50th of a second.
I believe the standard approach is to standardise on driving the app with the audio clock as audio glitches are the most jarring. This means that you will occasionally drop or duplicate a frame
The only approach I am aware of is that you do drop/gain the odd frame and it isn't very noticeable as there will be some time passing before the frame is dropped or gained.
Fuse for example does generate a frame ahead to ensure that there is enough data to cover the rate of consumption by the sound card (or uses APIs that block when the sound queue is full).
I think that most APIs require you to submit sound to be played, so you will only find out that you don't have enough sound when asked for another sound fragment (when you need more than just the sample or two that you were short), or if you had too much sound you will back up sound in your output queue.
I think not possible is a big call, but I think the details are likely to be very platform-specific.