HDMI generation by cloning video ram

edited May 2014 in Hardware
(coming from http://www.worldofspectrum.org/forums/showthread.php?p=768257&posted=1#post768257).

A thing I was brainstorming about..
Why not using a FPGA/microcontroller with some separate memory to build a HDMI signal, having the FPGA/microcontroller respond to any video ram change in the ZX Spectrum?
So only sensing the address and datalines basically.

No analog circuits, no video ram reading and contention problems.
Simply a transparant innovative digital video cloning in separate ram, with HDMI output...

No more need for a ULA ;)
Post edited by bverstee on

Comments

  • edited May 2014
    I'm not sure how it works exactly but it seems you need to licence HDMI products and they want a big fee $10000 annual fee plus royalties. :o

    http://www.hdmi.org/manufacturer/adopter_registration.aspx

    and even if you're using components that are already licenced there's still a catch.
    04/30/2012--CLARIFICATION: DEFINITION OF HDMI? LICENSED PRODUCTS
    An HDMI product that consists entirely of licensed components does not necessarily mean the final product is a licensed product

    probably not an issue if your making it for yourself as a hobby but if you wanted to sell it as a comercial product they might get funny about it.

    I think DVI is usable without fees Winston is planning a DEV board with DVI output.
    It was discussed earlier today so I've been doing some reading up.
    http://www.worldofspectrum.org/forums/showthread.php?t=47233&page=8
  • edited May 2014
    bverstee wrote: »
    A thing I was brainstorming about..
    Why not using a FPGA/microcontroller with some separate memory to build a HDMI signal,

    Already working on it. Had a design in mind around 2 years ago, but just been too lazy to do anything about it - but the FPGA dev board I'm working on has this in mind as its primary purpose - new display options for the Spectrum. I'm about halfway done with the PCB layout. There are already a couple of DVI/HDMI implementations in Verilog which will work with the FPGA that I'm using (Spartan-6)

    However, please call it DVI. DVI and HDMI are actually compatible (video only), but the latter comes with disproportionate licensing fees.

    The FPGA board should also have enough resources to support DisplayPort but DVI/HDMI is a lot more straightforward (basically, it's a lot like VGA in its timings, only digital).
  • edited May 2014
    the alternative is to move all your assets and factory to china and sell the stuff on ebay, while blowing raspberries at the HDMI consortium :roll:
  • edited May 2014
    Winston wrote: »
    Already working on it. Had a design in mind around 2 years ago, but just been too lazy to do anything about it - but the FPGA dev board I'm working on has this in mind as its primary purpose - new display options for the Spectrum. I'm about halfway done with the PCB layout. There are already a couple of DVI/HDMI implementations in Verilog which will work with the FPGA that I'm using (Spartan-6)

    However, please call it DVI. DVI and HDMI are actually compatible (video only), but the latter comes with disproportionate licensing fees.

    The FPGA board should also have enough resources to support DisplayPort but DVI/HDMI is a lot more straightforward (basically, it's a lot like VGA in its timings, only digital).

    Sounds great Winston, and I am very certain there will be a huge demand for it!
  • edited May 2014
    bverstee wrote: »
    (coming from http://www.worldofspectrum.org/forums/showthread.php?p=768257&posted=1#post768257).

    A thing I was brainstorming about..
    Why not using a FPGA/microcontroller with some separate memory to build a HDMI signal, having the FPGA/microcontroller respond to any video ram change in the ZX Spectrum?
    So only sensing the address and datalines basically.

    No analog circuits, no video ram reading and contention problems.
    Simply a transparant innovative digital video cloning in separate ram, with HDMI output...

    No more need for a ULA ;)

    Well, that is what I was trying to say elsewhere.
    If you wated it to display current multicolour tricks correctly, it would have to synch with the timing.
    BUT, if you made it so that it could detect OUTs and mimic ULA plus palette and multicolour, and 128K paging, then maybe the timing could be ignored.

    Although, if you do ignore the timing - even if you're in step with the 50Hz refresh, scrolling games like Cobra and Hysteria flicker and tear like crazy as they rely on raster-chasing.
    Joefish
    - IONIAN-GAMES.com -
  • edited May 2014
    joefish wrote: »
    Although, if you do ignore the timing - even if you're in step with the 50Hz refresh, scrolling games like Cobra and Hysteria flicker and tear like crazy as they rely on raster-chasing.

    This was the reason for my Verilog that does clock recovery, basically to keep the 'virtual scanline' perfectly in sync with where the Spectrum thinks it is so that the various timing tricks still work.

    I suspect the Spectra (I don't have one, so I can't be sure!) does it by using a sync separator and uses the hsync/vsync to do this job, but this limits you to the 48K machines only since the others don't have the YUV signals on the edge connector (this is why I want to use clock recovery, to be as trouble free as possible).

    The only issue with my clock recovery is the level of jitter. I've since found that some displays will actually sync ok to an HDMI signal with Spectrum timings (someone did this in a clone on a Pipistrello FPGA dev board) for a 50Hz refresh, but I think the clock recovery circuit will have too much jitter for the display especially at the clock frequency the display will want. I'll get to experiment with it when I've finished laying out the board and got some PCBs (I have some ideas on how to get the jitter down to acceptable levels). For other screen refresh rates it doesn't matter since then you'll be doing double buffering in the manner you'd do it for doing a VGA converter, and the output of the second buffer will have its own (high quality) clock.
  • edited May 2014
    Good idea, a DVI/HDMI solution for my toastrack will be great.
    I'm now using a RGB to VGA converter, it works. But I get small with bots when using my toastrack.
  • edited May 2014
    Summarising suggestions from:
    http://www.worldofspectrum.org/forums/showthread.php?t=47233

    Besides HDMI, SCART and VGA output would be handy. Although if it was both analogue and digital DVI out then it could be turned into VGA or HDMI with a suitable adaptor. And flat LCD VGA monitors are quite cheap, cheaper still than early standard-ratio LCD TVs with SCART input. So yeah, forget SCART.

    ULA+ palette and multicolour mode (with attributes at $6000) support are highly desirable in an external video card. (Other features of ULA+, less so). Screen paging is desirable, but this can be achieved by duplicating the 128K page-mapping behaviour for banks #5 and #7. Allow for either screen to be paged into high memory, then skim off reads in the $C000-$FFFF range, and also use the 128K control bit to decide which screen to display.

    I would suggest a separate OUT port is used to choose which of the two displays is mapped over the $4000-$7FFF normal screen address range, so that the developer can take advantage of the page flipping even from a 48K Spectrum. This port may duplicate the control bit from the 128K that controls which screen is displayed.

    Also, as a bonus to (a) sell more of these adaptors and (b) have a little fun, adding a hardware switch to make the memory cloning listen for writes in the $0000-$3FFF range (the ROM) instead of $4000-$7FFF would allow two devices to be used simultaneously to generate two separate screens. That's also assuming all controls were write-only, and any palette/paging controls would then happen simultaneously on both connected devices. The developer would use the extra page-flipping control described above. The device re-mapped to the ROM area would no longer respond to 128K page mapping.

    Finally, if the device has a Kempston joystick port built-in, a second device, when set to second display mode, would change its joystick port to respond to IN 55! :D

    Unless someone is clever enough to make one device detect the presence of the other and automatically re-configure themselves as master/slave?
    Joefish
    - IONIAN-GAMES.com -
  • edited May 2014
    It's certainly possible to get audio through a DVI-D connector, as my HTPC outputs to my TV via one, using a special cable with an HDMI connector at the other end. However, I suspect that they're just driving it as though it's an HDMI port with a different pinout, and it won't support the full DVI standard when it's doing that.

    Anyway, if you do want to use HDMI connectors as they're much smaller and you could use more commonly available cables, you'd only need to pay the (exorbitant) licensing fee if you want to put the logo on the box. Otherwise you can buy the sockets in bulk for less than a pound each.
  • edited May 2014
    Matt_B wrote: »
    Anyway, if you do want to use HDMI connectors as they're much smaller and you could use more commonly available cables, you'd only need to pay the (exorbitant) licensing fee if you want to put the logo on the box. Otherwise you can buy the sockets in bulk for less than a pound each.

    This is the approach I've taken for the FPGA development board - basically, use the HDMI connector due to its ubiquity and hope to fly under the radar. It'll be good to know for certain that there's not even a radar that needs to be flown below. However I was still unsure about the licensing thing (I thought they may have a patent on the connector, rather than it just being a trademark thing. I can't imagine they have a patent over the signalling scheme since it's already in use by DVI and it's hardly novel)
  • edited May 2014
    Matt_B wrote: »
    It's certainly possible to get audio through a DVI-D connector, as my HTPC outputs to my TV via one, using a special cable with an HDMI connector at the other end. However, I suspect that they're just driving it as though it's an HDMI port with a different pinout, and it won't support the full DVI standard when it's doing that.

    Anyway, if you do want to use HDMI connectors as they're much smaller and you could use more commonly available cables, you'd only need to pay the (exorbitant) licensing fee if you want to put the logo on the box. Otherwise you can buy the sockets in bulk for less than a pound each.

    I think sticking with DVI on the interface and including/advising a DVI to HDMI adapter is a better option.

    I didn't think of audio before, and I can imagine it's not simple to get AY audio over HDMI other than a (digital) simulation with a CPLD or FPGA so it can be sent over HDMI too.
  • edited May 2014
    To be fair, they would be unlikely to go after you for money - as you have very little compared to what the case would cost, I think more likely they would make a complaint under the CE or WEEE regulations that you are also probably avoiding, and then have a government body shut you down for free.
  • edited May 2014
    smogit wrote: »
    To be fair, they would be unlikely to go after you for money - as you have very little compared to what the case would cost, I think more likely they would make a complaint under the CE or WEEE regulations that you are also probably avoiding, and then have a government body shut you down for free.

    Yes, and it's not like they ever seemed to have ever gone after the manufacturers of HDMI to DisplayPort cables either, despite kicking up a fuss about it a few years back.

    Anyone who crosses their trademarks is toast though.
  • edited May 2014
    Matt_B wrote: »
    Yes, and it's not like they ever seemed to have ever gone after the manufacturers of HDMI to DisplayPort cables either, despite kicking up a fuss about it a few years back.

    Anyone who crosses their trademarks is toast though.

    As a hardware producer I wouldn't risk it anyway, even more as we have a good alternative IMHO.
Sign In or Register to comment.