Monday, 15 October 2018

Stable 50Hz and 60Hz video

The main reason for going to 800x600, is that we can get most monitors to do a decent 50Hz mode and 60Hz mode at this resolution.  This means we can support PAL and NTSC in a very natural way.

Unfortunately, both of these video modes use different dot clock rates, which complicates issues a bit.  The issue is that the VIC-IV feeds pixels out at a fixed 100MHz, and works out internally when it is time for the next real pixel, which is at either 30MHz or 40Mhz, depending on the video mode.  If we do this naively, then the pixels will either jitter or smear on the VGA output, because the edges will not line up exactly from pixel to pixel.  It also just won't work for HDMI output, which requires a perfectly regular clock.

I have already put a bunch of logic in to deal with this, but it doesn't seem to be working properly, and as a result, there is visible jitter and sparkle on the edges of pixels, and it really doesn't look good, as you can see here:


It actually is more annoying in real-life, because the ragged edges of the pixels crawl up the frame.

So to try to fix this, I have added an extra buffer that buffers a whole raster line, and, so theory has it, clocks out the pixels using the correct clock.  The only problem is that the image above was taken with it enabled, and the problem is unchanged.  So now it is time to work my way back through the video pipeline and find out where it all going wrong.

First step, let's check that the buffer is working correctly.  To test this, I have modified the video pipeline so that instead of buffering a real video pixel, it buffers black and white pixels alternately, to make a simple kind of test pattern. Thus, if it is working correctly, I should see stable vertical lines in both the 50Hz and 60Hz video modes.  This takes only a few lines to implement (and then about half an hour to synthesise...), so it isn't too hard to test.

Well, that was the theory. In fact, it all refused to work for me, so I started instead to completely rework the video pipeline handling so that this is all fixed from one end to the other. 

The first step in this was to make an FPGA target with just the video frame generators for PAL and NTSC, and try to make it so that it can be switched from one to the other, with stable pixels for both.  Thus I made a test-pattern (just alternating black and white lines like before), and the frame drivers and tried to think of the most synthesis friendly way to switch the pixel clocks.

What I settled on for now, is to use an asynchronous FIFO, which are great for crossing clock domains, and to implement logic to multi-plex between the 30 and 40MHz clocks, to pull the pixels out at the correct rate.

This took quite a bit of fiddling around to get all the plumbing in the pixel driver cleaned up, and using the new multiplexer. It did, at least, eventually synthesise, and the VGA monitor thinks there is an image in both modes, and, thinks it is 800x600 at 50Hz and 60Hz, as desired.  So that looks good.  However, there is no image at all, not even the test pattern. So I need to try to figure out why.

After a bunch of fiddling, I managed to get some picture. Part of the problem was that I had forgot to blank the video during the vertical refresh, which is when most monitors work out the base-line voltage of the video signal. As a result, it had decided that the entire image should be black.

A bit more fiddling and I got a kind of test pattern working, that shows the read and write addresses computed for the FIFO buffer. Well, more correctly, for the memory buffer I was using previously.  It looks, in fact, like the FIFO is perhaps the problem now.  I had ditched my home-written BRAM memory buffer for the official Xilinx standard-issue FIFO, because it is supposed to be the best way to do this kind of thing.  However, right now I am scratching my head as to why it isn't working.  There is a reset line, which the documentation isn't totally clear whether it is active high or low. So I connected that to a switch, and tried both positions, but to no avail.

The FIFO has a line that strobes every time a byte is committed to the FIFO, and that is staying stubbornly silent.  So it looks indeed like I have messed up something with the FIFO, and it is not accepting the pixels I am pushing into it.

Yet more fiddling, and it now seems that the FIFO is working.  Part of what I had to do was add a reset sequence for the FIFO.  So now the next trick is to get columns of alternating pixels showing, to make sure that we are able to properly convert the pixels from the incoming 100MHz clock to the outgoing 30/40MHz clocks with no jitter. 

After a little work, I now have it more or less working, but with some artifacts in both the 50Hz and 60Hz video modes, as can be seen in these images:

First, we have 60Hz mode. The vertical banding is an artifact of my camera, but the fuzzy vertical lines are very much real. It happens spaced regularly about every 70 or so pixels.  My suspicion is that one of the few signals that are required to cross the clock domain is responsible, but it is hard to know for sure.


Meanwhile, for the 50Hz display, we have vertical banding that is visible in real life. Also, the pixels don't seem to be of even width:


The uneven width of pixels is really strange, because the FIFO should be using the correct pixel clock to clock the data out.  My only guess, and this might be the cause of both the uneven pixel thicknesses and the occassional fuzzy vertical line in the other mode, is that the FIFO is emptying, which then means that the jitter (from the perspective of the output clocks) of the incoming pixel stream cannot be concealed.

The best way to fix this, is to make sure we don't start reading from the FIFO until the FIFO's "almost empty" signal clears, which indicates that it has a few words stored in it, and thus should be safe to start reading from, since the input side should be supplying pixels at least as fast.

Well, that sounded like a nice theory, but I have confirmed with the oscilloscope that the FIFO only empties at the end of each raster line, and thus should not be the problem.

Closer examination of the 50Hz display shows that there is some tearing on the right edges of some columns of pixels, i.e., there is uncertainty which column will be wide or narrow.  This makes me think that the FIFO machinary I have built must not really be working to the output clock.  The first suspect here is the multiplexor that ties either the 30MHz or 40MHz clock to the read-side of the FIFO.  So, let's just force the read side to 30MHz. In theory, that should give us rock-solid 50Hz display, and messed up 60Hz display. And, indeed it does mess up the 60Hz display in quite interesting ways, but the 50Hz display is unchanged.

I am now really at a miss as to quite what is going on, and am increasingly suspecting that the semantics of the FIFO are not quite what I expect for some reason. But first, let's check that using the oscilloscope. The pixels alternate green between full on, and full off.  Thus, if we stick a probe on the green line of the VGA out, we should be able to get an idea of what is going on.  And the results are indeed interesting...

If the problems I was describing above were with the video signal, then we would expect to see jitter in the widths off the peaks and troughs of the green pulse.  However, they look completely equal to me in both modes:




So then I thought to myself, I wonder if the problem isn't in the monitor sampling the pixels strangely.  That would really explain a lot of things.  So I hit auto-adjust on the monitor, and the vertical bars in 60Hz mode briefly disappeared completely, before the monitor decided that it really wanted to re-adapt to a slightly fuzzy setting again. And checking the mode information detected by the monitor for 50Hz mode, I discovered that it now thought the mode was 1080x610 (I didn't even know tht such a mode exists!), so the banding makes sense, because again, the monitor is sampling the pixels at odd times and interpolating it out to full width.




So, this means that my FIFO machinery is all fine, which is great, but that there is something funny with the frames I am outputting, in that the monitor thinks that they are not quite right, and thus has to interpolate a few pixels one way or the other.

By changing the test pattern to a counter of the number of pixels output on a line, so that it represents a series of binary vertical counters, the problem becomes apparent: 



Basically there are a bunch of dummy pixels at the left edge of the screen (the part that looks like fat ! signs and the clear blue area next to it, before the recurring binary pattern starts).  That means that there are more than 800 pixels, and so the monitor tries to deal with that, by squishing or stretching, depending on what mode it thinks it is in.

A bit of digging around revealed a bug in the frame generator, where an allowance for the video pipeline depth was not being applied in a consistent manner to the start and end of rasters. 

Finally, after a lot more fiddling and fixing silly little bugs, I got the new frame driver working, although in 50Hz mode, the monitor still claims it is 1080 x 610.

So the next step is to integrate all this back into the mainline, which means adjusting the whole video pipeline to use the signals provided by the updated pixel driver here.  That will be a job for another post.

3 comments:

  1. Hi !! I have a problem like in the 1st image..is it an hardware problem ?? or maybe some compatibility with the Led TV ?? ty for help ! Ferdy

    ReplyDelete
  2. sorry i leave my mail if you can answer paopin@hotmail.it

    ReplyDelete
    Replies
    1. If you are having problems with the MEGA65 doing this, I recommend joining the MEGA65 discord server (link at mega65.org) and asking there.

      Delete