VGA vs. NTSC timings are not giving the same desired output

This is the starting place for reporting bugs to the team. We will pass bug reports on to the developers after validating the reports.

You can report bugs in hardware, operating system (KERNAL or BASIC ROMs), the emulator, or the Demo library. For bugs specific to downloaded programs, use the download forum.
doslogo
Posts: 43
Joined: Fri Dec 20, 2024 4:26 pm

VGA vs. NTSC timings are not giving the same desired output

Post by doslogo »

[This post includes source code and binaries to test on real hardware, see attachments]

Games need to have the same visuals no matter if they are displayed on VGA, NTSC, emulator, and hopefully future HDMI.

My game requires scroll registers to be latched at the desired scanlines, no matter if the display mode is VGA, or NTSC. And no matter what the scroll register is (changing scroll X for layer 1 might not be the same latch timings as changing scroll Y for layer 0). The latching process is hidden and unknown to programmers, but if timed right, MUST behave the same on every allowed video mode and computer hardware.

Just because most people recognizes horizontal scrolling as parallax doesn't mean that vertical scrolling shouldn't work the same way internally. Latching the vertical scroll register of a layer should be the same as latching the horizontal scroll register of another layer. Or at least the documentation should say priority/order of latching.



The bug is visually presented by writing this test program that can be run and easily modified. Using the backdrop palette color as test, it will give a more visual representation of the problem.

VGA is behaving as expected of the documentation. Setting a palette index such as the first (the backdrop) to a different color than set from vertical blank, during a specific scanline should change the color somewhere close to 1/3 of the screen's width of said scanline as seen in the screenshot from a capture card:
vga.jpg
vga.jpg (9.59 KiB) Viewed 404 times
VGA

Note that 240p scaled content such as layers and sprites do snap to even pixels, but backdrop color won't since it is acting on the 480p interrupted scanline. How layer scroll values are affected by even and odd scanlines in 240p are unknown and when something works on VGA, it must also work exactly the same on NTSC (which it currently don't, play that Sonic demo on the SD card for proof).

To get a full scanline change of a backdrop color, one can use the timing of the CPU to delay the write until the raster beam has reached the end of the visible screen by inserting NOPs. In this case, 63 NOPs are inserted to give this result on VGA:
vga_nops.jpg
vga_nops.jpg (9.56 KiB) Viewed 404 times
VGA

A few more NOPs and the write of the palette entry will happen in the hblank as desired. Normal game consoles allow such effects, no matter if it is NTSC or PAL (see Yoshi's Island for SNES). With these many NOPs, there are no time to run game logic between scanlines in 480p, but in 320p there are a few cycles to take advantage of.


Now to the NTSC 240p, Composite. The same code as before, the backdrop is affected much earlier. Looks like half the distance compared to VGA but that might be misleading:
composite.jpg
composite.jpg (15.17 KiB) Viewed 404 times
Composite

When adding the same 63 NOPs to get the palette change in the next hblank, the palette change happens somewhere in the middle of the screen instead:
composite_nops.jpg
composite_nops.jpg (15.39 KiB) Viewed 404 times
Composite


With these tests, we developers of games that want raster effects hope to see a unified result where a hblank is a hblank, no matter if it is VGA, Composite, S-video or HDMI.

*A palette entry is 12 bit and requires two bytes to be written to VRAM to make the desired color. If they are written during the active raster beam, colors might not show up correctly for a few pixels at times. That's why it is important we can get a chance to write these values, as well as scroll values for layers in the non-visible screen area (before registers are latched of course).

**Mouse cursor sprite graphics is glitched in NTSC at bottom of screen, but that has nothing to do with this bug report!



Source, binary:
linetim.zip
(43.71 KiB) Downloaded 6 times
doslogo
Posts: 43
Joined: Fri Dec 20, 2024 4:26 pm

Re: VGA vs. NTSC timings are not giving the same desired output

Post by doslogo »

Extended post to get a few more attachments:

Sonic demo from the SD card (the batch from February 2025) where the latching of horizontal background layer fails and differs between video output modes:
son_vga.png
son_vga.png (828.56 KiB) Viewed 400 times
VGA
son_composite.png
son_composite.png (1021.67 KiB) Viewed 400 times
Composite



Now to the NTSC 240p, S-video. The same code as before, the backdrop is affected much earlier. Looks like half the distance compared to VGA but that might be misleading:
svid.jpg
svid.jpg (13.26 KiB) Viewed 400 times
S-video

When adding the same 63 NOPs to get the palette change in the next hblank, the palette change happens somewhere in the middle of the screen instead:
svid_nops.jpg
svid_nops.jpg (14.98 KiB) Viewed 400 times
S-video
DragWx
Posts: 400
Joined: Tue Mar 07, 2023 9:07 pm

Re: VGA vs. NTSC timings are not giving the same desired output

Post by DragWx »

In VGA mode, there are 800 pixel clocks in a scanline.
In NTSC mode, there are 1588 instead, so that's why it looks like the timing is too short by (about) one half.

So, in NTSC mode, if you double your delay, you should be able to get your write to hit at about the same point of the scanline compared to VGA mode.
doslogo
Posts: 43
Joined: Fri Dec 20, 2024 4:26 pm

Re: VGA vs. NTSC timings are not giving the same desired output

Post by doslogo »

DragWx wrote: Sun May 11, 2025 2:50 am In VGA mode, there are 800 pixel clocks in a scanline.
In NTSC mode, there are 1588 instead, so that's why it looks like the timing is too short by (about) one half.

So, in NTSC mode, if you double your delay, you should be able to get your write to hit at about the same point of the scanline compared to VGA mode.
So every 800 pixel clocks on VGA, a LINE interrupt is fired? And on every 1588 pixel clock on NTSC, a LINE interrupt is fired?

Also, if you have a link to the FPGA code that does this LINE interrupt, I would like to have a look at it. I believe something fishy is going on.
DragWx
Posts: 400
Joined: Tue Mar 07, 2023 9:07 pm

Re: VGA vs. NTSC timings are not giving the same desired output

Post by DragWx »

doslogo wrote: Sun May 11, 2025 3:28 pm if you have a link to the FPGA code that does this LINE interrupt, I would like to have a look at it. I believe something fishy is going on.
Sure, I'll provide links to the exact commit I'm looking at, so I can highlight the appropriate lines. (Or attempt to, anyway)

As an overview, there's a VGA core, and a separate composite (NTSC) core, and they keep track of their own timings separately, including when the next line is starting, which is important for the line IRQ. The VERA will listen to whichever core corresponds to the currently-selected video mode.

video_vga.v contains the VGA "next line" signal.
video_composite.v contains the composite "next line" signal.

Both "next line" signals are synchronized to the beginning of the active (visible) portion of the scanline.

Of the two "next line" signals, the appropriate one is selected here in top.v, depending on the current video mode.

The selected "next line" signal goes into composer.v, where it clocks the Y counter +2 if in interlaced mode, +1 otherwise.

Finally, the Y counter is compared with the line IRQ number, and the line IRQ is generated if they match. The exact time it's generated is when "next line" toggles, which is at the start of the active portion of the scanline, remember. Note that in interlaced mode, the least significant bit is ignored during the comparison.
doslogo wrote: Sun May 11, 2025 3:28 pm So every 800 pixel clocks on VGA, a LINE interrupt is fired? And on every 1588 pixel clock on NTSC, a LINE interrupt is fired?
Yes:

VGA mode uses 800 clocks during one scanline.
Composite mode uses 1588 clocks during one scanline.

So, if you fired a line IRQ on every line, they'd be spaced 800 VERA clocks apart (=256 CPU clocks) in VGA mode, and 1588 VERA clocks apart (=508.16 CPU clocks) in composite mode.

Remember that there's some amount of jitter due to the CPU needing to finish its current instruction before it can jump to its IRQ vector, and then some amount of time is consumed by the Kernal's ISR preamble before it actually jumps to CINV.

If you want to exactly synchronize with the line IRQ, then set your line IRQ to be one line early, then on interrupt, acknowledge and bump the IRQ line to the next line (or +2 in composite mode), then use the instruction WAI (leave the CPU's "I" flag set) to suspend the CPU until it happens, and then the CPU will resume execution at the beginning of the active portion of the next scanline.
doslogo
Posts: 43
Joined: Fri Dec 20, 2024 4:26 pm

Re: VGA vs. NTSC timings are not giving the same desired output

Post by doslogo »

In my code I posted, when running in the emulator with a breakpoint at the start of the interrupt handler, the official emulator says in register $9F28 and $9F26 that I am interrupting for vblank at line $1E0 (480) and the requested IRQ LINE at $1B4 (436) in VGA, which is correct, BUT in NTSC, vblank is now at line $1FF (511) and every other frame the requested IRQ LINE is either $1DE (478) or $1DF (479).

Imagine you want to get the scanline counter in order to set up the next IRQ LINE relative to the current interrupted one. In VGA, that is easy. But in NTSC mode, the values are just weird and off. Like, try adding 2 scanlines to the value 479 in hope it will stop at scanline 436+2, which would probably work, but you'll get my point.

Moving on the the real hardware, I have done everything including using VIA timers to try to get to the scanline early, but they drift too quickly and the code got too complicated: Vblank sets IRQ LINE, and IRQ LINE sets VIA timer IRQ, and VIA timer measurement is different on every boot of the comuter, so that didn't work, even when I used WAI to wait for the timers (yea, WAI will get activated on VIA IRQ as well, but I got that correct, in emulator at least). I have tired many thing for months without a stable result on real hardware.

But at least I know I can wait on NTSC since I will get those cycles back (sure takes a lot of time to wait, but that means the next scanline after that will not happen for a very very long time). All my tests hung my game because waiting for so long screwed up my next scheduled IRQ LINE. I will do more tests with the code I uploaded here, because I know something is wrong, but where is the question..? Maybe I fail to set the IRQ LINE scanline somehow since it is wrong in NTSC according to the emulators (shows up right, but value in registers not matching)?
DragWx
Posts: 400
Joined: Tue Mar 07, 2023 9:07 pm

Re: VGA vs. NTSC timings are not giving the same desired output

Post by DragWx »

In NTSC mode, the screen alternates between having all ODD and all EVEN scanlines drawn. Therefore, the line counter increases by +2 with each scanline, and will alternate between being all odd or all even line numbers. This is why the line IRQ comparison ignores bit 0, it's so you can set one line IRQ value and it'll fire both on odd and even frames.

I'd need to look at the emulator source code some more, because from your screenshots, it looks like the IRQs are happening on the correct scanlines, but the line register seems like it's reporting the wrong values in NTSC mode, or at least I'd expect them to be similar to VGA mode, where vblank is line 480.
doslogo
Posts: 43
Joined: Fri Dec 20, 2024 4:26 pm

Re: VGA vs. NTSC timings are not giving the same desired output

Post by doslogo »

Just waiting on real hardware every +2 scanlines does this:

Green backdrop until scanline 436, then alternate black and red until vblank (240p, so interrupt every other scanline)


With one NOP of wait:
nops1.JPG
nops1.JPG (1.01 MiB) Viewed 289 times

With 63 NOPs of wait, the VGA test case of NOPs:
nops63.JPG
nops63.JPG (1002.32 KiB) Viewed 289 times

EDIT: The green color is missing because IRQ LINE continues after vblank. The purple color is red, but my PVM had Phase set incorrectly!

And this is with 169 NOPs, the required count to finally reach the non visible screen on NTSC:
nops169.JPG
nops169.JPG (1.01 MiB) Viewed 289 times

The only thing changed in the code is the number of NOPs to wait until the backdrop is changed. This is my problem: Waiting too long on a NTSC scanline will screw everything up. Also, mouse cursor disappeared in the 63 NOPs case.


If it was just a matter of waiting as long as you'd like, then why is there a purple screen and why does the alternating lines start higher up? And where did my cycles go? There is no time for anything inbetween IRQs when adding so many NOPs
Last edited by doslogo on Mon May 12, 2025 10:04 pm, edited 1 time in total.
DragWx
Posts: 400
Joined: Tue Mar 07, 2023 9:07 pm

Re: VGA vs. NTSC timings are not giving the same desired output

Post by DragWx »

Your ISR is interrupting your main code while your main code is possibly still setting up the VERA for the test, and your ISR is not expecting to need to preserve the state of the VERA's address(es) when it returns. In your main code, try moving that PLP to the end of your main code, after it finishes setting up the VERA, then try again.
doslogo
Posts: 43
Joined: Fri Dec 20, 2024 4:26 pm

Re: VGA vs. NTSC timings are not giving the same desired output

Post by doslogo »

DragWx wrote: Sun May 11, 2025 10:31 pm Your ISR is interrupting your main code while your main code is possibly still setting up the VERA for the test, and your ISR is not expecting to need to preserve the state of the VERA's address(es) when it returns. In your main code, try moving that PLP to the end of your main code, after it finishes setting up the VERA, then try again.
Obviously that must be the mouse glitch
Post Reply