• Welcome to Smashboards, the world's largest Super Smash Brothers community! Over 250,000 Smash Bros. fans from around the world have come to discuss these great games in over 19 million posts!

    You are currently viewing our boards as a visitor. Click here to sign up right now and start on your path in the Smash community!

List of HDTV's with no(Or little) lag

Zodiac

Smash Master
Joined
Aug 10, 2005
Messages
3,557
Hey guys, I thought it would be nice to put together a little list of HDTV's that have no lag or so little lag you can use in a tournament. What inspired this was the new tv I just got, bu I must put up this warning, if you have a wii and component cables that is the real game breaker there (I assume gamecube component cables will do the same job) I tested my wii at 480p and there was no lag with game mod turned on, using composite cables with the gamecube however yielded the typical hdtv lag.

Toshiba S4SL410U
wii 480p: Lagless
Gamecube Composite : LagFULL

Please feel free to add another lagless hdtv to the list
 

Massive

Smash Champion
Joined
Aug 11, 2006
Messages
2,833
Location
Kansas City, MO
480p is composite video.

Are you saying your wii is doing well and your gamecube isnt or are you saying the Red-Yellow-White cables are causing lag (as they are known to)?
 

Zodiac

Smash Master
Joined
Aug 10, 2005
Messages
3,557
480p is composite video.

Are you saying your wii is doing well and your gamecube isnt or are you saying the Red-Yellow-White cables are causing lag (as they are known to)?
480p with component cables
 

Yung Mei

Where all da hot anime moms at
Joined
Jul 20, 2009
Messages
5,341
I noticed that some samsung tv's have almost no lag for melee (played on the wii)

It might have some lag, but i play pretty well on them compared to other flatscreens
 

Geenareeno

Smash Lord
Joined
Aug 10, 2010
Messages
1,102
Location
Saskatoon, SK
I noticed that some samsung tv's have almost no lag for melee (played on the wii)

It might have some lag, but i play pretty well on them compared to other flatscreens
I agree, my friend has a Samsung and once you get used to it, it's playable. But my Toshiba at home is just terrible, unplayable for sure.
 

Windrose

Smash Lord
Joined
Mar 22, 2009
Messages
1,470
almost no lag isn't good enough....


i read on another thread somewhere that there is a way to get 0 lag on flat screens given the right tools.....
 

Excision

Smash Rookie
Joined
Sep 11, 2011
Messages
14
My Panasonic VIERA has absolutely 0 zero lag with my gamecube, standard gamecube A/V cables.
 

Zankoku

Never Knows Best
Administrator
BRoomer
Joined
Nov 8, 2006
Messages
22,906
Location
Milpitas, CA
NNID
SSBM_PLAYER
480p is composite video.

Are you saying your wii is doing well and your gamecube isnt or are you saying the Red-Yellow-White cables are causing lag (as they are known to)?
Composite is 480i, not 480p. Pretty big difference.

I don't think there are any HDTVs that have absolutely no lag, though there are a couple EDTVs out there like this one that can manage to play without lag.
 

Anthon1996

Smash Ace
Joined
Nov 17, 2010
Messages
995
Location
Bionis
NNID
AnUglyBarnacle
3DS FC
5301-0385-3871
I have a Vizio and it has no lag whatsoever for any console, even the Super Nintendo (but then again, it, along with other older game systems, look borderline nauseating on HDTV's, due to their crystal clear 240p resolution).
 

Massive

Smash Champion
Joined
Aug 11, 2006
Messages
2,833
Location
Kansas City, MO
Composite is 480i, not 480p. Pretty big difference.

I'm aware of the difference, I just mistyped composite instead of component.

I was confused by his use of two different nomenclatures; it should have been either 480p and 480i or Component and Composite video.

I don't think there are any HDTVs that have absolutely no lag, though there are a couple EDTVs out there like this one that can manage to play without lag.
The Toshiba model he listed supposedly has an "input lag" reducing mode installed. I'd highly recommend that nobody make any inferences as to the laggy-ness of their TV without doing some type of standardized testing (webcam finger/screen correlation test is easiest) between the two.
 

Battlecow

Play to Win
Joined
May 19, 2009
Messages
8,740
Location
Chicago
Yeah, it's easy to say that a TV that has only a couple frames of delay has "no lag" when you're comparing it to a really bad one. I'm pretty sure that all flat-screen TV's except for like some weird special $10,000 ones that they use in hospitals or some **** have at least a frame of lag.

Also novice are you sure about that? Have you tested it or are you just going off of a gut feeling?
 

Zankoku

Never Knows Best
Administrator
BRoomer
Joined
Nov 8, 2006
Messages
22,906
Location
Milpitas, CA
NNID
SSBM_PLAYER
I'm pretty sure that the EDTV I linked does not have lag, because 640x480 is its native resolution and thus it does not have to waste any time on upscaling, which is what normally makes HDTVs so horrible to play on. It's flatscreen, too.

I don't think CRTs would lag considering they have no apparent resolution to them.
 

ajp_anton

Smash Lord
Joined
Jan 9, 2006
Messages
1,462
Location
Stockholm
Many CRTs lag horribly. Sometimes you can find some brightness/contrast/sharpness setting that doesn't lag however. And I'm not even particularly sensitive to lag.
Note also that "no lag" is impossible, as the console itself likely has some kind of buffer that adds one frame of lag, plus it probably takes almost 1/60 seconds to render each frame, adding yet another frame of lag.

I don't know exactly how flat TVs work, but I'm pretty sure they always add at least one frame of lag because they need to receive the whole frame before they can start processing them. Then they likely add another frame of lag to actually process the image (only one frame if you manage to skip everything unnecessary) (why would they spend money on a processor that works faster than it needs?).
Actually sending the image to the panel may be instantaneous, if the panel starts to update the pixels as it receives them. In that case, you can just add the 2ms or whatever the absolutely fastestest panels have nowadays.
 

Battlecow

Play to Win
Joined
May 19, 2009
Messages
8,740
Location
Chicago
IDK, playing 64 you can definitely notice the difference between offline emulation (0 frames) and online emulation with one frame. I feel like some of the crazy techskill stuff would definitely suffer if there was always a frame in between input and effect.
 

Zodiac

Smash Master
Joined
Aug 10, 2005
Messages
3,557
Alright there is understandable doubt about the statement "Lag free" so I'm going to set up a demonstration where I split the cables between my edtv(No lag) and my hdtv(no lag?) and film it with a camera.

Also, Anthon and Excision, care to link those models so I can add them to the front page?
 
Joined
Feb 3, 2008
Messages
858
Location
PWN
I have a Vizio and it has no lag whatsoever for any console, even the Super Nintendo (but then again, it, along with other older game systems, look borderline nauseating on HDTV's, due to their crystal clear 240p resolution).
For one, the SNES' resolution isn't 240p, but it's really close to that, though the interlaced resolution is double the progressive one, depending on which size it outputs. Confusing? Yeah I don't understand it either (ok, not right now, at least).

For two, I think, that like this person:
My Panasonic VIERA has absolutely 0 zero lag with my gamecube, standard gamecube A/V cables.
You're both just bad at detecting lag.

Which is fine, really... it means you get to enjoy regular games more easily these days.

Many CRTs lag horribly. Sometimes you can find some brightness/contrast/sharpness setting that doesn't lag however. And I'm not even particularly sensitive to lag.
Note also that "no lag" is impossible, as the console itself likely has some kind of buffer that adds one frame of lag, plus it probably takes almost 1/60 seconds to render each frame, adding yet another frame of lag.

I don't know exactly how flat TVs work, but I'm pretty sure they always add at least one frame of lag because they need to receive the whole frame before they can start processing them. Then they likely add another frame of lag to actually process the image (only one frame if you manage to skip everything unnecessary) (why would they spend money on a processor that works faster than it needs?).
Actually sending the image to the panel may be instantaneous, if the panel starts to update the pixels as it receives them. In that case, you can just add the 2ms or whatever the absolutely fastestest panels have nowadays.
It's not a whole frame of lag, unless you're the PS3. Signal transfer for CRTs is much faster than 17ms (one frame), try about 66-70% the speed of light? "No-lag" to us means <1 frame lag.

almost no lag isn't good enough....


i read on another thread somewhere that there is a way to get 0 lag on flat screens given the right tools.....
yeah... and yeah, there are signal processor...box thingies you can buy ($200-$300?) whose sole purpose is to upconvert signals as fast as possible. never tried it myself though.

Composite is 480i, not 480p. Pretty big difference.
I'm aware of the difference, I just mistyped composite instead of component.

I was confused by his use of two different nomenclatures; it should have been either 480p and 480i or Component and Composite video.
Yeah, it's weird. Component can do both anyway :/.

I'm pretty sure that the EDTV I linked does not have lag, because 640x480 is its native resolution and thus it does not have to waste any time on upscaling, which is what normally makes HDTVs so horrible to play on. It's flatscreen, too.

I don't think CRTs would lag considering they have no apparent resolution to them.
Oh, didn't know they made those.


Many ctr tvs also lag, it seems that big tvs lag more than small ones
Yeah, some of my friends and I have noticed that too. I used to just think the bigger tvs took longer to display the signal, but that didn't really make sense. I just read, however, that some bigger crts are designed to upconvert all signals to 1080i. Which is kind of dumb. But that might explain the slight lag. (Solution found, anyone? This one was bugging me for awhile.)
 

Zodiac

Smash Master
Joined
Aug 10, 2005
Messages
3,557
the game mode on hdtv's basically gets rid of the upscaling process that people have been mentioning, it doesn't totally eliminate it because after all at the very least it has to make the image fit the full screen of the tv which depending on what type of tv you have is either 720p or 1080p. obviously the higher the resolution the longer it takes to upscale. But the kicker is that hdtv's apply a bunch filtering and cleaning effects to the image so that the image doesn't look like crap when its up scaled from a lower resolution. The game mode on HDTV's turns off most of those filters, depends on brand/model type but some of them do a handy job of make the input lag undetectable.
 

ajp_anton

Smash Lord
Joined
Jan 9, 2006
Messages
1,462
Location
Stockholm
It's not a whole frame of lag, unless you're the PS3. Signal transfer for CRTs is much faster than 17ms (one frame), try about 66-70% the speed of light? "No-lag" to us means <1 frame lag.
Why would you need to be a PS3? If anything, the PS3 has less lag because the signal is already digital, so the TV can (don't know if it does though) just pass the signal through some of the steps needed to convert the Gamecube's analog signal into digital and upscale it.
"66% the speed of light" is just for the signal to reach the TV. Then the TV itself does some stupid processing before displaying. It's not a simple pass-through thing.
 
Joined
Feb 3, 2008
Messages
858
Location
PWN
^Ah... alright, I'll back up. (I was being pointed, I'll admit.) Though I'm a bit confused at your post. Many crts "lag horribly"? I don't understand. I've never played on a crt that lagged at all (minus this giant ones, which have incredibly hard to detect lag), and furthermore... nobody else has. And of course "no-lag" is impossible, it'd be silly to argue that anyway.

First off, are you sure you're not confusing "crt" with "lcd" ? And...

Ok, so you're using 1/60 of a second - which seems pretty fast - to describe how long it's going to take for a console to acknowledge an input (your 'buffer', I'm guessing), render a frame, and then have that signal/frame to be processed by the tv. Then it takes another 1/60 of a second for the 'flat' (like...lcd flat? or flat screen crt?) tv to receive this frame, and then another 1/60 of a second to process it before it's displayed.

Well, that doesn't make sense physically, and since we're trying to be accurate here - down to the frame - it's important that you be, well, precise with your measurements.

For one, crts (except for the giant one i guess?) don't have post-processing (afaict). thus my speed of light crack. (wasn't trying to poke fun of your knowledge of signal speed. just the absence of post processing.)

for two, does your computer lag when you're typing? case in point: it's same setup as your console and crt.

for three... forget it, i'm pretty sure you just mixed up crt and lcd. and apparently the ps3 has 1 frame of rendering lag. and it's the only console that has that.
 

ajp_anton

Smash Lord
Joined
Jan 9, 2006
Messages
1,462
Location
Stockholm
I know exactly what CRT and LCD means. With "flat panel" i mean any digital display with a fixed resolution, which is mostly LCD. They have to digitally process the image so that it can be displayed. The speed of the panel itself (the "ms" number you often see) is often meaningless, because it's so small compared to all the other stuff that's never mentioned.

I don't know what CRTs do to make them lag, but they do something, because they do lag =).
But since it can sometimes be fixed by finding some optimum setting for brightness/contrast/sharpness, I'm guessing that's the key. Maybe some ye-old b&w TVs don't lag because they didn't try to do anything to the image, but they probably don't even have the right inputs...
And it's not an NTSC/PAL thing. In my experience, TVs on american tournaments generally lag more.


About the "console lag"... because Melee sometimes drops frames (for example 4 players on FoD, or when many people die at the same time), I'm going to assume that most of the time the game is pretty close to dropping frames, meaning each frame takes a little under 1/60 seconds to render by the GPU. Add the time required for the CPU (which is faster because the game never slows down), and these probably add up to about 1/60 seconds. There's your first frame of lag. The actual input from the controller IIRC updates at a much higher rate, so it's negligible.
Now when the frame is ready, it can't be sent to the display right away, because it has to be synced with the actual output. This alone adds an average of half a frame of lag. But most likely the frame is copied onto some kind of buffer/queue which takes even longer. Ever heard of double or triple buffering? They are used in many computer games to reduce tearing.
Note that this is just my "educated guess", as I have never actually looked into the details of how the Gamecube works...
 

SCOTU

Smash Hero
Joined
Mar 16, 2007
Messages
6,636
Location
MI
To clarify things a bit: No CRTs should lag. Even the large ones. I have a 65" CRT in my living room. It doesn't lag. If you care why, keep reading.

I'm pretty sure that all flat-screen TV's except for like some weird special $10,000 ones that they use in hospitals or some **** have at least a frame of lag.
Funny story about Hostpital TVs. The ones that they have in patient rooms, I have no idea why, but at all of the last several hospitals I've been to recently (my grandma's been going to a lot of different places recently), they always have the worst possible picture quality I've ever seen on a TV before lol.

I know exactly what CRT and LCD means. With "flat panel" i mean any digital display with a fixed resolution, which is mostly LCD. They have to digitally process the image so that it can be displayed.
This "digital processing" takes 2 forms: 1) deinterlacing: Most digital displays are progressive displays, meaning that to put an interlaced signal (anything via composite) the display has to actually wait for more of the signal before it can display the current frame. There are several different methods of doing this that usually add around 17 or 34ms of delay. This is the main factor in lag on TVs, and an interlaced (read: CRT) TV does not have this problem. 2) upscaling: To take an image at one resolution and display it at another resolution (in the most quality/cost effective method) takes 2 cubic calculations per pixel, something that's a non-trivial operation to do for an embedded processor @ 1920x1080. Therefore, there are often shortcuts taken in designing TVs to use cheaper methods that get the job done after a some time (they can keep up with the signal, but they have some signal propagation delay). Since all digital TVs have a fixed resolution, any other source must be converted into their native resolution, and when cheaper methods are used, this instantiates delay. A TV with a native resolution of 480i/p will not generate this problem, furthermore, CRTs don't actually have a fixed resolution, so this is never a problem for them.

The speed of the panel itself (the "ms" number you often see) is often meaningless, because it's so small compared to all the other stuff that's never mentioned.
I'm assuming you're referring to the panel's response time, which is actually a measurement of how quickly it can change color, and is mostly used for determining if any ghosting will occour. This really has no impact on display lag, as even if it were in a range where you could easily tell (i.e. a 16ms response time), sure, your picture would be a bit slower, but you'd also be seeing a hard blend of 2 frames at any given time and would have trash picture quality.

I don't know what CRTs do to make them lag, but they do something, because they do lag =).
For the above mentioned reasons, CRTs don't lag. Almost always. I suppose it's possible to make one that does, but that would basically cost more, and you'd be designing it specifically to do that.

But since it can sometimes be fixed by finding some optimum setting for brightness/contrast/sharpness, I'm guessing that's the key.
This really has nothing to do with display lag. Have you ever heard of the placebo effect?

Maybe some ye-old b&w TVs don't lag because they didn't try to do anything to the image, but they probably don't even have the right inputs...
ye-old color TVs, ye-old HTDVs, and ye-old every other CRT process the signal in almost the exact same way. Also, it's pretty easy to build a composite -> RF bridge, in case you'd prefer to play on a BW TV to make you feel better.

And it's not an NTSC/PAL thing. In my experience, TVs on american tournaments generally lag more.
TVs that lag should lag about the same. If anything, PAL TVs should lag a tiny bit more because they'd need to wait longer to deinterlace the frames, and have more pixels to upscale. While I'm sure this difference is measurable, it probably doesn't even amount to a whole extra frame on an NTSC TV, and thus is negligible.

About the "console lag"... because Melee sometimes drops frames (for example 4 players on FoD, or when many people die at the same time), I'm going to assume that most of the time the game is pretty close to dropping frames, meaning each frame takes a little under 1/60 seconds to render by the GPU.
To be fair, on this point I could very much be wrong, cause I haven't looked at it in depth. But from memory and the memory of some consultants, Melee doesn't actually drop frames. It does slow down. If it did drop frames, it would actually need to predicatively opt to not draw those frames (i.e. the designers would have to had specified when the game can't handle it and tell it to drop frames there).

Add the time required for the CPU (which is faster because the game never slows down), and these probably add up to about 1/60 seconds.
That's not how game rendering works. It's not if the gpu is slow frames are dropped and if the cpu is slow there's slow down. There's a rendering mechanism that decides how things are rendered. In Melee's case (This varies by game), it uses a fixed framerate rendering method, where it always performs 1/60 of a second's worth of calculation + rendering each frame (I'm tempted to say that on PAL this is 1/50 of a second, but this would actually change game mechanics, so it might be 1/60th quantized to 1/50th for display).

There's your first frame of lag. The actual input from the controller IIRC updates at a much higher rate, so it's negligible.
The controller doesn't actually update at a higher rate, it's polled at one point during each frame, so inputs that last less than a frame risk being dropped. Furthermore, when you input, say, a jab, the frame that the console receives this input, it starts doing the first frame of a jab. This has the result of a 1 frame duration action being displayed ~ 1 Frame after you input it, so there isn't actually any delay generated by this process.

Now when the frame is ready, it can't be sent to the display right away, because it has to be synced with the actual output. This alone adds an average of half a frame of lag. But most likely the frame is copied onto some kind of buffer/queue which takes even longer. Ever heard of double or triple buffering? They are used in many computer games to reduce tearing.
As I've indicated, Melee (and most other console games) use a fixed timestep game loop that is synced with the display, so this actually isn't needed. Melee probably does use double buffering, but because the render step is already the same length as the refresh rate, it doesn't add any additional delay. Double/Tripple buffering only add delay when the rendering framerate differs from the display framrate.

Note that this is just my "educated guess", as I have never actually looked into the details of how the Gamecube works...
Nor how game rendering/ processing work, nor how digital nor CRT TVs work, nor really anything you mentioned.
 
Joined
Feb 3, 2008
Messages
858
Location
PWN
Melee doesn't (usually) drop frames, it slows down. Slow down is caused because the cpu has more calculations to do than its hardware is designed for, which would be a software fault for under-predicting the amount of calculations the cpu would have to do, among other things. Once the time to render the graphic exceeds the time given to draw a frame, the frame gets buffered, and slowdown is visually achieved.

Pretty sure that's accurate, sorry if I'm wrong a little.
 

SCOTU

Smash Hero
Joined
Mar 16, 2007
Messages
6,636
Location
MI
Melee doesn't drop frames, it slows down. Slow down is caused because the cpu has more calculations to do than its hardware is designed for, which would be a software fault for under-predicting the amount of calculations the cpu would have to do, among other things. Once the time to render the graphic exceeds the time given to draw a frame, the frame gets buffered, and slowdown is visually achieved.

Pretty sure that's accurate, sorry if I'm wrong a little.
That's more or less right. I can't guarantee it's 'cause the CPU is too slow or if it's the GPU. I'm certain it's the GPU for FoD (due to the reflections), but things like dying simultaneously (i didn't know this caused slowdowns) and Mute City could either be the CPU tracking the different things, or bandwidth of the CPU making GPU commands. But your description of the effect is correct as I explained in my post.
 

ajp_anton

Smash Lord
Joined
Jan 9, 2006
Messages
1,462
Location
Stockholm
This "digital processing" takes 2 forms: 1) deinterlacing: Most digital displays are progressive displays, meaning that to put an interlaced signal (anything via composite) the display has to actually wait for more of the signal before it can display the current frame. There are several different methods of doing this that usually add around 17 or 34ms of delay. This is the main factor in lag on TVs, and an interlaced (read: CRT) TV does not have this problem.
I thought about the waiting for the whole interlaced frame, but theoretically you could already bob-deinterlace the image after you receive the first field (shift it up/down and upscale). I doubt this is how it's implemented though...

I'm assuming you're referring to the panel's response time, which is actually a measurement of how quickly it can change color, and is mostly used for determining if any ghosting will occour. This really has no impact on display lag, as even if it were in a range where you could easily tell (i.e. a 16ms response time), sure, your picture would be a bit slower, but you'd also be seeing a hard blend of 2 frames at any given time and would have trash picture quality.
Sure it will also add ghosting, but if it takes 16ms to shift pixels into the next image, obviously it will add some more lag (it's easy to say it's 16ms, but it depends on when your eyes start to think they see a new image =)).

For the above mentioned reasons, CRTs don't lag. Almost always. I suppose it's possible to make one that does, but that would basically cost more, and you'd be designing it specifically to do that.

This really has nothing to do with display lag. Have you ever heard of the placebo effect?
Well, I know for a fact that most CRTs lag. Not nearly as much as LCDs, but they do lag. I'm not even that sensitive and if I don't think about it, I won't notice it with most TVs (but some lag so much they are unplayable).

ye-old color TVs, ye-old HTDVs, and ye-old every other CRT process the signal in almost the exact same way. Also, it's pretty easy to build a composite -> RF bridge, in case you'd prefer to play on a BW TV to make you feel better.
No thanks. On Pound4 there was a TV that used an RF adapter, and it was as laggy as the worst LCDs.

TVs that lag should lag about the same. If anything, PAL TVs should lag a tiny bit more because they'd need to wait longer to deinterlace the frames, and have more pixels to upscale. While I'm sure this difference is measurable, it probably doesn't even amount to a whole extra frame on an NTSC TV, and thus is negligible.
When I talk about PAL, I mean PAL60. Nobody plays Melee in 50Hz. American TVs lagging more could be just me though, I've heard bad things about them so maybe I was looking for the lag more.

To be fair, on this point I could very much be wrong, cause I haven't looked at it in depth. But from memory and the memory of some consultants, Melee doesn't actually drop frames. It does slow down. If it did drop frames, it would actually need to predicatively opt to not draw those frames (i.e. the designers would have to had specified when the game can't handle it and tell it to drop frames there).
I've tested this by recording matches on FoD and other situations where the game "lags", and confirmed that the game does run at full speed all the time (according to the in-game timer at the top). Frames are simply skipped.

That's not how game rendering works. It's not if the gpu is slow frames are dropped and if the cpu is slow there's slow down. There's a rendering mechanism that decides how things are rendered. In Melee's case (This varies by game), it uses a fixed framerate rendering method, where it always performs 1/60 of a second's worth of calculation + rendering each frame (I'm tempted to say that on PAL this is 1/50 of a second, but this would actually change game mechanics, so it might be 1/60th quantized to 1/50th for display).
See above. I agree that the CPU/GPU doesn't make a difference if the game is indeed slowed down, but in my tests it doesn't, so I'm just guessing that there's some mechanism where the GPU knows when to drop a frame and start with the next instead.
And again, nobody plays in 50Hz =). Though I know the game still runs internally at 60Hz and frames are dropped for 50Hz. Actually, every 1001 frame is also dropped in 60Hz (the game runs at exactly 60Hz internally, but NTSC and PAL60 are 60/1.001 Hz).

The controller doesn't actually update at a higher rate, it's polled at one point during each frame, so inputs that last less than a frame risk being dropped. Furthermore, when you input, say, a jab, the frame that the console receives this input, it starts doing the first frame of a jab. This has the result of a 1 frame duration action being displayed ~ 1 Frame after you input it, so there isn't actually any delay generated by this process.
Maybe it works differently for different games, but I remember reading something about the polling frequency being much higher than 60Hz.
Actually, I take back the "high frequency -> no lag" statement. If you press the button just after the game asks for inputs and starts rendering a frame, your input will have to wait until the next frame, adding an average of half a frame.

As I've indicated, Melee (and most other console games) use a fixed timestep game loop that is synced with the display, so this actually isn't needed. Melee probably does use double buffering, but because the render step is already the same length as the refresh rate, it doesn't add any additional delay. Double/Tripple buffering only add delay when the rendering framerate differs from the display framrate.
Whether the game slows down or drops frames when "overloaded", the GPU has to keep outputting frames at 60Hz. In those cases it's no longer synced with the display.
 

Zankoku

Never Knows Best
Administrator
BRoomer
Joined
Nov 8, 2006
Messages
22,906
Location
Milpitas, CA
NNID
SSBM_PLAYER
If frames are skipped rather than the game slowing down, that means the reasoning of banning FoD for doubles is completely invalid because for all intents and purposes the game is running at full speed anyway. I'd like to know the exact procedure in which you're verifying game speed...
 

SCOTU

Smash Hero
Joined
Mar 16, 2007
Messages
6,636
Location
MI
I thought about the waiting for the whole interlaced frame, but theoretically you could already bob-deinterlace the image after you receive the first field (shift it up/down and upscale). I doubt this is how it's implemented though...
Only when using non-typical displays does it do this.

Sure it will also add ghosting, but if it takes 16ms to shift pixels into the next image, obviously it will add some more lag (it's easy to say it's 16ms, but it depends on when your eyes start to think they see a new image =)).
Of course it does. I admitted this, however my point was that at the point where its impactful, you have much worse problems.

Well, I know for a fact that most CRTs lag. Not nearly as much as LCDs, but they do lag. I'm not even that sensitive and if I don't think about it, I won't notice it with most TVs (but some lag so much they are unplayable).
I have measured delay on many a CRT and many a digital display. I have never once ran into a CRT with more than 1ms delay, within margin of error for zero.

No thanks. On Pound4 there was a TV that used an RF adapter, and it was as laggy as the worst LCDs.
An RF adapter, if i'm remembering correctly (this could easily be wrong), should be an entirely passive converter, meaning there should be no delay whatsoever.

When I talk about PAL, I mean PAL60. Nobody plays Melee in 50Hz. American TVs lagging more could be just me though, I've heard bad things about them so maybe I was looking for the lag more.
You'll have to forgive my ignorance, as I have already stated, I do not have any experience with PAL. I was under the impression that PAL video would only run at 50fps and not 60. I did admit that it makes far more sense for the game to play at 60fps but to have its graphical content quantized to 50fps.

I've tested this by recording matches on FoD and other situations where the game "lags", and confirmed that the game does run at full speed all the time (according to the in-game timer at the top). Frames are simply skipped.
You know, I find this really hard to believe. This would actually change mechanics during the slowdown. I wonder if this isn't just an artifact of video capture.

See above. I agree that the CPU/GPU doesn't make a difference if the game is indeed slowed down, but in my tests it doesn't, so I'm just guessing that there's some mechanism where the GPU knows when to drop a frame and start with the next instead.
And again, nobody plays in 50Hz =). Though I know the game still runs internally at 60Hz and frames are dropped for 50Hz. Actually, every 1001 frame is also dropped in 60Hz (the game runs at exactly 60Hz internally, but NTSC and PAL60 are 60/1.001 Hz).
1 Frame isn't dropped every 1000, that's just a framerate. I am quite aware of this and why it's like that.

Maybe it works differently for different games, but I remember reading something about the polling frequency being much higher than 60Hz.
Actually, I take back the "high frequency -> no lag" statement. If you press the button just after the game asks for inputs and starts rendering a frame, your input will have to wait until the next frame, adding an average of half a frame.
Higher frequency polling isn't particularly necessary for things other than acceleration based devices like mice. Absolute position devices are typically fine being polled once every frame.

On a side note, why potential lag induced by the console even being brought up here? We're talking about TVs. We should be comparing display latency only, as that's the theoretical limiting point.

I can't believe there are still people who think PAL regions play with 50Hz...
If this was directed at me, I apologize for any confusion I have caused. I did not under any circumstance believe PAL to have game play at 50fps, merely it's display frequency. I hope to not have confused other readers.
 

ajp_anton

Smash Lord
Joined
Jan 9, 2006
Messages
1,462
Location
Stockholm
I have measured delay on many a CRT and many a digital display. I have never once ran into a CRT with more than 1ms delay, within margin of error for zero.
How do you measure a CRT's delay anyway (only the CRT and not together with the game console's internal lag)?

An RF adapter, if i'm remembering correctly (this could easily be wrong), should be an entirely passive converter, meaning there should be no delay whatsoever.
Either it was the adapter or the TV, I don't know, but I'm telling you it was like playing on a bad LCD.
Are you sure passive adapters won't have some kind of capacitance-lag?

You know, I find this really hard to believe. This would actually change mechanics during the slowdown. I wonder if this isn't just an artifact of video capture.
Well, my capture devices detects the framerate automatically, no matter what I choose (30, 29.97 or 25fps). The video will be flagged with the framerate I choose, but all frames are there (and sound will go OoS if framerate is wrong). I know my capture cards can randomly drop/add a frame, but it'd be weird if all 4 of them, recording the same stream at the same time, dropped (drop current+duplicate previous) exactly the same frames, when and only when the game "lags". Jump 600 frames forward over the heavily "lagging" parts, and the game clock will increase by 10 seconds (sometimes + 1 frame, see below).

1 Frame isn't dropped every 1000, that's just a framerate. I am quite aware of this and why it's like that.
On a tournament that I recorded, it dropped lots of frames, and it always drops them in pairs (2 fields). I built a script that finds the framedrops (by reading the timer) and interpolates them. Every 1000 frames a single frame was dropped in addition to the pairs. It's not just the timer, the game also jumps forward a little extra, and these single frames are always dropped, even when the capture seems to work perfectly.

On a side note, why potential lag induced by the console even being brought up here? We're talking about TVs. We should be comparing display latency only, as that's the theoretical limiting point.
I brought it up when people said they see "absolutely no lag" as if the console itself didn't add anything.
 

SCOTU

Smash Hero
Joined
Mar 16, 2007
Messages
6,636
Location
MI
edit: I forgot to mention this earlier: deathbysuarez: <3 Yandere~

How do you measure a CRT's delay anyway (only the CRT and not together with the game console's internal lag)?
Quit it with the console internal lag. No one cares. I thought we've been over this. There is no console delay, really. The frame you input your command on, 1frame time late, your action, 1 Frame in, is displayed. Unless for some reason you're claiming that because you can press an input before the polling takes place that this introduces some "delay". While this is true, it's not a delay that matters. For one delay should really be considered from the point of view of the game itself because that's what's actually doing the processing and modelling the simulation. and for 2) that "delay" is on average only ~8ms, which is more or less undetectable.

Now, to clarify what I mean by thinking of delay in terms of the game, i mean like this: No one really cares how long it takes for you to push a button and have the action show up on screen. People only care how quickly they can respond to what they see on the screen. This means that if I see something on the screen and need to react to it and press a button, no matter when I press the button during the frame, it will have the same effect. Because the game hasn't progressed at all, all of your changes are made as if they were instantaneous relative to everything going on in the game. Stop me if this isn't clear. I'm trying here, but i'm not coming up with a nice wording (today's not the best of days).

Either it was the adapter or the TV, I don't know, but I'm telling you it was like playing on a bad LCD.
Are you sure passive adapters won't have some kind of capacitance-lag?
well, the CRT shouldn't have lag. Such a passive adapter should really only be made out of resistors, which, while they do induce some propagation delay, it's kinda on the magnitude of ns.

Well, my capture devices detects the framerate automatically, no matter what I choose (30, 29.97 or 25fps). The video will be flagged with the framerate I choose, but all frames are there (and sound will go OoS if framerate is wrong). I know my capture cards can randomly drop/add a frame, but it'd be weird if all 4 of them, recording the same stream at the same time, dropped (drop current+duplicate previous) exactly the same frames, when and only when the game "lags". Jump 600 frames forward over the heavily "lagging" parts, and the game clock will increase by 10 seconds (sometimes + 1 frame, see below).
I would still be a bit surprised if this were the case. Maybe if i care enough later I'll look into it myself. But this really has nothing to do with this discussion, so i'll drop it for now.

On a tournament that I recorded, it dropped lots of frames, and it always drops them in pairs (2 fields). I built a script that finds the framedrops (by reading the timer) and interpolates them. Every 1000 frames a single frame was dropped in addition to the pairs. It's not just the timer, the game also jumps forward a little extra, and these single frames are always dropped, even when the capture seems to work perfectly.
This shouldn't count as a dropped frame, because it was never there in the first place. You simply have a slightly less than 30/60 framerate. A dropped frame as reported by your capture device means one existed but it wasn't captured completely. I think you've just got some problem capturing. I have captured video for long segments with no dropped frames and the expected framecount.

I brought it up when people said they see "absolutely no lag" as if the console itself didn't add anything.
Well, on a fundamental level, the console really doesn't add anything. And since this is a topic on TV delay, they're talking about the TV having absolutely no lag at all. Which is synonymous with completely undetectable by usage (really anything less than 16ms is pretty undetectable, but "absolutely lag free" as in a CRT will have less than 1ms)
 
Joined
Feb 3, 2008
Messages
858
Location
PWN
Interesting, both Novice and ajp_anton, from finland and sweden respectively, have said that crts lag. ...is this really what we're missing here?

because...
When I talk about PAL, I mean PAL60. Nobody plays Melee in 50Hz. American TVs lagging more could be just me though, I've heard bad things about them so maybe I was looking for the lag more.
I can't believe there are still people who think PAL regions play with 50Hz...
I guess I didn't know this/didn't really understand this.

Here, CRTs have always been lag-free. And LCDs are "always" laggy, which is why discussion sprang up in Zodiac's thread here: because until it can really be tested and shown, pretty substantially, then due to experience, those of us who can recognize lag immediately, are going to notice.

So I think that might be causing our confusion. Though, ajp_anton, I still think you're a bit confused because even the knowledgeable-sounding stuff you say still has mixed terms or doesn't seem to grasp the concept right, but... this has all been interesting to discuss, even though we're having to guess at a few things.

I've tested this by recording matches on FoD and other situations where the game "lags", and confirmed that the game does run at full speed all the time (according to the in-game timer at the top). Frames are simply skipped.
Technically, the in-game timer doesn't have to be synched with the graphics. The timer can run by itself, and it's frames can be dropped, while the graphics are still updating themselves. The same goes for sound - the game can be running slowly, yet the sound will play at normal speed/pitch. That said, I think a better way of describing it anyway is that frames are both dropped and slowed down. And a prime example is...

The black hole glitch. It's obviously slowed down. But if you look closely, it drops frames as well. Take a look for yourself. You can not only hear the super scope's shots get slowed down, but also see the characters move more slowly. Peach's turnip-pulling warcry and the 'pluck' noise sound normal. However, you can also see animation skipping, and therefore, frame dropping.

So I think the game is going to slow down first before it knows when to drop frames. I think this describes FoD.
If frames are skipped rather than the game slowing down, that means the reasoning of banning FoD for doubles is completely invalid because for all intents and purposes the game is running at full speed anyway. I'd like to know the exact procedure in which you're verifying game speed...
Another valid point.

Maybe it works differently for different games, but I remember reading something about the polling frequency being much higher than 60Hz.
Actually, I take back the "high frequency -> no lag" statement. If you press the button just after the game asks for inputs and starts rendering a frame, your input will have to wait until the next frame, adding an average of half a frame.
Technically, input can be polled multiple times per frame, but the lag will still average half a frame, I guess you could say, but even this latency is what everyone still calls 'zero lag' because it's... what has to exist, and what has existed for years, for everyone, and it's as fast as you can get.

I can't really comment on your frame data and tests, but...

Sure it will also add ghosting, but if it takes 16ms to shift pixels into the next image, obviously it will add some more lag (it's easy to say it's 16ms, but it depends on when your eyes start to think they see a new image =)).
Almost all lcds these days have <5ms lag.


*Subscibing to this thread cause I want to understand this stuff better*
if you want to know a few basics about input lag and response time and such, there's some discussion/a concise post i made in this other thread here

Hertz, don't it?
bones, you're a champ you have the right frame of mind.
 
Top Bottom