In the early days of networked gaming (I mean, the days of Doom- (not to things like imaginet)) connections were slow.
If a game or application tried to pump too much data over the connection- it would take time. Also, sometimes the architecture itself and the hardware involved introduced a sort of “time-delay” into the equation. This was called “network lag”- a doom game might not be enjoyable due to network lag, for instance.
As Online gaming has become more prevalent and requires less PC familiarity, the term has started to be used completely erroneously in all sorts of situations.
For example- if your PC is underpowered for a new game, or you set the detail to high and it runs slowly, it’s not “lag” because there is no actual lag between your actions and what you see on screen. the term “lag” was originally used to indicate the time difference between when you pressed a key and when the server acknowledged you pressing that key. since even with the lowest framerates your key presses are being sent to the buffer in your local machine immediately. The term practically redefined itself when games such as quake implemented client-side prediction- you could see another player moving forward, and, instead of stopping dead as your PC is awaiting game state information, the client continues to draw that character moving in that direction.
The short story is- “Lag” is completely unrelated to framerate. they are separate. You can have a high lag/ping and a low framerate, or vice versa, or both, but that doesn’t suddenly make them interchangable.
EDIT:
To further express what I mean:
While there is latency between the input and when you see that input on the screen (say, for example, if a program or game is only running at 1 fps; it could take a full second to see the effect). There is however no latency between when you press the button and when the game receives it.
So, for example, if your game is running at only a few frames per second, within the time between frames your actions are still being dealt with (if the game is written properly, anyway). For example, you could easily, say ,shoot an arrow or gun or whatever in the game, and because of the way the framerate is so low you could miss seeing that event at all.
With networked games, latency is a given, with 50 ms being something of a norm. Earlier online gaming dealt with much higher ping times- sometimes up to 500ms.
The way this is combat is to use client-side prediction. For example, if your friend billy is moving forward, and your game doesn’t receive packets from billy (or the server) for a while, it will just assume Billy kept moving forward. The other option is to have every single “gamestate” sent from the server shown separately, but then you only ‘see’ movement if the latency is extremely low; and this includes your own player; without client-side prediction, your game is really a “window” into another game running on a server, and it only shows what the server tells you you can see. With client side prediction, what it does is let you do what you will, send the packets for your movements, and hope the server keeps you updated. Sometimes this causes weirdness; for example, in quake, there might be a touchplate that controls a vat of lava, foolish individuals are supposed to try to run across, hit the touchplate, with opens up the hole and they die. Then you run across, and it doesn’t open… after you get to the other side, and a few moments later you suddenly find yourself already dead at the bottom of the vat of lava. this is because the button was controlled “on the server” so your game just let you do it’s thing, and sent your movements to the server, which obligingly allowed you to tread lava and then die for those few seconds while you thought you were still alive.
What most people refer to as lag are things that defy the client-side prediction; for example, it might show Billy walk off into the distance, and then when your game receives a update, the server is like “oh, actually billy turned and he’s over here now” at which point Billy’s “avatar” (character) suddenly plops over to that position. Usually, FPS doesn’t fit into this at all; you can have a high framerate and still get latency, of course; in fact, you are more likely to do so because a lower framerate helps hide the flaws in client-side prediction.
Have something to say about this post? Comment!
2 thoughts on “It’s not network lag unless there is a network involved.”
Are you talking about “lag” in general, or “network lag”..?
‘it’s not “lag” because there is no actual lag between your actions and what you see on screen’
Erm.. the term “lag” means a time delay – pressing a key and seeing the actions a brief time later *is* a lag; the effect is lagging behind the action.
‘the term “lag” was originally used to indicate the time difference between when you pressed a key and when the server acknowledged you pressing that key’
I think you’re talking specifically about “network lag” there, not “lag” in general. Best not confuse “lag” with automatically meaning network context.
pressing a key and seeing the action a brief second later can be perceived as “lag”. The definition is:
slowdown: the act of slowing down or falling behind
hang (back) or fall (behind) in movement, progress, development, etc.
In the terms of a game, for example, many people would call 12 frames per second “laggy”. However, the computer is accepting, and processing the inputs just as quickly as it usually would- the fact that the “effect” from these inputs is only seen at 12 frames per second can hardly be called a result of lag.
In terms of say, a edit box that takes a second to register input, it depends on the context. Following the definition, the only thing that is falling behind anything else is the actual display as opposed to what the user “thinks” should be there. Therefore, “lag” as experienced on a local machine is not really measurable except by perception. What one person considers “lag” another might consider perfectly acceptable, and in both cases the result is the same.
With a network, however, there is a measurable delay. the very concept itself was more or less used originally with regard to “ping” times, or, more precisely, delays based on distance from a server. If PlayerA is a mile from the game server and player B was on the other side of the continent, then PlayerB was always “lagged” behind playerA, and there was nothing that anybody could do about it. Additionally, spikes and dips in connectivity and speed as well as the condition of the route being taken by the packets could mean that the various packets that are being sent by the server will take a different amount of time to reach each client- oftentimes a complete loss of packets, or duplicate packets (As a result of the server retransmitting the packets again as per the TCP protocol). This is a physically measurable quantity of time that doesn’t rely on a persons perception and there is no doubt wether there is any slowdown.
Case and point- quite some time ago, I played through several games at a speed of only 18-25 fps, and I found it quite playable. However, many people would call it “laggy”. The fact is, when the concept being used has a strict definition and it’s application relies purely on a perceptual measuring of that definition the term itself becomes nearly meaningless.