In the early days of networked gaming (I mean, the days of Doom- (not to things like imaginet)) connections were slow.
If a game or application tried to pump too much data over the connection- it would take time. Also, sometimes the architecture itself and the hardware involved introduced a sort of “time-delay” into the equation. This was called “network lag”- a doom game might not be enjoyable due to network lag, for instance.
As Online gaming has become more prevalent and requires less PC familiarity, the term has started to be used completely erroneously in all sorts of situations.
For example- if your PC is underpowered for a new game, or you set the detail to high and it runs slowly, it’s not “lag” because there is no actual lag between your actions and what you see on screen. the term “lag” was originally used to indicate the time difference between when you pressed a key and when the server acknowledged you pressing that key. since even with the lowest framerates your key presses are being sent to the buffer in your local machine immediately. The term practically redefined itself when games such as quake implemented client-side prediction- you could see another player moving forward, and, instead of stopping dead as your PC is awaiting game state information, the client continues to draw that character moving in that direction.
The short story is- “Lag” is completely unrelated to framerate. they are separate. You can have a high lag/ping and a low framerate, or vice versa, or both, but that doesn’t suddenly make them interchangable.
To further express what I mean:
While there is latency between the input and when you see that input on the screen (say, for example, if a program or game is only running at 1 fps; it could take a full second to see the effect). There is however no latency between when you press the button and when the game receives it.
So, for example, if your game is running at only a few frames per second, within the time between frames your actions are still being dealt with (if the game is written properly, anyway). For example, you could easily, say ,shoot an arrow or gun or whatever in the game, and because of the way the framerate is so low you could miss seeing that event at all.
With networked games, latency is a given, with 50 ms being something of a norm. Earlier online gaming dealt with much higher ping times- sometimes up to 500ms.
The way this is combat is to use client-side prediction. For example, if your friend billy is moving forward, and your game doesn’t receive packets from billy (or the server) for a while, it will just assume Billy kept moving forward. The other option is to have every single “gamestate” sent from the server shown separately, but then you only ‘see’ movement if the latency is extremely low; and this includes your own player; without client-side prediction, your game is really a “window” into another game running on a server, and it only shows what the server tells you you can see. With client side prediction, what it does is let you do what you will, send the packets for your movements, and hope the server keeps you updated. Sometimes this causes weirdness; for example, in quake, there might be a touchplate that controls a vat of lava, foolish individuals are supposed to try to run across, hit the touchplate, with opens up the hole and they die. Then you run across, and it doesn’t open… after you get to the other side, and a few moments later you suddenly find yourself already dead at the bottom of the vat of lava. this is because the button was controlled “on the server” so your game just let you do it’s thing, and sent your movements to the server, which obligingly allowed you to tread lava and then die for those few seconds while you thought you were still alive.
What most people refer to as lag are things that defy the client-side prediction; for example, it might show Billy walk off into the distance, and then when your game receives a update, the server is like “oh, actually billy turned and he’s over here now” at which point Billy’s “avatar” (character) suddenly plops over to that position. Usually, FPS doesn’t fit into this at all; you can have a high framerate and still get latency, of course; in fact, you are more likely to do so because a lower framerate helps hide the flaws in client-side prediction.