500Hz USB mouse polling in XP SP3

Windows XP SP3 release candidate updates usbport.sys from SP2, meaning all of the existing patches and replacement files for changing your USB mouse polling to 500Hz (from the default 125Hz) no longer work. Using them will either fail (complaining of the wrong file) or will cause your USB ports to cease to work in XP.

Luckily, the line of hex code in usbport.sys that relates to the polling rate is only slightly changed from SP2, and can easily be tweaked with a hex editor.

Default hex string:
XP SP2: 3C 08 73 09 C6 86 0A 01
XP SP3 RC: 3C 08 73 09 C6 86 0E 01

One hex character difference. Luckily, we leave this character alone when patching.

500Hz hex string:
XP SP2: B0 02 73 09 52 8C 0A 01
XP SP3 RC: B0 02 73 09 52 8C 0E 01

To do this, you need to use a hex editor (xvi32) and edit the file in Safe Mode. Not only do you edit usbport.sys in C:\WINDOWS\system32\drivers, but you must also copy the patched file into C:\WINDOWS\system32\dllcache (hidden by default, edit your folder options to reveal protected OS files).

Once you've got the patched usbport.sys in both the drivers/ and dllcache/ directories, reboot and you're good to go.

To check your mouse polling rate, use the dx_mouse_timer_dialog utility.

125hz is a great default value put in there by someone who knew what they were doing. It is higher than most high-end CRT refresh rates (as well as LCD pixel-change refresh) and is way above the line where the eye can tell the difference.

It is impossible to take advantage of the difference between 125hz and 500hz in gaming, even hardcore competitive gaming, short of you being a cyborg transmitting mouse coordinates directly from your neural processor into the USB driver.

Likely, setting it to 500hz just bogs something down in a multiple-device USB chain and/or CPU with no actual benefit to user. Talk about an entirely pointless tweak.

shihonage wrote:

125hz is a great default value put in there by someone who knew what they were doing. It is higher than most high-end CRT refresh rates (as well as LCD pixel-change refresh) and is way above the line where the eye can tell the difference.

That would matter if all the video card did was wait for the mouse polls and draw the framebuffer the nanosecond it received one. However, that's not the case.

Instead, because the image drawn on the screen is asynchronous with the mouse polling, what a higher polling rate means is a more timely "reading" when the game fetches one.

Consider: Source engine games by default run at 100 "ticks", or polling input 100 times per second. So 125Hz is more than enough, right? Well, until we recognize the fact that the 125Hz aren't necessarily in perfect harmony with the instant that the engine reads input values.

We can have a situation where the sequence of (very rapid) events is:
Hand movement
Mouse poll
Engine reads input
Mouse poll
Frame draws
Hand movement
Engine reads input <-- no poll between movement and input read
Frame draws w/old value
Mouse poll
... etc

So, no, there's no way to see all 500 individual mouse readings on screen. But that's completely missing what the problem is. 500Hz polling helps in making the frames that get on my screen be in sync with my movements and not slightly behind. 125Hz means a maximum delay of 8ms between input and having that input read by the system. 8ms behind? Yeah, you can "see" that. The fact that even the 125Hz mouse may be polled multiple times between the last frame drawn and the upcoming one doesn't matter. What matters is how timely the last poll was before the game engine reads it. There's already a time gap between when the engine reads the input and when it renders to the screen (as these are typically the first and last tasks of the game loop, with all of the world processing in-between), so it's easy to see how the hardware latency just piles on top of the existing gap.

Now let's also factor in the fact that USB polling interrupts can be delayed or lost (if any part of the interrupt packet or token is corrupt, the packet is simply discarded). That 8ms delay can be extended further - and the lower the polling rate, the longer the wait until the packet resend! No, I'm afraid the problem is a tad more complicated than simply comparing a mouse polling rate to a monitor refresh rate.

When using 500Hz polling, especially at sufficient sensitivity, it's easy to see the difference in the smoothness of movement. And all of the "125Hz was decided by some smart guy" appeal to authority doesn't make a difference. Yes, the guy was probably very smart. He knew 99% of the people using PCs weren't going to be using them for hardcore games where extremely fine yet fast mouse movement was something of concern. He was also dealing with PCs with an order of magnitude less power than now, where the CPU load increase would actually have mattered some.

*Legion* wrote:

Consider: Source engine games by default run at 100 "ticks", or polling input 100 times per second.

They don't, though. I don't know about other Source games, but HL2DM runs by default at 66 ticks.

Legion wrote:

tech-talk

You forgot something that, well, renders everything you said irrelevant. The frequency at which online games exchange packets doesn't go anywhere near 1/125th of a second. It's somewhere between 5hz (WoW, I presume) and 15-30hz (Quake Wars/UT and the like).

As someone who co-founded a competitive UT clan and was into whole zen-breathing-matters-meditational-gaming-state thing, the only (slight as it may be) advantage I could still sense was in jacking up my mouse frequency from 60hz to 80hz (back when it was a PS/2 mouse, which had 60hz as default).

Other than that... games' internal mouse poller post-processes out the movement. Windows mouse drivers post-process out the movement. Precision is reduced. Finally, the packet rate of online games is astronomically far from even 125hz to make any difference whatsoever, AND, movement interpolation smoothes out other people's movement, so they're not THAT precisely where you see them. Not where a difference between 125hz and 500hz would make the tiniest difference.

In the end all you got is snake oil. But hey, everyone needs a little good luck charm to believe in

Podunk wrote:

They don't, though. I don't know about other Source games, but HL2DM runs by default at 66 ticks.

That's true by default for online games, but that's a function of trying to keep bandwidth down (though, of course, many servers are turned up to 100 ticks). However, for offline games, I think my statement may have been incorrect as well, as it looks like HL2 doesn't run at a fixed tick rate offline.

Of course, that's all tangential to the main point of the example, which still stands.

shihonage wrote:

You forgot something that, well, renders everything you said irrelevant. The frequency at which online games exchange packets doesn't go anywhere near 1/125th of a second. It's somewhere between 5hz (WoW, I presume) and 15-20hz (Quake Wars/UT and the like).

Which completely fails to be relevant to the discussion at all.

Other than that... games' internal mouse poller post-processes out the movement. Windows mouse drivers post-process out the movement. Precision is reduced

Care to provide some support to these claims? I've yet to find anything that suggests the Windows HID mouse driver does anything to the input values other than pass them on and perhaps convert them into legacy format. And I've written a 3D engine and looked at source code of older engines and seen nothing to what you suggest, save for the "mouse smoothing" menu options some games have that DOES do what you're claiming.

Of course, it's a remedial Garbage In-Garbage Out principle here. The existence of any post-processing does not mean that bad input and better input all arrive at the same destination.

In the end all you got is snake oil.

Do you know how to talk to someone without being a prick? I've yet to see any evidence to suggest so.

shihonage wrote:

You forgot something that, well, renders everything you said irrelevant. The frequency at which online games exchange packets doesn't go anywhere near 1/125th of a second. It's somewhere between 5hz (WoW, I presume) and 15-30hz (Quake Wars/UT and the like).

That still doesn't address the simple fact that in any poll-for-input model, the more often you poll, the more likely you are to have the most up-to-date status when you actually act on your last measured reading.

Legion wrote:

Care to provide some support to these claims? I've yet to find anything that suggests the Windows HID mouse driver does anything to the input values other than pass them on and perhaps convert them into legacy format. And I've written a 3D engine and looked at source code of older engines and seen nothing to what you suggest, save for the "mouse smoothing" menu options some games have that DOES do what you're claiming.

This really depends on how a game polls the mouse.

For instance, Windows XP auto-accelerates and interpolates mouse movement by default. There was an option in Windows 2000 to stop the acceleration but now it is gone. Instead, XP has a checkbox to "increase mouse precision", which does not emulate the same behavior as the Windows 2000 checkbox did.

If the game polls the mouse on a deeper level, then perhaps it can avoid the unholy post-processing that Windows does to it. My own engine, for instance, doesn't.

*Legion* wrote:
shihonage wrote:

You forgot something that, well, renders everything you said irrelevant. The frequency at which online games exchange packets doesn't go anywhere near 1/125th of a second. It's somewhere between 5hz (WoW, I presume) and 15-20hz (Quake Wars/UT and the like).

Which completely fails to be relevant to the discussion at all.

This is the core of my argument and it is 100% relevant.

Having written a fully playable coop client-server engine with client-side movement interpolation before, I do have some idea as to how things happen in that world.

Unless you need 500hz for a competitive edge in single-player games, at which point, uh, I'm really at a loss of what to tell you.

Mr Crinkle wrote:

That still doesn't address the simple fact that in any poll-for-input model, the more often you poll, the more likely you are to have the most up-to-date status when you actually act on your last measured reading.

The relevance of 125hz vs 500hz input from a HUMAN within an engine that polls you at 20-30hz, and then adds your opponent movement interpolation into the mix, is similar to Fatality selling a "gaming sound card" which will let you hear your opponents better and thus make it easier to frag them into oblivion.

In other words, precisely NIL. The 500hz hack may be your good luck amulet, but nothing more.

*Legion* wrote:

Of course, that's all tangential to the main point of the example, which still stands.

Yeah. I guess I just felt the need to pitch in my two cents.

shihonage wrote:

The 500hz hack may be your good luck amulet, but nothing more.

Negative, I always keep my lucky beers with me for this purpose when I game online.

You can argue that the subjective experience going from 125Hz to say 500Hz is irrelevant, but that doesn't change the technical reality of what's going on. Legion is right in that the two processes (mouse driver updating and the game engine polling it) running asynchronous at different frequencies will yield a worst case delay equal to the time period corresponding to the highest of the two frequencies. For these values it means either fluctuating between 0 and 8ms old values or between 0 and 2ms.

The packet rate of an online game isn't relevant to this. WoW for instance synching its state with the server at a very low frequency as you say and your mouse cursor movement being sluggish or accurate and smooth (say the difference between an old serial mouse and FancySchmancy Mouse 3000+) really have nothing to do with each other. FPSes are not that different from this. The feedback loop of you moving your hand and the crosshair updating on the screen is usually decoupled from the network packet rate, the fixed time step physics simulation etc.

Again you can argue that the average difference of about 6ms we are talking about here with these numbers won't improve your aim by any amount worth mentioning, but please don't do it on bogus technical merits.

nossid wrote:

You can argue that the subjective experience going from 125Hz to say 500Hz is irrelevant, but that doesn't change the technical reality of what's going on. Legion is right in that the two processes (mouse driver updating and the game engine polling it) running asynchronous at different frequencies will yield a worst case delay equal to the time period corresponding to the highest of the two frequencies. For these values it means either fluctuating between 0 and 8ms old values or between 0 and 2ms.

The purpose of this tweak is to change the subjective experience, namely, to give one an advantage in an online FPS, as the OP implied. My point is, it doesn't, and it stands, making statements like the above entirely irrelevant.

The packet rate of an online game isn't relevant to this. WoW for instance synching its state with the server at a very low frequency as you say and your mouse cursor movement being sluggish or accurate and smooth (say the difference between an old serial mouse and FancySchmancy Mouse 3000+) really have nothing to do with each other. FPSes are not that different from this. The feedback loop of you moving your hand and the crosshair updating on the screen is usually decoupled from the network packet rate, the fixed time step physics simulation etc.

Indeed it is decoupled. The only event that matters is the mouse click. The lower the packet refresh rate, the larger is the average delay between the mouse click and those coordinates being sent to the game server. As it stands, the aforementioned delay is 5 to 20 times larger than a single mouse refresh, which brings the difference between 500 and 125hz refresh to, well, nothing. If the packet refresh was approaching mouse refresh in any meaningful way, then we'd have semi-meaningful results.

Again you can argue that the average difference of about 6ms we are talking about here with these numbers won't improve your aim by any amount worth mentioning

That's kind of the point.

, but please don't do it on bogus technical merits.

You sir would be served well to do some research on client-side movement interpolation. When you aim at someone in an online FPS, they're often not exactly where they seem to be, as they're sending out their coordinates at a limited rate (15-30 pps), and your client interpolates where they were/are-likely-to-be-now in-between, which makes, again the difference between 125hz and 500hz on client end even more so "bogus", as the ultra-smooth (and seemingly approaching the 125hz refresh you have on your mouse !) framerate at which everyone around you moves is an imprecise illusion. When you fire, the server checks against the exact coordinates of your enemy, not against the non-exact, interpolated results you see on your screen.

nossid wrote:

Again you can argue that the average difference of about 6ms we are talking about here with these numbers won't improve your aim by any amount worth mentioning, but please don't do it on bogus technical merits.

No, it's not a performance enhancer (of course, my initial post didn't say anything about gaming at all, a whole heap of unfounded assumptions were run with). It's more of a slight improvement in "feel", a bit smoother of tracking. Because we're talking about small variations on a frame-by-frame basis, it's certainly no "become an aiming god" magic bullet. It merely helps smooth out some subtle jitter. Nothing earth-shattering, but of course, I never said it was.

double

shihonage wrote:

You sir would be served well to do some research on client-side movement interpolation. When you aim at someone in an online FPS, they're often not exactly where they seem to be, as they're sending out their coordinates at a limited rate (15-30 pps), and your client interpolates where they were/are-likely-to-be-now in-between, which makes, again the difference between 125hz and 500hz on client end even more so "bogus", as the ultra-smooth (and seemingly approaching the 125hz refresh you have on your mouse !) framerate at which everyone around you moves is an imprecise illusion. When you fire, the server checks against the exact coordinates of your enemy, not against the non-exact, interpolated results you see on your screen.

The Source engine disagrees.

Source SDK docs wrote:

This doesn't mean you have to lead you're aiming when shooting at other players since the server-side lag compensation knows about client entity interpolation and corrects this error. If you turn off interpolation on a listen server (cl_interpolate 0), the drawn hitboxes will match the rendered player model again, but the animations and moving objects will become very jittery.

The server does indeed backtrace the shot to confirm the "hit", but it most certainly is not clueless about the interpolation.

Just some fun numbers, to fan the flames:

Involuntary muscle response (i.e. a reflex) takes ~50ms.
Voluntary reaction to a non-visual stimulus takes ~100ms.
Voluntary reaction to a visual stimulus takes ~150ms.
Hand movement of 10cm takes ~300ms.

shihonage wrote:

The purpose of this tweak is to change the subjective experience, namely, to give one an advantage in an online FPS, as the OP implied. My point is, it doesn't, and it stands, making statements like the above entirely irrelevant.

You started off this thread of discussion by claiming something to be impossible and didn't address what I saw to be the core of the issue. I found it reasonable to clearly establish what the technicalities of the frequencies in question were.

shihonage wrote:
, but please don't do it on bogus technical merits.

You sir would be served well to do some research on client-side movement interpolation. When you aim at someone in an online FPS, they're often not exactly where they seem to be, as they're sending out their coordinates at a limited rate (15-30 pps), and your client interpolates where they were/are-likely-to-be-now in-between, which makes, again the difference between 125hz and 500hz on client end even more so "bogus", as the ultra-smooth (and seemingly approaching the 125hz refresh you have on your mouse !) framerate at which everyone around you moves is an imprecise illusion.

No, I don't currently need to do any additional research into client side interpolation (extrapolation really), but thanks anyway. I do believe we have different definitions of aiming here. With aiming I mean the act of moving the crosshair (cursor, or whatever) to the exact intended spot as I can see it on my screen without overshooting, having to readjust etc. I thought my second paragraph would be enough to show this. This is something that is clearly improved by an increased refresh rate, albeit usually by a small amount for any modern hardware.

This to me is the only kind of aim that a change at the mouse level can improve and seems to be what Legion talks about in his second post. You bringing up packet rates etc. after that and claiming what he wrote to be irrelevant is what brought me to post, as it isn't irrelevant. Surely you can agree that an improvement in aiming as I described it above matters and isn't "impossible to take advantage of". If you can't hit what you see in your client, then what does it matter what the state is on the server? The bogus tech claim comment is from you bringing up all that other stuff in order to dismiss the improvement that does exist, however small it might be. Which brings us to the next part that Legion already commented a bit on.

shihonage wrote:

When you fire, the server checks against the exact coordinates of your enemy, not against the non-exact, interpolated results you see on your screen.

Except if you do the hit detection client side. This is a whole other can of worms with regard to trusting the client etc. and is another discussion. Personally, I'm pretty happy with whatever magic Valve has concocted since I play cross-Atlantic.

If you want me to not overanalyze things like this in the future then I can recommend going easy on absolute terms like impossible and irrelevant when there is at least some epsilon worth of technical merit.

It strikes me that the 500hz mouse refresh could very well matter for fine positioning before you press the fire button.

nossid wrote:
shihonage wrote:

The purpose of this tweak is to change the subjective experience, namely, to give one an advantage in an online FPS, as the OP implied. My point is, it doesn't, and it stands, making statements like the above entirely irrelevant.

You started off this thread of discussion by claiming something to be impossible and didn't address what I saw to be the core of the issue. I found it reasonable to clearly establish what the technicalities of the frequencies in question were.

It seems what you see as core of the issue has nothing at all to do with the issue. You're nitpicking on a microscopic level which borders on rephrasing what I already said differently, all to establish some fictional superiority of precision over my facts while the end result remains precisely the same.

No, I don't currently need to do any additional research into client side interpolation (extrapolation really), but thanks anyway. I do believe we have different definitions of aiming here. With aiming I mean the act of moving the crosshair (cursor, or whatever) to the exact intended spot as I can see it on my screen without overshooting, having to readjust etc. I thought my second paragraph would be enough to show this. This is something that is clearly improved by an increased refresh rate, albeit usually by a small amount for any modern hardware.

As I said in my initial posts, it is clearly improved to a degree. Once you reach something like 125hz, the difference between that and 500hz is negligible to the point of being statistically zilch, should one forego the obvious and go ahead to test it in experimental conditions.

This to me is the only kind of aim that a change at the mouse level can improve and seems to be what Legion talks about in his second post. You bringing up packet rates etc. after that and claiming what he wrote to be irrelevant is what brought me to post, as it isn't irrelevant. Surely you can agree that an improvement in aiming as I described it above matters and isn't "impossible to take advantage of". If you can't hit what you see in your client, then what does it matter what the state is on the server? The bogus tech claim comment is from you bringing up all that other stuff in order to dismiss the improvement that does exist, however small it might be.

Again... negligible. Zero. Negligible in theory, if you allow for a possiblity of a cyborg playing the game. ZERO in practice, when you consider that it is played by a human.

doihaveto wrote:

Involuntary muscle response (i.e. a reflex) takes ~50ms.
Voluntary reaction to a non-visual stimulus takes ~100ms.
Voluntary reaction to a visual stimulus takes ~150ms.
Hand movement of 10cm takes ~300ms.

I didn't want to resort to pointing out the obvious, but there it is, which actually fits with the above poster's nickname

Except if you do the hit detection client side. This is a whole other can of worms with regard to trusting the client etc. and is another discussion. Personally, I'm pretty happy with whatever magic Valve has concocted since I play cross-Atlantic.

No game worth playing relies on client-side hit detection. The days of Diablo 1 open battle.net idealism are about a million townkills behind us.

If you want me to not overanalyze things like this in the future then I can recommend going easy on absolute terms like impossible and irrelevant when there is at least some epsilon worth of technical merit.

I do believe that a lot of us technonerds have a mild case of autism/OCD that flares up in proximity to its manifestation in others, which may be reason for this entire exchange. I do actually regret starting this argument at this point and will not pursue it any further.

*Legion* wrote:

No, it's not a performance enhancer (of course, my initial post didn't say anything about gaming at all, a whole heap of unfounded assumptions were run with). It's more of a slight improvement in "feel", a bit smoother of tracking. Because we're talking about small variations on a frame-by-frame basis, it's certainly no "become an aiming god" magic bullet. It merely helps smooth out some subtle jitter. Nothing earth-shattering, but of course, I never said it was.

Well, you're absolutely right there. I assumed this was about gaming precision because I simply cannot fathom anyone other than a hardcore FPS person tweaking out their USB driver in order to get a gameplay advantage.

Sir shihonage, master troll, arbitrator, arrogator. I felt the need to register on this forum to comment on this months old thread to make sure no one is persuaded by your irrational rhetoric. You are either wrong/irrelevant/trivial with every contention you've brought forth.

You claim the frame rate of online games is 5hz-15-20hz, I don't know about modern games but Quakeworld which is 12 years old has a network frame rate of 77hz(unless video fps drops below 77) approximately 12.98ms. And since my latency to a particular server is less than this I can play online with invariably 12.98~ms ping at which point it makes no difference whether I enable client side prediction.

I'll skip over what legion et al have explained articulately, and mention that the performance of a mouse is increased, in some cases drastically so, by higher a USB poll rate, take for instance the logitech mx500 which has noticeable negative acceleration at 125hz which is drastically rectified by an increased polling rate, which makes it comparable in terms of 'linear response vs speed' (absence of negative acceleration plateau, or total malfunction of tracking) with the currently best tracking mouse, (afaik) 'razor deathadder'. (though the deathadder is able to perform at 1800dpi resolution compared to 400dpi)

I reiterate you are wrong/irrelevant/trivial with your contentions and you shall not have the last word.

Didn't Quake have separate modes for network versus modem play?

Malor wrote:

Didn't Quake have separate modes for network versus modem play?

Quakeworld didn't have "modem" play. It had Internet play.

However, all the multiplayer games of the time had to accomodate the bandwidth and latency limitations forced by the predominant use of modems to connect to the Internet.

Quake was played a lot on LANs, and I'm suspicious it had a mode for that, which could easily have become the default in later patches.

Sir foe, master of necromancy. I have trouble pulling up exact information on Quakeworld, but these are John Carmack's own words on the games whose network code was done shortly BEFORE and AFTER Quakeworld.

John Carmack wrote:

Quake 1 had all entities in the world strictly interpolated (with a 20 hz
default clock
), but had several aspects of the game hard coded on the client,
like the view bobbing, damage flashes, and status bar.

QuakeWorld was my experimentation with adding a lot of specialized logic to
improve network play. An advantage I had was that the gameplay was all done,
so I didn’t mind adding quite hardcoded things to improve nailguns and
shotguns, among other things. The largest change was adding client side
movement prediction, which basically threw out the notion of the general
purpose client.

Quake 2 was intended to be more general and flexible than Q1/QW, with almost
everything completely configurable by the server. At the time of q2test, it
was (with a fixed 10 hz clock).

I'll leave to you to figure out the likeliness of sending 77 world updates (back and forth, i.e. x2) with 2000 byte limit that Carmack imposed on himself to accomodate the widespread modem use of the time.

One of today's latest and greatest, Quake Wars: Enemy Territory sends world updates at 99ms on ranked servers, but player-to-server update is supposedly faster ("up" to 33ms?).

As for the mouse refresh numbers, I believe you missed my point entirely. Hint: it has something to do with being human.

It is simply nonsense to claim that humans can't discern a difference between 125 and 500 (or even 250) Hz.

Set your desktop to black (so the cursor stands out nice and sharp) and make circles and figure-8s with your mouse @ 125Hz. Now patch to 500Hz and do it again. If you don't detect a glaringly obvious difference, make an appointment with an optometrist, ASAP.

Oh and thanks for this tip. I just got around to installing SP3 and the first thing I noticed after the reboot was that I was back to the choppy 125Hz mouserate. If I didn't find this I would have rolled back to SP2. I don't know how anyone can stand 125Hz even just for everyday computing, let alone gaming.

In really competitive play, this stuff is obvious.

The mouse and the monitor are decoupled, so 500Hz tracking means the cursor position on screen will more closely track the cursor position on the pad. Whatever the mouse position is, at 500Hz, the sampling hardware is always going to be closer to the truth than it is at 125.

At the instants that the other, asynchronous events happen in the system (a frame is drawn in the game engine, or a frame is drawn on the monitor), those subsystems will have a more accurate picture of where the mouse was.

Now, in the case of an LCD with 40ms lag, it's just not going to matter. But if you've got a fast LCD or a CRT, and if your game runs at a fairly high tick rate (Source at 100 tics, say), the faster your reflexes are, the more visible the difference will be. I used to play Cstrike pretty competitively, and you needed amazing accuracy and speed to be great. You had to be able to aim at something INSTANTLY and PERFECTLY every single time, and to do that, you needed a great mouse.

I can't reach that level anymore, having aged too much, but if these younger guys say they need 500hz mouse refresh, as long as they're on a low-lag LCD or CRT, I believe them. The display of the aim does not matter, only the actual aim itself. You aim and fire, and your bullets are already flying before the screen even updates. That's how quick these guys are. I know, because I was _almost_ that fast myself.

The game is constantly playing catchup to the operator, and with a high-resolution mouse, it does a more accurate job. The bullets go closer to where the operator really meant them to go. It's the position of the mouse on the desk that's the actual important bit, so measuring that with the best possible accuracy means the rest of the system will more closely correspond to that physical reality.

gphil wrote:

Set your desktop to black (so the cursor stands out nice and sharp) and make circles and figure-8s with your mouse @ 125Hz. Now patch to 500Hz and do it again. If you don't detect a glaringly obvious difference, make an appointment with an optometrist, ASAP.

EDIT: I just realized that this claim may not have been technically true. Modern top-of-the-line LCD displays may allow one to see the difference between 125 and 500hz.

However, that is also irrelevant, because my argument wasn't that humans can't discern the difference, but that this difference is entirely negligible in online games, as a combination of their packet refresh rates, average depicted framerate of such games(usually not approaching even 125hz, if half of that), and the timing of human perceive-process-react cycle.

Malor, I was a competitive FPS gamer and clan co-founder myself. I played these games since Doom came out, and up to and including now.

I spent many months playing Unreal Tournament 2003/2004 INSTAGIB in particular, and I was quite good at it. Instagib is the perfect showcase for these claims, if there ever existed one.

All that considered, I have to say with all due respect, that your assessment sounds more like the movie "Wanted" than a combination of human and computer reality. The combination of human reaction time, 10-20fps world update rate (thats without latency being in the equation), mandatory client-side movement interpolation (generating a variably imprecise position of objects on the screen depending on latency of the moment), natural imprecision of muscular movement, really make this difference completely within margin of error.

The claims of this being beneficial are closer to snake oil than anything else, IMO, and they could be disproved by the means of using test groups and experiments and seeing if their SCORES would change outside of margin of error if their mouse sensitivity is jacked up to 500hz. Unfortunately such experiment has not been done, as far as I know, so we'll just have to rely on technical facts and common sense.

During INSTAGIB play it actually matters much more what your mouse SENSITIVITY is set to, and whether it corresponds to the level you're in. If it is a giant field, your sensitivity must be set really low for accurate aim at those tiny people on the other end of the field (this has to do with muscle control more than anything - its easier to be more accurate during larger, sweeping movements). If it is a close-corridor level, where people spin past and around you, it must be set relatively high so you can spin quick and deep, and there, precision matters less because they're closer to you and thus, bigger. That's an example of a factor that actually makes a notable, observable difference.

Psychology and positioning also matter a whole lot more than some low-percentage increase in a potential something that likely gets canceled by several other factors on the way to the server. Let's not mistake precision of a mouse for its refresh rate. Most laser mice are more than sufficiently precise.