What is the point of high DPI mice?

Special K

Diamond Member
Jun 18, 2000
7,098
0
76
I have an MX518 mouse that offers 3 different DPI settings. Although I can see the value of being able to adjust the DPI in the middle of a game, I don't understand why mouse manufacturers are creating mice with extremely high DPI settings. What is the benefit of this when I can adjust the mouse sensitivity within the game and/or Windows?

Right now, my MX518 is set to its middle DPI value, and I have the windows and in-game sensitivities set at about the 50% point. If I were to use the MX518's highest DPI setting, the mouse would become uncontrollable for me. I can't even imagine what it would be like to use one of those 4000+ DPI mice.

So what is the point of these extremely high DPI mice? Are there people out there who are already using a 1600 DPI mouse with the windows sensitivity and the in-game sensitivity set to the maximum value, and still find the cursor movement too slow? That seems almost unbelievable to me, even though sensitivity is a very individual setting.
 

videogames101

Diamond Member
Aug 24, 2005
6,783
27
91
OS cursor sensitivity is just that, sensitivity. But DPI provides a higher resolution of movement, more precision.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Because changing sensitivity on the software side of things can add software interpolation to the equation which can cause tracking errors - ie pixels are digital, if a software calculation multiplying the sensitivity (to increase or decrease pointer speed) comes to a conclusion with decimals it will round it (as you cannot move a cursor a fraction of a pixel)

Also, screen resolution size can make a big difference. Players playing on 1920x1200 or 2560x1600 screens and playing games at those resolutions can require higher sensitivity from their mice

A lot of it is really just marketing gimmick though
 

AmberClad

Diamond Member
Jul 23, 2005
4,914
0
0
I don't know about the MX518, but some gaming mice allow you to change sensitivity on the fly using a built-in mechanism, without requiring you to open up the Windows mouse settings.
 

zagood

Diamond Member
Mar 28, 2005
4,102
0
71
Originally posted by: videogames101
OS cursor sensitivity is just that, sensitivity. But DPI provides a higher resolution of movement, more precision.

:thumbsup:
 

Special K

Diamond Member
Jun 18, 2000
7,098
0
76
Originally posted by: videogames101
OS cursor sensitivity is just that, sensitivity. But DPI provides a higher resolution of movement, more precision.

I guess I don't quite understand the distinction. If I increase the DPI of my mouse, it seems to behave the same as if I had just increased the sensitivity. So what is the difference? If I increase the sensitivity in windows or the game, is it just making the mouse move faster by skipping pixels or what?

Also, what effect does the mouse/USB poll rate have?
 

imported_Scoop

Senior member
Dec 10, 2007
773
0
0
It's about precision not sensitivity. When I got my Copperhead I switched between the different DPI and poll rates until it felt good and settled for 1600 DPI with 500Hz. After that it was about getting used to it. Higher DPI/poll rate isn't necessarily better like I don't use 2000 DPI and 1000Hz.
 

videogames101

Diamond Member
Aug 24, 2005
6,783
27
91
Originally posted by: Special K
Originally posted by: videogames101
OS cursor sensitivity is just that, sensitivity. But DPI provides a higher resolution of movement, more precision.

I guess I don't quite understand the distinction. If I increase the DPI of my mouse, it seems to behave the same as if I had just increased the sensitivity. So what is the difference? If I increase the sensitivity in windows or the game, is it just making the mouse move faster by skipping pixels or what?

Also, what effect does the mouse/USB poll rate have?

Ok, your mouse gives say 500 dpi. You move your mouse diagonally and for the sake of example it goes past 500 dots or 1 inch(as in dpi) Now, your PC see you've moved 500 dots, if the sensitivity is set to high, it'll amplify that movement by say 2x. Effectively, you doubled each dots movement across the screen, creating the effect of moving a mouse with double the dpi that 1 inch, 1000 dots of non-amplified movement. Now, say the 500 dots * 2x results in your mouse moving 5 inches across the screen. your 500 dots are mapped across 5 inches.

Now in a higher dpi mouse, you don't need to amplify movement. you move a 1000 dpi mouse 1 inch and your computer doesn't amplify it by 2x, but you still have 1000 dots. Those 1000 dots, just like the other mouse, get your pointer 5 inches across your screen. Now you have 1000 dots mapped across those 5 inches.

Meaning, with the 500 dpi mouse, your movement is 1/2 as precise as your movement with the 1000 dpi mouse, assuming your increasing the sensitivity on the lower dpi mouse. 5 inches of the screen are mapped to 500 dots and 100 dots respectively, even though the same amount of movement on each mouse will get you the same distance, you'll have half the number of possible points between where you are and where your going, thats 1/2 the precision.



I rambled, but I hope I got the concept across.
 

Howard

Lifer
Oct 14, 1999
47,982
10
81
Originally posted by: videogames101
Originally posted by: Special K
Originally posted by: videogames101
OS cursor sensitivity is just that, sensitivity. But DPI provides a higher resolution of movement, more precision.

I guess I don't quite understand the distinction. If I increase the DPI of my mouse, it seems to behave the same as if I had just increased the sensitivity. So what is the difference? If I increase the sensitivity in windows or the game, is it just making the mouse move faster by skipping pixels or what?

Also, what effect does the mouse/USB poll rate have?

Ok, your mouse gives say 500 dpi. You move your mouse diagonally and for the sake of example it goes past 500 dots or 1 inch(as in dpi) Now, your PC see you've moved 500 dots, if the sensitivity is set to high, it'll amplify that movement by say 2x. Effectively, you doubled each dots movement across the screen, creating the effect of moving a mouse with double the dpi that 1 inch, 1000 dots of non-amplified movement. Now, say the 500 dots * 2x results in your mouse moving 5 inches across the screen. your 500 dots are mapped across 5 inches.

Now in a higher dpi mouse, you don't need to amplify movement. you move a 1000 dpi mouse 1 inch and your computer doesn't amplify it by 2x, but you still have 1000 dots. Those 1000 dots, just like the other mouse, get your pointer 5 inches across your screen. Now you have 1000 dots mapped across those 5 inches.

Meaning, with the 500 dpi mouse, your movement is 1/2 as precise as your movement with the 1000 dpi mouse, assuming your increasing the sensitivity on the lower dpi mouse. 5 inches of the screen are mapped to 500 dots and 100 dots respectively, even though the same amount of movement on each mouse will get you the same distance, you'll have half the number of possible points between where you are and where your going, thats 1/2 the precision.


I rambled, but I hope I got the concept across.
What he's trying to say is that you skip the dots in between if you artificially increase the sensitivity.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
The setting in windows only multiplies what it gets from the mouse, it doesn't change the data coming in from the mouse.
If I play noisy music into a amplifier it doesn't make the output sound better, just louder.

USB polling rate is totally separate from dpi. It is basically how often the pc checks for new information from the mouse.

So polling rate of 500, means it checks the mouse 500 times a second for new data.
It depends on the mouses chipset how high that value can go. The average is 125 times a second. If set too high you get junk data from the mouse since it can't keep up. Like me calling you on your cell every 2 secs , you can only answer so fast :)


 

Howard

Lifer
Oct 14, 1999
47,982
10
81
Originally posted by: Modelworks
The setting in windows only multiplies what it gets from the mouse, it doesn't change the data coming in from the mouse.
If I play noisy music into a amplifier it doesn't make the output sound better, just louder.

USB polling rate is totally separate from dpi. It is basically how often the pc checks for new information from the mouse.

So polling rate of 500, means it checks the mouse 500 times a second for new data.
It depends on the mouses chipset how high that value can go. The average is 125 times a second. If set too high you get junk data from the mouse since it can't keep up. Like me calling you on your cell every 2 secs , you can only answer so fast :)
Which mice suffer from this problem? I haven't heard of any issues with mice themselves, even at 1000 Hz.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Howard
Which mice suffer from this problem? I haven't heard of any issues with mice themselves, even at 1000 Hz.

It really is luck of the draw. Most higher end mice are not going to have a problem. Manufacturers know that MS only expects them to poll at 125Hz so if they can save cash by using a cheaper processor in their device they will.

The processor inside the mouse has to be able to handle the routine of:

Ask mouse for data
Mouse sends data

And the cycle repeats, 1000 times a second.

The max any device can poll is 1 khz , anything more is outside USB specifications.

The main issue is that changing the polling rate does not mean it will check only the mouse at 1 Khz, but every device connected to the usb ports on the p.c. .

I don't recommend setting it to the max. I think 500 is a safer setting and is less likely to cause problems with other devices.

Increasing it to the max increases the chances of adding noise to the data stream which would result in just the opposite of what raising the rate is trying to accomplish. Any corrupted data is discarded and asked for again which takes away from the max 1 Khz rate.

I tell people to start slow. Change it to 250 or so and make sure everything on USB is working correctly, then increment up.




 

Special K

Diamond Member
Jun 18, 2000
7,098
0
76
Originally posted by: videogames101
Originally posted by: Special K
Originally posted by: videogames101
OS cursor sensitivity is just that, sensitivity. But DPI provides a higher resolution of movement, more precision.

I guess I don't quite understand the distinction. If I increase the DPI of my mouse, it seems to behave the same as if I had just increased the sensitivity. So what is the difference? If I increase the sensitivity in windows or the game, is it just making the mouse move faster by skipping pixels or what?

Also, what effect does the mouse/USB poll rate have?

Ok, your mouse gives say 500 dpi. You move your mouse diagonally and for the sake of example it goes past 500 dots or 1 inch(as in dpi) Now, your PC see you've moved 500 dots, if the sensitivity is set to high, it'll amplify that movement by say 2x. Effectively, you doubled each dots movement across the screen, creating the effect of moving a mouse with double the dpi that 1 inch, 1000 dots of non-amplified movement. Now, say the 500 dots * 2x results in your mouse moving 5 inches across the screen. your 500 dots are mapped across 5 inches.

Now in a higher dpi mouse, you don't need to amplify movement. you move a 1000 dpi mouse 1 inch and your computer doesn't amplify it by 2x, but you still have 1000 dots. Those 1000 dots, just like the other mouse, get your pointer 5 inches across your screen. Now you have 1000 dots mapped across those 5 inches.

Meaning, with the 500 dpi mouse, your movement is 1/2 as precise as your movement with the 1000 dpi mouse, assuming your increasing the sensitivity on the lower dpi mouse. 5 inches of the screen are mapped to 500 dots and 100 dots respectively, even though the same amount of movement on each mouse will get you the same distance, you'll have half the number of possible points between where you are and where your going, thats 1/2 the precision.



I rambled, but I hope I got the concept across.

Is a "dot" just a unit the mouse uses to convert the surface it detects with its laser into something quantifiable? What is the relationship between a dot and a pixel on the screen?
 

videogames101

Diamond Member
Aug 24, 2005
6,783
27
91
Originally posted by: Special K
Originally posted by: videogames101
Originally posted by: Special K
Originally posted by: videogames101
OS cursor sensitivity is just that, sensitivity. But DPI provides a higher resolution of movement, more precision.

I guess I don't quite understand the distinction. If I increase the DPI of my mouse, it seems to behave the same as if I had just increased the sensitivity. So what is the difference? If I increase the sensitivity in windows or the game, is it just making the mouse move faster by skipping pixels or what?

Also, what effect does the mouse/USB poll rate have?

Ok, your mouse gives say 500 dpi. You move your mouse diagonally and for the sake of example it goes past 500 dots or 1 inch(as in dpi) Now, your PC see you've moved 500 dots, if the sensitivity is set to high, it'll amplify that movement by say 2x. Effectively, you doubled each dots movement across the screen, creating the effect of moving a mouse with double the dpi that 1 inch, 1000 dots of non-amplified movement. Now, say the 500 dots * 2x results in your mouse moving 5 inches across the screen. your 500 dots are mapped across 5 inches.

Now in a higher dpi mouse, you don't need to amplify movement. you move a 1000 dpi mouse 1 inch and your computer doesn't amplify it by 2x, but you still have 1000 dots. Those 1000 dots, just like the other mouse, get your pointer 5 inches across your screen. Now you have 1000 dots mapped across those 5 inches.

Meaning, with the 500 dpi mouse, your movement is 1/2 as precise as your movement with the 1000 dpi mouse, assuming your increasing the sensitivity on the lower dpi mouse. 5 inches of the screen are mapped to 500 dots and 100 dots respectively, even though the same amount of movement on each mouse will get you the same distance, you'll have half the number of possible points between where you are and where your going, thats 1/2 the precision.



I rambled, but I hope I got the concept across.

Is a "dot" just a unit the mouse uses to convert the surface it detects with its laser into something quantifiable? What is the relationship between a dot and a pixel on the screen?

yeah, i'm not sure about the relationship or what the size of a dot is, but it's called "dots per inch" so i assume thats just the quantified mouse movement.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Special K


Is a "dot" just a unit the mouse uses to convert the surface it detects with its laser into something quantifiable? What is the relationship between a dot and a pixel on the screen?


It gets a bit technical so bear with me :)

DPI really doesn't tell the whole story. You really need to know the sensor sampling size of the mouse. Not easy to get since they want you to see the high dpi number in ads not the low pixel number of the image.

It factors into how a optical mouse works. Inside the mouse it is taking a picture of the surface very fast. Computers cannot do dots they do pixels. The image the mouse takes is usually about 15x15 pixels. If I wanted to print that image at 1000dpi for one inch I would need to copy it 66 times. So at 1000dpi the mouse divides each inch into 66 parts or 1 inch / 66 = .015 inches .

So you would have to move the mouse .015 inches for the mouse to detect it.

When you use a lower dpi the mouse becomes less accurate since 300dpi would mean 300/15 = 20. 1 inch / 20 = .05 inches. The mouse will not notice any movement that is less than .05 inches.

When the mouse sends data to the pc it doesn't send it anything more than mouse moved x,y .

So at 1000dpi or .015 inches, you move the mouse 1 inch to the right, it is going to send windows X = +66. Move it to the left it sends X=-66 . Since 1 inch / .015 inches = 66

At 300 dpi , you move the mouse 1 inch to the right and 1 inch / .05 inches = 20, x = +20 and left x= -20

Windows takes the X value and moves the cursor that number pixels right or left.

Assuming 15 pixels is the sensor size you can figure out that dpi / 15 = pixels
That will tell you how many pixels are assigned to each inch with the windows setting at normal with no multiplier.





 

octopus41092

Golden Member
Feb 23, 2008
1,840
0
76
Yes, what everyone has said here is pretty much true. I have found that anything over 2000dpi feels the same and really has no difference. When I'm gaming I usually just set it to 2000dpi even though my mouse is capable of 4000dpi.

In most games I have my sensitivity low and in CS:S I have it down to 1. At a certain point having dpi that high is pointless. At that point I can't have my sensitivity any lower unless I turn down the dpi, so having that high dpi is really kinda pointless IMO.