How come when i scroll my mouse to the far right of my screen, it leaves the screen?

Phantom589

Member
Sep 21, 2005
142
0
0
When i scroll far right, the pointer leaves the monitor screen untill i scroll back left. The picture on the monitor is fine and everything else is ok. This doesnt happen when i try to scroll pass the top,left or right boundries. It is really annoying mainly when i try to quit a program because i always end up out of the screen. Help plz. The monitor is 1280 X 1024, and is the resolution i have it set it and everything (games, movies) is working perfectly.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
It has to do that because the active part of the arrow tip is the top-left pixel. If it stopped before going offscreen you couldn't ever click the last few rightmost pixels of the screen.
 

YOyoYOhowsDAjello

Moderator<br>A/V & Home Theater<br>Elite member
Aug 6, 2001
31,205
45
91
"When i scroll far right"

"This doesnt happen when i try to scroll pass the top,left or right boundries"

:confused:

Did you mean bottom not right in the second part?

You probably have dual monitor mode set up and you have your desktop extended to a second desktop that you don't have a display for. Try looking in your video options and diabling multiple monitors etc.
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
14
81
www.markbetz.net
Well, the mouse location, as far as Windows is concerned, is figured from a point at the top left corner of the rectangle that contains the pointer. If you try to move it offscreen to the left you'll notice that you can't. Usually if you move it offscreen to the right you will still see the last vertical column of pixels. If you do not, then your monitor may be adjusted so that the image is slightly wider than the viewable area of the screen.
 

YOyoYOhowsDAjello

Moderator<br>A/V & Home Theater<br>Elite member
Aug 6, 2001
31,205
45
91
Originally posted by: DaveSimmons
It has to do that because the active part of the arrow tip is the top-left pixel. If it stopped before going offscreen you couldn't ever click the last few rightmost pixels of the screen.

Oh, is that what you mean Phantom?
 

Skeeedunt

Platinum Member
Oct 7, 2005
2,777
3
76
Originally posted by: YOyoYOhowsDAjello

...

You probably have dual monitor mode set up and you have your desktop extended to a second desktop that you don't have a display for. Try looking in your video options and diabling multiple monitors etc.

This happens to me when I undock my laptop. Second monitor is physically gone, by I can still move the mouse pointer completely off the screen to the right, as if the monitor was still there. Sounds like something along those lines.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Originally posted by: YOyoYOhowsDAjello
You probably have dual monitor mode set up and you have your desktop extended to a second desktop that you don't have a display for. Try looking in your video options and diabling multiple monitors etc.
This is probably the cause if you can scroll way past the right edge. If you can only scroll a little bit past the edge then what I said is the cause.

 

Phantom589

Member
Sep 21, 2005
142
0
0
Well when i installed my video card (nvidia 6800) i used all the options for plugging it in. I put in the VGA (the blue one) , the optional DVI (white one) and the USB. I have all three conecting my monitor and computer. Could that be it?

The only realy problem is that the pointer can leave through the right site. how do i fix it?
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
If you only have one monitor and one computer, you should only attach the VGA or DVI from the video card not both (unless an Apple 30" monster).

Can you scroll only slightly past the right edge or a long ways?
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Most people say the DVI looks better. Try powering down before switching around, and you may need to press a "source select" button on your LCD to swtich to DVI input.
 

crazylegs

Senior member
Sep 30, 2005
779
0
71
sounds like you got it set to Dualview...(cand chane this: right click desktop>properties>settings>advanced>___name of video card>nView display settings>nView display mode... change this to single digital display - as you should leave the digital input plugged in) if you got an nvidia not sure about ATI's... unplug the VGA and just use the DVI - u might need to select DVI input in your monitor onscreen menu if u get a black screen!!

hope this helped???

GL
 
Mar 11, 2004
23,444
5,852
146
Yeah, sounds like you're video card has a second monitor activated. Try going into your video driver's control panel and disabling the second monitor.

I guess you could actually just disable it in the windows Properties and then Settings tab. Make sure Extend my Windows desktop to this monitor is unchecked on the second one.
 

Phantom589

Member
Sep 21, 2005
142
0
0
Originally posted by: crazylegs
sounds like you got it set to Dualview...(cand chane this: right click desktop>properties>settings>advanced>___name of video card>nView display settings>nView display mode... change this to single digital display - as you should leave the digital input plugged in) if you got an nvidia not sure about ATI's... unplug the VGA and just use the DVI - u might need to select DVI input in your monitor onscreen menu if u get a black screen!!

hope this helped???

GL



worked like a charm, thanks alot.
 

Phantom589

Member
Sep 21, 2005
142
0
0
Originally posted by: crazylegs
sounds like you got it set to Dualview...(cand chane this: right click desktop>properties>settings>advanced>___name of video card>nView display settings>nView display mode... change this to single digital display - as you should leave the digital input plugged in) if you got an nvidia not sure about ATI's... unplug the VGA and just use the DVI - u might need to select DVI input in your monitor onscreen menu if u get a black screen!!

hope this helped???

GL


i having another problem. On my moniter there is a button to choose inputs. When i press it once, it says digital input and the screen is black as if the computer was turned off, when i press it agian it returns to analog input and displays what it should display from the computer (btw this is with both vga and dvi cords in )....why is it doing this? am i supposed to just use analog input?

btw this is a dell ultasharp 19 in flat panel.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Originally posted by: DaveSimmons
If you only have one monitor and one computer, you should only attach the VGA or DVI from the video card not both (unless an Apple 30" monster).

Ok ill take one of the connections off. Which one should i keep?

Most people say the DVI looks better. Try powering down before switching around, and you may need to press a "source select" button on your LCD to swtich to DVI input.
 

crazylegs

Senior member
Sep 30, 2005
779
0
71
not sure abouth that specific monitor.... (i have a cheapo Digimate 19in LCD- does the job for me!) in my situation i had both VGA and DVI plugged in for a while because i was moving the base unit from my house (DVI into monitor) to mates house (only VGA into monitor) for data transfer :)

Anyway when it came to moving it back to my house... i had to plug both connections in, then choose digital output from the nVidia settings (same place as last time, on the drop down tab for monitor choice i could choose analouge or digital) - screen may go blank now until u use the monitor menu settings to choose DVI input.... make any sense...? bit long winded i know!!! Good Luck!

no probs by the way...
 

Skyhanger

Senior member
Jul 16, 2005
341
0
0
Originally posted by: Phantom589
Originally posted by: crazylegs
sounds like you got it set to Dualview...(cand chane this: right click desktop>properties>settings>advanced>___name of video card>nView display settings>nView display mode... change this to single digital display - as you should leave the digital input plugged in) if you got an nvidia not sure about ATI's... unplug the VGA and just use the DVI - u might need to select DVI input in your monitor onscreen menu if u get a black screen!!

hope this helped???

GL


i having another problem. On my moniter there is a button to choose inputs. When i press it once, it says digital input and the screen is black as if the computer was turned off, when i press it agian it returns to analog input and displays what it should display from the computer (btw this is with both vga and dvi cords in )....why is it doing this? am i supposed to just use analog input?

btw this is a dell ultasharp 19 in flat panel.

Just plug in one cord. The monitor can read from DVI(digital) or VGA(analog). It does not need both.
Try putting only one in, you're prob just confusing the bejeezus out of your computer with both.
Try putting in only DVI (remove VGA, USB, etc) and see if your computer autodetects.