View Full Version : tft turning on and off
I got me a nice 24" syncmaster 2494hs with digital and analog output - desktop is running 1920x1080 and so is worms - awesome res xD But... when I use the DVI output, the monitor suddenly turns on and off when a game starts. With analog signal, this does not occur. There is no message like 'out of scan range' or stuff like that on the screen, and DVI works properly in windows. Any ideas what the problem could be? :S This both happens in Win7 and XP, making it hard for me to believe its some driver issue.
More strange is that I managed to play for a couple of minutes with no problem once, and then it suddenly started to do this. When I minimize to desktop it keeps turning on and off aarg...
for now i solved it by connecting both analog and dvi, and making analog my primary display b4 I load worms, but i wanna get rid of the analog cable - quite understandable I hope xD
I own a 24" LCD and I can't notice the output difference between the D-sub / DVI-D / HDMI input signals. Just use the analog interface. And, does not your LCD support a maximum resolution of 1920x1200?
CyberShadow
1 Mar 2009, 10:38
Analog 1920x1200 was pretty blurry when I tried it, even at 60 Hz...
I guess that depends on the Auto Adjust feature of your screen. Also, I've heard that some video cards don't handle well this resolution. In some cases adjusting the pixel clock and phase of your screen may solve the problem.
I own a 24" LCD and I can't notice the output difference between the D-sub / DVI-D / HDMI input signals. Just use the analog interface. And, does not your LCD support a maximum resolution of 1920x1200?
Yea, I see no difference either with DVI or dsub connected - and 1920x1080 is max res, its fullHD. Blurry seems to be out of the question on this one, lol
I guess that depends on the Auto Adjust feature of your screen. Also, I've heard that some video cards don't handle well this resolution. In some cases adjusting the pixel clock and phase of your screen may solve the problem.
Yea it could be my old ati9600pro being the problem here... no idea.. anyway dsub solves it
VGA is blurry compared to DVI, just test it by displaying 1 white pixel on black background. You'll notice a huge diff - wich is to be expected since there is still D = > A => D conversion going on in VGA.
Result of the test will be:
DVI: 1 pixel of your monitor will light up fully white.
VGA: greyish blurred out dot will be displayed
oh and PS; I am talking about digital output devices offcourse (like LCD's)
Looks like turning on MagicBright in the monitor options helped :p
...besides I didn't notice the default M$ drivers for the vid card were installed, lol - prob solved b4 I installed the newest ati 9.2 drivers, but still... :blush: :ashamed: all running fine now on dvi hehe
VGA is blurry compared to DVI, just test it by displaying 1 white pixel on black background. You'll notice a huge diff - wich is to be expected since there is still D = > A => D conversion going on in VGA.
Result of the test will be:
DVI: 1 pixel of your monitor will light up fully white.
VGA: greyish blurred out dot will be displayed
oh and PS; I am talking about digital output devices offcourse (like LCD's)
I can't notice any difference on my 26" LCD at 1920x1200 when using my vid cards vga to lcd vga and my vid cards dvi-i to hdmi on my lcd.
The only difference I do notice is on ocassions I can see it refresh to a slight degree when its using vga.
vBulletin® v3.8.6, Copyright ©2000-2013, Jelsoft Enterprises Ltd.