Jump to content

LCD Computer Monitor + Graphic Card Res question.


Lucky #9

Recommended Posts

  • Members

I've been looking to update my CRT computer monitor to a newer LCD 19" screen, probably the wide version. However, I can't come to an understanding on one subject:

 

I've read that you get the best picture if you have your graphic card set for the monitor's native resolution. For example, CompUSA had an ACER 19" on special several days ago for $159, with a native resolution of 1440 X 900. This is also the native resolution for a lot of other monitors.

 

I checked every graphic card in the place to see if they displayed at that resolution but none had that particular resolution listed on the box. High-priced ATI cards, for example, had about 20 different resolutions listed, but not 1440 X 900.

 

I asked a salesman about this and he said that all the cards they carried would display in that resolution --- but since I have a great distrust for salespeople in general, based on lame previous suggestions in multiple scenarios, I did not heed his advice and buy the monitor.

 

I have yet to see one LCD monitor display text with any clarity whatsoever in the stores I've been in. I just got through shopping for a new TV for the folks for Christmas as well and was also totally unimpressed with what I saw in the way of 32-37" LCD HDTVs --- lots of "blurriness," for lack of a better term.

 

So what is the accurate story here?

Link to comment
Share on other sites

  • Members

I've currently got a 19" LCD which i run at 1280x1024 res (or something like that, i'm not on my computer atm) and I didn't even consider what resolution my graphics card was capable of doing, I just assume they can do most resolutions you throw at them provided they're not too old.

 

Anyway, it looks fine, and MMMMM the big resolution is great, so much more room, especially when I'm running cubase.

Link to comment
Share on other sites

  • Members
Originally posted by Lucky #9

1440 X 900.

I checked every graphic card in the place to see if they displayed at that resolution but none had that particular resolution listed on the box. High-priced ATI cards, for example, had about 20 different resolutions listed, but not 1440 X 900.

I went through the same confusion, but as it turns out my lowend ATi 7000(64mb) displayed 1440x900 just fine after I hooked the monitor up, so did my girlfiends old 34mb NVidea card when I was doing her PC over, but my really old Ati Rage fury(32mb) made only to 1280X1024 tops, but it looks fine anyway. You should be fine with just about any decent modern card these days.

Link to comment
Share on other sites

  • Members

When you go to diplay properties/settings in windows, the screen resolution that you will find there is mostly limited to the modes that your display adapter can do with the specific monitor attached, therefore you might see max of only 1280x1024. However when you put in a monitor capable of other resolutions, those other modes will come up. For example, my geforce 3 T200 right now runs at 1280x1024 on my CRT monitor, but when I go to the dedicated Display adapter tab and uncheck "hide modes this monitor cannot support" you will see that this vidcard can support up to 2048x1536. Any Nvidia geforce card (1,2,3,4 etc.) or their ATi counterparts will easily accomodate 1440x900 although it might not be exactly that, it might be 1600x900.

 

So your probably safe with any current video card.

Link to comment
Share on other sites

  • Members

I ran into that problem when my LCD died on my laptop. I decided to work with an external monitor as it was cheaper. The shop had a nice widescreen monitor but we couldn't get it to work with my laptop as the video driver would not cover the resolution, so I had to settle for a standard 19" with standard aspect ratio.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...