DVI cable vs DVI Adapter - Signal issue

Discussion in 'Newbie Lounge' started by macros951, Sep 1, 2005.

  1. macros951

    macros951 Member

    Joined:
    Mar 7, 2002
    Messages:
    204
    Location:
    sydney
    Just got a R9600XT (DVI) & connecting to a Samsung 730MW LCD (also has DVI)

    Currently using a DVI Adapter (connected to the DVI port on Vid Card) though the existing VGA cable to VGA port on LCD.

    Would it be better to just get a DVI cable to connect DVI -> DVI?
    Would this provide better image quality?
    When viewing display properties its showing as analog. If change to DVI->DVI would this become digital?

    Did a search but still wanting to clarify.

    Thanks guys & gals
     
    Last edited: Sep 2, 2005
  2. annandin

    annandin Member

    Joined:
    Jul 11, 2004
    Messages:
    912
    Location:
    Hobart
    DVI -> DVI would give a better picture one would assume.
     
  3. SLATYE

    SLATYE SLATYE, not SLAYTE

    Joined:
    Nov 11, 2002
    Messages:
    26,849
    Location:
    Canberra
    (1) Yes
    (2) Yes
    (3) Yes

    I wish all questions were that simple.

    In more detail:

    (1) DVI-DVI means the signal is digital all the way from the GPU to the monitor = no quality loss. DVI-VGA means it gets converted to analogue and back again, losing information in the process (lower quality). It's also subject to interference in the cable (digital tends not to be, unless it's massive interference).

    (2) As above.

    (3) DVI-D (what LCDs use) is straight digital, so unless something really wierd is going on it'll show up as digital.
     
  4. OP
    OP
    macros951

    macros951 Member

    Joined:
    Mar 7, 2002
    Messages:
    204
    Location:
    sydney
    THanks for the fast responses. :thumbup:

    I had a feeling this would be the case but just wanted to confirm.
     
  5. OP
    OP
    macros951

    macros951 Member

    Joined:
    Mar 7, 2002
    Messages:
    204
    Location:
    sydney
    A small issue has arisen which I'm hoping I can get some additional assistance with.

    It turns out my LCD has DVI-D & Vid card is DVI-I. I've purchased a DVI-D cable (DVI-I has the extra pins which wont connect to the LCD) but when I hook it up I'm not getting any signal on my monitor. It just says check cable or something like that.

    Is this a compatibility issue or something else? Any idea?

    Thanks again
    ----
    I've just found that compatibility wont be an issue but now I need to find out why it wont work. Ideas would still be appreciated?
     
    Last edited: Sep 2, 2005
  6. SLATYE

    SLATYE SLATYE, not SLAYTE

    Joined:
    Nov 11, 2002
    Messages:
    26,849
    Location:
    Canberra
    All DVI-based LCDs are DVI-D - you just plug them into the DVI-I port anyway (some pins won't be connected - they're the ones used by DVI-A)

    That's an interesting problem. Is that the only monitor connected to the video card? If it is, you may find that it takes a while to actually come out of standby mode (on my computer, monitors on the VGA port start instantly when I turn the computer on. The LCD on the DVI port waits about 5 seconds before waking up).
     
  7. lagmaster

    lagmaster Member

    Joined:
    Oct 8, 2004
    Messages:
    3,068
    A question, what kind of quality loss is there? I mean, I have a VGA CRT, and there is no real fuzz or something that i can tell, so what are the properties of quality loss?
     
  8. SLATYE

    SLATYE SLATYE, not SLAYTE

    Joined:
    Nov 11, 2002
    Messages:
    26,849
    Location:
    Canberra
    In reality the loss is very small - and even smaller for CRTs (it hurts LCDs more because they have to convert the analogue signal back to digital. CRTs don't).

    The really big advantage for LCDs using DVI is that you never have to adjust where the picture is - it always fits perfectly in the middle of the screen and goes right to the edges. That rarely happens for CRTs, and rarely happens for LCDs on VGA (although they're getting better. The auto-adjust normally does a good job).
     
  9. lagmaster

    lagmaster Member

    Joined:
    Oct 8, 2004
    Messages:
    3,068
    Oh ok, thanks. My dad has an HP LCD on vga and when his computer boots the image is off the screen to the left a bit. Does this mean that DVI doesn't use timing formulas etc?
     
  10. SLATYE

    SLATYE SLATYE, not SLAYTE

    Joined:
    Nov 11, 2002
    Messages:
    26,849
    Location:
    Canberra
    With DVI, each pixel is individually defined. The video card just says "first pixel is ***. Second pixel is ***. Third pixel is ***" and so on, until the whole screen is filled. Then it starts again. Because the LCD has clearly defined pixels, it can map that across directly.

    On CRTs, the 'picture' is just one big waveform. The monitor really has to guess where it starts and ends. That tends to result in the picture not being perfectly centered (you can adjust it - essentially changing where the electronics look for the start of the wave).

    I'm not sure how to explain that properly, so that'll have to do for now.
     
  11. lagmaster

    lagmaster Member

    Joined:
    Oct 8, 2004
    Messages:
    3,068
    That's ok, it makes perfect sense! I think my mum is saying that there is a possibility that I will get an LCD for Christmas. Hopefully by then I will have a DVI Video card. Then again the LCD may be D-SUB so I guess it's ok. (I couldn't live with anything below 1280x1024 though... :( )
     
  12. SLATYE

    SLATYE SLATYE, not SLAYTE

    Joined:
    Nov 11, 2002
    Messages:
    26,849
    Location:
    Canberra
    I think even cheap LCDs are increasingly heading towards being DVI-based. By the time you get one, there's a fair chance that it'll be DVI (which is great for use with a DVI video card, especially since it frees up the D-sub port for a second monitor).

    Having 1280x1024 won't be an issue. Only 15" LCDs use a lower resolution, and they're becoming extremely rare.
     
  13. MrHanky

    MrHanky Member

    Joined:
    Jun 13, 2002
    Messages:
    819
    Location:
    Melbourne, Aus
    The 'adapter' you purchased isn't really an adapter at all- all it does is pass the DVI-A signal directly to the correct pins on the 15 pin D-SUB (vga) connector...

    DVI-D = Digital signal
    DVI-A = Analogue (same as VGA) signal, included in some extra pins on the DVI connector
    DVI-I = A connector which has both the DVI-D pins & the DVI-A pins, meaning you can hook up the 'adapter' (to get VGA straight out of the VGA plug...) or a DVI-D cable to connect straight to a digital device (LCD, projector etc.)

    DVI-D is the superior method of LCD connectivity in every regard;

    When you use your 'adapter', you're converting digital information in the PC about the image to be displayed to analogue, transmitting it across lossy cable, and then re-converting it back to digital to be displayed on the LCD matrix.

    DVI-D eliminates the analogue phase altogether. If your LCD is not recognising the DVI signal it *could* be faulty? Just a thought... mine worked straight away :confused:
     
  14. FLB

    FLB Member

    Joined:
    Jan 30, 2003
    Messages:
    3,580
    Location:
    Adelaide
    This might sound like a silly question but... Does your LCD monitor have a switch on the back to select D-Sub or DVI? or perhaps thru the OSD menu from the front panel? I know with mine I need to select the correct input source.
     
  15. Remote Man

    Remote Man Member

    Joined:
    Jul 23, 2002
    Messages:
    1,035
    Location:
    Melbourne.vic.au
    Just while we're talking about DVI I'll ask this question,
    I heard that DVI causes lag in the transmission but I can't see how unless it's getting converted on the video card or something.
    So does it cause lag? and if so how?
     

Share This Page

Advertisement: