1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

HDMI vs DVI?

Discussion in 'Monitors' started by Jack Bauer, 24 May 2006.

  1. Jack Bauer

    Wise Guy

    Joined: 6 Nov 2005

    Posts: 1,673

    Location: Oxford

    Bear with me, I might not explain this very well :p

    Ive been looking at HD LCD TVs. They come with a HDMI input and a DVI.

    However if I plug my cpu into the DVI port the TV has a max res of eg 1300 ish.
    If I use the HDMI I can do 1080i, which has a res of 1920x1080

    am I understanding this right? Is this the only differnce of using a DVI input?
     
  2. Fr0dders

    Caporegime

    Joined: 18 Oct 2002

    Posts: 33,300

    Location: West Yorks

    what are you doing putting your processor anywhere near your TV ?
     
  3. Jack Bauer

    Wise Guy

    Joined: 6 Nov 2005

    Posts: 1,673

    Location: Oxford

    Apart from the fact that im speaking hyperthetically, Im looking at buying an LCD tv which will be fine near my CPU
     
  4. McDaniel

    Capodecina

    Joined: 15 Aug 2003

    Posts: 19,872

    Location: Southend-on-Sea

    i think he was trying to point out the fact the CPU is in the motherboard, and wouldnt plug into any sort of DVI - i think you mean your Graphics Card.
     
  5. Jack Bauer

    Wise Guy

    Joined: 6 Nov 2005

    Posts: 1,673

    Location: Oxford

    Realy, I new I was doing something wrong :eek:
    Sorry by CPU I was refering to computer, bad habit.

    Can any one confirm the above for me or put me right If im not understanding it
     
  6. utherpendragon

    Hitman

    Joined: 4 May 2006

    Posts: 508

    DVI inputs often have a bandwidth limit on it, by lowering the refresh rate it should go slightly higher. Also to reach any higher you will need a dual link DVI port on your motherboard/graphics card, HDMI doesn't have this problem as bandwidth is speced very high.
     
  7. Jack Bauer

    Wise Guy

    Joined: 6 Nov 2005

    Posts: 1,673

    Location: Oxford

    My cards a x1800xt so I have both 2xDVIs and a HDMI. Its more the TV thats my problem. Some say they are HDTVs but dont have a HDMI input. Can I still do 1080i (or P which ever it is) if it only has a DVI input?
    Also they state a max res of 1330 ( something like that) however they say they can do 1080i but to my understanding this is 1920 res. Im just a little confused
     
  8. utherpendragon

    Hitman

    Joined: 4 May 2006

    Posts: 508

    what is the tv's native res? Thjere's a difference between native resolution and what it can accept. I very much doubt you have a 1920x1080 panel so the input will be scaled down.
     
  9. Jack Bauer

    Wise Guy

    Joined: 6 Nov 2005

    Posts: 1,673

    Location: Oxford

    I dont have an LCD TV at all, Im trying to understand what im buying before I get it. If you look up the spec of HDTV it has a 1920x1080 res. Therefore any TV rated as 1080i can do a res of 1920x1080. My question is can I do 1080i thought a DVI input or do I have to use a HDMI input?
     
  10. Baddass

    Don

    Joined: 12 Jan 2003

    Posts: 20,029

    Location: UK

    HD signals are typically either 720p or 1080i. 720p referes to a 'progressive' image which is 1280 pixels on 720 vertical lines. These are often referred to as 720 lines and so you need a screen which has a native resolution of at least 1280 x 720 to show 720p HD content. 1080i involves two 'interlaced' halves of the image at a resolution of 1920 x 1080 (hence the 1080i = 1080 lines).

    the native resolution of the LCD/ Plasma TV is still 1366 x 768 or whatever it might be. However, it can accept a 1080i HD input but will scale this down internally to a resolution it can handle. This does give a better picture though, but a 1366 x 768 can really only truly operate at 720p HD level as it only supports 720 lines not the full 1080i. You would need a larger TV to display a true 1080i image, and obviously you'd need a 1080i broadcast instead of a 720p broadcast. A lot of Sky's HD is going ot be 720p for instance, and any more would be pretty much wasted over here in the UK anyway with the TV support we have.


    So in answer to your question, you can still send a 1080i HD input over both DVI and HDMI, however, because the native resolution of the TV is still 1366 x 768, it will get scaled down anyway.
     
  11. THC_SsSsSnake

    Wise Guy

    Joined: 7 Jan 2003

    Posts: 1,469

    Location: Leicester

    do dedicated tft monitors generally give a better iq than hdlcdtv's when both used for pc gaming?
     
  12. Jack Bauer

    Wise Guy

    Joined: 6 Nov 2005

    Posts: 1,673

    Location: Oxford

    cheers baddass
     
  13. MichaelHo

    Wise Guy

    Joined: 19 Jan 2006

    Posts: 1,237

    Location: London, UK

    Jack B i have recently bought a HDTV LCD monitor and i would advise you to get HDTV with 1:1 VGA mapping, so u can jus link ** pc via vga which will give excellent quality anyway as its 1:1 mapping with the screen. Therefore u can save the HDMI input for something else e.g. xbox 360, sky HD etc. my lcd screen is running at a pc res of 1360 x 768.
     
  14. Jack Bauer

    Wise Guy

    Joined: 6 Nov 2005

    Posts: 1,673

    Location: Oxford

    Not to sure exactly what 1:1 VGA mapping is but I will google it and have a read when I get home. Will this give image quality as good as 1080i. Realy this is my only issue as I intend everything to run though my HTPC ( this is already build, just need the TV to go with it : / ) so inputs to the TV aint too important to me at the expense of image quality.
     
  15. MichaelHo

    Wise Guy

    Joined: 19 Jan 2006

    Posts: 1,237

    Location: London, UK

    i think it depends on what resolution your tv will go up to, bigger tvs will have bigger res i think.
     
  16. Fr0dders

    Caporegime

    Joined: 18 Oct 2002

    Posts: 33,300

    Location: West Yorks

    1:1 pixel mapping refers to the fact that most PCs will drive a resolution of 1360 x 768 but most LCD tvs have a native resolution of 1366 x 768. That 6 pixel difference can cause a problem. Your TV may force you to stretch the image over the extra pixels, meaning that 1 pixel from your signal, takes up 1.0001 pixels on screen bluring the image

    this is very much the same effect you get if you run your monitor outside of its "native resolution"

    with 1:1 pixel mapping the extra 6 pixels just remain black, ensuring that 1 pixel from your monitor signal is 1 pixel on screen.
     
  17. THC_SsSsSnake

    Wise Guy

    Joined: 7 Jan 2003

    Posts: 1,469

    Location: Leicester

    Mr Lol very interesting, where in leicester?