HDMI vs DVI?

Associate
Joined
6 Nov 2005
Posts
1,675
Location
Oxford
Bear with me, I might not explain this very well :p

Ive been looking at HD LCD TVs. They come with a HDMI input and a DVI.

However if I plug my cpu into the DVI port the TV has a max res of eg 1300 ish.
If I use the HDMI I can do 1080i, which has a res of 1920x1080

am I understanding this right? Is this the only differnce of using a DVI input?
 
Soldato
Joined
15 Aug 2003
Posts
19,916
Location
Essex
i think he was trying to point out the fact the CPU is in the motherboard, and wouldnt plug into any sort of DVI - i think you mean your Graphics Card.
 
Associate
OP
Joined
6 Nov 2005
Posts
1,675
Location
Oxford
Realy, I new I was doing something wrong :eek:
Sorry by CPU I was refering to computer, bad habit.

Can any one confirm the above for me or put me right If im not understanding it
 
Associate
Joined
4 May 2006
Posts
499
DVI inputs often have a bandwidth limit on it, by lowering the refresh rate it should go slightly higher. Also to reach any higher you will need a dual link DVI port on your motherboard/graphics card, HDMI doesn't have this problem as bandwidth is speced very high.
 
Associate
OP
Joined
6 Nov 2005
Posts
1,675
Location
Oxford
My cards a x1800xt so I have both 2xDVIs and a HDMI. Its more the TV thats my problem. Some say they are HDTVs but dont have a HDMI input. Can I still do 1080i (or P which ever it is) if it only has a DVI input?
Also they state a max res of 1330 ( something like that) however they say they can do 1080i but to my understanding this is 1920 res. Im just a little confused
 
Associate
OP
Joined
6 Nov 2005
Posts
1,675
Location
Oxford
I dont have an LCD TV at all, Im trying to understand what im buying before I get it. If you look up the spec of HDTV it has a 1920x1080 res. Therefore any TV rated as 1080i can do a res of 1920x1080. My question is can I do 1080i thought a DVI input or do I have to use a HDMI input?
 
Man of Honour
Joined
12 Jan 2003
Posts
20,567
Location
UK
HD signals are typically either 720p or 1080i. 720p referes to a 'progressive' image which is 1280 pixels on 720 vertical lines. These are often referred to as 720 lines and so you need a screen which has a native resolution of at least 1280 x 720 to show 720p HD content. 1080i involves two 'interlaced' halves of the image at a resolution of 1920 x 1080 (hence the 1080i = 1080 lines).

the native resolution of the LCD/ Plasma TV is still 1366 x 768 or whatever it might be. However, it can accept a 1080i HD input but will scale this down internally to a resolution it can handle. This does give a better picture though, but a 1366 x 768 can really only truly operate at 720p HD level as it only supports 720 lines not the full 1080i. You would need a larger TV to display a true 1080i image, and obviously you'd need a 1080i broadcast instead of a 720p broadcast. A lot of Sky's HD is going ot be 720p for instance, and any more would be pretty much wasted over here in the UK anyway with the TV support we have.


So in answer to your question, you can still send a 1080i HD input over both DVI and HDMI, however, because the native resolution of the TV is still 1366 x 768, it will get scaled down anyway.
 
Associate
Joined
19 Jan 2006
Posts
1,252
Location
London, UK
Jack B i have recently bought a HDTV LCD monitor and i would advise you to get HDTV with 1:1 VGA mapping, so u can jus link ** pc via vga which will give excellent quality anyway as its 1:1 mapping with the screen. Therefore u can save the HDMI input for something else e.g. xbox 360, sky HD etc. my lcd screen is running at a pc res of 1360 x 768.
 
Associate
OP
Joined
6 Nov 2005
Posts
1,675
Location
Oxford
Not to sure exactly what 1:1 VGA mapping is but I will google it and have a read when I get home. Will this give image quality as good as 1080i. Realy this is my only issue as I intend everything to run though my HTPC ( this is already build, just need the TV to go with it : / ) so inputs to the TV aint too important to me at the expense of image quality.
 
Caporegime
Joined
18 Oct 2002
Posts
33,396
Location
West Yorks
1:1 pixel mapping refers to the fact that most PCs will drive a resolution of 1360 x 768 but most LCD tvs have a native resolution of 1366 x 768. That 6 pixel difference can cause a problem. Your TV may force you to stretch the image over the extra pixels, meaning that 1 pixel from your signal, takes up 1.0001 pixels on screen bluring the image

this is very much the same effect you get if you run your monitor outside of its "native resolution"

with 1:1 pixel mapping the extra 6 pixels just remain black, ensuring that 1 pixel from your monitor signal is 1 pixel on screen.
 
Back
Top Bottom