• Hi, I am the owner and main administrator of Styleforum. If you find the forum useful and fun, please help support it by buying through the posted links on the forum. Our main, very popular sales thread, where the latest and best sales are listed, are posted HERE

    Purchases made through some of our links earns a commission for the forum and allows us to do the work of maintaining and improving it. Finally, thanks for being a part of this community. We realize that there are many choices today on the internet, and we have all of you to thank for making Styleforum the foremost destination for discussions of menswear.
  • This site contains affiliate links for which Styleforum may be compensated.
  • STYLE. COMMUNITY. GREAT CLOTHING.

    Bored of counting likes on social networks? At Styleforum, you’ll find rousing discussions that go beyond strings of emojis.

    Click Here to join Styleforum's thousands of style enthusiasts today!

    Styleforum is supported in part by commission earning affiliate links sitewide. Please support us by using them. You may learn more here.

Question about HDMI cable and PS3

Nahmeanz

Senior Member
Joined
Dec 27, 2007
Messages
270
Reaction score
1
Originally Posted by haganah
I ordered a PS3 for someone that should be arriving soon. I went to monoprice to get a cable and there are a million choices. Can someone tell me the differences? And is there an advantage to monoprice over blue jeans?

And next question (sorry teddie for derailing your thread), but what's a good game to get for a girl? She said she liked mortal kombat but is prone to saying stupid things that don't pan out.


http://www.monoprice.com/products/pr...seq=1&format=2

The other ones are more expensive because they have more shielding for in-wall installations.
 

A Y

Distinguished Member
Joined
Mar 12, 2006
Messages
6,084
Reaction score
1,038
Originally Posted by haganah
I ordered a PS3 for someone that should be arriving soon. I went to monoprice to get a cable and there are a million choices. Can someone tell me the differences? And is there an advantage to monoprice over blue jeans?
For short runs (less than 10 feet), get the cheapest one that's rated to 1080p, and is thin and flexible. Life is so much better when you not trying to route heavy, stiff cable.
And next question (sorry teddie for derailing your thread), but what's a good game to get for a girl? She said she liked mortal kombat but is prone to saying stupid things that don't pan out.
Some of the download games are a lot of fun and are pretty rewarding for a casual gamer, especially the ones with really cute or beautiful design. I like Echochrome and Loco Roco. There are often demo levels you can download so you can try before you buy. --Andre
 

haganah

Distinguished Member
Joined
Nov 24, 2007
Messages
6,325
Reaction score
30
Originally Posted by Nahmeanz
http://www.monoprice.com/products/pr...seq=1&format=2 The other ones are more expensive because they have more shielding for in-wall installations.
haha awesome - is this rated to 1080p like A Y says? It says HDMI 1.3a...
Originally Posted by A Y
For short runs (less than 10 feet), get the cheapest one that's rated to 1080p, and is thin and flexible. Life is so much better when you not trying to route heavy, stiff cable. Some of the download games are a lot of fun and are pretty rewarding for a casual gamer, especially the ones with really cute or beautiful design. I like Echochrome and Loco Roco. There are often demo levels you can download so you can try before you buy. --Andre
This is becoming a pain. Please tell me that I don't need to buy an internet cable too and that this all feeds off the tv's cable connection. Please.
 

Nahmeanz

Senior Member
Joined
Dec 27, 2007
Messages
270
Reaction score
1
I'm pretty sure all HDMI cables are capable of 1080p. The revisions of HDMI specs dealt with its ability to handle better sound formats.
 

Tokyo Slim

In Time Out
Timed Out
Joined
Apr 28, 2004
Messages
18,360
Reaction score
16
Originally Posted by A Y
I have not confused them. Digital bandwidth is fundamentally limited by analog bandwidth.
And thats what I'm telling you. That is not the problem here. Read the damn links I posted.
Of course they do. The information carried by HDMI is digital --- the video signal itself is digital. However, that information has to be transmitted on a wire. That is all analog.
*sigh* No, it is ELECTRICAL. Analog means something else, as I've explained.
ana·log (an′ə lôg′, -läg′) adjective 1. of a system of measurement in which a continuously varying value, as sound, temperature, etc., corresponds proportionally to another value, esp. a voltage 2. of or by means of an analog computer 3. of or having to do with transmission of a signal that varies continuously and analogously with the waveform of the voice or other source analog TVs, telephones, and recordings 4. using hands, dials, etc. to show numerical amounts, as on a clock
None of those things describe a digital signal. Or electrical transmission of a digital signal. Analog and digital are both transmitted electrically, but electricity itself is not analog or digital.
And your second sentence is wrong, because cables can't tell how you've chosen to encode your information. They only see an analog waveform and will degrade that waveform the same way regardless of whether you're trying to transmit digital information or not.
No its not. If you'd have bothered to read anything I posted instead of assuming I was wrong, you'd know better. The cables used for analog video (I.E. Component video) are much better at carrying signal over distance than HDMI. It has nothing to do with there being more "bandwidth" because even basic HDMI can send more information than component video. Nevertheless, the protocol and wire architecture are more stable over distance than HDMI.
I'm not sure we disagree here. If you have a cable that is incapable of 330 MHz of bandwidth, you will drop bits when you use that cable for 1080p. That was my original point.
No you won't. a 1080p 330mhz refresh rate television doesn't exist. The fact that this is what HDMI2 is certified to do has no bearing on your current hardware, which will refresh at a maximum of 120mhz, and more likely 60mhz (which is standard 1080p). Increasing the BANDWIDTH of the cable won't do you any good. That was my original point, and I explained why. You keep mixing up what the issue is. You seem to not want to read what I write. Much like with HDMI, sending more data to you is meaningless. Since even HDMI 1 already puts out greater than a 1080p signal (1200p), why would increasing the information output matter? It wouldn't. HDMI 2 puts out over 1600p. It will still fade over distance. Why? Because its not a bandwidth issue, its a low voltage mixed with crappy architecture issue.
Guess what? Attenuation is a purely analog phenomena. Attenuation happens because of resistive losses on the cable, low-pass filtering due to the interaction of the cable's capacitance and various resistances (source and termination) in the circuit, and low-pass filtering by the inductance of the cable. All of these effects are analog.
No, attenuation is an electical phenomena. ANALOG has a meaning, and you are misusing it. You continue to misuse it. You will probably always continue to misuse it because nothing I've said makes a difference to you. Analog signal is not what you think it is, and it is definitely not the same as a digital signal.
Attenuation reduces the voltage levels so that the receiving end may not recognize the difference between a 0 and 1.
This is true. But again increasing the bandwidth will not solve this problem. Increasing the voltage might, or refining TMDS, so that the differential is greater but then, you wouldn't be using HDMI anymore. You'd be using something else, and you'd then NEED extra bandwidth because the low voltage minimized differential is what makes HDMI and DVI fast.
HDCP has nothing to do with this.
Says you. Not says a lot of other people who've had HDCP handshaking problems with HDMI1 including bit rate transfer over the HDMI cable. That will cause some pretty significant artifacting and sparklies.
 

A Y

Distinguished Member
Joined
Mar 12, 2006
Messages
6,084
Reaction score
1,038
Originally Posted by haganah
This is becoming a pain. Please tell me that I don't need to buy an internet cable too and that this all feeds off the tv's cable connection. Please.
You do need the PS3 hooked up to the Internet to get the downloadable games.
Originally Posted by Tokyo Slim
No, it is ELECTRICAL. Analog means something else, as I've explained.
Both use electricity, but at high speeds, you must deal with analog electrical phenomena.
Analog and digital are both transmitted electrically, but electricity itself is not analog or digital.
Yes, I know that. However, both are models of reality that are useful for various situations. Using analog models of electricity when dealing with high speed digital signals is one of them. No one's saying they're the same thing.
The cables used for analog video (I.E. Component video) are much better at carrying signal over distance than HDMI. It has nothing to do with there being more "bandwidth" because even basic HDMI can send more information than component video. Nevertheless, the protocol and wire architecture are more stable over distance than HDMI.
You are comparing apples and oranges. The distance you can carry component video has a completely different set of constraints than HDMI because the signals are different. Because HDMI uses high-speed digital signals, it uses up more analog bandwidth in the cable and transmission systems due to the sharp edges of the digital encoding. There is also no protocol or wire architecture for component.
No you won't. a 1080p 330mhz refresh rate television doesn't exist.
You are confused. I wasn't talking about a 330 MHz refresh rate. 330 MHz is the analog bandwidth of the HDMI cable required to carry 1080p. It has nothing to do with refresh rates since refresh rates are encoded within the digital payload of HDMI. It has everything to do with how much analog bandwidth the cable needs in order to properly transmit a legible waveform so that the receiver can decode the digital signal. You have a fundamental misunderstanding of how signal transmission works. I suggest you go and learn about it, because most of the things you're saying make no sense at all. For example:
This is true. But again increasing the bandwidth will not solve this problem. Increasing the voltage might, or refining TMDS, so that the differential is greater but then, you wouldn't be using HDMI anymore. You'd be using something else, and you'd then NEED extra bandwidth because the low voltage minimized differential is what makes HDMI and DVI fast.
It depends on what causes your attenuation. Attenuation caused by resistive losses (cable's too thin) is not a bandwidth issue. Attenuation of the high frequencies necessary to recognize signal edges is a bandwidth issue. As I mentioned in my previous post, this is caused filters formed from the capacitance and inductance of the cable, and the various resistances in the circuit. TMDS does not increase bandwidth or mitigate attenuation. It minimizes noise interference, and seeks to put an average DC level of 0 volts on the wires to minimize power usage. To make a cable with more bandwidth and less loss is a geometry and materials problem.
Says you. Not says a lot of other people who've had HDCP handshaking problems with HDMI1 including bit rate transfer over the HDMI cable. That will cause some pretty significant artifacting and sparklies.
The fact that HDCP has issues wasn't my point. HDCP has nothing to do with the analog bandwidth required in a cable to transmit 1080p. The problems HDCP causes are much higher level and occur in the protocol layer. Sparklies and artifacting are not caused by HDCP. Blank screens are. Sparklies and artifacting happen when you drop bits, which happens when you use an HDMI cable with inadequate analog bandwidth. --Andre
 

Tokyo Slim

In Time Out
Timed Out
Joined
Apr 28, 2004
Messages
18,360
Reaction score
16
Originally Posted by A Y
Both use electricity, but at high speeds, you must deal with analog electrical phenomena.
Why, when a the electricity is not sent as an analog signal as you claimed?
Yes, I know that. However, both are models of reality that are useful for various situations. Using analog models of electricity when dealing with high speed digital signals is one of them. No one's saying they're the same thing.
You did. Several times.
The distance you can carry component video has a completely different set of constraints than HDMI because the signals are different.
Exactly. HDMI is a discreet, low voltage digital signal. And component is an ANALOG signal. Thats what I said. And you can carry component video farther with less loss.
There is also no protocol or wire architecture for component.
YpBR Component is 75ohm coaxial. HDMI is 212ohm (spec1) or 84ohm (spec2) twisted pair. ( ...I think I did that calculation right)
You are confused. I wasn't talking about a 330 MHz refresh rate. 330 MHz is the analog bandwidth of the HDMI cable required to carry 1080p.
I'm not sure where you were getting that. HDMI1 single link runs at 165mhz signal bandwidth. (what you've been talking about incessantly) Which is enough "bandwidth" for a 1920x1200p60 signal to be transmitted which is already higher than what your television can display. The 330Mhz I was talking about is the 1920x1200p330 signal (or a 2560x1600p75) that HDMI2 can transmit. Which is way more data than anything today can use. I was referring to the refresh rate as it applies to the datastream or digital "bandwidth" of HDMI. HDMI2's signal bandwidth is 340Mhz, by the way, not 330.
TMDS does not increase bandwidth or mitigate attenuation.
Never said it does. You didn't read what I wrote. I know what TMDS is, and I proposed a solution to the problem of long-distance dropouts and data recovery on the manufacturing and planning side of HDMI (of course, its too late to think about it logically, they've already screwed it up) It was hypothetical. You've never been able to explain how a fixed voltage on/off protocol with a fixed bitrate that depends first and foremost on your television, would benefit from more data bandwidth when the maximum bandwidth already exceeds what contemporary equipment will ask of it. It would benefit from lower resistance, but that isn't what you are talking about. Regardless of what you think you know, there are very serious limitations to HDMI, they just aren't what you think they are. Its still one of the best short distance solutions for 1080p video and audio transmission and ANY HDMI1 rated cable will be able to transport the signal across a wire within reason. If you are having image problems within the standard lengths of HDMI cable, its probably NOT the cable's fault. Spending more than $15 on a 6-8 foot HDMI cable is pretty much because you want a shiny cable, and not really much to do with performance, there's nothing wrong with that. Its just not necessary, and telling people it is, does them a disservice.
 

A Y

Distinguished Member
Joined
Mar 12, 2006
Messages
6,084
Reaction score
1,038
Originally Posted by Tokyo Slim
Why, when a the electricity is not sent as an analog signal as you claimed?

It's sent as electricity. The analog model is useful for explaining why certain things happen to that signal.

You did. Several times.
Never did. Try rereading them again.

Exactly. HDMI is a discreet, low voltage digital signal. And component is an ANALOG signal. Thats what I said. And you can carry component video farther with less loss.
You are missing an important but subtle distinction. Component video is encoded as analog. This is what you are talking about. HDMI video is encoded as digital. No one disputes either fact.

However, the speed of the digital signals being used to send HDMI down a cable requires a good engineer to consider the analog properties of the digital signals and the cable. This is because many assumptions one usually makes about digital (sharp, clean edges, for example) are breaking down for the kinds of signalling speeds HDMI uses.

YpBR Component is 75ohm coaxial. HDMI is 212ohm (spec1) or 84ohm (spec2) twisted pair. ( ...I think I did that calculation right)
Whatever. That is hardly any kind of protocol, considering how many different kinds of component signalling there is --- not only in color space but also where the syncs are carried. Even you conflated different color spaces schemes by saying "YpBR" which is nonsensical, and non-existent.

The 330Mhz I was talking about is the 1920x1200p330 signal (or a 2560x1600p75) that HDMI2 can transmit. Which is way more data than anything today can use. I was referring to the refresh rate as it applies to the datastream or digital "bandwidth" of HDMI. HDMI2's signal bandwidth is 340Mhz, by the way, not 330.
Whether it's 330 or 165 or 340 MHz, it has nothing to do with the TV's refresh rate. It's the analog bandwidth required to transmit the signal.

You've never been able to explain how a fixed voltage on/off protocol with a fixed bitrate that depends first and foremost on your television, would benefit from more data bandwidth when the maximum bandwidth already exceeds what contemporary equipment will ask of it. It would benefit from lower resistance, but that isn't what you are talking about.
I'm not sure where you're getting this, but I have never linked the bitrate of the protocol to a TV's refresh rate. That's a completely absurd thing to do. All I have said is that if your digital signal needs 330 MHz of analog bandwidth, and you use a cable with less bandwidth than that, you will drop bits.

If you use a cable that cannot support the analog bandwidth required to transmit a 1080p HDMI signal, you will drop bits.

Spending more than $15 on a 6-8 foot HDMI cable is pretty much because you want a shiny cable, and not really much to do with performance, there's nothing wrong with that. Its just not necessary, and telling people it is, does them a disservice.
I told people to get the cheapest cable that is certified for 1080p. The only constraints I put on the cable were to make sure that it's as thin and flexible as possible so it would be easy to handle.

Look at post 32.

--Andre
 

Tokyo Slim

In Time Out
Timed Out
Joined
Apr 28, 2004
Messages
18,360
Reaction score
16
Originally Posted by A Y
Whatever. That is hardly any kind of protocol, considering how many different kinds of component signalling there is --- not only in color space but also where the syncs are carried. Even you conflated different color spaces schemes by saying "YpBR" which is nonsensical, and non-existent.

Sorry, YpBPR. Considering all the spelling mistakes you make, I'd think you could spot me a letter instead of being childish.
 

Tokyo Slim

In Time Out
Timed Out
Joined
Apr 28, 2004
Messages
18,360
Reaction score
16
Originally Posted by A Y
If you use a cable that cannot support the analog bandwidth required to transmit a 1080p HDMI signal, you will drop bits.
And I've explained to you about a dozen times that that is not the problem. And why its not the problem. And how it could not possibly be the problem. The analog bandwidth of a wire is not higher at the beginning of the wire than at the end It has the same bandwidth all the way down the wire. Since the wire already starts out with way more bandwidth than is needed to transmit the signal, why would increasing it make any difference?
 

Tokyo Slim

In Time Out
Timed Out
Joined
Apr 28, 2004
Messages
18,360
Reaction score
16
Ok... whatever.

I still don't know why you are stuck on this whole Mhz thing. Its easy and it makes sense.
I'll say it one more time, as simple as it gets:


HDMI1 = Already transports more information than you television uses unless you have a 120Mhz refresh tv or a tv that has more than 24 bit/16million TrueColor.

HDMI2 = Transports WAY more information than your television uses period. No matter what television you have currently.
 

A Y

Distinguished Member
Joined
Mar 12, 2006
Messages
6,084
Reaction score
1,038
Originally Posted by Tokyo Slim
Sorry, YpBPR. Considering all the spelling mistakes you make, I'd think you could spot me a letter instead of being childish.

If you want to use technical terms, get them right instead of whining about it when someone calls you on them. It's YPbPr, BTW.

Originally Posted by Tokyo Slim
And I've explained to you about a dozen times that that is not the problem. And why its not the problem. And how it could not possibly be the problem. The analog bandwidth of a wire is not higher at the beginning of the wire than at the end It has the same bandwidth all the way down the wire. Since the wire already starts out with way more bandwidth than is needed to transmit the signal, why would increasing it make any difference?

First, cable bandwidths don't change over their length, and I don't even know where you pulled that one from. More importantly, if you have a cable that has more than enough bandwidth to support 1080p, then you have a cable that can support 1080p. What's the problem?

I'm talking about a cable that has inadequate bandwidth. There are plenty of examples of that out there.

Try this: look at the number of digital transitions that you need to run 1080p HDMI. Now figure out the edge speeds necessary to recover those transitions. You now know the analog bandwidth of the cable necessary to transmit a 1080p HDMI cable without dropping bits. Use a cable with less bandwidth than that, you drop bits. More, you're fine.

--Andre
 

A Y

Distinguished Member
Joined
Mar 12, 2006
Messages
6,084
Reaction score
1,038
Originally Posted by Tokyo Slim
Ok... whatever.

I still don't know why you are stuck on this whole Mhz thing. Its easy and it makes sense.

As simple as it gets:


HDMI1 = Already transports more information than you television uses unless you have a 120Mhz refresh tv or a tv that has more than 24 bit/16million TrueColor.

HDMI2 = Transports WAY more information than your television uses period. No matter what television you have currently.


Sure, but that's all beside the point. You want to use a cable that is engineered so that it can carry the bandwidth required by the HDMI spec. If you use an inadequate cable, you're going to drop bits.

BTW, no one transmits 120 Hz (not MHz). That is processing done within the TV.

There were plenty of HDMI cables made before 1080p became popular. Some of them, especially in longer lengths, could not support transmission of 1080p signals because mostly of bandwidth constraints in the cable. All I'm saying is that you should avoid them when you're buying a cable.

--Andre
 

Tokyo Slim

In Time Out
Timed Out
Joined
Apr 28, 2004
Messages
18,360
Reaction score
16
Originally Posted by A Y
First, cable bandwidths don't change over their length, and I don't even know where you pulled that one from. More importantly, if you have a cable that has more than enough bandwidth to support 1080p, then you have a cable that can support 1080p. What's the problem?
******* a. You finally get it! Thats what I've been telling you since we started this. HDMI cables have MORE BANDWIDTH than you need to push 1080p It doesn't matter what they are only "rated to 1080i". If its less than 8 feet, your 1080i cable can push a 1080p signal JUST FINE. 1080i HDMI cable standard was engineered to pass 1080p. Always has been. Just like HDMI2 is designed to pass future standards that don't exist yet. Its ****** engineering and wire architecture that is screwing up distance travel. I've seen commercial grade digital hi-def before, and nobody in their right mind would use anything but Coax to run it. You can run the the same type of low voltage digital signal a hundred yards without boosting it. Twisted pair was marsupialed.
 

Featured Sponsor

How important is full vs half canvas to you for heavier sport jackets?

  • Definitely full canvas only

    Votes: 92 37.4%
  • Half canvas is fine

    Votes: 90 36.6%
  • Really don't care

    Votes: 27 11.0%
  • Depends on fabric

    Votes: 41 16.7%
  • Depends on price

    Votes: 38 15.4%

Forum statistics

Threads
506,958
Messages
10,593,122
Members
224,357
Latest member
CalvinSKing
Top