{ "numMessagesInTopic": 10, "nextInTime": 928, "senderId": "lOV1AOXkFRcN3-ovzxx7tVeeXx_krvCzkRHwlQl_r1D0x6FR-_czwlm0fsintK5R2jbcF77-uPtIyCxvZWMM3u96tUUra0slv6rYMg", "systemMessage": false, "subject": "Re: [IBM_T2X_LCD] Re: New DisplayPort Standard for PCs, Monitors, TV Displays and Projectors", "from": "David Evans <key-yahoo@...>", "authorName": "David Evans", "msgSnippet": "... In general, using base 2 k instead of base 10 k is a computer thing. The main push for using 1000 instead of 1024 has been from the marketing people arena", "msgId": 927, "profile": "makyen1", "topicId": 914, "spamInfo": { "reason": "12", "isSpam": false }, "replyTo": "LIST", "userId": 159820010, "messageBody": "
> --- In IBM_T2X_LCD@yahoogroups.com, David Evans <key-yahoo@z...> wrote:In general, using base 2 k instead of base 10 k is a computer thing.
\n> > fluppeteer wrote:
\n> >
\n> > > --- In IBM_T2X_LCD@yahoogroups.com, "sonar211" <vlado79@r...> wrote:
\n> > > > The Main Link bandwidth enables data transfer at up to 10.8
\n> > > > Gbits/second using a total of four lanes.
\n> > >
\n> > > Quick bit of maths, 3840x2400x24x48Hz = 10.6 Gbits/second.
\n> > > Coincidence, or DG6? :-)
\n> >
\n> > Ummm... you used 1000 (base 10 "k"), not 1024 (base 2 "k"). Using
\n> 1024,
\n> > 3840x2400x24x48Hz = 9.89 Gb/s.
\n> > [I use 1024 for the rest of this email.]
\n>
\n> Hi David. It's *probably* in decimal multiples; I don't see
\n> why it should be mapped onto powers of two (in the way that
\n> memory is, for example). It's not being used (except abstractly)
\n> to transfer a number of megabytes, for examples. I could be wrong
\n> though.
\n
\n
> > to compare. It should be noted that this does _NOT_ include anyI would hope so. However, the numbers that I used came from IBM's
\n> > overhead for sync signals. Assuming that the DG5 uses the same
\n> > percentage of sync overhead as the DG3 uses for it's 2x 1920x2400 mode,
\n> > 33.32%, the actual bandwidth required from DVI to get 3840x2400x24x48Hz
\n> > *(1+0.3332) is 13.18Gb/s.
\n> > [This assumption of same overhead appears to be invalid due to the DG5
\n> > using 1 dual-link DVI + 1 single link DVI to get 3840x2400x24x48Hz.]
\n>
\n> I'm hoping that they've given up on trying to make it all look
\n> like a signal for CRTs.
\n
\n
> For a new standard, it would probably beAgreed, but I think it unlikely that this will be the case. The main
\n> better to require that CRTs have an internal frame buffer (from
\n> which they can mess with the timings to their hearts' content)
\n> anyway; there are too many TFTs out their to be wasting lumps
\n> of the bandwidth on a fly-back which never happens.
\n
\n
> > Yeah, don't you just _love_ to see "new", "future-proof", "industryI agree that we can go back more years than I stated. I was using the
\n> > standard", "broad application" specifications that can only just
\n> barely
\n> > handle 2 year old technology?
\n>
\n> At least four years old, allowing for the original Bertha. :-)
\n
\n
> That said, it *can* handle it, unlike dual link DVI, and itUnfortunately, it does appear that it requires a special variant. Based
\n> doesn't seem to require a special variant like the dual link
\n> versions of DVI and HDMI do. Given nVidia's problems getting
\n> a decent DVI signal, here's hoping they can do the next one
\n> right...
\n
\n
> I'll look forward to seeing the spec; for now I've no idea howI also look forward to seeing the spec. It will be interesting to see
\n> their plans for increasing the bandwidth limits are implemented
\n> (or indeed the details of the rest of the spec).
\n
\n
> Shame this3D displays: Orders of magnitude higher, assuming you are not just
\n> turns up just as everyone finally gets over to DVI, but there
\n> you go. I'm not sure how soon any future-proofing will need to
\n> happen - higher resolution than a T221 seems unlikely to happen
\n> for a few years (not that I have internal industry knowledge),
\n> and it's not clear what going bigger would gain before you start
\n> to get to holographic levels (which is *much* higher resolution, if
\n> transmitted raw).
\n
\n
\n
> I guess it could be used for more conventionalDVI, even analog, can be used for this purpose.
\n> (stereoscopic/lenticular) 3D display, though.
\n
\n
>Alternate pixel representation methods can save bandwidth, but usually
\n> I realize that people with big Phase One studio cameras (or, just
\n> about, the latest digital consumer cameras) could, just, make use of
\n> higher resolution. Mapping the RGB channels better would even it out
\n> a bit, though, and although I like a 1:1 pixel mapping from my 300D
\n> even I would argue that the benefits over zooming in are limited once
\n> you can show most of the image in one go. There's always *two* T221s,
\n> or a merged projector wall...
\n
\n
> It's all much more promising than if they'd not stretched the limits,Unknown, but appears to be yes.
\n> though. I wonder if there's better support for >8 bits per channel?
\n
\n