{ "numMessagesInTopic": 23, "nextInTime": 1852, "senderId": "OGTrQqMSAWiDxbaZtyTNGODemq0HyPgaRYZtcW9mijzBmwAT5XzNVcOI9uRzQz6UDrXF4_Dx-OKHZNTcsVagPFNJUL56AZQbzAqFacQ--w", "systemMessage": false, "subject": "Re: T221 + FX 4500 + Linux?", "from": ""fluppeteer" <yahoo@...>", "authorName": "fluppeteer", "msgSnippet": "I m guessing here, but I suspect the problem might be with the hot plug detect rather than with the EDID itself. The hot plug pin is only supposed to go high", "msgId": 1851, "profile": "fluppeteer", "topicId": 1761, "spamInfo": { "reason": "0", "isSpam": false }, "replyTo": "LIST", "userId": 192443393, "messageBody": "
--- In IBM_T2X_LCD@yahoogroups.com, Sandon Van Ness <sandon@...> wrote:
\n>
\n> Try:
\n>
\n> Option "ConnectedMonitor" "DFP-0, DFP-1"
\n> Option "ExactModeTimingsDVI" "true"
\n> Option "ModeValidation" "NoMaxPClkCheck, NoEdidMaxPClkCheck,
\n> Noedidmodes, AllowNon60HzDFPModes, NoMaxSizeCheck, NoHorizSyncCheck,
\n> NoVertRefreshCheck, NoEdidDFPMaxSizeCheck, NoEdidModes"
\n>
\n> This should pretty much completely disable EDID and just output
\n> regardless, keep in mind you will need modelines for the resolutions
\nthough.
\n>
\n> David Egts wrote:
\n> > Hi everyone,
\n> >
\n> > I've made some headway.
\n> >
\n> > After a lot of fiddling, I'm now able to drive the T221 with two
\n> > stripes of 1920x2400 at 25 Hz using single link DVI. In this case I'm
\n> > using two GPUs, but I'm pretty confident that I can do this off two
\n> > channels of one GPU.
\n> >
\n> > The trick was to use the Gefen DVI Detective to clone the EDID of A1
\n> > and then put move the DVI Detective to sit in between the first
\n> > channel of the second GPU and A2. Once I did this it worked
\nperfectly.
\n> >
\n> > The thing that's a bummer is that it seems like I must use the DVI
\n> > Detective instead of having the NVIDIA GPU simply push the pixels out
\n> > no matter what. I tried setting "UseEdid" and "UseEdidFreqs" to
\n> > false, "ExactModeTimingsDVI" to true, etc., but I've struck out
\n> > figuring out what the magic setting is to simply tell the GPU to push
\n> > out the pixels.
\n> >
\n> > If anyone has any suggestions on how to do this, that would be great!
\n> >
\n> > I also tried this with two ATI FireGL X2s and it worked perfectly on
\n> > the first try with no DVI Detective needed.
\n> >
\n> > Thanks,
\n> >
\n> > Dave
\n> >
\n> >
\n> >
\n> >
\n> >
\n> >
\n> > Yahoo! Groups Links
\n> >
\n> >
\n> >
\n> >
\n> >
\n> >
\n>