gt;A better way to perform scaling (it is more capable and gives better
Sorry for this delayed message, but I was off for a couple weeks...
I'm now using these calls for image scaling and they do work better.
However, the way QuickTime processes CMYK turned out not to be
acceptable. You called black processing "sophisticated," but by
having QuickTime ignore it completely causes loss of information
in the color conversion. Bolting on ColorSync after the fact would
be too late. I've submitted a bug report to Apple explaining the
A good online source for what I'm talking about is here...
...QT's current algorithm must cap convertion values to 255 for each
channel, since it's using a cruder add/subtract calculation, when
multiplication/division is required for accuracy.
As for QT 6.0's use of codecLosslessQuality for bicubic interpolation,
I'm only able to get it to work while using Graphics Importers, using
GraphicsImportSetQuality. I had to read Ice Floe Dispatch #8 before I
could grok what you were refering to as a "sequence"; I've tried using
both FDecompressImage and DecompressSequenceFrameS with bad results.
The former call still does point-sampling, and the latter always
produces garbage if I specify anything other than the identity matrix.
Here's the code excerpt I'm trying to get to work... The following
takes two GWorlds, inOldWorld and inNewWorld, as source and destination.
The source is either k24RGBPixelFormat, k8IndexedGrayPixelFormat, or
k32ARGBPixelFormat. (The last is after an in-place manual conversion
from CMYK.) The destination is k24RGBPixelFormat. Error checking
removed for clarity. Code fails under QT 6.0.3 under OS 9, and QT 6.3
under Windows XP. (My code base is also Carbon-compliant, but I don't
have access to a OS X system to test against just now.)
PixMapHandle theInMap, theOutMap;
/* ImageSequence theSequence;
CodecFlags theResults; */
theInMap = GetGWorldPixMap(inOldWorld);
theOutMap = GetGWorldPixMap(inNewWorld);
theOutMap, NULL, &scaleMatrix, srcCopy, NULL, NULL, NULL,
codecLosslessQuality, anyCodec, (**imageDesc).dataSize,
/* DecompressSequenceBeginS(&theSequence, imageDesc, NULL, 0,
(CGrafPtr)inNewWorld, NULL, NULL, &scaleMatrix, srcCopy,
NULL, 0, codecLosslessQuality, NULL);
(**imageDesc).dataSize, 0, &theResults, NULL);
According to the QuickTime documentation, the Ice Floe note, as
well as info in the "What's New in QT 6" document, I appear to
be doing everything correctly... Any suggestions as to what I
could be doing wrong? I've tried changing "anyCodec" to specify
best fidelity, using MaxQuality instead of Lossless, using
ditherCopy instead of srcCopy, and expliciting passing the
identifier for the Raw/None codec. None of the variations worked.