-
Bug
-
Resolution: Unresolved
-
P3: Somewhat important
-
None
-
4.7.1
-
None
Here is the testcase:
void KisPrescaledProjectionTest::testQtScaling() { // Create a canvas image QImage canvas(6, 6, QImage::Format_ARGB32); canvas.fill(0); // Image we are going to scale down QImage image(7, 7, QImage::Format_ARGB32); QPainter imagePainter(&image); imagePainter.fillRect(QRect(0,0,7,7),Qt::green); imagePainter.end(); QPainter gc(&canvas); // Scale down transformation qreal scale = 3.49/7.0; gc.setTransform(QTransform::fromScale(scale,scale)); // Draw a rect scale*(7x7) gc.fillRect(QRectF(0,0,7,7), Qt::red); // Draw an image scale*(7x7) gc.drawImage(QPointF(), image, QRectF(0,0,7,7)); gc.end(); canvas.save("canvas.png"); // Create an expected result QImage expectedResult(6, 6, QImage::Format_ARGB32); canvas.fill(0); QPainter expectedPainter(&expectedResult); expectedPainter.fillRect(QRect(0,0,4,4), Qt::green); expectedPainter.end(); QCOMPARE(canvas, expectedResult); }
I think it is expected that both objects: filled rectangle and the image will have the same size. But currently this is not so. Image is rounded according to mathematical rules, while filled rect is rounded to the next integer (like ceil). That is in this example, scaled image (green rect) will have the size 3x3, and the filled rect (red rect) will have the size 4x4.
There are no comments about this in the documentation. According to the chapter about QRectF rendering, it looks like both of the rectangles should have the size 4x4.
The result image is in the attachment.