01-25-2018, 06:45 PM
I tried using Python-fu clothify(). It worked the first time and it works great.
First I tried it on a smaller image, 1,000 x 1,000 px (viewed in a 6 x 6 in area on my monitor). Then I tried it on a larger image, 2,000 x 2,000 px (still viewed in a 6 x 6 in area on my monitor).
The resulting 'larger' image looks different when viewed in a 6 x 6 in area, but if I zoom in the 'clothify' effect looks the same. So what I see in this is that clothify() is image size agnostic. The result can be seen on a per area basis, not per image area. Is there a way to adjust clothify() so that its effect is spread out over a larger area... for a larger image ?
First I tried it on a smaller image, 1,000 x 1,000 px (viewed in a 6 x 6 in area on my monitor). Then I tried it on a larger image, 2,000 x 2,000 px (still viewed in a 6 x 6 in area on my monitor).
The resulting 'larger' image looks different when viewed in a 6 x 6 in area, but if I zoom in the 'clothify' effect looks the same. So what I see in this is that clothify() is image size agnostic. The result can be seen on a per area basis, not per image area. Is there a way to adjust clothify() so that its effect is spread out over a larger area... for a larger image ?