I'm trying to generate "crumpled" images using images obtained from a flatbed bed scanner.
By following the method described in the paper[Link] in the section 3.1. I've written the code generate the perturbed mesh .
This is the code to generate the perturbed mesh.
import numpy as np
import cv2
import matplotlib.pyplot as plt
img = cv2.imread('/home/lab/render_docs/scan/4.png')
mr = img.shape[0]
mc = img.shape[1]
xx = np.arange(mr-1, -1, -1)
yy = np.arange(0, mc, 1)
[Y, X] = np.meshgrid(xx, yy)
ms = np.transpose(np.asarray([X.flatten('F'), Y.flatten('F')]), (1,0))
perturbed_mesh = ms
nv = np.random.randint(20) - 1
for k in range(nv):
#Choosing one vertex randomly
vidx = np.random.randint(np.shape(ms)[0])
vtex = ms[vidx, :]
#Vector between all vertices and the selected one
xv = perturbed_mesh - vtex
#Random movement
mv = (np.random.rand(1,2) - 0.5)*20
hxv = np.zeros((np.shape(xv)[0], np.shape(xv)[1] +1) )
hxv[:, :-1] = xv
hmv = np.tile(np.append(mv, 0), (np.shape(xv)[0],1))
d = np.cross(hxv, hmv)
d = np.absolute(d[:, 2])
d = d / (np.linalg.norm(mv, ord=2))
wt = d
curve_type = np.random.rand(1)
if curve_type > 0.3:
alpha = np.random.rand(1) * 50 + 50
wt = alpha / (wt + alpha)
else:
alpha = np.random.rand(1) + 1
wt = 1 - (wt / 100 )**alpha
msmv = mv * np.expand_dims(wt, axis=1)
perturbed_mesh = perturbed_mesh + msmv
perturbed_mesh = mesh_perturb(img)
img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)
dst = cv2.remap(img, perturbed_mesh[:, 0], perturbed_mesh[:, 1], cv2.INTER_CUBIC)
cv2.imshow('perturbed_im', dst)
cv2.waitKey(0)
cv2.destroyAllWindows()
When I run the code I get the following error:
Traceback (most recent call last):
File "pert_mesh.py", line 58, in <module>
dst = cv2.remap(img, perturbed_mesh[:, 0], perturbed_mesh[:, 1], cv2.INTER_CUBIC)
cv2.error: OpenCV(3.4.2) /io/opencv/modules/imgproc/src/imgwarp.cpp:1728: error: (-215:Assertion failed) dst.cols < 32767 && dst.rows < 32767 && src.cols < 32767 && src.rows < 32767 in function 'remap'
I understand that error is due to perturbed_mesh
being too large but how do I distort the image with its original size ?
This is how the perturbed mesh looks like:
This is the screenshot from the paper illustrating the synthetic image generation:
Sample source image for testing: https://i.stack.imgur.com/26KN4.jpg