Ask Your Question
0

dnn resize_layer assertion 'getmemoryshapes'

asked 2018-11-05 12:46:15 -0600

aguila gravatar image

hello! i was testing a model with the opencv dnn module to then implement the whole algorithm from https://github.com/wywu/LAB but i get the following error:

opencv-master/modules/dnn/src/layers/resize_layer.cpp:200: error: (-215:Assertion failed) inputs.size() == 1 in function 'getMemoryShapes'

I can read the network and print the details but while forwarding it crashes.. Has anyone encountered something similar?

Thanks and greetz!

edit retag flag offensive close merge delete

Comments

hi, aguila, can you be a bit more concise about what you are doing here ? there must be a prebuilt model, code, etc.

berak gravatar imageberak ( 2018-11-05 12:59:47 -0600 )edit
1

Hi @berak, the prebuilt model i downloaded from the repo https://wywu.github.io/projects/LAB/s... and there is an implementation in caffe https://github.com/wywu/LAB/blob/mast... i based myself in that one and this one https://github.com/HandsomeHans/Easy-LAB

I started something like this:

Net net_lab = readNet("rel.prototxt", "model.bin");
Mat imge = imread("faceImg.jpg",0);

resize(imge, imge, Size(256,256));
imge.convertTo(imge, CV_32F);
Mat inputBlob = blobFromImage(imge, 1.0, Size(), Scalar(127,127,127), false, false);
net_lab.setInput(inputBlob);
Mat outputBlob = net_lab.forward();
aguila gravatar imageaguila ( 2018-11-05 13:39:27 -0600 )edit

Author uses customized Interp layer (see https://github.com/wywu/LAB/blob/mast...). It resizes the first input blob to size of the second one. If your model always works with 1x1x256x256 input you can easy compute a destination shape and use an origin Interp layer introduced in https://github.com/cdmh/deeplab-public.

dkurt gravatar imagedkurt ( 2018-11-06 01:10:08 -0600 )edit

Hello @dkurt thank you for your help, sorry i did not checked in the past week the two different layers. I am looking now the differences between the Interp Layer from opencv (the one from deeplab) and the custom layer from the author.

In my case, i will always have the same input size 256x256.

For this, if i am correct, i should explicit give the output size as input in the Interp Layer no? something like:

layer {
  name: "hourglass_1/upper_4/interp"
  type: "Interp"
  bottom: "hourglass_1/lower_4_3/res"
  bottom: "hourglass_1/lower_3/res"
  top: "hourglass_1/upper_4/interp"
  param {
  width:  XX        <--- My inputs
  height: XX
  }
}
aguila gravatar imageaguila ( 2018-11-12 06:34:06 -0600 )edit

@aguila, Try this:

layer {
  name: "hourglass_1/upper_4/interp"
  type: "Interp"
  bottom: "hourglass_1/lower_4_3/res"
  # bottom: "hourglass_1/lower_3/res"  Now we don't need it with explicit output sizes
  top: "hourglass_1/upper_4/interp"
  interp_param {
    width:  256
    height: 256
  }
}
dkurt gravatar imagedkurt ( 2018-11-12 07:46:45 -0600 )edit

ok i will try it later today and post my results here.. thanks!

aguila gravatar imageaguila ( 2018-11-12 07:59:40 -0600 )edit

@benjamin, please do not post answers here, if you have a question or a comment, thank you

(apart from that, questons like yours never work here, leave older questions alone, noone's here anymore)

berak gravatar imageberak ( 2019-04-17 02:22:33 -0600 )edit

1 answer

Sort by » oldest newest most voted
0

answered 2018-12-10 17:16:02 -0600

aguila gravatar image

Hello, i forgot to finish this example.. thanks for the hints @dkurt . I just made the modifications to the network. The trick is to find the correct size for the "Interp" Layers (using the opencv Interp Layer with 1 Input).

As in the example, the size of the layer "hourglass_1/lower_3/res" must be given as parameter, in that case would be:

layer {
  name: "hourglass_1/upper_4/interp"
  type: "Interp"
  bottom: "hourglass_1/lower_4_3/res"
  # bottom: "hourglass_1/lower_3/res"  Now we don't need it with explicit output sizes
  top: "hourglass_1/upper_4/interp"
  interp_param {
    width:  8 ---> Size of "hourglass_1/lower_3/res" layer
    height: 8
  }
}

The same process is applied to all the "Interp" layers.

edit flag offensive delete link more

Comments

interp_param { width: 8 height: 8 } Where do the parameters come from?

benjamin gravatar imagebenjamin ( 2019-04-17 03:26:36 -0600 )edit

I have changed entire network file as suggested, and it works for me, thanks, but the execution time is very high (300 msec per face on CPU i5).

anup-deshmukh gravatar imageanup-deshmukh ( 2020-07-25 07:00:38 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2018-11-05 12:46:15 -0600

Seen: 2,682 times

Last updated: Dec 10 '18