1 | initial version |
I realized that with 1000 learners and a depth of 20, that's potentially 2^20*1000 learning parameters, i.e. about a billion or 1 gigabyte. So turns out that the learning model needs all of that space to store all of the trees.
To reduce the size I must lower the tree depth and/or number of learners. For example, reducing tree depth to 5 used only 21 mb (though it seemed to take around the same amount of time to build the learning model). Perhaps decreasing the weight trim rate would result in more trees that are pruned before reaching depth 20 (and thus reduce memory size as well). I haven't tested this yet.
Case closed.