SEMI-SORTED BATCHING WITH VARIABLE LENGTH INPUT FOR EFFICIENT TRAINING
Abstract:
Techniques are described for training neural networks on variable length datasets. The numeric representation of the length of each training sample is randomly perturbed to yield a pseudo-length, and the samples sorted by pseudo-length to achieve lower zero padding rate (ZPR) than completely randomized batching (thus saving computation time) yet higher randomness than strictly sorted batching (thus achieving better model performance than strictly sorted batching).
Information query
Patent Agency Ranking
0/0