I am training a tacotron2 model for a new dataset but it gives an error. Error is given below:
RuntimeError: Error(s) in loading state_dict for Tacotron2: size mismatch for embedding.weight: copying a param with shape torch.Size([148, 512]) from checkpoint, the shape in current model is torch.Size([88, 512]).
How can I resolve it?
Seems like your symbols.py contains fewer symbols than the one in the pre-trained model. Try updating
n_symbols
in your hparams.py to 88 and add "symbols" toignore_layers
.