here p2:
Normalization depends on the step pattern - allowed transitions and
weights between matched pairs, when searching for an optimal path.
Normalization is then made by dividing the distance by n, m or n+m
depending on the step pattern and slope weighting [5].
-- taking into consideration, that DTW overcomes some of the restrictions of simpler similarity measures such as Euclidean distance... I think, answer to your queston depends on the library you use - see docs for your chosen library and pay attention for similarity metrics - some of them are still euclidean and some of them are defined in the other library as dtw -- only testing with fake data can reveal the necessity of normalization or non-necessity (depending on the library)... But as so as it all is about the distance, its choice is the governing factor for model construction , e.g.
Dynamic Time Warping (DTW) is a prominent similarity metric in
time-series analysis, particularly when the data sets are of varying
durations or exhibit phase changes or time warping. DTW, unlike
Euclidean distance, allows for non-linear warping of the time axis to
suit analogous patterns in time-series data sets.
-- seems not needing normalization
p.s. you can even see differencies and nuances of the algorithms and some other metrics for judging time series similarity
P.P.S. if you'd like to see for the exact signal similarity - use some convolution techniques, e.g. like
here
P.P.P.S. if you have a stochastic process - you can use HMM for dynamic modelling with further clustering similarities of the results OR to use Discrete Wavelet Transform "to decompose a time series into a time-frequency spectrum and use the frequency energies as features"