The examples that you provided look similar to one another in duration and amplitude, so normalization might not be a problem. But consider whether you will always be comparing the same duration and whether you want to compare the relative change in the time series or the absolute change. The relative change can be measured by normalizing the series so that the first value (starting point) is set to 0. Depending on your problem, this might be worth considering.
Second, think about some sort of smoothing algorithm to get rid of some of the jitter in your data. Examples include an Exponentially Weighted Moving Average or a second order Holt-Winters EWMA.
The suggestions in the comments are good ones... both Fourier Transform via FFT or moving to a wavelet formulation are interesting, but they are still a one-to-one in terms of the amount of data in (in temporal space) being equal to the amount of data out (in frequency space).
I would suggest first including the raw data or a down-sampling of the raw data with some additional features like area under the curve, max, min. I would stay away from procedures which amplify the jitter of your data (like derivatives) in favor of those which decrease the jitter (like integrals).
A nice option, which implicitly smooths would be to take a running integration of your data and then plot the running integral rather than the values themselves. This wouldn't require apriori smoothing and is still deterministic.
Following this, I suggest doing a whole bunch of feature extraction/feature engineering and then trying a LASSO regression to pick out the most useful features.
Some feature engineering ideas include:
- Fourier Transform via FFT
- Possibly only keeping the n (~10) largest Fourier Modes from above rather than the full set.
- Integrals of your data.
- Number of zero crossings.
Suggestions to look at the number of changes in the second derivative are tough with time series data due to its stochastic nature. The slopes (derivatives) are very wild and depend largely on how much smoothing has been applied and how the data is being sampled.