This is my note for the 4th course of TensorFlow in Practice Specialization given by deeplearning.ai and taught by Laurence Moroney on Coursera.

đź‘‰ Check the codes on my Github.
đź‘‰ Official notebooks on Github.

đź‘‰ Go to course 1 - Intro to TensorFlow for AI, ML, DL.
đź‘‰ Go to course 2 - CNN in TensorFlow.
đź‘‰ Go to course 3 - NLP in Tensorflow.

• Sequence models: focus on time series (there are others) â€“ stock, weather,â€¦
• At the end, we wanna model sunspot actitivity cycles which is important to NASA and other space agencies.
• Using RNN on time series data.

## Sequences and prediction

### Time Series

đź‘‰ Notebook: introduction to time series. + explaining video. => How to create synthetic time series data + plot them.

• Time series is everywhere: stock prices, weather focasts, historical trends (Mooreâ€™s law),â€¦
• Univariate TS and MUltivariate TS.
• Type of things can we do with ML over TS:
• Any thing has a time factor can be analysed using TS.
• Predicting a focasting (eg. birth & death in Japan -> predict future for retirement, immigration, impactsâ€¦).
• Imputation: project back into the past.
• Fill holes in the data.
• Nomalies detecction (website attacks).
• Spot patterns (eg. speed recognition).
• Common patterns in TS:
• Trend: a specific direcion that theyâ€™re moving in.

• Seasonality: patterns repeat at predictable intervals (eg. active users for a website).

• Combinition of both trend and seasonality.

• Stationary TS.

• Autocorrelated TS: a time series is linearly related to a lagged version of itself.. There is no trend, no seasonality.

• Multiple auto correlation.

• May be trend + seasonality + autorrelation + noise.

• Non-stationary TS:

In this case, we base just on the later data to predict the future (not on the whole data).

### Train / Validation / Test

• Fixed partitioning (this course focuses on) = splitting TS data into training period, validation period and test period.
• If TS is seasonal, we want each period contains the whole number of seasons.

• We can split + train + test to get a model and then re-train with the data containing also the test period so that the model is optimized! In that case, the test set comes from the future.

• Roll-forward partitioning: we start with a short training period and we gradually increase it (1 day at a time or 1 week at a time). At each iteration, we train the model on training period, use it to focast the following day/week in the validation period. = Fixed partitioning in a number of times!

### Metrics

For evaluating models:

``````errors = forecasts - actual

# Mean squared error (square to get rid of negative values)
# Eg. Used if large errors are potentially dangerous
mse = np.square(errors).mean()
# Get back to the same scale to error
rmse = np.sqrt(mse)

# Mean absolute error (his favorite)
# this doesn't penalize large errs as much as mse does,
# used if loss is proportional to the size of err
mae = np.abs(errors).mean()

# Mean abs percentage err
# idea of the size of err compared to the values
mape = np.abs(errors / x_valid).mean()
``````
``````# MAE with TF
keras.metrics.mean_absolute_error(x_valid, naive_forecast).numpy()
``````

### Moving average and differencing

đź‘‰ Notebook: Forecasting. + explaining video.

Moving average: a simple forecasting method. Calculate the average of blue lines within a fixed â€śaveraging windowsâ€ť.

• This can eliminate noises and doesnâ€™t anticipate trend or seasonality.
• Depend on the â€śaveraging windowâ€ť, it can give worse result than naive forecast.

Take the average on each yellow window. MAE=7.14 (optimal is 4).

``````def moving_average_forecast(series, window_size):
"""Forecasts the mean of the last few values.
If window_size=1, then this is equivalent to naive forecast"""
forecast = []
for time in range(len(series) - window_size):
forecast.append(series[time:time + window_size].mean())
return np.array(forecast)
``````

Differencing: remove the trend and seasonality from the TS. We study on the differences between points and their previous neighbor in period.

Left image: we find the differencing of original values, then we find the average (orange line). Right image: restore the trend and seasonality. MAE=5.8 (optimal is 4).

Above method still get the noises (because we add the differencing to the previous noise). If we remove past noise using moving average on that.

Smoothing both past and present values. MAE=4.5 (optimal is 4).

Keep in mind before using Deep Learning, sometimes simple approaches just work fine!

â€˘Notes with this notation aren't good enough. They are being updated. If you can see this, you are so smart. ;)