Short term wind condition prediction of wind farm driven by unit data in the 5th national industrial Internet data innovation and application competition [11] [baseline] [seq2seq]

statement

  1. No real data will appear in the series blog
  2. All private messages will be rejected during the competition. Please comment if you have any questions

Catalogue of series articles

Short term wind condition prediction of wind farm driven by unit data in the 5th national industrial Internet data innovation and application competition [0] [scoring rules for wind condition prediction - Calculation of final score R] [abandoned]
Short term wind condition prediction of wind farm driven by unit data in the 5th national industrial Internet data innovation and application competition [1] [production of verification set] [abandoned]
Short term wind condition prediction of wind farm driven by unit data in the 5th national industrial Internet data innovation and application competition [2] [use of verification set] [abandoned]
Short term wind condition prediction of wind farm driven by unit data in the 5th national industrial Internet data innovation and application competition [3] [calculate final score]
Short term wind condition prediction of wind farm driven by unit data in the 5th national industrial Internet data innovation and application competition [4] [data management]
Short term wind condition prediction of wind farm driven by unit data in the 5th national industrial Internet data innovation and application competition [5] [data visualization] [test set preliminary competition]
Short term wind condition prediction of wind farm driven by unit data in the 5th national industrial Internet data innovation and application competition [6] [data visualization] [training set]
Short term wind condition prediction of wind farm driven by unit data in the 5th national industrial Internet data innovation and application competition [7] [data management] [localization of verification set]
Short term wind condition prediction of wind farm driven by unit data in the 5th national industrial Internet data innovation and application competition [8] [data visualization] [verification set]
Short term wind condition prediction of wind farm driven by unit data in the 5th national industrial Internet data innovation and application competition [9] [data visualization] [meteorological data]
Short term wind condition prediction of wind farm driven by unit data in the 5th national industrial Internet data innovation and application competition [10] [baseline] [LSTM]

Update description

nothing

preface

warm

This article is Short term wind condition prediction of wind farm driven by unit data in the 5th national industrial Internet data innovation and application competition [10] [baseline] [LSTM] Upgraded version of

1, Necessary preparation


Where did the training set come from, Short term wind condition prediction of wind farm driven by unit data in the 5th national industrial Internet data innovation and application competition [4] [data management]

Then, the latest main.py is as follows

import datetime
import itertools
import os
import shutil
from tqdm import trange

import matplotlib.pyplot as plt
import numpy as np
import pandas as pd

import torch
import torch.nn as nn
import torch.nn.functional as F

plt.rcParams['font.sans-serif'] = ['SimHei']
plt.rcParams['axes.unicode_minus'] = False
np.set_printoptions(suppress=True)
pd.set_option('display.float_format', lambda x:'%.7f'%x)

device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')

# ----------------------------------------------------------------------------------------------------
# Config
# ----------------------------------------------------------------------------------------------------

field = np.array([
    'Wind farm 1',
    'Wind farm 2'
])
field_len = field.shape[0]

machine = np.array([
    [f'x{i}' for i in range(26, 50+1)],
    [f'x{i}' for i in range(25, 49+1)]
])
machine_len = machine[0].shape[0]

season = np.array(['spring', 'summer', 'autumn', 'winter'])
season_len = season.shape[0]

period = np.array([
    [f'{s}_{str(i).zfill(2)}' for s in season for i in range(1, 20+1)],
    [f'{s}_{str(i).zfill(2)}' for s in season for i in range(21, 40+1)]
])
period_len = period[0].shape[0]

# ----------------------------------------------------------------------------------------------------
# Coordinate system conversion
# ----------------------------------------------------------------------------------------------------

def p2c(data):
    """
    Polar Coordinate System -> Cartesian coordinate system
    polar coordinates(ρ, θ) -> Rectangular coordinates(x, y)
    """
    
    ρ, θ = data.T
    xy = np.full_like(data, np.nan, dtype=np.float32)    # It is used to store the corresponding rectangular coordinates
    
    c = ρ < 0     # Condition: current polar diameter ρ  Less than 0
    assert not np.any(c), 'Polar diameter ρ Cannot be less than 0'
    
    c = ρ == 0    # Condition: current polar diameter ρ  When equal to 0
    xy[c] = np.zeros_like(xy[c], dtype=np.float32)    # Its corresponding x and y are 0
    
    c = ρ > 0     # Condition: current polar diameter ρ  When greater than 0
    xy[c, 0] = ρ[c] * np.cos(2 * np.pi * θ[c])      # x = ρ * cos(θ)
    xy[c, 1] = ρ[c] * np.sin(2 * np.pi * θ[c])      # y = ρ * sin(θ)
    
    return np.around(xy, 4)    # Keep 4 decimal places

def c2p(data):
    """
    Cartesian coordinate system -> Polar Coordinate System
    Rectangular coordinates(x, y) -> polar coordinates(ρ, θ)
    """
    x, y = data.T
    ρθ = np.full_like(data, np.nan, dtype=np.float32)    # Used to store the corresponding polar coordinates
    
    ρ = np.sqrt(np.power(x, 2) + np.power(y, 2))    # ρ =  Sqrt (x * * 2 + y * * 2) sqrt stands for root
    ρθ[:, 0] = ρ    # Polar diameter
    
    c = ρ < 0    # This is not the case
    
    c = ρ == 0    # Condition: current polar diameter ρ  When equal to 0
    ρθ[c, 1] = np.zeros_like(ρθ[c, 1], dtype=np.float32)    # The polar angle is uncertain and can be in any direction. Here I decided to initialize it to 0 °
    
    c = np.bitwise_and(ρ > 0, y == 0)    # Condition: current polar diameter ρ  Greater than 0 and y equal to 0
    ρθ[c, 1] = 0.5 * (np.sign(x[c]) < 0).astype(int) * np.ones_like(ρθ[c, 1], dtype=np.float32)    # Polar angle θ  Determined by the sign of X. When sign (x) < 0, θ  0.5, i.e. 180 °; conversely θ  Is 0, i.e. 0 °
    
    c = np.bitwise_and(ρ > 0, y != 0)    # Condition: current polar diameter ρ  Greater than 0 and y equal to 0
    ρθ[c, 1] = np.mod(np.sign(y[c]) * np.arccos(x[c] / ρ[c]) / (2 * np.pi), 1)    # θ = (sign(y) * arccos(x / ρ) / (2 * pi)) mod 1
    
    return np.around(ρθ, 4)    # Keep 4 decimal places

# ----------------------------------------------------------------------------------------------------
# Data Manager
# ----------------------------------------------------------------------------------------------------

def read_csv(path):
    return pd.read_csv(path, encoding='utf-8') if os.path.exists(path) else None

class Data_Manager:
    def __init__(self):
        
        self._root = 'Training set'
        self._root_p = 'Test set_Preliminary'
        self._root_f = 'Test set_finals'
        
        if self._check_file:
            print('Merging data from the same turbine and repairing missing time periods')
            self._merge_data

    def generate_sample(self, test_findals):
        return pd.DataFrame(
            data = np.array([
                [*x0, x1, x2, x3, x4] for x0, x1, x2, x3, x4 in 
                    itertools.product(
                        np.vstack([list(itertools.product([field[f]], machine[f])) for f in range(field_len)]).tolist(),    # 'wind farm', 'fan'
                        period[1 if test_findals else 0],    # 'time period '
                        np.arange(1, 20+1) * 30,             # 'time '
                        [None],                              # 'wind speed'
                        [None]                               # 'wind direction'
                    )
            ]),
            columns = ['Wind field', 'Fan', 'time interval', 'time', 'wind speed', 'wind direction']
        )

    @property
    def load_data_t(self):
        self.X_t = np.zeros(shape=(field_len, machine_len, 2 * 365 * 24 * 120, 2), dtype=np.float32)    # shape: (2, 25, 2102400, 2)
        for f in range(field_len):
            for m in trange(machine_len):
                datas = read_csv(os.path.join(self._root, field[f], machine[f][m]) + '.csv')
                self.X_t[f, m] = datas[['wind speed', 'wind direction']].fillna(0).values

        self.X_t_spd = self.X_t[..., 0].reshape(field_len * machine_len, 2102400, 1)             # (2 * 25, 2102400, 1)
        self.X_t_dir = p2c(np.vstack([np.ones_like(self.X_t_spd, dtype=np.float32).flatten(), self.X_t[..., 1].flatten()]).T).reshape(field_len * machine_len, 2102400, 2)             # (2 * 25, 2102400, 2)

        print('Training set')
        print(f'self.X_t     , shape : {self.X_t.shape} <- (Wind field , Fan , time , features <- (wind speed, wind direction))')
        print(f'self.X_t_spd , shape : {self.X_t_spd.shape}    <- (Wind field * Fan , time , wind speed <- ρ)')
        print(f'self.X_t_dir , shape : {self.X_t_dir.shape}    <- (Wind field * Fan , time , wind direction <- (x, y) <- θ)', end='\n\n')

    @property
    def load_data_p(self):
        self.X_p = np.zeros(shape=(field_len, machine_len, period_len, 120, 2), dtype=np.float32)       # shape: (2, 25, 80, 120, 2)
        for f in range(field_len):
            for m in trange(machine_len):
                for p in range(period_len):
                    datas = read_csv(os.path.join(self._root_p, field[f], machine[f][m], period[0][p]) + '.csv')
                    if datas is not None:
                        self.X_p[f, m, p] = datas[['wind speed', 'wind direction']].fillna(0).values

        self.X_p_spd = self.X_p[..., 0].reshape(field_len * machine_len * period_len, 120, 1)    # (2 * 25 * 80, 120, 1)
        self.X_p_dir = p2c(np.vstack([np.ones_like(self.X_p_spd, dtype=np.float32).flatten(), self.X_p[..., 1].flatten()]).T).reshape(field_len * machine_len * period_len, 120, 2)    # (2 * 25 * 80, 120, 2)

        print('Test set_Preliminary')
        print(f'self.X_p     , shape : {self.X_p.shape} <- (Wind field , Fan , time interval , time , features <- (wind speed, wind direction))')
        print(f'self.X_p_spd , shape : {self.X_p_spd.shape}      <- (Wind field * Fan * time interval , time , wind speed <- ρ)')
        print(f'self.X_p_dir , shape : {self.X_p_dir.shape}      <- (Wind field * Fan * time interval , time , wind direction <- (x, y) <- θ)', end='\n\n')

    @property
    def load_data_f(self):
        self.X_f = np.zeros(shape=(field_len, machine_len, period_len, 120, 2), dtype=np.float32)       # shape: (2, 25, 80, 120, 2)
        for f in range(field_len):
            for m in trange(machine_len):
                for p in range(period_len):
                    datas = read_csv(os.path.join(self._root_f, field[f], machine[f][m], period[1][p]) + '.csv')
                    if datas is not None:
                        self.X_f[f, m, p] = datas[['wind speed', 'wind direction']].fillna(0).values

        self.X_f_spd = self.X_f[..., 0].reshape(field_len * machine_len * period_len, 120, 1)    # (2 * 25 * 80, 120, 1)
        self.X_f_dir = p2c(np.vstack([np.ones_like(self.X_f_spd, dtype=np.float32).flatten(), self.X_f[..., 1].flatten()]).T).reshape(field_len * machine_len * period_len, 120, 2)    # (2 * 25 * 80, 120, 2)

        print('Test set_finals')
        print(f'self.X_f     , shape : {self.X_f.shape} <- (Wind field , Fan , time interval , time , features <- (wind speed, wind direction))')
        print(f'self.X_f_spd , shape : {self.X_f_spd.shape}      <- (Wind field * Fan * time interval , time , wind speed <- ρ)')
        print(f'self.X_f_dir , shape : {self.X_f_dir.shape}      <- (Wind field * Fan * time interval , time , wind direction <- (x, y) <- θ)', end='\n\n')

    @property
    def _merge_data(self):

        os.makedirs(self._root, exist_ok=True)
        [os.makedirs(os.path.join(self._root, field[f]), exist_ok=True) for f in range(field_len)]
        [shutil.copyfile(os.path.join('train', field[f], 'weather.csv'), os.path.join(self._root, field[f], 'weather.csv')) for f in range(field_len)]

        # Year corresponding to each wind farm
        years = {
            field[0]: [2018, 2019],
            field[1]: [2017, 2018]
        }

        # Complete time series
        time_series = {field[f]: pd.DataFrame(data={'time': pd.date_range(datetime.datetime(years[field[f]][0], 1, 1, 0, 0, 0), datetime.datetime(years[field[f]][1], 12, 31, 23, 59, 30), freq='30S')}, dtype=str) for f in range(field_len)}

        for f in range(field_len):
            for m in range(machine_len):

                machine_dir = os.path.join('train', field[f], machine[f][m])

                machine_data_save_path = os.path.join(self._root, field[f], machine[f][m]) + '.csv'

                print(f'Merge {machine_dir} to {machine_data_save_path} ... \t', end='')

                # Merging the data of two years by file operation is too slow by pandas
                with open(machine_data_save_path, 'a', encoding='utf-8') as f1:
                    f1.write('time,Active power at grid side of frequency converter,Outside temperature,wind direction,wind speed\n')    # Listing
                    for data_file in os.listdir(machine_dir):
                        with open(os.path.join(machine_dir, data_file), 'r', encoding='utf-8') as f2:
                            f1.writelines(f2.readlines()[1:])    # [1:] - > the first row is the column name, which is not written

                # Merge data according to the column 'time'
                df = pd.merge(
                    left = time_series[field[f]],
                    right = read_csv(machine_data_save_path),
                    how = 'left',
                    on = ['time']
                )

                df.loc[:, ['time', 'Active power at grid side of frequency converter', 'Outside temperature', 'wind speed', 'wind direction']].to_csv(machine_data_save_path, float_format='%.7f', index=False, encoding='utf-8')

                print('done!')

    @property
    def _check_file(self):
        for f in range(field_len):
            for m in range(machine_len):
                file = os.path.join(self._root, field[f], machine[f][m]) + '.csv'
                if not os.path.exists(file):
                    return True
            file = os.path.join(self._root, field[f], 'weather.csv')
            if not os.path.exists(file):
                return True
        return False

2, Whole process

1. Preparation

from main import *
dm = Data_Manager()
dm.load_data_t

2. Model

2.1 parameters

i_len = 120    # input length
o_len = 20     # output length
Len = i_len + o_len

indexes = np.arange(2102400 - 1 - Len)

m = np.ceil(np.log2(i_len)).astype(int)    # Greater than I_ The m-th power of Len's 2
n_len = np.power(2, m)    # The sequence length is 2 to the power of m
padding = n_len - i_len
batch_size = period_len

epochs = 5000 + 1
e = len(str(epochs))
lr_s = 1e-3
lr_e = 1e-7
lr_decay = (lr_s - lr_e) / epochs
hidden_size = 32
z = np.arange(o_len)
size = 10

y_err = (100 / 28 - 1) / 160

2.2 Encoder

class Encoder(nn.Module):
    def __init__(self):
        super().__init__()
        
        self.lstm_cell = nn.LSTMCell(input_size = input_size, hidden_size = hidden_size)
        
    def forward(self, x):
        
        hx = torch.zeros(x.size(1), hidden_size, dtype=torch.float32, device=device)
        cx = torch.zeros(x.size(1), hidden_size, dtype=torch.float32, device=device)
        
        for i in range(i_len):
            # (N, input_size), ((N, hidden_size), (N, hidden_size)) -> (N, hidden_size), (N, hidden_size)
            hx, cx = self.lstm_cell(x[i], (hx, cx))
        
        return hx, cx    # (N, hidden_size), (N, hidden_size)

2.3 Decoder

class Decoder(nn.Module):
    def __init__(self):
        super().__init__()
        
        self.lstm_cell = nn.LSTMCell(input_size = input_size, hidden_size = hidden_size)
        self.l = nn.Sequential(
            nn.Linear(hidden_size, hidden_size),
            nn.ReLU(),
            nn.Linear(hidden_size, output_size)
        )
        
    def forward(self, x, hx, cx):
        
        outputs = torch.zeros(o_len, x.size(0), output_size, dtype=torch.float32, device=device)
        
        for i in range(o_len):
            # (N, input_size), ((N, hidden_size), (N, hidden_size)) -> (N, hidden_size), (N, hidden_size)
            hx, cx = self.lstm_cell(x, (hx, cx))
            x = self.l(hx)    # (N, hidden_size) -> (N, output_size)
            outputs[i] = x    # (N, output_size)
        
        return outputs    # (o_len, N, output_size)

2.4 Seq2seq_Net

class Seq2seq_Net(nn.Module):
    def __init__(self):
        super().__init__()
        
        self.encoder = Encoder()
        self.decoder = Decoder()
        
    def forward(self, x):
        
        x = x.permute(1, 0, 2)    # (N, i_len, input_size) -> (i_len, N, input_size)
        
        hx, cx = self.encoder(x)    # (i_len, N, input_size) -> (N, hidden_size), (N, hidden_size)
        
        outputs = self.decoder(x[-1], hx, cx)    # (N, input_size), (N, hidden_size), (N, hidden_size) -> (o_len, N, output_size)
        
        outputs = outputs.permute(1, 0, 2)    # (o_len, N, output_size) -> (N, o_len, output_size)
        
        outputs = outputs.reshape(outputs.size(0) * outputs.size(1), outputs.size(2))    # (N, o_len, output_size) -> (N * o_len, output_size)
        
        return outputs    # (N * o_len, output_size)

3. Training

3.1. For wind speed

input_size = output_size = dm.X_t_spd.shape[2]
model_spd = Seq2seq_Net().to(device)
optimizer_spd = torch.optim.Adam(model_spd.parameters(), lr=lr_s, weight_decay=0)
# train

pbar = trange(epochs)
for epoch in pbar:

    # ----------------------------------------------------------------------------------------------------
    # Extract data
    # ----------------------------------------------------------------------------------------------------
    
    i = np.array([np.arange(i, i + Len) for i in np.random.choice(indexes, batch_size, False)])    # (batch_size, Len)

    X = torch.tensor(dm.X_t_spd[:, i[:, :i_len].flatten(), :], dtype=torch.float32, device=device).reshape(field_len * machine_len * batch_size, i_len, input_size)
    Y = torch.tensor(dm.X_t_spd[:, i[:, i_len:].flatten(), :], dtype=torch.float32, device=device).reshape(field_len * machine_len * batch_size * o_len, input_size)

    # ----------------------------------------------------------------------------------------------------
    # train
    # ----------------------------------------------------------------------------------------------------

    y_pred = model_spd(X)
    y_true = Y

    loss = nn.PairwiseDistance()(y_pred, y_true).mean()

    optimizer_spd.zero_grad()
    loss.backward()
    optimizer_spd.step()

    optimizer_spd.param_groups[0]['lr'] -= lr_decay
    
    torch.cuda.empty_cache()
    
    # ----------------------------------------------------------------------------------------------------
    # visualization
    # ----------------------------------------------------------------------------------------------------

    if epoch % 500 == 0 and epoch > 0:

        y_pred = y_pred.reshape(field_len * machine_len * batch_size, o_len).detach().cpu().numpy()
        y_true = y_true.reshape(field_len * machine_len * batch_size, o_len).detach().cpu().numpy()

        fig, ax = plt.subplots(1, size, figsize=(14 * size, 14))

        for j in range(size):

            i = np.random.randint(0, field_len * machine_len * batch_size)

            ax[j].fill_between(z, y_true[i] - y_err, y_true[i] + y_err, alpha=0.2)
            ax[j].plot(z, y_pred[i], label='Prediction interval prediction value', color='blue')
            ax[j].plot(z, y_true[i], label='Real value of prediction interval', color='red')
            ax[j].legend()

        plt.show()
    
    # ----------------------------------------------------------------------------------------------------
    # Show progress
    # ----------------------------------------------------------------------------------------------------

    pbar.set_description(f'[Epoch: {epoch+1:>{e}}/{epochs:>{e}}] [lr: {optimizer_spd.state_dict()["param_groups"][0]["lr"]:.7f}] [loss {loss.item():.3f}]')


3.2. For wind direction

input_size = output_size = dm.X_t_dir.shape[2]
model_dir = Seq2seq_Net().to(device)
optimizer_dir = torch.optim.Adam(model_dir.parameters(), lr=lr_s, weight_decay=0)
# train

pbar = trange(epochs)
for epoch in pbar:

    # ----------------------------------------------------------------------------------------------------
    # Extract data
    # ----------------------------------------------------------------------------------------------------
    
    i = np.array([np.arange(i, i + Len) for i in np.random.choice(indexes, batch_size, False)])    # (batch_size, Len)

    X = torch.tensor(dm.X_t_dir[:, i[:, :i_len].flatten(), :], dtype=torch.float32, device=device).reshape(field_len * machine_len * batch_size, i_len, input_size)
    Y = torch.tensor(dm.X_t_dir[:, i[:, i_len:].flatten(), :], dtype=torch.float32, device=device).reshape(field_len * machine_len * batch_size * o_len, input_size)

    # ----------------------------------------------------------------------------------------------------
    # train
    # ----------------------------------------------------------------------------------------------------

    y_pred = model_dir(X)
    y_true = Y

    loss = nn.PairwiseDistance()(y_pred, y_true).mean()

    optimizer_dir.zero_grad()
    loss.backward()
    optimizer_dir.step()

    optimizer_dir.param_groups[0]['lr'] -= lr_decay
    
    torch.cuda.empty_cache()
    
    # ----------------------------------------------------------------------------------------------------
    # visualization
    # ----------------------------------------------------------------------------------------------------

    if epoch % 500 == 0 and epoch > 0:

        y_pred = c2p(y_pred.detach().cpu().numpy()).T[1].reshape(field_len * machine_len * batch_size, o_len)
        y_true = c2p(y_true.detach().cpu().numpy()).T[1].reshape(field_len * machine_len * batch_size, o_len)

        fig, ax = plt.subplots(1, size, figsize=(14 * size, 14))

        for j in range(size):

            i = np.random.randint(0, field_len * machine_len * batch_size)

            ax[j].fill_between(z, y_true[i] - y_err, y_true[i] + y_err, alpha=0.2)
            ax[j].plot(z, y_pred[i], label='Prediction interval prediction value', color='blue')
            ax[j].plot(z, y_true[i], label='Real value of prediction interval', color='red')
            ax[j].legend()

        plt.show()
    
    # ----------------------------------------------------------------------------------------------------
    # Show progress
    # ----------------------------------------------------------------------------------------------------

    pbar.set_description(f'[Epoch: {epoch+1:>{e}}/{epochs:>{e}}] [lr: {optimizer_dir.state_dict()["param_groups"][0]["lr"]:.7f}] [loss {loss.item():.3f}]')

4. Test

dm.load_data_p

4.1. Generate answers

test_pred_df = dm.generate_sample(test_findals=False)

with torch.no_grad():
    
    X_spd = torch.tensor(dm.X_p_spd, dtype=torch.float32, device=device)[:, -i_len:, :]
    y_spd = model_spd(X_spd).cpu().numpy()[:, 0]
    
    X_dir = torch.tensor(dm.X_p_dir, dtype=torch.float32, device=device)[:, -i_len:, :]
    y_dir = c2p(model_dir(X_dir).cpu().numpy()).T[1]

test_pred_df.loc[:, ['wind speed', 'wind direction']] = np.vstack([y_spd, y_dir]).T

test_pred_df.fillna(0).to_csv('test_pred.csv', float_format='%.4f', index=False, encoding='utf-8')

4.2. Test data visualization

test_pred_df = read_csv('test_pred.csv')
f = np.random.randint(0, field_len)
p = np.random.randint(0, period_len)
m = np.random.randint(0, machine_len)

print('Wind field', field[f])
print('time interval', period[0][p])
print('Fan', machine[f][m])
z_1 = np.arange(120)
z_2 = np.arange(120, 140)

spd_test, dir_test = dm.X_p[f, m, p].T
spd_pred, dir_pred = test_pred_df.loc[(test_pred_df['Wind field'] == field[f]) & (test_pred_df['Fan'] == machine[f][m]) & (test_pred_df['time interval'] == period[0][p]), ['wind speed', 'wind direction']].values.T

fig, ax = plt.subplots(2, 2, figsize=(10 * 2, 10))

ax[0, 0].set_xlabel('time')
ax[0, 0].set_ylabel('wind speed')
ax[0, 0].plot(z_1, spd_test, label='Test interval', color='darkred')
ax[0, 0].plot(z_2, spd_pred, label='Prediction interval prediction value', color='blue')
ax[0, 0].legend()

ax[0, 1].set_xlabel('time')
ax[0, 1].set_ylabel('wind direction')
ax[0, 1].plot(z_1, dir_test, label='Test interval', color='darkred')
ax[0, 1].plot(z_2, dir_pred, label='Prediction interval prediction value', color='blue')
ax[0, 1].legend()

ax[1, 0].set_xlabel('time')
ax[1, 0].set_ylabel('wind speed')
ax[1, 0].scatter(z_1, spd_test, label='Test interval', color='darkred')
ax[1, 0].scatter(z_2, spd_pred, label='Prediction interval prediction value', color='blue')
ax[1, 0].legend()

ax[1, 1].set_xlabel('time')
ax[1, 1].set_ylabel('wind direction')
ax[1, 1].scatter(z_1, dir_test, label='Test interval', color='darkred')
ax[1, 1].scatter(z_2, dir_pred, label='Prediction interval prediction value', color='blue')
ax[1, 1].legend()
z_1_diff1 = np.arange(119)
z_2_diff1 = np.arange(119, 138)

spd_test_diff1 = spd_test[1:] - spd_test[:-1]
spd_pred_diff1 = spd_pred[1:] - spd_pred[:-1]

dir_test_diff1 = dir_test[1:] - dir_test[:-1]
dir_pred_diff1 = dir_pred[1:] - dir_pred[:-1]

fig, ax = plt.subplots(2, 2, figsize=(10 * 2, 10))

ax[0, 0].set_xlabel('time')
ax[0, 0].set_ylabel('wind speed')
ax[0, 0].plot(z_1_diff1, spd_test_diff1, label='First order difference test interval', color='darkred')
ax[0, 0].plot(z_2_diff1, spd_pred_diff1, label='First order difference prediction interval prediction value', color='blue')
ax[0, 0].legend()

ax[0, 1].set_xlabel('time')
ax[0, 1].set_ylabel('wind direction')
ax[0, 1].plot(z_1_diff1, dir_test_diff1, label='First order difference test interval', color='darkred')
ax[0, 1].plot(z_2_diff1, dir_pred_diff1, label='First order difference prediction interval prediction value', color='blue')
ax[0, 1].legend()

ax[1, 0].set_xlabel('time')
ax[1, 0].set_ylabel('wind speed')
ax[1, 0].scatter(z_1_diff1, spd_test_diff1, label='First order difference test interval', color='darkred')
ax[1, 0].scatter(z_2_diff1, spd_pred_diff1, label='First order difference prediction interval prediction value', color='blue')
ax[1, 0].legend()

ax[1, 1].set_xlabel('time')
ax[1, 1].set_ylabel('wind direction')
ax[1, 1].scatter(z_1_diff1, dir_test_diff1, label='First order difference test interval', color='darkred')
ax[1, 1].scatter(z_2_diff1, dir_pred_diff1, label='First order difference prediction interval prediction value', color='blue')
ax[1, 1].legend()
z_1_diff2 = np.arange(118)
z_2_diff2 = np.arange(118, 136)

spd_test_diff2 = spd_test_diff1[1:] - spd_test_diff1[:-1]
spd_pred_diff2 = spd_pred_diff1[1:] - spd_pred_diff1[:-1]

dir_test_diff2 = dir_test_diff1[1:] - dir_test_diff1[:-1]
dir_pred_diff2 = dir_pred_diff1[1:] - dir_pred_diff1[:-1]

fig, ax = plt.subplots(2, 2, figsize=(10 * 2, 10))

ax[0, 0].set_xlabel('time')
ax[0, 0].set_ylabel('wind speed')
ax[0, 0].plot(z_1_diff2, spd_test_diff2, label='Second order difference test interval', color='darkred')
ax[0, 0].plot(z_2_diff2, spd_pred_diff2, label='Second order difference prediction interval prediction value', color='blue')
ax[0, 0].legend()

ax[0, 1].set_xlabel('time')
ax[0, 1].set_ylabel('wind direction')
ax[0, 1].plot(z_1_diff2, dir_test_diff2, label='Second order difference test interval', color='darkred')
ax[0, 1].plot(z_2_diff2, dir_pred_diff2, label='Second order difference prediction interval prediction value', color='blue')
ax[0, 1].legend()

ax[1, 0].set_xlabel('time')
ax[1, 0].set_ylabel('wind speed')
ax[1, 0].scatter(z_1_diff2, spd_test_diff2, label='Second order difference test interval', color='darkred')
ax[1, 0].scatter(z_2_diff2, spd_pred_diff2, label='Second order difference prediction interval prediction value', color='blue')
ax[1, 0].legend()

ax[1, 1].set_xlabel('time')
ax[1, 1].set_ylabel('wind direction')
ax[1, 1].scatter(z_1_diff2, dir_test_diff2, label='Second order difference test interval', color='darkred')
ax[1, 1].scatter(z_2_diff2, dir_pred_diff2, label='Second order difference prediction interval prediction value', color='blue')
ax[1, 1].legend()

3, Conclusion

It's for reference only. It's meaningless to brush points.

Tags: Pytorch lstm

Posted on Thu, 28 Oct 2021 03:27:35 -0400 by GeXus