第R3周:天气预测

  • 本文为365天深度学习训练营 中的学习记录博客
  • 原作者:K同学啊

任务说明:该数据集提供了来自澳大利亚许多地点的大约 10 年的每日天气观测数据。需要做的是根据这些数据对RainTomorrow进行一个预测,这次任务任务与以往的不同,增加了探索式数据分析(EDA),希望这部分内容可以帮助到大家。

我的环境:
●语言环境:Python3.8
●编译器:Jupyter Lab
●深度学习框架:torch 1.10.2 (cpu)
●数据:天气数据集

一、导入数据

import numpy as np
import pandas as pd
import seaborn as sns
import matplotlib.pyplot as plt
import warnings
warnings.filterwarnings('ignore')from sklearn.model_selection import train_test_split
from sklearn.preprocessing import MinMaxScaler
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Activation,Dropout
from tensorflow.keras.callbacks import EarlyStopping
from tensorflow.keras.layers import Dropout
from sklearn.metrics import classification_report,confusion_matrix
from sklearn.metrics import r2_score
from sklearn.metrics import mean_absolute_error , mean_absolute_percentage_error , mean_squared_error
data = pd.read_csv("./R3/weatherAUS.csv")
df   = data.copy()
data.head()

代码输出:

DateLocationMinTempMaxTempRainfallEvaporationSunshineWindGustDirWindGustSpeedWindDir9am...Humidity9amHumidity3pmPressure9amPressure3pmCloud9amCloud3pmTemp9amTemp3pmRainTodayRainTomorrow
02008-12-01Albury13.422.90.6NaNNaNW44.0W...71.022.01007.71007.18.0NaN16.921.8NoNo
12008-12-02Albury7.425.10.0NaNNaNWNW44.0NNW...44.025.01010.61007.8NaNNaN17.224.3NoNo
22008-12-03Albury12.925.70.0NaNNaNWSW46.0W...38.030.01007.61008.7NaN2.021.023.2NoNo
32008-12-04Albury9.228.00.0NaNNaNNE24.0SE...45.016.01017.61012.8NaNNaN18.126.5NoNo
42008-12-05Albury17.532.31.0NaNNaNW41.0ENE...82.033.01010.81006.07.08.017.829.7NoNo

5 rows × 23 columns

data.describe()

代码输出:

MinTempMaxTempRainfallEvaporationSunshineWindGustSpeedWindSpeed9amWindSpeed3pmHumidity9amHumidity3pmPressure9amPressure3pmCloud9amCloud3pmTemp9amTemp3pm
count143975.000000144199.000000142199.00000082670.00000075625.000000135197.000000143693.000000142398.000000142806.000000140953.000000130395.00000130432.00000089572.00000086102.000000143693.000000141851.00000
mean12.19403423.2213482.3609185.4682327.61117840.03523014.04342618.66265768.88083151.5391161017.649941015.2558894.4474614.50993016.99063121.68339
std6.3984957.1190498.4780604.1937043.78548313.6070628.9153758.80980019.02916420.7959027.106537.0374142.8871592.7203576.4887536.93665
min-8.500000-4.8000000.0000000.0000000.0000006.0000000.0000000.0000000.0000000.000000980.50000977.1000000.0000000.000000-7.200000-5.40000
25%7.60000017.9000000.0000002.6000004.80000031.0000007.00000013.00000057.00000037.0000001012.900001010.4000001.0000002.00000012.30000016.60000
50%12.00000022.6000000.0000004.8000008.40000039.00000013.00000019.00000070.00000052.0000001017.600001015.2000005.0000005.00000016.70000021.10000
75%16.90000028.2000000.8000007.40000010.60000048.00000019.00000024.00000083.00000066.0000001022.400001020.0000007.0000007.00000021.60000026.40000
max33.90000048.100000371.000000145.00000014.500000135.000000130.00000087.000000100.000000100.0000001041.000001039.6000009.0000009.00000040.20000046.70000
data.dtypes

代码输出:

Date              object
Location          object
MinTemp          float64
MaxTemp          float64
Rainfall         float64
Evaporation      float64
Sunshine         float64
WindGustDir       object
WindGustSpeed    float64
WindDir9am        object
WindDir3pm        object
WindSpeed9am     float64
WindSpeed3pm     float64
Humidity9am      float64
Humidity3pm      float64
Pressure9am      float64
Pressure3pm      float64
Cloud9am         float64
Cloud3pm         float64
Temp9am          float64
Temp3pm          float64
RainToday         object
RainTomorrow      object
dtype: object
data['Date'] = pd.to_datetime(data['Date'])
data['Date']

代码输出:

0        2008-12-01
1        2008-12-02
2        2008-12-03
3        2008-12-04
4        2008-12-05...    
145455   2017-06-21
145456   2017-06-22
145457   2017-06-23
145458   2017-06-24
145459   2017-06-25
Name: Date, Length: 145460, dtype: datetime64[ns]
data['year']  = data['Date'].dt.year
data['Month'] = data['Date'].dt.month
data['day']   = data['Date'].dt.day
data.head()

代码输出:

DateLocationMinTempMaxTempRainfallEvaporationSunshineWindGustDirWindGustSpeedWindDir9am...Pressure3pmCloud9amCloud3pmTemp9amTemp3pmRainTodayRainTomorrowyearMonthday
02008-12-01Albury13.422.90.6NaNNaNW44.0W...1007.18.0NaN16.921.8NoNo2008121
12008-12-02Albury7.425.10.0NaNNaNWNW44.0NNW...1007.8NaNNaN17.224.3NoNo2008122
22008-12-03Albury12.925.70.0NaNNaNWSW46.0W...1008.7NaN2.021.023.2NoNo2008123
32008-12-04Albury9.228.00.0NaNNaNNE24.0SE...1012.8NaNNaN18.126.5NoNo2008124
42008-12-05Albury17.532.31.0NaNNaNW41.0ENE...1006.07.08.017.829.7NoNo2008125

5 rows × 26 columns

data.drop('Date',axis=1,inplace=True)
data.columns

代码输出:

Index(['Location', 'MinTemp', 'MaxTemp', 'Rainfall', 'Evaporation', 'Sunshine','WindGustDir', 'WindGustSpeed', 'WindDir9am', 'WindDir3pm','WindSpeed9am', 'WindSpeed3pm', 'Humidity9am', 'Humidity3pm','Pressure9am', 'Pressure3pm', 'Cloud9am', 'Cloud3pm', 'Temp9am','Temp3pm', 'RainToday', 'RainTomorrow', 'year', 'Month', 'day'],dtype='object')

二、探索式数据分析(EDA)

  1. 数据相关性探索
plt.figure(figsize=(15,13))
# data.corr()表示了data中的两个变量之间的相关性
ax = sns.heatmap(data.corr(), square=True, annot=True, fmt='.2f')
ax.set_xticklabels(ax.get_xticklabels(), rotation=90)          
plt.show()

代码输出:

在这里插入图片描述

  1. 是否会下雨
sns.set(style="darkgrid")
plt.figure(figsize=(4,3))
sns.countplot(x='RainTomorrow',data=data)

代码输出:

<matplotlib.axes._subplots.AxesSubplot at 0x1afec66fcc0>

在这里插入图片描述

plt.figure(figsize=(4,3))
sns.countplot(x='RainToday',data=data)

代码输出:

<matplotlib.axes._subplots.AxesSubplot at 0x1afed091208>

在这里插入图片描述

x=pd.crosstab(data['RainTomorrow'],data['RainToday'])
x

代码输出:

RainTodayNoYes
RainTomorrow
No9272816858
Yes1660414597
y=x/x.transpose().sum().values.reshape(2,1)*100
y

代码输出:

RainTodayNoYes
RainTomorrow
No84.61664815.383352
Yes53.21624346.783757

●如果今天不下雨,那么明天下雨的机会 = 53.22%
●如果今天下雨明天下雨的机会 = 46.78%

y.plot(kind="bar",figsize=(4,3),color=['#006666','#d279a6']);

在这里插入图片描述

  1. 地理位置与下雨的关系
x=pd.crosstab(data['Location'],data['RainToday']) 
# 获取每个城市下雨天数和非下雨天数的百分比
y=x/x.transpose().sum().values.reshape((-1, 1))*100
# 按每个城市的雨天百分比排序
y=y.sort_values(by='Yes',ascending=True )color=['#cc6699','#006699','#006666','#862d86','#ff9966'  ]
y.Yes.plot(kind="barh",figsize=(15,20),color=color)

代码输出:

<matplotlib.axes._subplots.AxesSubplot at 0x1afe642bf28>

在这里插入图片描述

位置影响下雨,对于 Portland 来说,有 36% 的时间在下雨,而对于 Woomers 来说,只有6%的时间在下雨

  1. 湿度和压力对下雨的影响
data.columns

代码输出:

Index(['Location', 'MinTemp', 'MaxTemp', 'Rainfall', 'Evaporation', 'Sunshine','WindGustDir', 'WindGustSpeed', 'WindDir9am', 'WindDir3pm','WindSpeed9am', 'WindSpeed3pm', 'Humidity9am', 'Humidity3pm','Pressure9am', 'Pressure3pm', 'Cloud9am', 'Cloud3pm', 'Temp9am','Temp3pm', 'RainToday', 'RainTomorrow', 'year', 'Month', 'day'],dtype='object')
plt.figure(figsize=(8,6))
sns.scatterplot(data=data,x='Pressure9am',y='Pressure3pm',hue='RainTomorrow');

在这里插入图片描述

plt.figure(figsize=(8,6))
sns.scatterplot(data=data,x='Humidity9am',y='Humidity3pm',hue='RainTomorrow');

代码输出:

在这里插入图片描述

低压与高湿度会增加第二天下雨的概率,尤其是下午 3 点的空气湿度。

  1. 气温对下雨的影响
plt.figure(figsize=(8,6))
sns.scatterplot(x='MaxTemp', y='MinTemp', data=data, hue='RainTomorrow');

代码输出:

在这里插入图片描述
结论:当一天的最高气温和最低气温接近时,第二天下雨的概率会增加。

三、数据预处理

  1. 处理缺损值
# 每列中缺失数据的百分比
data.isnull().sum()/data.shape[0]*100

代码输出:

Location          0.000000
MinTemp           1.020899
MaxTemp           0.866905
Rainfall          2.241853
Evaporation      43.166506
Sunshine         48.009762
WindGustDir       7.098859
WindGustSpeed     7.055548
WindDir9am        7.263853
WindDir3pm        2.906641
WindSpeed9am      1.214767
WindSpeed3pm      2.105046
Humidity9am       1.824557
Humidity3pm       3.098446
Pressure9am      10.356799
Pressure3pm      10.331363
Cloud9am         38.421559
Cloud3pm         40.807095
Temp9am           1.214767
Temp3pm           2.481094
RainToday         2.241853
RainTomorrow      2.245978
year              0.000000
Month             0.000000
day               0.000000
dtype: float64
# 在该列中随机选择数进行填充
lst=['Evaporation','Sunshine','Cloud9am','Cloud3pm']
for col in lst:fill_list = data[col].dropna()data[col] = data[col].fillna(pd.Series(np.random.choice(fill_list, size=len(data.index))))
s = (data.dtypes == "object")
object_cols = list(s[s].index)
object_cols

代码输出:

['Location','WindGustDir','WindDir9am','WindDir3pm','RainToday','RainTomorrow']
# inplace=True:直接修改原对象,不创建副本
# data[i].mode()[0] 返回频率出现最高的选项,众数for i in object_cols:data[i].fillna(data[i].mode()[0], inplace=True)
t = (data.dtypes == "float64")
num_cols = list(t[t].index)
num_cols

代码输出:

['MinTemp','MaxTemp','Rainfall','Evaporation','Sunshine','WindGustSpeed','WindSpeed9am','WindSpeed3pm','Humidity9am','Humidity3pm','Pressure9am','Pressure3pm','Cloud9am','Cloud3pm','Temp9am','Temp3pm']
# .median(), 中位数
for i in num_cols:data[i].fillna(data[i].median(), inplace=True)
data.isnull().sum()

代码输出:

Location         0
MinTemp          0
MaxTemp          0
Rainfall         0
Evaporation      0
Sunshine         0
WindGustDir      0
WindGustSpeed    0
WindDir9am       0
WindDir3pm       0
WindSpeed9am     0
WindSpeed3pm     0
Humidity9am      0
Humidity3pm      0
Pressure9am      0
Pressure3pm      0
Cloud9am         0
Cloud3pm         0
Temp9am          0
Temp3pm          0
RainToday        0
RainTomorrow     0
year             0
Month            0
day              0
dtype: int64
  1. 构建数据集
from sklearn.preprocessing import LabelEncoderlabel_encoder = LabelEncoder()
for i in object_cols:data[i] = label_encoder.fit_transform(data[i])
X = data.drop(['RainTomorrow','day'],axis=1).values
y = data['RainTomorrow'].values
X_train, X_test, y_train, y_test = train_test_split(X,y,test_size=0.25,random_state=101)
scaler = MinMaxScaler()
scaler.fit(X_train)
X_train = scaler.transform(X_train)
X_test  = scaler.transform(X_test)

四、预测是否下雨

  1. 搭建神经网络
from tensorflow.keras.optimizers import Adammodel = Sequential()
model.add(Dense(units=24,activation='tanh',))
model.add(Dense(units=18,activation='tanh'))
model.add(Dense(units=23,activation='tanh'))
model.add(Dropout(0.5))
model.add(Dense(units=12,activation='tanh'))
model.add(Dropout(0.2))
model.add(Dense(units=1,activation='sigmoid'))optimizer = tf.keras.optimizers.Adam(learning_rate=1e-4)model.compile(loss='binary_crossentropy',optimizer=optimizer,metrics="accuracy")
early_stop = EarlyStopping(monitor='val_loss', mode='min',min_delta=0.001, verbose=1, patience=25,restore_best_weights=True)
  1. 模型训练
model.fit(x=X_train, y=y_train, validation_data=(X_test, y_test), verbose=1,callbacks=[early_stop],epochs = 100,batch_size = 32
)

代码输出:

Epoch 1/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.5179 - accuracy: 0.7616 - val_loss: 0.3891 - val_accuracy: 0.8306
Epoch 2/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.4012 - accuracy: 0.8307 - val_loss: 0.3778 - val_accuracy: 0.8360
Epoch 3/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3923 - accuracy: 0.8343 - val_loss: 0.3749 - val_accuracy: 0.8371
Epoch 4/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3856 - accuracy: 0.8366 - val_loss: 0.3735 - val_accuracy: 0.8384
Epoch 5/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3855 - accuracy: 0.8371 - val_loss: 0.3721 - val_accuracy: 0.8388
Epoch 6/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3835 - accuracy: 0.8366 - val_loss: 0.3714 - val_accuracy: 0.8393
Epoch 7/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3808 - accuracy: 0.8387 - val_loss: 0.3706 - val_accuracy: 0.8389
Epoch 8/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3790 - accuracy: 0.8373 - val_loss: 0.3698 - val_accuracy: 0.8400
Epoch 9/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3818 - accuracy: 0.8368 - val_loss: 0.3695 - val_accuracy: 0.8397
Epoch 10/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3784 - accuracy: 0.8383 - val_loss: 0.3691 - val_accuracy: 0.8398
Epoch 11/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3775 - accuracy: 0.8394 - val_loss: 0.3688 - val_accuracy: 0.8402
Epoch 12/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3798 - accuracy: 0.8370 - val_loss: 0.3687 - val_accuracy: 0.8399
Epoch 13/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3767 - accuracy: 0.8389 - val_loss: 0.3684 - val_accuracy: 0.8401
Epoch 14/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3751 - accuracy: 0.8397 - val_loss: 0.3690 - val_accuracy: 0.8398
Epoch 15/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3736 - accuracy: 0.8415 - val_loss: 0.3682 - val_accuracy: 0.8404
Epoch 16/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3743 - accuracy: 0.8410 - val_loss: 0.3678 - val_accuracy: 0.8409
Epoch 17/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3752 - accuracy: 0.8406 - val_loss: 0.3680 - val_accuracy: 0.8409
Epoch 18/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3777 - accuracy: 0.8382 - val_loss: 0.3675 - val_accuracy: 0.8404
Epoch 19/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3735 - accuracy: 0.8394 - val_loss: 0.3673 - val_accuracy: 0.8409
Epoch 20/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3751 - accuracy: 0.8389 - val_loss: 0.3673 - val_accuracy: 0.8407
Epoch 21/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3715 - accuracy: 0.8394 - val_loss: 0.3670 - val_accuracy: 0.8408
Epoch 22/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3758 - accuracy: 0.8391 - val_loss: 0.3667 - val_accuracy: 0.8406
Epoch 23/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3715 - accuracy: 0.8411 - val_loss: 0.3668 - val_accuracy: 0.8406
Epoch 24/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3744 - accuracy: 0.8405 - val_loss: 0.3686 - val_accuracy: 0.8401
Epoch 25/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3754 - accuracy: 0.8393 - val_loss: 0.3666 - val_accuracy: 0.8414
Epoch 26/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3720 - accuracy: 0.8415 - val_loss: 0.3658 - val_accuracy: 0.8412
Epoch 27/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3694 - accuracy: 0.8413 - val_loss: 0.3656 - val_accuracy: 0.8417
Epoch 28/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3733 - accuracy: 0.8411 - val_loss: 0.3676 - val_accuracy: 0.8390
Epoch 29/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3715 - accuracy: 0.8413 - val_loss: 0.3655 - val_accuracy: 0.8417
Epoch 30/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3707 - accuracy: 0.8417 - val_loss: 0.3655 - val_accuracy: 0.8420
Epoch 31/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3754 - accuracy: 0.8393 - val_loss: 0.3653 - val_accuracy: 0.8419
Epoch 32/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3694 - accuracy: 0.8423 - val_loss: 0.3651 - val_accuracy: 0.8416
Epoch 33/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3717 - accuracy: 0.8422 - val_loss: 0.3657 - val_accuracy: 0.8411
Epoch 34/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3743 - accuracy: 0.8399 - val_loss: 0.3648 - val_accuracy: 0.8417
Epoch 35/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3690 - accuracy: 0.8426 - val_loss: 0.3647 - val_accuracy: 0.8416
Epoch 36/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3723 - accuracy: 0.8408 - val_loss: 0.3645 - val_accuracy: 0.8423
Epoch 37/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3715 - accuracy: 0.8414 - val_loss: 0.3649 - val_accuracy: 0.8402
Epoch 38/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3680 - accuracy: 0.8429 - val_loss: 0.3644 - val_accuracy: 0.8421
Epoch 39/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3711 - accuracy: 0.8416 - val_loss: 0.3664 - val_accuracy: 0.8411
Epoch 40/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3726 - accuracy: 0.8395 - val_loss: 0.3641 - val_accuracy: 0.8415
Epoch 41/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3695 - accuracy: 0.8425 - val_loss: 0.3643 - val_accuracy: 0.8411
Epoch 42/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3698 - accuracy: 0.8429 - val_loss: 0.3637 - val_accuracy: 0.8416
Epoch 43/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3666 - accuracy: 0.8425 - val_loss: 0.3632 - val_accuracy: 0.8414
Epoch 44/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3659 - accuracy: 0.8426 - val_loss: 0.3637 - val_accuracy: 0.8416
Epoch 45/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3696 - accuracy: 0.8434 - val_loss: 0.3635 - val_accuracy: 0.8411
Epoch 46/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3660 - accuracy: 0.8436 - val_loss: 0.3628 - val_accuracy: 0.8419
Epoch 47/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3728 - accuracy: 0.8401 - val_loss: 0.3659 - val_accuracy: 0.8418
Epoch 48/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3696 - accuracy: 0.8408 - val_loss: 0.3628 - val_accuracy: 0.8420
Epoch 49/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3674 - accuracy: 0.8426 - val_loss: 0.3629 - val_accuracy: 0.8409
Epoch 50/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3654 - accuracy: 0.8439 - val_loss: 0.3663 - val_accuracy: 0.8414
Epoch 51/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3695 - accuracy: 0.8421 - val_loss: 0.3624 - val_accuracy: 0.8422
Epoch 52/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3671 - accuracy: 0.8424 - val_loss: 0.3621 - val_accuracy: 0.8422
Epoch 53/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3710 - accuracy: 0.8410 - val_loss: 0.3621 - val_accuracy: 0.8419
Epoch 54/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3686 - accuracy: 0.8418 - val_loss: 0.3621 - val_accuracy: 0.8421
Epoch 55/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3673 - accuracy: 0.8429 - val_loss: 0.3622 - val_accuracy: 0.8419
Epoch 56/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3670 - accuracy: 0.8416 - val_loss: 0.3627 - val_accuracy: 0.8406
Epoch 57/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3665 - accuracy: 0.8413 - val_loss: 0.3620 - val_accuracy: 0.8412
Epoch 58/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3680 - accuracy: 0.8428 - val_loss: 0.3616 - val_accuracy: 0.8420
Epoch 59/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3679 - accuracy: 0.8413 - val_loss: 0.3626 - val_accuracy: 0.8400
Epoch 60/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3660 - accuracy: 0.8427 - val_loss: 0.3613 - val_accuracy: 0.8428
Epoch 61/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3659 - accuracy: 0.8431 - val_loss: 0.3612 - val_accuracy: 0.8421
Epoch 62/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3681 - accuracy: 0.8411 - val_loss: 0.3610 - val_accuracy: 0.8417
Epoch 63/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3661 - accuracy: 0.8434 - val_loss: 0.3609 - val_accuracy: 0.8424
Epoch 64/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3606 - accuracy: 0.8462 - val_loss: 0.3615 - val_accuracy: 0.8432
Epoch 65/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3657 - accuracy: 0.8449 - val_loss: 0.3614 - val_accuracy: 0.8419
Epoch 66/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3646 - accuracy: 0.8447 - val_loss: 0.3613 - val_accuracy: 0.8431
Epoch 67/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3653 - accuracy: 0.8433 - val_loss: 0.3615 - val_accuracy: 0.8412
Epoch 68/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3651 - accuracy: 0.8428 - val_loss: 0.3609 - val_accuracy: 0.8428
Epoch 69/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3649 - accuracy: 0.8445 - val_loss: 0.3604 - val_accuracy: 0.8422
Epoch 70/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3669 - accuracy: 0.8426 - val_loss: 0.3608 - val_accuracy: 0.8426
Epoch 71/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3644 - accuracy: 0.8447 - val_loss: 0.3615 - val_accuracy: 0.8431
Epoch 72/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3611 - accuracy: 0.8470 - val_loss: 0.3611 - val_accuracy: 0.8411
Epoch 73/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3630 - accuracy: 0.8445 - val_loss: 0.3617 - val_accuracy: 0.8408
Epoch 74/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3681 - accuracy: 0.8415 - val_loss: 0.3598 - val_accuracy: 0.8432
Epoch 75/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3638 - accuracy: 0.8443 - val_loss: 0.3596 - val_accuracy: 0.8432
Epoch 76/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3639 - accuracy: 0.8432 - val_loss: 0.3602 - val_accuracy: 0.8427
Epoch 77/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3635 - accuracy: 0.8446 - val_loss: 0.3597 - val_accuracy: 0.8423
Epoch 78/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3637 - accuracy: 0.8446 - val_loss: 0.3613 - val_accuracy: 0.8430
Epoch 79/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3659 - accuracy: 0.8427 - val_loss: 0.3647 - val_accuracy: 0.8428
Epoch 80/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3662 - accuracy: 0.8432 - val_loss: 0.3605 - val_accuracy: 0.8414
Epoch 81/100
3410/3410 [==============================] - 6s 2ms/step - loss: 0.3662 - accuracy: 0.8429 - val_loss: 0.3592 - val_accuracy: 0.8431
Epoch 82/100
3410/3410 [==============================] - 6s 2ms/step - loss: 0.3661 - accuracy: 0.8435 - val_loss: 0.3595 - val_accuracy: 0.8425
Epoch 83/100
3410/3410 [==============================] - 6s 2ms/step - loss: 0.3634 - accuracy: 0.8445 - val_loss: 0.3592 - val_accuracy: 0.8434
Epoch 84/100
3410/3410 [==============================] - 6s 2ms/step - loss: 0.3642 - accuracy: 0.8441 - val_loss: 0.3591 - val_accuracy: 0.8428
Epoch 85/100
3410/3410 [==============================] - 6s 2ms/step - loss: 0.3608 - accuracy: 0.8457 - val_loss: 0.3595 - val_accuracy: 0.8429
Epoch 86/100
3410/3410 [==============================] - 6s 2ms/step - loss: 0.3620 - accuracy: 0.8464 - val_loss: 0.3591 - val_accuracy: 0.8436
Epoch 87/100
3410/3410 [==============================] - 6s 2ms/step - loss: 0.3653 - accuracy: 0.8430 - val_loss: 0.3604 - val_accuracy: 0.8432
Epoch 88/100
3410/3410 [==============================] - 6s 2ms/step - loss: 0.3654 - accuracy: 0.8427 - val_loss: 0.3686 - val_accuracy: 0.8367
Epoch 89/100
3410/3410 [==============================] - 7s 2ms/step - loss: 0.3594 - accuracy: 0.8468 - val_loss: 0.3589 - val_accuracy: 0.8428
Epoch 90/100
3410/3410 [==============================] - 6s 2ms/step - loss: 0.3636 - accuracy: 0.8446 - val_loss: 0.3588 - val_accuracy: 0.8436
Epoch 91/100
3410/3410 [==============================] - 6s 2ms/step - loss: 0.3657 - accuracy: 0.8444 - val_loss: 0.3607 - val_accuracy: 0.8432
Epoch 92/100
3410/3410 [==============================] - 6s 2ms/step - loss: 0.3628 - accuracy: 0.8441 - val_loss: 0.3588 - val_accuracy: 0.8437
Epoch 93/100
3410/3410 [==============================] - 6s 2ms/step - loss: 0.3666 - accuracy: 0.8427 - val_loss: 0.3596 - val_accuracy: 0.8423
Epoch 94/100
3410/3410 [==============================] - 6s 2ms/step - loss: 0.3659 - accuracy: 0.8421 - val_loss: 0.3594 - val_accuracy: 0.8441
Epoch 95/100
3410/3410 [==============================] - 6s 2ms/step - loss: 0.3623 - accuracy: 0.8459 - val_loss: 0.3583 - val_accuracy: 0.8438
Epoch 96/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3604 - accuracy: 0.8459 - val_loss: 0.3586 - val_accuracy: 0.8439
Epoch 97/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3633 - accuracy: 0.8436 - val_loss: 0.3588 - val_accuracy: 0.8415
Epoch 98/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3618 - accuracy: 0.8471 - val_loss: 0.3582 - val_accuracy: 0.8440
Epoch 99/100
3410/3410 [==============================] - 4s 1ms/step - loss: 0.3633 - accuracy: 0.8452 - val_loss: 0.3627 - val_accuracy: 0.8433
Epoch 100/100
3410/3410 [==============================] - 5s 1ms/step - loss: 0.3600 - accuracy: 0.8458 - val_loss: 0.3584 - val_accuracy: 0.8441<tensorflow.python.keras.callbacks.History at 0x1afed021128>
  1. 结果可视化
import matplotlib.pyplot as pltacc = model.history.history['accuracy']
val_acc = model.history.history['val_accuracy']loss = model.history.history['loss']
val_loss = model.history.history['val_loss']epochs_range = range(100)plt.figure(figsize=(14, 4))
plt.subplot(1, 2, 1)plt.plot(epochs_range, acc, label='Training Accuracy')
plt.plot(epochs_range, val_acc, label='Validation Accuracy')
plt.legend(loc='lower right')
plt.title('Training and Validation Accuracy')plt.subplot(1, 2, 2)
plt.plot(epochs_range, loss, label='Training Loss')
plt.plot(epochs_range, val_loss, label='Validation Loss')
plt.legend(loc='upper right')
plt.title('Training and Validation Loss')
plt.show()

代码输出:

在这里插入图片描述

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.rhkb.cn/news/394635.html

如若内容造成侵权/违法违规/事实不符,请联系长河编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

目录函数以及链接文件

一、对stat里面的用户名时间做处理的函数 1.1.getpwuid&#xff08;&#xff09; struct passwd *getpwuid(uid_t uid); 功能: 根据用户id到/etc/passwd文件下解析获得 结构体信息 参数: uid:用户id 返回值: 成功返回id对应用户的信息 失败返回NULL 1. 2.getgrgid&#xf…

数据复盘“黑色星期一”:加密市场震荡,代币表现如何?

8月5日的“黑色星期一”成为了全球金融市场的动荡日&#xff0c;这一波及到加密市场的剧烈震荡导致了大量清算事件和代币的暴跌。本文将通过数据复盘&#xff0c;分析这一事件中加密货币的表现&#xff0c;并探讨未来市场的可能走向。 一、暴跌中的惨痛数据 在“黑色星期一”事…

Jenkins构建异常,Dockerfile中ADD或COPY及相对路径

Jenkins构建异常&#xff0c;Dockerfile中ADD或COPY及相对路径 制品构建前后端异常 #前端 09:45:53 docker build -t hubtest.......com.cn/duty_record/......-web-01:origin-master-20 -f vue/script/Dockerfile vue/script 09:45:54 Sending build context to Docker da…

zabbix7.0TLS-05-快速入门-触发器

文章目录 1 概述2 查看触发器3 添加触发器4 验证触发器5 查看问题6 问题恢复 1 概述 监控项用于收集数据&#xff0c;但是我们并不能时刻观测每个监控项的数据&#xff0c;看看哪个监控项的数据超过了正常可接受的数值或状态&#xff0c;比如 CPU 负载高于 90%、磁盘使用率低于…

TypeScript位运算

参考文献&#xff1a; https://blog.csdn.net/xuaner8786/article/details/138858747 https://www.runoob.com/typescript/ts-operators.html 位运算符 TypeScript 中的位运算符用于在二进制位级别上操作数字。这些运算符在处理整数和底层系统编程时特别有用。以下是一些使用…

互联网医院系统源码与医保购药APP开发:探索医疗的数字化转型

互联网医院系统的开发是一个复杂的工程&#xff0c;需要多个模块的有机结合才能实现高效、安全的在线医疗服务。以下是互联网医院系统的几个关键组成部分&#xff1a; 1.在线问诊模块 2.电子病历管理 3.在线预约与支付系统 4.远程医疗设备对接 一、医保购药APP的开发要点 …

三大浏览器Google Chrome、Edge、Firefox内存占用对比

问题 Chrome、Edg、Firefox三家究竟谁的占用少 结论 打开一个页面内存占用 Firefox>Edge>Chrome 打开打量页面内存占用 Firefox>Chrome>Edge 从监视器可以看到Edge增加一个页面增加一个页面不到100M而其它浏览器需要150M左右;Firefox浏览器主线程内存占用800M比…

【实现100个unity特效之16】unity2022之前或者之后版本实现全屏shader graph的不同方式 —— 适用于人物受伤红屏或者一些其他状态效果

最终效果 文章目录 最终效果前言unity2022版本 Fullscreen shader graph首先&#xff0c;请注意你的Inity版本&#xff0c;是不是2022.2以上&#xff0c;并且项目是URP项且基本配置 修改shader graph边缘效果动起来优化科幻风制作一些变量最终效果最终节点图代码控制 2022之前版…

【xilinx】如何从 Vivado GUI 启用/禁用 IP Core container

问题描述 如何从 Vivado GUI 启用/禁用 IP 核容器&#xff1f; 解决方案 要通过 GUI 启用/禁用 2023.1 之前的 Vivado 版本中的 IP 核容器&#xff0c;请按照以下步骤操作&#xff1a; 选择设置 -> IP -> 使用核心容器 在 Vivado 2023.1 及更高版本中&#xff0c;请按照…

Unity初识

1&#xff1a;下载Unity Hub 下载地址&#xff1a;Unity官方下载_Unity最新版_从Unity Hub下载安装 | Unity中国官网 建议直接使用unity hub因为支持比较全面&#xff0c;适合新手 有中文 管理 编辑器等等功能支持 下载安装不过多介绍 2&#xff1a;Unity Hub汉化 因为我…

elasticsearch的使用(二)

DSL查询 Elasticsearch的查询可以分为两大类&#xff1a; 叶子查询&#xff08;Leaf query clauses&#xff09;&#xff1a;一般是在特定的字段里查询特定值&#xff0c;属于简单查询&#xff0c;很少单独使用。 复合查询&#xff08;Compound query clauses&#xff09;&am…

sql注入-常见注入方法复现

环境演示均已sql-labs为例 1、报错注入 1.1常用的报错注入的函数 掌握好extractvalue、updatexml、floor报错&#xff0c;floor报错较难需要多理解&#xff0c;updatexml较为常用 定义 报错注入是通过特殊函数错误使用并使其输出错误结果来获取信息的。是一种页面响应形式…

centos上传工具

yum install lrzsz 安装完成之后 作用是 输入 rz 可以本地上传文件

python自动化笔记:pytest框架

目录 一、pytest介绍二、测试用例命名规则2.1、pytest命名规则2.2、python命名规范 三、pytest运行方式3.1、主函数方式3.2、命令行方式3.3、通过pytest.ini的配置文件运行&#xff08;常用&#xff09; 四、跳过测试用例4.1 无条件跳过4.2 有条件跳过 五、用例的前后置&#x…

GD - GD-Link-V2接口引脚线序

文章目录 GD - GD-Link-V2接口引脚线序概述笔记接口线序连接方式 END GD - GD-Link-V2接口引脚线序 概述 弄了一个GD-Link-V2, 看了说明书&#xff0c;记录一下线序。 笔记 接口线序 出厂的GD-LINK-V2默认是向外供电为3.3V。 输出插座为2x5P - 2.54mm. 从GD-LINK-V2的(TOP…

数据结构——单向链表

目录 前言 一、单向链表 二、单向链表基本操作 1、链表单创建 2.节点插入 &#xff08;1&#xff09;尾部插入 &#xff08;2&#xff09;任意位置插入 3、单向链表节点删除 4、链表打印 5、释放链表 6、链表逆序 ...... 三、链表测试 总结 前言 链表&#xff08;Linked List&a…

万字长文讲透数字化转型

温馨提醒&#xff1a;1.6w字详细拆解&#xff0c;内容篇幅较长&#xff0c;建议先收藏~ 数字化浪潮正在席卷全球&#xff0c;践行数字化转型和提升企业的运营水平与竞争力&#xff0c;已经成为各国企业角力全球市场的重要议题。为此&#xff0c;很多国家政府都推出了鼓励和推动…

(el-Time-Picker)操作(不使用 ts):Element-plus 中 TimePicker 组件的使用及输出想要时间格式需求的解决过程

Ⅰ、Element-plus 提供的 TimePicker 时间选择器组件与想要目标情况的对比&#xff1a; 1、Element-plus 提供 TimePicker 组件情况&#xff1a; 其一、Element-ui 自提供的 TimePicker 代码情况为(示例的代码)&#xff1a; // Element-plus 提供的组件代码: <template>…

Go - 10. * 值类型和指针类型的差异

目录 一.引言 二.接收者类型 三.代码示例 1.指针接收者 2.值接收者 3.运行结果对比 4.代码修改 5.刨根问底 四.总结 一.引言 go 语言中 func (c *Title) 和 func (c Title) 两个方法的传参差一个 * 号&#xff0c;二者的区别是一个是指针类型&#xff0c;一个是值类型…

MATLAB中的imshow函数的使用方法及实例应用

一、imshow函数 imshow是MATLAB工具软件中用于显示图像的函数&#xff0c;它支持多种图像类型&#xff0c;包括灰度图像、真彩色图像、索引图像等。以下是对imshow常用用法: imshow(I) 在图窗中显示灰度图像 I。imshow 使用图像数据类型的默认显示范围&#xff0c;并优化图窗、…