个人医疗开支预测项目

 注意:本文引用自专业人工智能社区Venus AI

更多AI知识请参考原站 ([www.aideeplearning.cn])

项目背景

随着医疗成本的持续上涨,个人医疗开支成为一个重要议题。理解影响医疗费用的多种因素对于医疗保险公司、政府机构以及个人都至关重要。利用数据分析和机器学习技术,我们能够更好地预测和管理个人医疗费用。

项目目标

本项目的主要目标是开发一个能够准确预测个人医疗费用的模型。通过分析影响医疗费用的各种因素,如年龄、性别、BMI、吸烟状态、居住地区等,我们希望提供给保险公司和政策制定者更深入的见解,以便他们制定更有效的策略和计划。

项目应用

  • 保险定价: 帮助保险公司基于客户的个人健康数据定制保险费率。
  • 政策制定: 为政府和医疗机构提供数据支持,以便制定更有效的医疗保健政策。
  • 个人医疗规划: 辅助个人基于他们的健康状况和生活方式来规划未来的医疗费用。

数据集(描述到特征)

数据集包含以下特征:

  • 年龄(age): 主要受益人的年龄。
  • 性别(sex): 保险合同者的性别,包括女性和男性。
  • BMI(bmi): 身体质量指数,衡量体重与身高的关系,理想范围是18.5至24.9。
  • 子女数量(children): 受健康保险覆盖的子女数量。
  • 吸烟状况(smoker): 是否吸烟。
  • 居住地区(region): 受益人在美国的居住地区,包括东北部、东南部、西南部和西北部。
  • 医疗费用(charges): 由健康保险账单的个人医疗费用。

模型和依赖库

项目中使用了多种模型和依赖库:

  • 模型:
    1. 线性回归模型(Linear Regression Model)
    2. 随机森林回归模型(Random Forest Regression Model)
    3. 带有GridSearchCV的支持向量回归模型(Support Vector Regression Model with GridSearchCV)
    4. 梯度提升模型(GradientBoost Model)
    5. 简单的密集神经网络(Simple Dense Neural Network)
  • 依赖库:
    • 数据预处理和探索性数据分析: pandas、seaborn、matplotlib、numpy
    • 模型训练: sklearn.linear_model、sklearn.tree、sklearn.ensemble、sklearn.svm、sklearn.model_selection、tensorflow

代码实现

数据分析

# 导入依赖
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
# 读入数据集
df = pd.read_csv('insurance.csv')
df.head()
agesexbmichildrensmokerregioncharges
019female27.9000yessouthwest16884.92400
118male33.7701nosoutheast1725.55230
228male33.0003nosoutheast4449.46200
333male22.7050nonorthwest21984.47061
432male28.8800nonorthwest3866.85520
# 检查是否有空值
df.isnull().sum()
age         0
sex         0
bmi         0
children    0
smoker      0
region      0
charges     0
dtype: int64
# 在散点图上绘制医疗费用收费和年龄的关系,色调为吸烟者plt.figure(figsize = (14,7))
sns.scatterplot(x=df['age'] ,y=df['charges'] ,hue=df['smoker'] ,palette = 'bright' ,s=50)
plt.xticks(color='red' ,size=12)
plt.yticks(color='red' ,size= 12)
plt.xlabel('AGE' ,color='purple' ,size=17)
plt.ylabel('CHARGES' ,color='purple' ,size=17);

# 绘制绘制医疗费用收费和BMI的散点图,色调为“吸烟者”plt.figure(figsize = (14,7))
sns.scatterplot(x=df['bmi'] ,y=df['charges'] ,hue=df['smoker'] ,palette = 'bright' ,s=50)
plt.xticks(color='red' ,size=12)
plt.yticks(color='red' ,size= 12)
plt.xlabel('BMI' ,color='purple' ,size=17)
plt.ylabel('CHARGES' ,color='purple' ,size=17);

df['region'].unique()
array(['southwest', 'southeast', 'northwest', 'northeast'], dtype=object)
# 检查不同地区的医疗费用关系
plt.figure(dpi=150)
sns.boxplot(x=df['region'] ,y=df['charges'] )
plt.xticks(color='red' ,size=12)
plt.yticks(color='red' ,size= 12)
plt.xlabel('REGION' ,color='purple' ,size=17)
plt.ylabel('CHARGES' ,color='purple' ,size=17);

# 检查费用分布
plt.figure(dpi=100)
sns.histplot(x=df['charges'] ,color='green')
plt.xticks(color='red' ,size=8)
plt.yticks(color='red' ,size= 8);

# 对所有分类列进行 one-hot 编码
df2 = pd.get_dummies(df ,drop_first = True)
df2.head()
agebmichildrenchargessex_malesmoker_yesregion_northwestregion_southeastregion_southwest
01927.900016884.9240001001
11833.77011725.5523010010
22833.00034449.4620010010
33322.705021984.4706110100
43228.88003866.8552010100
# 检查相关性
df2.corr()['charges']
age                 0.299008
bmi                 0.198341
children            0.067998
charges             1.000000
sex_male            0.057292
smoker_yes          0.787251
region_northwest   -0.039905
region_southeast    0.073982
region_southwest   -0.043210
Name: charges, dtype: float64
# 绘制上面的热图
plt.figure(dpi=160)
sns.heatmap(np.round(df2.corr() ,2) ,annot=True ,cmap='viridis');

使用 5 个不同模型进行建模实验

# 定义特征和标签
X = df2.drop('charges' ,axis=1)
y = df2['charges']
# 进行训练和测试分离
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
len(X_train) ,len(y_test)
(1070, 268)
# 进行一些预处理
from sklearn.preprocessing import StandardScaler
scaler = StandardScaler()scaled_X_train = scaler.fit_transform(X_train)
scaled_X_test = scaler.transform(X_test)

模型1:线性回归

from sklearn.linear_model import LinearRegression
lr = LinearRegression()
lr.fit(scaled_X_train ,y_train)

LinearRegression

LinearRegression()
# 检查性能
from sklearn.metrics import mean_squared_error ,r2_score
lr.score(scaled_X_test ,y_test)
0.7835929767120722
r2_score(y_test ,lr.predict(scaled_X_test))
0.7835929767120722

模型 2:随机森​​林模型

from sklearn.ensemble import RandomForestRegressor
rf = RandomForestRegressor(n_estimators = 140,criterion = 'squared_error',random_state = 42,n_jobs = -1)rf.fit(scaled_X_train,y_train)

RandomForestRegressor

RandomForestRegressor(n_estimators=140, n_jobs=-1, random_state=42)
forest_test_pred = rf.predict(scaled_X_test)
r2_score(y_pred = forest_test_pred ,y_true = y_test)
0.8650776528213561

所以分数从 0.78 提高到 0.87

模型 3:带有 GridSearchCV 的支持向量机

from sklearn.svm import SVR
from sklearn.model_selection import GridSearchCV
svr = SVR()
# 定义 gridsearchcv 的参数网格
param_grid = {'C':[0.001,0.01,0.1,0.5,1],'kernel':['linear','rbf','poly'],'gamma':['scale','auto'],'degree':[2,3,4],'epsilon':[0,0.01,0.1,0.5,1,2]}
grid = GridSearchCV(svr,param_grid=param_grid)
grid.fit(scaled_X_train,y_train)

# 检查最佳参数
grid.best_params_
{'C': 1, 'degree': 2, 'epsilon': 2, 'gamma': 'scale', 'kernel': 'linear'}
grid_preds = grid.predict(scaled_X_test)
r2_score(y_true = y_test ,y_pred=grid_preds)
0.019799220771840598

SVR 模型的性能非常差。猜测是出现了严重的过拟合

模型 4:GradientBoost 模型

from sklearn.ensemble import GradientBoostingRegressor
gb = GradientBoostingRegressor(random_state = 42)
gb.fit(scaled_X_train ,y_train)

GradientBoostingRegressor

GradientBoostingRegressor(random_state=42)
gb_preds = gb.predict(scaled_X_test)
r2_score(y_true = y_test ,y_pred = gb_preds)
0.8792571359795264

所以我们使用梯度提升模型得到了 0.88 的分数,这比之前的分数要高。

模型 5:密集神经网络

import tensorflow as tf
scaled_X_train.shape
(1070, 8)
import tensorflow as tf# Set the random seed for reproducibility
tf.random.set_seed(42)# Define the model
model = tf.keras.Sequential([tf.keras.layers.Dense(32, activation='relu'),  # First hidden layer with 32 neurons and ReLU activationtf.keras.layers.Dense(1)  # Output layer with 1 neuron (for regression)
])# Compile the model
model.compile(loss=tf.keras.losses.MeanAbsoluteError(),  # Using Mean Absolute Error for lossoptimizer=tf.keras.optimizers.Adam(learning_rate=0.1),  # Adam optimizer with a learning rate of 0.1metrics=['mae']  # Tracking Mean Absolute Error as a metric
)# Train the model
model.fit(scaled_X_train,y_train,epochs=500,validation_data=(scaled_X_test, y_test)
)
Epoch 1/500
34/34 [==============================] - 1s 5ms/step - loss: 13176.4053 - mae: 13176.4053 - val_loss: 12342.8604 - val_mae: 12342.8604
Epoch 2/500
34/34 [==============================] - 0s 1ms/step - loss: 11706.1855 - mae: 11706.1855 - val_loss: 9990.0645 - val_mae: 9990.0645
Epoch 3/500
34/34 [==============================] - 0s 2ms/step - loss: 8708.5283 - mae: 8708.5283 - val_loss: 6905.4297 - val_mae: 6905.4297
Epoch 4/500
34/34 [==============================] - 0s 2ms/step - loss: 5669.3076 - mae: 5669.3076 - val_loss: 4656.5684 - val_mae: 4656.5684
Epoch 5/500
34/34 [==============================] - 0s 2ms/step - loss: 4149.5723 - mae: 4149.5723 - val_loss: 3637.5278 - val_mae: 3637.5278
Epoch 6/500
34/34 [==============================] - 0s 2ms/step - loss: 3684.2610 - mae: 3684.2610 - val_loss: 3394.3816 - val_mae: 3394.3816
Epoch 7/500
34/34 [==============================] - 0s 2ms/step - loss: 3543.0925 - mae: 3543.0925 - val_loss: 3277.0593 - val_mae: 3277.0593
Epoch 8/500
34/34 [==============================] - 0s 2ms/step - loss: 3464.1858 - mae: 3464.1858 - val_loss: 3179.0710 - val_mae: 3179.0710
Epoch 9/500
34/34 [==============================] - 0s 2ms/step - loss: 3390.9805 - mae: 3390.9805 - val_loss: 3128.5288 - val_mae: 3128.5288
Epoch 10/500
34/34 [==============================] - 0s 2ms/step - loss: 3347.2905 - mae: 3347.2905 - val_loss: 3054.6538 - val_mae: 3054.6538
Epoch 11/500
34/34 [==============================] - 0s 2ms/step - loss: 3298.8059 - mae: 3298.8059 - val_loss: 3021.8582 - val_mae: 3021.8582
Epoch 12/500
34/34 [==============================] - 0s 2ms/step - loss: 3245.7239 - mae: 3245.7239 - val_loss: 2963.4739 - val_mae: 2963.4739
Epoch 13/500
34/34 [==============================] - 0s 2ms/step - loss: 3210.4692 - mae: 3210.4692 - val_loss: 2907.2910 - val_mae: 2907.2910
Epoch 14/500
34/34 [==============================] - 0s 2ms/step - loss: 3179.5813 - mae: 3179.5813 - val_loss: 2880.0723 - val_mae: 2880.0723
Epoch 15/500
34/34 [==============================] - 0s 2ms/step - loss: 3152.4485 - mae: 3152.4485 - val_loss: 2855.9231 - val_mae: 2855.9231
Epoch 16/500
34/34 [==============================] - 0s 2ms/step - loss: 3140.8130 - mae: 3140.8130 - val_loss: 2866.1743 - val_mae: 2866.1743
Epoch 17/500
34/34 [==============================] - 0s 2ms/step - loss: 3133.6399 - mae: 3133.6399 - val_loss: 2838.6018 - val_mae: 2838.6018
Epoch 18/500
34/34 [==============================] - 0s 2ms/step - loss: 3093.8535 - mae: 3093.8535 - val_loss: 2822.5027 - val_mae: 2822.5027
Epoch 19/500
34/34 [==============================] - 0s 2ms/step - loss: 3074.4148 - mae: 3074.4148 - val_loss: 2802.2068 - val_mae: 2802.2068
Epoch 20/500
34/34 [==============================] - 0s 2ms/step - loss: 3036.4634 - mae: 3036.4634 - val_loss: 2768.5417 - val_mae: 2768.5417
Epoch 21/500
34/34 [==============================] - 0s 2ms/step - loss: 3009.5781 - mae: 3009.5781 - val_loss: 2767.8345 - val_mae: 2767.8345
Epoch 22/500
34/34 [==============================] - 0s 2ms/step - loss: 2991.8489 - mae: 2991.8489 - val_loss: 2776.9192 - val_mae: 2776.9192
Epoch 23/500
34/34 [==============================] - 0s 2ms/step - loss: 2967.2141 - mae: 2967.2141 - val_loss: 2740.1831 - val_mae: 2740.1831
Epoch 24/500
34/34 [==============================] - 0s 1ms/step - loss: 2918.3477 - mae: 2918.3477 - val_loss: 2701.7727 - val_mae: 2701.7727
Epoch 25/500
34/34 [==============================] - 0s 2ms/step - loss: 2885.5771 - mae: 2885.5771 - val_loss: 2691.2104 - val_mae: 2691.2104
Epoch 26/500
34/34 [==============================] - 0s 2ms/step - loss: 2853.4263 - mae: 2853.4263 - val_loss: 2690.2424 - val_mae: 2690.2424
Epoch 27/500
34/34 [==============================] - 0s 2ms/step - loss: 2824.5645 - mae: 2824.5645 - val_loss: 2648.8665 - val_mae: 2648.8665
Epoch 28/500
34/34 [==============================] - 0s 2ms/step - loss: 2787.7375 - mae: 2787.7375 - val_loss: 2627.9719 - val_mae: 2627.9719
Epoch 29/500
34/34 [==============================] - 0s 2ms/step - loss: 2762.5017 - mae: 2762.5017 - val_loss: 2613.2471 - val_mae: 2613.2471
Epoch 30/500
34/34 [==============================] - 0s 2ms/step - loss: 2717.3730 - mae: 2717.3730 - val_loss: 2591.0627 - val_mae: 2591.0627
Epoch 31/500
34/34 [==============================] - 0s 2ms/step - loss: 2696.9280 - mae: 2696.9280 - val_loss: 2583.0713 - val_mae: 2583.0713
Epoch 32/500
34/34 [==============================] - 0s 2ms/step - loss: 2668.2810 - mae: 2668.2810 - val_loss: 2560.2952 - val_mae: 2560.2952
Epoch 33/500
34/34 [==============================] - 0s 2ms/step - loss: 2635.6670 - mae: 2635.6670 - val_loss: 2542.4531 - val_mae: 2542.4531
Epoch 34/500
34/34 [==============================] - 0s 2ms/step - loss: 2621.8315 - mae: 2621.8315 - val_loss: 2525.4246 - val_mae: 2525.4246
Epoch 35/500
34/34 [==============================] - 0s 2ms/step - loss: 2602.8113 - mae: 2602.8113 - val_loss: 2522.4045 - val_mae: 2522.4045
Epoch 36/500
34/34 [==============================] - 0s 2ms/step - loss: 2560.2109 - mae: 2560.2109 - val_loss: 2488.8569 - val_mae: 2488.8569
Epoch 37/500
34/34 [==============================] - 0s 2ms/step - loss: 2538.4377 - mae: 2538.4377 - val_loss: 2474.2083 - val_mae: 2474.2083
Epoch 38/500
34/34 [==============================] - 0s 2ms/step - loss: 2520.7141 - mae: 2520.7141 - val_loss: 2435.9792 - val_mae: 2435.9792
Epoch 39/500
34/34 [==============================] - 0s 2ms/step - loss: 2484.6331 - mae: 2484.6331 - val_loss: 2415.0652 - val_mae: 2415.0652
Epoch 40/500
34/34 [==============================] - 0s 2ms/step - loss: 2465.1812 - mae: 2465.1812 - val_loss: 2370.2437 - val_mae: 2370.2437
Epoch 41/500
34/34 [==============================] - 0s 2ms/step - loss: 2437.7688 - mae: 2437.7688 - val_loss: 2371.4500 - val_mae: 2371.4500
Epoch 42/500
34/34 [==============================] - 0s 2ms/step - loss: 2425.9814 - mae: 2425.9814 - val_loss: 2343.7275 - val_mae: 2343.7275
Epoch 43/500
34/34 [==============================] - 0s 2ms/step - loss: 2392.9536 - mae: 2392.9536 - val_loss: 2330.2480 - val_mae: 2330.2480
Epoch 44/500
34/34 [==============================] - 0s 2ms/step - loss: 2371.0847 - mae: 2371.0847 - val_loss: 2289.7070 - val_mae: 2289.7070
Epoch 45/500
34/34 [==============================] - 0s 2ms/step - loss: 2348.7854 - mae: 2348.7854 - val_loss: 2292.7617 - val_mae: 2292.7617
Epoch 46/500
34/34 [==============================] - 0s 2ms/step - loss: 2334.8552 - mae: 2334.8552 - val_loss: 2241.6716 - val_mae: 2241.6716
Epoch 47/500
34/34 [==============================] - 0s 2ms/step - loss: 2315.0535 - mae: 2315.0535 - val_loss: 2210.4521 - val_mae: 2210.4521
Epoch 48/500
34/34 [==============================] - 0s 2ms/step - loss: 2297.7964 - mae: 2297.7964 - val_loss: 2171.4700 - val_mae: 2171.4700
Epoch 49/500
34/34 [==============================] - 0s 2ms/step - loss: 2294.3506 - mae: 2294.3506 - val_loss: 2151.6238 - val_mae: 2151.6238
Epoch 50/500
34/34 [==============================] - 0s 2ms/step - loss: 2263.3362 - mae: 2263.3362 - val_loss: 2142.9990 - val_mae: 2142.9990
Epoch 51/500
34/34 [==============================] - 0s 2ms/step - loss: 2253.5146 - mae: 2253.5146 - val_loss: 2134.1323 - val_mae: 2134.1323
Epoch 52/500
34/34 [==============================] - 0s 1ms/step - loss: 2237.9785 - mae: 2237.9785 - val_loss: 2098.9661 - val_mae: 2098.9661
Epoch 53/500
34/34 [==============================] - 0s 2ms/step - loss: 2236.9548 - mae: 2236.9548 - val_loss: 2071.0901 - val_mae: 2071.0901
Epoch 54/500
34/34 [==============================] - 0s 2ms/step - loss: 2215.9924 - mae: 2215.9924 - val_loss: 2061.3196 - val_mae: 2061.3196
Epoch 55/500
34/34 [==============================] - 0s 2ms/step - loss: 2205.8298 - mae: 2205.8298 - val_loss: 2040.3530 - val_mae: 2040.3530
Epoch 56/500
34/34 [==============================] - 0s 2ms/step - loss: 2188.2505 - mae: 2188.2505 - val_loss: 2021.0121 - val_mae: 2021.0121
Epoch 57/500
34/34 [==============================] - 0s 2ms/step - loss: 2177.3682 - mae: 2177.3682 - val_loss: 2021.2423 - val_mae: 2021.2423
Epoch 58/500
34/34 [==============================] - 0s 2ms/step - loss: 2178.9595 - mae: 2178.9595 - val_loss: 2011.6056 - val_mae: 2011.6056
Epoch 59/500
34/34 [==============================] - 0s 2ms/step - loss: 2163.4875 - mae: 2163.4875 - val_loss: 1989.4083 - val_mae: 1989.4083
Epoch 60/500
34/34 [==============================] - 0s 2ms/step - loss: 2162.3979 - mae: 2162.3979 - val_loss: 1966.5223 - val_mae: 1966.5223
Epoch 61/500
34/34 [==============================] - 0s 2ms/step - loss: 2147.4280 - mae: 2147.4280 - val_loss: 1976.2264 - val_mae: 1976.2264
Epoch 62/500
34/34 [==============================] - 0s 2ms/step - loss: 2155.3154 - mae: 2155.3154 - val_loss: 1960.0067 - val_mae: 1960.0067
Epoch 63/500
34/34 [==============================] - 0s 2ms/step - loss: 2135.1248 - mae: 2135.1248 - val_loss: 1952.3442 - val_mae: 1952.3442
Epoch 64/500
34/34 [==============================] - 0s 2ms/step - loss: 2134.0422 - mae: 2134.0422 - val_loss: 1944.5605 - val_mae: 1944.5605
Epoch 65/500
34/34 [==============================] - 0s 1ms/step - loss: 2127.6821 - mae: 2127.6821 - val_loss: 1941.9708 - val_mae: 1941.9708
Epoch 66/500
34/34 [==============================] - 0s 2ms/step - loss: 2122.3645 - mae: 2122.3645 - val_loss: 1929.4570 - val_mae: 1929.4570
Epoch 67/500
34/34 [==============================] - 0s 2ms/step - loss: 2120.1699 - mae: 2120.1699 - val_loss: 1936.8811 - val_mae: 1936.8811
Epoch 68/500
34/34 [==============================] - 0s 1ms/step - loss: 2109.6860 - mae: 2109.6860 - val_loss: 1931.9760 - val_mae: 1931.9760
Epoch 69/500
34/34 [==============================] - 0s 2ms/step - loss: 2114.0178 - mae: 2114.0178 - val_loss: 1912.5884 - val_mae: 1912.5884
Epoch 70/500
34/34 [==============================] - 0s 1ms/step - loss: 2104.0676 - mae: 2104.0676 - val_loss: 1905.8671 - val_mae: 1905.8671
Epoch 71/500
34/34 [==============================] - 0s 1ms/step - loss: 2092.1143 - mae: 2092.1143 - val_loss: 1901.4152 - val_mae: 1901.4152
Epoch 72/500
34/34 [==============================] - 0s 2ms/step - loss: 2100.2505 - mae: 2100.2505 - val_loss: 1903.0378 - val_mae: 1903.0378
Epoch 73/500
34/34 [==============================] - 0s 2ms/step - loss: 2101.2200 - mae: 2101.2200 - val_loss: 1901.5217 - val_mae: 1901.5217
Epoch 74/500
34/34 [==============================] - 0s 2ms/step - loss: 2084.9080 - mae: 2084.9080 - val_loss: 1888.1274 - val_mae: 1888.1274
Epoch 75/500
34/34 [==============================] - 0s 1ms/step - loss: 2073.2085 - mae: 2073.2085 - val_loss: 1885.5505 - val_mae: 1885.5505
Epoch 76/500
34/34 [==============================] - 0s 2ms/step - loss: 2079.0083 - mae: 2079.0083 - val_loss: 1869.9961 - val_mae: 1869.9961
Epoch 77/500
34/34 [==============================] - 0s 2ms/step - loss: 2081.0576 - mae: 2081.0576 - val_loss: 1880.8708 - val_mae: 1880.8708
Epoch 78/500
34/34 [==============================] - 0s 2ms/step - loss: 2075.4822 - mae: 2075.4822 - val_loss: 1867.2734 - val_mae: 1867.2734
Epoch 79/500
34/34 [==============================] - 0s 2ms/step - loss: 2071.6575 - mae: 2071.6575 - val_loss: 1891.7242 - val_mae: 1891.7242
Epoch 80/500
34/34 [==============================] - 0s 2ms/step - loss: 2079.9980 - mae: 2079.9980 - val_loss: 1863.6963 - val_mae: 1863.6963
Epoch 81/500
34/34 [==============================] - 0s 2ms/step - loss: 2067.2991 - mae: 2067.2991 - val_loss: 1872.7144 - val_mae: 1872.7144
Epoch 82/500
34/34 [==============================] - 0s 1ms/step - loss: 2055.6555 - mae: 2055.6555 - val_loss: 1879.2584 - val_mae: 1879.2584
Epoch 83/500
34/34 [==============================] - 0s 2ms/step - loss: 2059.4839 - mae: 2059.4839 - val_loss: 1863.3885 - val_mae: 1863.3885
Epoch 84/500
34/34 [==============================] - 0s 2ms/step - loss: 2063.1194 - mae: 2063.1194 - val_loss: 1862.5278 - val_mae: 1862.5278
Epoch 85/500
34/34 [==============================] - 0s 2ms/step - loss: 2051.0049 - mae: 2051.0049 - val_loss: 1854.5962 - val_mae: 1854.5962
Epoch 86/500
34/34 [==============================] - 0s 2ms/step - loss: 2042.3267 - mae: 2042.3267 - val_loss: 1850.3087 - val_mae: 1850.3087
Epoch 87/500
34/34 [==============================] - 0s 2ms/step - loss: 2041.6899 - mae: 2041.6899 - val_loss: 1850.6119 - val_mae: 1850.6119
Epoch 88/500
34/34 [==============================] - 0s 2ms/step - loss: 2035.3190 - mae: 2035.3190 - val_loss: 1847.2694 - val_mae: 1847.2694
Epoch 89/500
34/34 [==============================] - 0s 2ms/step - loss: 2037.0938 - mae: 2037.0938 - val_loss: 1850.0952 - val_mae: 1850.0952
Epoch 90/500
34/34 [==============================] - 0s 2ms/step - loss: 2041.1196 - mae: 2041.1196 - val_loss: 1844.9628 - val_mae: 1844.9628
Epoch 91/500
34/34 [==============================] - 0s 2ms/step - loss: 2044.2196 - mae: 2044.2196 - val_loss: 1843.0162 - val_mae: 1843.0162
Epoch 92/500
34/34 [==============================] - 0s 2ms/step - loss: 2032.8231 - mae: 2032.8231 - val_loss: 1851.4753 - val_mae: 1851.4753
Epoch 93/500
34/34 [==============================] - 0s 2ms/step - loss: 2036.5299 - mae: 2036.5299 - val_loss: 1855.1908 - val_mae: 1855.1908
Epoch 94/500
34/34 [==============================] - 0s 1ms/step - loss: 2029.4299 - mae: 2029.4299 - val_loss: 1858.8723 - val_mae: 1858.8723
Epoch 95/500
34/34 [==============================] - 0s 2ms/step - loss: 2024.9934 - mae: 2024.9934 - val_loss: 1849.4526 - val_mae: 1849.4526
Epoch 96/500
34/34 [==============================] - 0s 2ms/step - loss: 2033.5105 - mae: 2033.5105 - val_loss: 1840.8606 - val_mae: 1840.8606
Epoch 97/500
34/34 [==============================] - 0s 1ms/step - loss: 2048.5264 - mae: 2048.5264 - val_loss: 1831.0789 - val_mae: 1831.0789
Epoch 98/500
34/34 [==============================] - 0s 2ms/step - loss: 2036.3689 - mae: 2036.3689 - val_loss: 1852.7601 - val_mae: 1852.7601
Epoch 99/500
34/34 [==============================] - 0s 2ms/step - loss: 2023.3348 - mae: 2023.3348 - val_loss: 1828.0614 - val_mae: 1828.0614
Epoch 100/500
34/34 [==============================] - 0s 2ms/step - loss: 2017.7542 - mae: 2017.7542 - val_loss: 1815.4932 - val_mae: 1815.4932
Epoch 101/500
34/34 [==============================] - 0s 2ms/step - loss: 2014.7932 - mae: 2014.7932 - val_loss: 1822.2697 - val_mae: 1822.2697
Epoch 102/500
34/34 [==============================] - 0s 1ms/step - loss: 2012.9108 - mae: 2012.9108 - val_loss: 1817.1342 - val_mae: 1817.1342
Epoch 103/500
34/34 [==============================] - 0s 2ms/step - loss: 2012.9425 - mae: 2012.9425 - val_loss: 1816.2031 - val_mae: 1816.2031
Epoch 104/500
34/34 [==============================] - 0s 2ms/step - loss: 2017.9021 - mae: 2017.9021 - val_loss: 1846.4427 - val_mae: 1846.4427
Epoch 105/500
34/34 [==============================] - 0s 2ms/step - loss: 2014.4573 - mae: 2014.4573 - val_loss: 1821.5594 - val_mae: 1821.5594
Epoch 106/500
34/34 [==============================] - 0s 2ms/step - loss: 2012.5148 - mae: 2012.5148 - val_loss: 1824.1171 - val_mae: 1824.1171
Epoch 107/500
34/34 [==============================] - 0s 2ms/step - loss: 2015.5453 - mae: 2015.5453 - val_loss: 1811.1664 - val_mae: 1811.1664
Epoch 108/500
34/34 [==============================] - 0s 2ms/step - loss: 2005.4918 - mae: 2005.4918 - val_loss: 1812.3279 - val_mae: 1812.3279
Epoch 109/500
34/34 [==============================] - 0s 2ms/step - loss: 2001.3566 - mae: 2001.3566 - val_loss: 1822.6973 - val_mae: 1822.6973
Epoch 110/500
34/34 [==============================] - 0s 2ms/step - loss: 2010.9276 - mae: 2010.9276 - val_loss: 1817.7332 - val_mae: 1817.7332
Epoch 111/500
34/34 [==============================] - 0s 2ms/step - loss: 2001.7743 - mae: 2001.7743 - val_loss: 1811.4060 - val_mae: 1811.4060
Epoch 112/500
34/34 [==============================] - 0s 2ms/step - loss: 2002.9131 - mae: 2002.9131 - val_loss: 1821.4413 - val_mae: 1821.4413
Epoch 113/500
34/34 [==============================] - 0s 2ms/step - loss: 2003.3796 - mae: 2003.3796 - val_loss: 1824.0438 - val_mae: 1824.0438
Epoch 114/500
34/34 [==============================] - 0s 2ms/step - loss: 2016.0397 - mae: 2016.0397 - val_loss: 1823.3734 - val_mae: 1823.3734
Epoch 115/500
34/34 [==============================] - 0s 2ms/step - loss: 2004.9606 - mae: 2004.9606 - val_loss: 1803.3007 - val_mae: 1803.3007
Epoch 116/500
34/34 [==============================] - 0s 2ms/step - loss: 2008.6638 - mae: 2008.6638 - val_loss: 1800.9580 - val_mae: 1800.9580
Epoch 117/500
34/34 [==============================] - 0s 2ms/step - loss: 2001.8729 - mae: 2001.8729 - val_loss: 1791.9836 - val_mae: 1791.9836
Epoch 118/500
34/34 [==============================] - 0s 2ms/step - loss: 1993.7715 - mae: 1993.7715 - val_loss: 1797.1489 - val_mae: 1797.1489
Epoch 119/500
34/34 [==============================] - 0s 2ms/step - loss: 1991.6925 - mae: 1991.6925 - val_loss: 1801.9685 - val_mae: 1801.9685
Epoch 120/500
34/34 [==============================] - 0s 2ms/step - loss: 2005.2185 - mae: 2005.2185 - val_loss: 1806.2285 - val_mae: 1806.2285
Epoch 121/500
34/34 [==============================] - 0s 2ms/step - loss: 1992.6122 - mae: 1992.6122 - val_loss: 1795.7297 - val_mae: 1795.7297
Epoch 122/500
34/34 [==============================] - 0s 2ms/step - loss: 1989.4568 - mae: 1989.4568 - val_loss: 1792.3977 - val_mae: 1792.3977
Epoch 123/500
34/34 [==============================] - 0s 2ms/step - loss: 1985.5287 - mae: 1985.5287 - val_loss: 1816.6318 - val_mae: 1816.6318
Epoch 124/500
34/34 [==============================] - 0s 2ms/step - loss: 2005.8525 - mae: 2005.8525 - val_loss: 1812.2037 - val_mae: 1812.2037
Epoch 125/500
34/34 [==============================] - 0s 2ms/step - loss: 1994.0787 - mae: 1994.0787 - val_loss: 1797.1824 - val_mae: 1797.1824
Epoch 126/500
34/34 [==============================] - 0s 2ms/step - loss: 1988.7498 - mae: 1988.7498 - val_loss: 1801.5994 - val_mae: 1801.5994
Epoch 127/500
34/34 [==============================] - 0s 2ms/step - loss: 1994.9095 - mae: 1994.9095 - val_loss: 1811.8732 - val_mae: 1811.8732
Epoch 128/500
34/34 [==============================] - 0s 2ms/step - loss: 1990.6432 - mae: 1990.6432 - val_loss: 1790.9241 - val_mae: 1790.9241
Epoch 129/500
34/34 [==============================] - 0s 2ms/step - loss: 1993.8073 - mae: 1993.8073 - val_loss: 1809.0693 - val_mae: 1809.0693
Epoch 130/500
34/34 [==============================] - 0s 2ms/step - loss: 1988.2024 - mae: 1988.2024 - val_loss: 1799.9777 - val_mae: 1799.9777
Epoch 131/500
34/34 [==============================] - 0s 2ms/step - loss: 1984.6715 - mae: 1984.6715 - val_loss: 1802.8118 - val_mae: 1802.8118
Epoch 132/500
34/34 [==============================] - 0s 2ms/step - loss: 1988.0992 - mae: 1988.0992 - val_loss: 1791.8558 - val_mae: 1791.8558
Epoch 133/500
34/34 [==============================] - 0s 2ms/step - loss: 1989.9736 - mae: 1989.9736 - val_loss: 1785.9014 - val_mae: 1785.9014
Epoch 134/500
34/34 [==============================] - 0s 2ms/step - loss: 1996.2953 - mae: 1996.2953 - val_loss: 1781.5219 - val_mae: 1781.5219
Epoch 135/500
34/34 [==============================] - 0s 2ms/step - loss: 1988.9187 - mae: 1988.9187 - val_loss: 1791.9681 - val_mae: 1791.9681
Epoch 136/500
34/34 [==============================] - 0s 2ms/step - loss: 1980.8845 - mae: 1980.8845 - val_loss: 1792.9158 - val_mae: 1792.9158
Epoch 137/500
34/34 [==============================] - 0s 2ms/step - loss: 1995.1309 - mae: 1995.1309 - val_loss: 1797.9642 - val_mae: 1797.9642
Epoch 138/500
34/34 [==============================] - 0s 2ms/step - loss: 1984.1794 - mae: 1984.1794 - val_loss: 1794.5872 - val_mae: 1794.5872
Epoch 139/500
34/34 [==============================] - 0s 2ms/step - loss: 1982.2208 - mae: 1982.2208 - val_loss: 1793.4797 - val_mae: 1793.4797
Epoch 140/500
34/34 [==============================] - 0s 1ms/step - loss: 1985.4689 - mae: 1985.4689 - val_loss: 1792.9102 - val_mae: 1792.9102
Epoch 141/500
34/34 [==============================] - 0s 2ms/step - loss: 1985.4965 - mae: 1985.4965 - val_loss: 1793.7250 - val_mae: 1793.7250
Epoch 142/500
34/34 [==============================] - 0s 2ms/step - loss: 1995.5189 - mae: 1995.5189 - val_loss: 1792.1943 - val_mae: 1792.1943
Epoch 143/500
34/34 [==============================] - 0s 2ms/step - loss: 1984.2916 - mae: 1984.2916 - val_loss: 1789.3123 - val_mae: 1789.3123
Epoch 144/500
34/34 [==============================] - 0s 2ms/step - loss: 1975.0759 - mae: 1975.0759 - val_loss: 1792.1858 - val_mae: 1792.1858
Epoch 145/500
34/34 [==============================] - 0s 2ms/step - loss: 1978.6841 - mae: 1978.6841 - val_loss: 1789.2256 - val_mae: 1789.2256
Epoch 146/500
34/34 [==============================] - 0s 1ms/step - loss: 1977.5038 - mae: 1977.5038 - val_loss: 1780.8389 - val_mae: 1780.8389
Epoch 147/500
34/34 [==============================] - 0s 2ms/step - loss: 1987.2664 - mae: 1987.2664 - val_loss: 1792.8644 - val_mae: 1792.8644
Epoch 148/500
34/34 [==============================] - 0s 2ms/step - loss: 1976.7812 - mae: 1976.7812 - val_loss: 1796.4983 - val_mae: 1796.4983
Epoch 149/500
34/34 [==============================] - 0s 2ms/step - loss: 1974.8413 - mae: 1974.8413 - val_loss: 1787.6670 - val_mae: 1787.6670
Epoch 150/500
34/34 [==============================] - 0s 2ms/step - loss: 1978.4220 - mae: 1978.4220 - val_loss: 1795.4137 - val_mae: 1795.4137
Epoch 151/500
34/34 [==============================] - 0s 2ms/step - loss: 1979.4327 - mae: 1979.4327 - val_loss: 1790.2787 - val_mae: 1790.2787
Epoch 152/500
34/34 [==============================] - 0s 1ms/step - loss: 1977.7789 - mae: 1977.7789 - val_loss: 1774.2340 - val_mae: 1774.2340
Epoch 153/500
34/34 [==============================] - 0s 2ms/step - loss: 1982.2012 - mae: 1982.2012 - val_loss: 1784.3153 - val_mae: 1784.3153
Epoch 154/500
34/34 [==============================] - 0s 2ms/step - loss: 1977.4806 - mae: 1977.4806 - val_loss: 1782.5403 - val_mae: 1782.5403
Epoch 155/500
34/34 [==============================] - 0s 2ms/step - loss: 1980.9225 - mae: 1980.9225 - val_loss: 1791.9736 - val_mae: 1791.9736
Epoch 156/500
34/34 [==============================] - 0s 2ms/step - loss: 1981.3085 - mae: 1981.3085 - val_loss: 1788.5519 - val_mae: 1788.5519
Epoch 157/500
34/34 [==============================] - 0s 2ms/step - loss: 1981.2551 - mae: 1981.2551 - val_loss: 1768.8878 - val_mae: 1768.8878
Epoch 158/500
34/34 [==============================] - 0s 2ms/step - loss: 1973.1549 - mae: 1973.1549 - val_loss: 1798.0594 - val_mae: 1798.0594
Epoch 159/500
34/34 [==============================] - 0s 2ms/step - loss: 1980.2050 - mae: 1980.2050 - val_loss: 1775.0919 - val_mae: 1775.0919
Epoch 160/500
34/34 [==============================] - 0s 2ms/step - loss: 1971.4470 - mae: 1971.4470 - val_loss: 1781.8694 - val_mae: 1781.8694
Epoch 161/500
34/34 [==============================] - 0s 2ms/step - loss: 1975.1182 - mae: 1975.1182 - val_loss: 1775.5975 - val_mae: 1775.5975
Epoch 162/500
34/34 [==============================] - 0s 2ms/step - loss: 1969.4803 - mae: 1969.4803 - val_loss: 1781.3888 - val_mae: 1781.3888
Epoch 163/500
34/34 [==============================] - 0s 2ms/step - loss: 1967.1493 - mae: 1967.1493 - val_loss: 1781.9633 - val_mae: 1781.9633
Epoch 164/500
34/34 [==============================] - 0s 2ms/step - loss: 1974.8502 - mae: 1974.8502 - val_loss: 1780.3650 - val_mae: 1780.3650
Epoch 165/500
34/34 [==============================] - 0s 2ms/step - loss: 1972.5902 - mae: 1972.5902 - val_loss: 1771.8502 - val_mae: 1771.8502
Epoch 166/500
34/34 [==============================] - 0s 2ms/step - loss: 1971.9954 - mae: 1971.9954 - val_loss: 1781.4761 - val_mae: 1781.4761
Epoch 167/500
34/34 [==============================] - 0s 2ms/step - loss: 1970.0887 - mae: 1970.0887 - val_loss: 1783.3815 - val_mae: 1783.3815
Epoch 168/500
34/34 [==============================] - 0s 2ms/step - loss: 1967.5521 - mae: 1967.5521 - val_loss: 1778.5927 - val_mae: 1778.5927
Epoch 169/500
34/34 [==============================] - 0s 2ms/step - loss: 1967.0444 - mae: 1967.0444 - val_loss: 1771.4957 - val_mae: 1771.4957
Epoch 170/500
34/34 [==============================] - 0s 2ms/step - loss: 1966.5867 - mae: 1966.5867 - val_loss: 1785.8175 - val_mae: 1785.8175
Epoch 171/500
34/34 [==============================] - 0s 1ms/step - loss: 1967.7477 - mae: 1967.7477 - val_loss: 1776.2269 - val_mae: 1776.2269
Epoch 172/500
34/34 [==============================] - 0s 2ms/step - loss: 1980.0281 - mae: 1980.0281 - val_loss: 1779.9388 - val_mae: 1779.9388
Epoch 173/500
34/34 [==============================] - 0s 2ms/step - loss: 1981.8927 - mae: 1981.8927 - val_loss: 1771.5815 - val_mae: 1771.5815
Epoch 174/500
34/34 [==============================] - 0s 2ms/step - loss: 1966.4998 - mae: 1966.4998 - val_loss: 1782.0991 - val_mae: 1782.0991
Epoch 175/500
34/34 [==============================] - 0s 2ms/step - loss: 1976.7661 - mae: 1976.7661 - val_loss: 1774.7715 - val_mae: 1774.7715
Epoch 176/500
34/34 [==============================] - 0s 2ms/step - loss: 1966.4243 - mae: 1966.4243 - val_loss: 1775.2882 - val_mae: 1775.2882
Epoch 177/500
34/34 [==============================] - 0s 2ms/step - loss: 1967.7278 - mae: 1967.7278 - val_loss: 1774.0293 - val_mae: 1774.0293
Epoch 178/500
34/34 [==============================] - 0s 2ms/step - loss: 1966.8129 - mae: 1966.8129 - val_loss: 1774.3683 - val_mae: 1774.3683
Epoch 179/500
34/34 [==============================] - 0s 2ms/step - loss: 1971.2717 - mae: 1971.2717 - val_loss: 1774.1532 - val_mae: 1774.1532
Epoch 180/500
34/34 [==============================] - 0s 2ms/step - loss: 1965.6360 - mae: 1965.6360 - val_loss: 1766.4706 - val_mae: 1766.4706
Epoch 181/500
34/34 [==============================] - 0s 2ms/step - loss: 1967.4757 - mae: 1967.4757 - val_loss: 1775.8031 - val_mae: 1775.8031
Epoch 182/500
34/34 [==============================] - 0s 2ms/step - loss: 1960.8469 - mae: 1960.8469 - val_loss: 1777.3994 - val_mae: 1777.3994
Epoch 183/500
34/34 [==============================] - 0s 2ms/step - loss: 1961.0070 - mae: 1961.0070 - val_loss: 1769.4930 - val_mae: 1769.4930
Epoch 184/500
34/34 [==============================] - 0s 2ms/step - loss: 1975.5460 - mae: 1975.5460 - val_loss: 1786.1698 - val_mae: 1786.1698
Epoch 185/500
34/34 [==============================] - 0s 2ms/step - loss: 1966.0946 - mae: 1966.0946 - val_loss: 1761.1935 - val_mae: 1761.1935
Epoch 186/500
34/34 [==============================] - 0s 2ms/step - loss: 1960.2603 - mae: 1960.2603 - val_loss: 1769.9578 - val_mae: 1769.9578
Epoch 187/500
34/34 [==============================] - 0s 2ms/step - loss: 1973.1697 - mae: 1973.1697 - val_loss: 1757.0441 - val_mae: 1757.0441
Epoch 188/500
34/34 [==============================] - 0s 2ms/step - loss: 1968.5486 - mae: 1968.5486 - val_loss: 1766.9186 - val_mae: 1766.9186
Epoch 189/500
34/34 [==============================] - 0s 2ms/step - loss: 1967.8101 - mae: 1967.8101 - val_loss: 1770.9064 - val_mae: 1770.9064
Epoch 190/500
34/34 [==============================] - 0s 2ms/step - loss: 1976.6902 - mae: 1976.6902 - val_loss: 1776.0175 - val_mae: 1776.0175
Epoch 191/500
34/34 [==============================] - 0s 2ms/step - loss: 1969.4918 - mae: 1969.4918 - val_loss: 1777.5186 - val_mae: 1777.5186
Epoch 192/500
34/34 [==============================] - 0s 2ms/step - loss: 1960.8390 - mae: 1960.8390 - val_loss: 1762.5614 - val_mae: 1762.5614
Epoch 193/500
34/34 [==============================] - 0s 2ms/step - loss: 1976.4841 - mae: 1976.4841 - val_loss: 1773.4581 - val_mae: 1773.4581
Epoch 194/500
34/34 [==============================] - 0s 2ms/step - loss: 1968.7488 - mae: 1968.7488 - val_loss: 1769.2513 - val_mae: 1769.2513
Epoch 195/500
34/34 [==============================] - 0s 2ms/step - loss: 1960.0731 - mae: 1960.0731 - val_loss: 1769.3701 - val_mae: 1769.3701
Epoch 196/500
34/34 [==============================] - 0s 2ms/step - loss: 1968.5825 - mae: 1968.5825 - val_loss: 1762.6217 - val_mae: 1762.6217
Epoch 197/500
34/34 [==============================] - 0s 2ms/step - loss: 1965.5475 - mae: 1965.5475 - val_loss: 1772.2786 - val_mae: 1772.2786
Epoch 198/500
34/34 [==============================] - 0s 2ms/step - loss: 1963.0095 - mae: 1963.0095 - val_loss: 1767.6793 - val_mae: 1767.6793
Epoch 199/500
34/34 [==============================] - 0s 2ms/step - loss: 1977.9890 - mae: 1977.9890 - val_loss: 1781.3022 - val_mae: 1781.3022
Epoch 200/500
34/34 [==============================] - 0s 2ms/step - loss: 1965.5640 - mae: 1965.5640 - val_loss: 1762.8400 - val_mae: 1762.8400
Epoch 201/500
34/34 [==============================] - 0s 2ms/step - loss: 1964.1094 - mae: 1964.1094 - val_loss: 1774.6151 - val_mae: 1774.6151
Epoch 202/500
34/34 [==============================] - 0s 2ms/step - loss: 1956.6323 - mae: 1956.6323 - val_loss: 1765.7267 - val_mae: 1765.7267
Epoch 203/500
34/34 [==============================] - 0s 2ms/step - loss: 1964.8213 - mae: 1964.8213 - val_loss: 1771.9376 - val_mae: 1771.9376
Epoch 204/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.8395 - mae: 1959.8395 - val_loss: 1787.2118 - val_mae: 1787.2118
Epoch 205/500
34/34 [==============================] - 0s 2ms/step - loss: 1966.8627 - mae: 1966.8627 - val_loss: 1759.7816 - val_mae: 1759.7816
Epoch 206/500
34/34 [==============================] - 0s 2ms/step - loss: 1960.0481 - mae: 1960.0481 - val_loss: 1759.5378 - val_mae: 1759.5378
Epoch 207/500
34/34 [==============================] - 0s 2ms/step - loss: 1972.4121 - mae: 1972.4121 - val_loss: 1775.3069 - val_mae: 1775.3069
Epoch 208/500
34/34 [==============================] - 0s 2ms/step - loss: 1969.1094 - mae: 1969.1094 - val_loss: 1771.8595 - val_mae: 1771.8595
Epoch 209/500
34/34 [==============================] - 0s 2ms/step - loss: 1965.4229 - mae: 1965.4229 - val_loss: 1774.4146 - val_mae: 1774.4146
Epoch 210/500
34/34 [==============================] - 0s 2ms/step - loss: 1961.6182 - mae: 1961.6182 - val_loss: 1758.3811 - val_mae: 1758.3811
Epoch 211/500
34/34 [==============================] - 0s 2ms/step - loss: 1961.2460 - mae: 1961.2460 - val_loss: 1765.3663 - val_mae: 1765.3663
Epoch 212/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.8534 - mae: 1959.8534 - val_loss: 1769.8109 - val_mae: 1769.8109
Epoch 213/500
34/34 [==============================] - 0s 2ms/step - loss: 1965.4561 - mae: 1965.4561 - val_loss: 1755.5190 - val_mae: 1755.5190
Epoch 214/500
34/34 [==============================] - 0s 2ms/step - loss: 1956.2323 - mae: 1956.2323 - val_loss: 1758.1731 - val_mae: 1758.1731
Epoch 215/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.1764 - mae: 1957.1764 - val_loss: 1760.1693 - val_mae: 1760.1693
Epoch 216/500
34/34 [==============================] - 0s 2ms/step - loss: 1966.5322 - mae: 1966.5322 - val_loss: 1748.7299 - val_mae: 1748.7299
Epoch 217/500
34/34 [==============================] - 0s 2ms/step - loss: 1961.7247 - mae: 1961.7247 - val_loss: 1761.4476 - val_mae: 1761.4476
Epoch 218/500
34/34 [==============================] - 0s 1ms/step - loss: 1964.3876 - mae: 1964.3876 - val_loss: 1776.0294 - val_mae: 1776.0294
Epoch 219/500
34/34 [==============================] - 0s 2ms/step - loss: 1966.7490 - mae: 1966.7490 - val_loss: 1761.2290 - val_mae: 1761.2290
Epoch 220/500
34/34 [==============================] - 0s 2ms/step - loss: 1963.3037 - mae: 1963.3037 - val_loss: 1764.5389 - val_mae: 1764.5389
Epoch 221/500
34/34 [==============================] - 0s 2ms/step - loss: 1956.0612 - mae: 1956.0612 - val_loss: 1765.7604 - val_mae: 1765.7604
Epoch 222/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.6335 - mae: 1959.6335 - val_loss: 1765.8113 - val_mae: 1765.8113
Epoch 223/500
34/34 [==============================] - 0s 2ms/step - loss: 1964.8103 - mae: 1964.8103 - val_loss: 1758.9226 - val_mae: 1758.9226
Epoch 224/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.9744 - mae: 1953.9744 - val_loss: 1761.1431 - val_mae: 1761.1431
Epoch 225/500
34/34 [==============================] - 0s 2ms/step - loss: 1956.5897 - mae: 1956.5897 - val_loss: 1770.0221 - val_mae: 1770.0221
Epoch 226/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.6388 - mae: 1959.6388 - val_loss: 1771.6255 - val_mae: 1771.6255
Epoch 227/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.5343 - mae: 1957.5343 - val_loss: 1759.5389 - val_mae: 1759.5389
Epoch 228/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.2158 - mae: 1955.2158 - val_loss: 1760.0000 - val_mae: 1760.0000
Epoch 229/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.5652 - mae: 1950.5652 - val_loss: 1769.1490 - val_mae: 1769.1490
Epoch 230/500
34/34 [==============================] - 0s 2ms/step - loss: 1972.5710 - mae: 1972.5710 - val_loss: 1782.5632 - val_mae: 1782.5632
Epoch 231/500
34/34 [==============================] - 0s 2ms/step - loss: 1969.8538 - mae: 1969.8538 - val_loss: 1766.4808 - val_mae: 1766.4808
Epoch 232/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.6958 - mae: 1958.6958 - val_loss: 1768.9506 - val_mae: 1768.9506
Epoch 233/500
34/34 [==============================] - 0s 2ms/step - loss: 1969.3577 - mae: 1969.3577 - val_loss: 1774.3427 - val_mae: 1774.3427
Epoch 234/500
34/34 [==============================] - 0s 2ms/step - loss: 1969.5005 - mae: 1969.5005 - val_loss: 1765.2805 - val_mae: 1765.2805
Epoch 235/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.5685 - mae: 1959.5685 - val_loss: 1757.3914 - val_mae: 1757.3914
Epoch 236/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.8185 - mae: 1955.8185 - val_loss: 1767.4189 - val_mae: 1767.4189
Epoch 237/500
34/34 [==============================] - 0s 2ms/step - loss: 1961.3993 - mae: 1961.3993 - val_loss: 1762.3055 - val_mae: 1762.3055
Epoch 238/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.5317 - mae: 1957.5317 - val_loss: 1760.3268 - val_mae: 1760.3268
Epoch 239/500
34/34 [==============================] - 0s 1ms/step - loss: 1957.4944 - mae: 1957.4944 - val_loss: 1763.0468 - val_mae: 1763.0468
Epoch 240/500
34/34 [==============================] - 0s 2ms/step - loss: 1977.7267 - mae: 1977.7267 - val_loss: 1763.8022 - val_mae: 1763.8022
Epoch 241/500
34/34 [==============================] - 0s 2ms/step - loss: 1980.8425 - mae: 1980.8425 - val_loss: 1765.0255 - val_mae: 1765.0255
Epoch 242/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.3900 - mae: 1958.3900 - val_loss: 1755.1130 - val_mae: 1755.1130
Epoch 243/500
34/34 [==============================] - 0s 2ms/step - loss: 1961.3809 - mae: 1961.3809 - val_loss: 1763.5626 - val_mae: 1763.5626
Epoch 244/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.8789 - mae: 1958.8789 - val_loss: 1752.8175 - val_mae: 1752.8175
Epoch 245/500
34/34 [==============================] - 0s 2ms/step - loss: 1963.3140 - mae: 1963.3140 - val_loss: 1769.4661 - val_mae: 1769.4661
Epoch 246/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.8228 - mae: 1958.8228 - val_loss: 1757.4446 - val_mae: 1757.4446
Epoch 247/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.3497 - mae: 1958.3497 - val_loss: 1769.2129 - val_mae: 1769.2129
Epoch 248/500
34/34 [==============================] - 0s 2ms/step - loss: 1956.8246 - mae: 1956.8246 - val_loss: 1754.8088 - val_mae: 1754.8088
Epoch 249/500
34/34 [==============================] - 0s 2ms/step - loss: 1962.1644 - mae: 1962.1644 - val_loss: 1757.5533 - val_mae: 1757.5533
Epoch 250/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.9523 - mae: 1957.9523 - val_loss: 1770.5831 - val_mae: 1770.5831
Epoch 251/500
34/34 [==============================] - 0s 2ms/step - loss: 1962.0795 - mae: 1962.0795 - val_loss: 1750.0560 - val_mae: 1750.0560
Epoch 252/500
34/34 [==============================] - 0s 2ms/step - loss: 1968.0094 - mae: 1968.0094 - val_loss: 1773.1727 - val_mae: 1773.1727
Epoch 253/500
34/34 [==============================] - 0s 2ms/step - loss: 1963.5264 - mae: 1963.5264 - val_loss: 1772.5731 - val_mae: 1772.5731
Epoch 254/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.1318 - mae: 1959.1318 - val_loss: 1770.7881 - val_mae: 1770.7881
Epoch 255/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.7399 - mae: 1959.7399 - val_loss: 1771.9459 - val_mae: 1771.9459
Epoch 256/500
34/34 [==============================] - 0s 2ms/step - loss: 1969.7211 - mae: 1969.7211 - val_loss: 1770.5940 - val_mae: 1770.5940
Epoch 257/500
34/34 [==============================] - 0s 2ms/step - loss: 1951.1825 - mae: 1951.1825 - val_loss: 1764.4368 - val_mae: 1764.4368
Epoch 258/500
34/34 [==============================] - 0s 2ms/step - loss: 1960.7878 - mae: 1960.7878 - val_loss: 1754.7833 - val_mae: 1754.7833
Epoch 259/500
34/34 [==============================] - 0s 2ms/step - loss: 1966.8428 - mae: 1966.8428 - val_loss: 1758.1840 - val_mae: 1758.1840
Epoch 260/500
34/34 [==============================] - 0s 2ms/step - loss: 1965.6685 - mae: 1965.6685 - val_loss: 1769.8696 - val_mae: 1769.8696
Epoch 261/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.3458 - mae: 1958.3458 - val_loss: 1769.6771 - val_mae: 1769.6771
Epoch 262/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.6326 - mae: 1957.6326 - val_loss: 1764.7932 - val_mae: 1764.7932
Epoch 263/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.2800 - mae: 1959.2800 - val_loss: 1783.8472 - val_mae: 1783.8472
Epoch 264/500
34/34 [==============================] - 0s 2ms/step - loss: 1962.0381 - mae: 1962.0381 - val_loss: 1773.7986 - val_mae: 1773.7986
Epoch 265/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.3545 - mae: 1958.3545 - val_loss: 1766.7994 - val_mae: 1766.7994
Epoch 266/500
34/34 [==============================] - 0s 2ms/step - loss: 1961.2140 - mae: 1961.2140 - val_loss: 1771.8352 - val_mae: 1771.8352
Epoch 267/500
34/34 [==============================] - 0s 2ms/step - loss: 1969.5743 - mae: 1969.5743 - val_loss: 1766.3003 - val_mae: 1766.3003
Epoch 268/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.3326 - mae: 1955.3326 - val_loss: 1757.1107 - val_mae: 1757.1107
Epoch 269/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.2692 - mae: 1955.2692 - val_loss: 1765.9857 - val_mae: 1765.9857
Epoch 270/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.2644 - mae: 1959.2644 - val_loss: 1759.9711 - val_mae: 1759.9711
Epoch 271/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.1543 - mae: 1953.1543 - val_loss: 1766.8632 - val_mae: 1766.8632
Epoch 272/500
34/34 [==============================] - 0s 2ms/step - loss: 1968.7771 - mae: 1968.7771 - val_loss: 1764.6088 - val_mae: 1764.6088
Epoch 273/500
34/34 [==============================] - 0s 2ms/step - loss: 1968.1626 - mae: 1968.1626 - val_loss: 1754.2203 - val_mae: 1754.2203
Epoch 274/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.8226 - mae: 1957.8226 - val_loss: 1763.2904 - val_mae: 1763.2904
Epoch 275/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.6366 - mae: 1953.6366 - val_loss: 1751.8658 - val_mae: 1751.8658
Epoch 276/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.2917 - mae: 1958.2917 - val_loss: 1769.1542 - val_mae: 1769.1542
Epoch 277/500
34/34 [==============================] - 0s 2ms/step - loss: 1973.9141 - mae: 1973.9141 - val_loss: 1774.1117 - val_mae: 1774.1117
Epoch 278/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.1049 - mae: 1957.1049 - val_loss: 1765.9230 - val_mae: 1765.9230
Epoch 279/500
34/34 [==============================] - 0s 2ms/step - loss: 1967.5278 - mae: 1967.5278 - val_loss: 1770.7007 - val_mae: 1770.7007
Epoch 280/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.8138 - mae: 1950.8138 - val_loss: 1765.4844 - val_mae: 1765.4844
Epoch 281/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.8281 - mae: 1955.8281 - val_loss: 1766.1042 - val_mae: 1766.1042
Epoch 282/500
34/34 [==============================] - 0s 2ms/step - loss: 1961.5582 - mae: 1961.5582 - val_loss: 1762.7716 - val_mae: 1762.7716
Epoch 283/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.2568 - mae: 1954.2568 - val_loss: 1763.8444 - val_mae: 1763.8444
Epoch 284/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.2937 - mae: 1957.2937 - val_loss: 1756.2458 - val_mae: 1756.2458
Epoch 285/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.0532 - mae: 1958.0532 - val_loss: 1766.0789 - val_mae: 1766.0789
Epoch 286/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.5929 - mae: 1949.5929 - val_loss: 1761.2052 - val_mae: 1761.2052
Epoch 287/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.3927 - mae: 1957.3927 - val_loss: 1760.6901 - val_mae: 1760.6901
Epoch 288/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.8138 - mae: 1953.8138 - val_loss: 1764.0536 - val_mae: 1764.0536
Epoch 289/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.3557 - mae: 1954.3557 - val_loss: 1760.0659 - val_mae: 1760.0659
Epoch 290/500
34/34 [==============================] - 0s 2ms/step - loss: 1951.7838 - mae: 1951.7838 - val_loss: 1767.5074 - val_mae: 1767.5074
Epoch 291/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.6080 - mae: 1957.6080 - val_loss: 1758.7362 - val_mae: 1758.7362
Epoch 292/500
34/34 [==============================] - 0s 1ms/step - loss: 1956.8254 - mae: 1956.8254 - val_loss: 1761.5820 - val_mae: 1761.5820
Epoch 293/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.8625 - mae: 1954.8625 - val_loss: 1769.0343 - val_mae: 1769.0343
Epoch 294/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.7103 - mae: 1958.7103 - val_loss: 1767.7207 - val_mae: 1767.7207
Epoch 295/500
34/34 [==============================] - 0s 2ms/step - loss: 1962.3280 - mae: 1962.3280 - val_loss: 1765.2023 - val_mae: 1765.2023
Epoch 296/500
34/34 [==============================] - 0s 2ms/step - loss: 1963.1007 - mae: 1963.1007 - val_loss: 1763.6494 - val_mae: 1763.6494
Epoch 297/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.1455 - mae: 1959.1455 - val_loss: 1754.1744 - val_mae: 1754.1744
Epoch 298/500
34/34 [==============================] - 0s 2ms/step - loss: 1952.9417 - mae: 1952.9417 - val_loss: 1759.1855 - val_mae: 1759.1855
Epoch 299/500
34/34 [==============================] - 0s 2ms/step - loss: 1964.5503 - mae: 1964.5503 - val_loss: 1771.1095 - val_mae: 1771.1095
Epoch 300/500
34/34 [==============================] - 0s 2ms/step - loss: 1965.7643 - mae: 1965.7643 - val_loss: 1768.7195 - val_mae: 1768.7195
Epoch 301/500
34/34 [==============================] - 0s 2ms/step - loss: 1956.4481 - mae: 1956.4481 - val_loss: 1758.2565 - val_mae: 1758.2565
Epoch 302/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.8484 - mae: 1950.8484 - val_loss: 1754.2070 - val_mae: 1754.2070
Epoch 303/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.8726 - mae: 1953.8726 - val_loss: 1764.6080 - val_mae: 1764.6080
Epoch 304/500
34/34 [==============================] - 0s 2ms/step - loss: 1963.4065 - mae: 1963.4065 - val_loss: 1761.6327 - val_mae: 1761.6327
Epoch 305/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.3087 - mae: 1959.3087 - val_loss: 1753.7297 - val_mae: 1753.7297
Epoch 306/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.4009 - mae: 1954.4009 - val_loss: 1755.5059 - val_mae: 1755.5059
Epoch 307/500
34/34 [==============================] - 0s 2ms/step - loss: 1956.5096 - mae: 1956.5096 - val_loss: 1754.2982 - val_mae: 1754.2982
Epoch 308/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.6779 - mae: 1949.6779 - val_loss: 1771.6564 - val_mae: 1771.6564
Epoch 309/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.1182 - mae: 1958.1182 - val_loss: 1754.2281 - val_mae: 1754.2281
Epoch 310/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.8036 - mae: 1953.8036 - val_loss: 1761.2064 - val_mae: 1761.2064
Epoch 311/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.8374 - mae: 1950.8374 - val_loss: 1766.2916 - val_mae: 1766.2916
Epoch 312/500
34/34 [==============================] - 0s 2ms/step - loss: 1951.4978 - mae: 1951.4978 - val_loss: 1751.0413 - val_mae: 1751.0413
Epoch 313/500
34/34 [==============================] - 0s 2ms/step - loss: 1951.3518 - mae: 1951.3518 - val_loss: 1755.6367 - val_mae: 1755.6367
Epoch 314/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.1110 - mae: 1949.1110 - val_loss: 1751.0707 - val_mae: 1751.0707
Epoch 315/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.4119 - mae: 1953.4119 - val_loss: 1758.8411 - val_mae: 1758.8411
Epoch 316/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.6378 - mae: 1953.6378 - val_loss: 1752.4923 - val_mae: 1752.4923
Epoch 317/500
34/34 [==============================] - 0s 2ms/step - loss: 1962.7073 - mae: 1962.7073 - val_loss: 1758.3711 - val_mae: 1758.3711
Epoch 318/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.1427 - mae: 1958.1427 - val_loss: 1754.0049 - val_mae: 1754.0049
Epoch 319/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.0283 - mae: 1959.0283 - val_loss: 1763.0295 - val_mae: 1763.0295
Epoch 320/500
34/34 [==============================] - 0s 2ms/step - loss: 1962.1799 - mae: 1962.1799 - val_loss: 1757.9574 - val_mae: 1757.9574
Epoch 321/500
34/34 [==============================] - 0s 2ms/step - loss: 1967.6014 - mae: 1967.6014 - val_loss: 1754.3242 - val_mae: 1754.3242
Epoch 322/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.2101 - mae: 1954.2101 - val_loss: 1755.9860 - val_mae: 1755.9860
Epoch 323/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.6353 - mae: 1957.6353 - val_loss: 1748.1429 - val_mae: 1748.1429
Epoch 324/500
34/34 [==============================] - 0s 2ms/step - loss: 1965.8748 - mae: 1965.8748 - val_loss: 1750.5398 - val_mae: 1750.5398
Epoch 325/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.0470 - mae: 1949.0470 - val_loss: 1764.1530 - val_mae: 1764.1530
Epoch 326/500
34/34 [==============================] - 0s 2ms/step - loss: 1952.1329 - mae: 1952.1329 - val_loss: 1762.7769 - val_mae: 1762.7769
Epoch 327/500
34/34 [==============================] - 0s 2ms/step - loss: 1951.0535 - mae: 1951.0535 - val_loss: 1769.8868 - val_mae: 1769.8868
Epoch 328/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.4520 - mae: 1958.4520 - val_loss: 1764.0299 - val_mae: 1764.0299
Epoch 329/500
34/34 [==============================] - 0s 2ms/step - loss: 1961.4423 - mae: 1961.4423 - val_loss: 1763.5717 - val_mae: 1763.5715
Epoch 330/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.9846 - mae: 1953.9846 - val_loss: 1768.4126 - val_mae: 1768.4126
Epoch 331/500
34/34 [==============================] - 0s 2ms/step - loss: 1951.3615 - mae: 1951.3615 - val_loss: 1757.0254 - val_mae: 1757.0254
Epoch 332/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.7188 - mae: 1954.7188 - val_loss: 1762.9642 - val_mae: 1762.9642
Epoch 333/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.1290 - mae: 1953.1290 - val_loss: 1762.4924 - val_mae: 1762.4924
Epoch 334/500
34/34 [==============================] - 0s 2ms/step - loss: 1951.4161 - mae: 1951.4161 - val_loss: 1757.6155 - val_mae: 1757.6155
Epoch 335/500
34/34 [==============================] - 0s 1ms/step - loss: 1952.3960 - mae: 1952.3960 - val_loss: 1764.6984 - val_mae: 1764.6984
Epoch 336/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.7646 - mae: 1950.7646 - val_loss: 1772.4570 - val_mae: 1772.4570
Epoch 337/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.0762 - mae: 1955.0762 - val_loss: 1770.7056 - val_mae: 1770.7056
Epoch 338/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.9214 - mae: 1959.9214 - val_loss: 1766.1273 - val_mae: 1766.1272
Epoch 339/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.8682 - mae: 1955.8682 - val_loss: 1746.8082 - val_mae: 1746.8082
Epoch 340/500
34/34 [==============================] - 0s 2ms/step - loss: 1948.0269 - mae: 1948.0269 - val_loss: 1759.6322 - val_mae: 1759.6322
Epoch 341/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.8123 - mae: 1949.8123 - val_loss: 1751.2783 - val_mae: 1751.2783
Epoch 342/500
34/34 [==============================] - 0s 2ms/step - loss: 1952.0177 - mae: 1952.0177 - val_loss: 1751.9829 - val_mae: 1751.9829
Epoch 343/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.0615 - mae: 1955.0615 - val_loss: 1761.2970 - val_mae: 1761.2970
Epoch 344/500
34/34 [==============================] - 0s 1ms/step - loss: 1953.7709 - mae: 1953.7709 - val_loss: 1776.7866 - val_mae: 1776.7866
Epoch 345/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.2633 - mae: 1958.2633 - val_loss: 1757.1514 - val_mae: 1757.1514
Epoch 346/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.7467 - mae: 1954.7467 - val_loss: 1754.1384 - val_mae: 1754.1384
Epoch 347/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.5950 - mae: 1953.5950 - val_loss: 1769.6234 - val_mae: 1769.6234
Epoch 348/500
34/34 [==============================] - 0s 2ms/step - loss: 1962.7493 - mae: 1962.7493 - val_loss: 1763.6633 - val_mae: 1763.6633
Epoch 349/500
34/34 [==============================] - 0s 2ms/step - loss: 1948.9203 - mae: 1948.9203 - val_loss: 1761.1244 - val_mae: 1761.1244
Epoch 350/500
34/34 [==============================] - 0s 2ms/step - loss: 1972.8507 - mae: 1972.8507 - val_loss: 1777.9928 - val_mae: 1777.9928
Epoch 351/500
34/34 [==============================] - 0s 2ms/step - loss: 1961.0935 - mae: 1961.0935 - val_loss: 1753.1858 - val_mae: 1753.1858
Epoch 352/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.9105 - mae: 1958.9105 - val_loss: 1763.2803 - val_mae: 1763.2803
Epoch 353/500
34/34 [==============================] - 0s 2ms/step - loss: 1946.5043 - mae: 1946.5043 - val_loss: 1754.3450 - val_mae: 1754.3450
Epoch 354/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.7681 - mae: 1949.7681 - val_loss: 1745.9467 - val_mae: 1745.9467
Epoch 355/500
34/34 [==============================] - 0s 2ms/step - loss: 1946.6383 - mae: 1946.6383 - val_loss: 1757.6488 - val_mae: 1757.6488
Epoch 356/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.1432 - mae: 1950.1432 - val_loss: 1752.2520 - val_mae: 1752.2520
Epoch 357/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.4932 - mae: 1949.4932 - val_loss: 1758.4166 - val_mae: 1758.4166
Epoch 358/500
34/34 [==============================] - 0s 2ms/step - loss: 1951.7157 - mae: 1951.7157 - val_loss: 1784.3848 - val_mae: 1784.3848
Epoch 359/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.7473 - mae: 1949.7473 - val_loss: 1761.4342 - val_mae: 1761.4342
Epoch 360/500
34/34 [==============================] - 0s 2ms/step - loss: 1956.2291 - mae: 1956.2291 - val_loss: 1747.1814 - val_mae: 1747.1814
Epoch 361/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.1675 - mae: 1953.1675 - val_loss: 1754.5613 - val_mae: 1754.5613
Epoch 362/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.4589 - mae: 1953.4589 - val_loss: 1761.6952 - val_mae: 1761.6952
Epoch 363/500
34/34 [==============================] - 0s 2ms/step - loss: 1976.0000 - mae: 1976.0000 - val_loss: 1744.4117 - val_mae: 1744.4117
Epoch 364/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.0414 - mae: 1959.0414 - val_loss: 1762.1742 - val_mae: 1762.1742
Epoch 365/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.8511 - mae: 1949.8511 - val_loss: 1758.6908 - val_mae: 1758.6908
Epoch 366/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.3275 - mae: 1953.3275 - val_loss: 1758.0052 - val_mae: 1758.0052
Epoch 367/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.8618 - mae: 1950.8618 - val_loss: 1744.1410 - val_mae: 1744.1410
Epoch 368/500
34/34 [==============================] - 0s 2ms/step - loss: 1972.1248 - mae: 1972.1248 - val_loss: 1772.6183 - val_mae: 1772.6183
Epoch 369/500
34/34 [==============================] - 0s 2ms/step - loss: 1956.9573 - mae: 1956.9573 - val_loss: 1759.3527 - val_mae: 1759.3527
Epoch 370/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.5968 - mae: 1955.5968 - val_loss: 1765.9628 - val_mae: 1765.9628
Epoch 371/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.3209 - mae: 1958.3209 - val_loss: 1763.1091 - val_mae: 1763.1091
Epoch 372/500
34/34 [==============================] - 0s 2ms/step - loss: 1947.7278 - mae: 1947.7278 - val_loss: 1751.5487 - val_mae: 1751.5487
Epoch 373/500
34/34 [==============================] - 0s 2ms/step - loss: 1951.7825 - mae: 1951.7825 - val_loss: 1762.3896 - val_mae: 1762.3896
Epoch 374/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.2893 - mae: 1954.2893 - val_loss: 1760.6979 - val_mae: 1760.6979
Epoch 375/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.0992 - mae: 1957.0992 - val_loss: 1765.6522 - val_mae: 1765.6522
Epoch 376/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.5472 - mae: 1950.5472 - val_loss: 1761.5393 - val_mae: 1761.5393
Epoch 377/500
34/34 [==============================] - 0s 2ms/step - loss: 1952.2313 - mae: 1952.2313 - val_loss: 1764.2993 - val_mae: 1764.2993
Epoch 378/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.1935 - mae: 1950.1935 - val_loss: 1748.9039 - val_mae: 1748.9039
Epoch 379/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.7526 - mae: 1953.7526 - val_loss: 1754.3691 - val_mae: 1754.3691
Epoch 380/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.4235 - mae: 1955.4235 - val_loss: 1756.1005 - val_mae: 1756.1005
Epoch 381/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.8040 - mae: 1957.8040 - val_loss: 1751.6953 - val_mae: 1751.6953
Epoch 382/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.1229 - mae: 1958.1229 - val_loss: 1747.8070 - val_mae: 1747.8070
Epoch 383/500
34/34 [==============================] - 0s 2ms/step - loss: 1946.6637 - mae: 1946.6637 - val_loss: 1752.1309 - val_mae: 1752.1309
Epoch 384/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.8477 - mae: 1954.8477 - val_loss: 1761.1697 - val_mae: 1761.1697
Epoch 385/500
34/34 [==============================] - 0s 2ms/step - loss: 1965.7804 - mae: 1965.7804 - val_loss: 1754.0254 - val_mae: 1754.0254
Epoch 386/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.3236 - mae: 1958.3236 - val_loss: 1764.0514 - val_mae: 1764.0514
Epoch 387/500
34/34 [==============================] - 0s 2ms/step - loss: 1956.8267 - mae: 1956.8267 - val_loss: 1758.4872 - val_mae: 1758.4872
Epoch 388/500
34/34 [==============================] - 0s 2ms/step - loss: 1956.6726 - mae: 1956.6726 - val_loss: 1756.9443 - val_mae: 1756.9443
Epoch 389/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.9005 - mae: 1957.9005 - val_loss: 1737.4865 - val_mae: 1737.4865
Epoch 390/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.6439 - mae: 1949.6439 - val_loss: 1739.9377 - val_mae: 1739.9377
Epoch 391/500
34/34 [==============================] - 0s 2ms/step - loss: 1951.9272 - mae: 1951.9272 - val_loss: 1756.1089 - val_mae: 1756.1089
Epoch 392/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.8655 - mae: 1953.8655 - val_loss: 1763.0947 - val_mae: 1763.0947
Epoch 393/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.5458 - mae: 1957.5458 - val_loss: 1749.2146 - val_mae: 1749.2146
Epoch 394/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.1083 - mae: 1950.1083 - val_loss: 1746.1954 - val_mae: 1746.1954
Epoch 395/500
34/34 [==============================] - 0s 2ms/step - loss: 1966.8063 - mae: 1966.8063 - val_loss: 1763.3324 - val_mae: 1763.3324
Epoch 396/500
34/34 [==============================] - 0s 2ms/step - loss: 1952.4954 - mae: 1952.4954 - val_loss: 1757.8846 - val_mae: 1757.8846
Epoch 397/500
34/34 [==============================] - 0s 2ms/step - loss: 1952.4414 - mae: 1952.4414 - val_loss: 1754.0262 - val_mae: 1754.0262
Epoch 398/500
34/34 [==============================] - 0s 2ms/step - loss: 1948.2065 - mae: 1948.2065 - val_loss: 1752.7214 - val_mae: 1752.7214
Epoch 399/500
34/34 [==============================] - 0s 2ms/step - loss: 1947.9043 - mae: 1947.9043 - val_loss: 1750.4845 - val_mae: 1750.4845
Epoch 400/500
34/34 [==============================] - 0s 2ms/step - loss: 1956.3918 - mae: 1956.3918 - val_loss: 1759.9824 - val_mae: 1759.9824
Epoch 401/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.9606 - mae: 1955.9606 - val_loss: 1747.9561 - val_mae: 1747.9561
Epoch 402/500
34/34 [==============================] - 0s 2ms/step - loss: 1941.3849 - mae: 1941.3849 - val_loss: 1754.4902 - val_mae: 1754.4902
Epoch 403/500
34/34 [==============================] - 0s 2ms/step - loss: 1943.3167 - mae: 1943.3167 - val_loss: 1754.0844 - val_mae: 1754.0844
Epoch 404/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.5490 - mae: 1949.5490 - val_loss: 1754.5602 - val_mae: 1754.5601
Epoch 405/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.7708 - mae: 1953.7708 - val_loss: 1757.1343 - val_mae: 1757.1343
Epoch 406/500
34/34 [==============================] - 0s 2ms/step - loss: 1944.7001 - mae: 1944.7001 - val_loss: 1754.2170 - val_mae: 1754.2170
Epoch 407/500
34/34 [==============================] - 0s 2ms/step - loss: 1945.4005 - mae: 1945.4005 - val_loss: 1758.1936 - val_mae: 1758.1936
Epoch 408/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.3241 - mae: 1953.3241 - val_loss: 1759.7593 - val_mae: 1759.7593
Epoch 409/500
34/34 [==============================] - 0s 2ms/step - loss: 1944.3270 - mae: 1944.3270 - val_loss: 1756.6381 - val_mae: 1756.6381
Epoch 410/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.6821 - mae: 1957.6821 - val_loss: 1749.7008 - val_mae: 1749.7008
Epoch 411/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.4100 - mae: 1955.4100 - val_loss: 1748.7892 - val_mae: 1748.7892
Epoch 412/500
34/34 [==============================] - 0s 2ms/step - loss: 1952.8435 - mae: 1952.8435 - val_loss: 1758.9407 - val_mae: 1758.9407
Epoch 413/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.5355 - mae: 1954.5355 - val_loss: 1755.9673 - val_mae: 1755.9673
Epoch 414/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.0613 - mae: 1959.0613 - val_loss: 1758.3146 - val_mae: 1758.3146
Epoch 415/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.1749 - mae: 1955.1749 - val_loss: 1741.0492 - val_mae: 1741.0492
Epoch 416/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.3885 - mae: 1954.3885 - val_loss: 1754.9452 - val_mae: 1754.9452
Epoch 417/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.5027 - mae: 1954.5027 - val_loss: 1756.1047 - val_mae: 1756.1047
Epoch 418/500
34/34 [==============================] - 0s 2ms/step - loss: 1951.7993 - mae: 1951.7993 - val_loss: 1751.6759 - val_mae: 1751.6759
Epoch 419/500
34/34 [==============================] - 0s 2ms/step - loss: 1956.9473 - mae: 1956.9473 - val_loss: 1756.7864 - val_mae: 1756.7864
Epoch 420/500
34/34 [==============================] - 0s 2ms/step - loss: 1948.1915 - mae: 1948.1915 - val_loss: 1763.8118 - val_mae: 1763.8118
Epoch 421/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.2238 - mae: 1950.2238 - val_loss: 1746.9730 - val_mae: 1746.9730
Epoch 422/500
34/34 [==============================] - 0s 2ms/step - loss: 1947.8093 - mae: 1947.8093 - val_loss: 1752.2904 - val_mae: 1752.2904
Epoch 423/500
34/34 [==============================] - 0s 2ms/step - loss: 1944.5515 - mae: 1944.5515 - val_loss: 1771.6937 - val_mae: 1771.6937
Epoch 424/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.9481 - mae: 1950.9481 - val_loss: 1754.7682 - val_mae: 1754.7682
Epoch 425/500
34/34 [==============================] - 0s 2ms/step - loss: 1951.0114 - mae: 1951.0114 - val_loss: 1750.5623 - val_mae: 1750.5623
Epoch 426/500
34/34 [==============================] - 0s 2ms/step - loss: 1947.3135 - mae: 1947.3135 - val_loss: 1748.2422 - val_mae: 1748.2422
Epoch 427/500
34/34 [==============================] - 0s 2ms/step - loss: 1962.6624 - mae: 1962.6624 - val_loss: 1743.6954 - val_mae: 1743.6954
Epoch 428/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.7063 - mae: 1950.7063 - val_loss: 1759.4097 - val_mae: 1759.4097
Epoch 429/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.3193 - mae: 1955.3193 - val_loss: 1745.2053 - val_mae: 1745.2053
Epoch 430/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.6218 - mae: 1953.6218 - val_loss: 1762.4940 - val_mae: 1762.4940
Epoch 431/500
34/34 [==============================] - 0s 2ms/step - loss: 1952.9666 - mae: 1952.9666 - val_loss: 1760.6790 - val_mae: 1760.6790
Epoch 432/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.8594 - mae: 1955.8594 - val_loss: 1756.6204 - val_mae: 1756.6204
Epoch 433/500
34/34 [==============================] - 0s 2ms/step - loss: 1947.9142 - mae: 1947.9142 - val_loss: 1755.0618 - val_mae: 1755.0616
Epoch 434/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.4448 - mae: 1949.4448 - val_loss: 1751.0273 - val_mae: 1751.0273
Epoch 435/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.0499 - mae: 1954.0499 - val_loss: 1770.4410 - val_mae: 1770.4410
Epoch 436/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.2693 - mae: 1950.2693 - val_loss: 1755.4609 - val_mae: 1755.4609
Epoch 437/500
34/34 [==============================] - 0s 2ms/step - loss: 1952.0602 - mae: 1952.0602 - val_loss: 1758.6973 - val_mae: 1758.6973
Epoch 438/500
34/34 [==============================] - 0s 2ms/step - loss: 1952.9181 - mae: 1952.9181 - val_loss: 1757.1066 - val_mae: 1757.1066
Epoch 439/500
34/34 [==============================] - 0s 2ms/step - loss: 1962.9031 - mae: 1962.9031 - val_loss: 1750.6654 - val_mae: 1750.6654
Epoch 440/500
34/34 [==============================] - 0s 2ms/step - loss: 1948.3645 - mae: 1948.3645 - val_loss: 1755.0796 - val_mae: 1755.0796
Epoch 441/500
34/34 [==============================] - 0s 2ms/step - loss: 1964.3591 - mae: 1964.3591 - val_loss: 1745.6477 - val_mae: 1745.6477
Epoch 442/500
34/34 [==============================] - 0s 2ms/step - loss: 1944.2411 - mae: 1944.2411 - val_loss: 1750.2649 - val_mae: 1750.2649
Epoch 443/500
34/34 [==============================] - 0s 2ms/step - loss: 1945.4633 - mae: 1945.4633 - val_loss: 1761.6448 - val_mae: 1761.6448
Epoch 444/500
34/34 [==============================] - 0s 2ms/step - loss: 1944.9739 - mae: 1944.9739 - val_loss: 1758.9583 - val_mae: 1758.9583
Epoch 445/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.4044 - mae: 1958.4044 - val_loss: 1759.6649 - val_mae: 1759.6649
Epoch 446/500
34/34 [==============================] - 0s 1ms/step - loss: 1969.0897 - mae: 1969.0897 - val_loss: 1753.5978 - val_mae: 1753.5978
Epoch 447/500
34/34 [==============================] - 0s 2ms/step - loss: 1948.5836 - mae: 1948.5836 - val_loss: 1747.1993 - val_mae: 1747.1993
Epoch 448/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.2635 - mae: 1949.2635 - val_loss: 1756.9629 - val_mae: 1756.9629
Epoch 449/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.5714 - mae: 1950.5714 - val_loss: 1752.6770 - val_mae: 1752.6770
Epoch 450/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.8744 - mae: 1955.8744 - val_loss: 1749.4503 - val_mae: 1749.4503
Epoch 451/500
34/34 [==============================] - 0s 2ms/step - loss: 1941.1742 - mae: 1941.1742 - val_loss: 1749.6925 - val_mae: 1749.6925
Epoch 452/500
34/34 [==============================] - 0s 2ms/step - loss: 1946.8116 - mae: 1946.8116 - val_loss: 1758.3607 - val_mae: 1758.3607
Epoch 453/500
34/34 [==============================] - 0s 2ms/step - loss: 1947.7488 - mae: 1947.7488 - val_loss: 1751.2559 - val_mae: 1751.2559
Epoch 454/500
34/34 [==============================] - 0s 2ms/step - loss: 1948.8512 - mae: 1948.8512 - val_loss: 1754.0320 - val_mae: 1754.0320
Epoch 455/500
34/34 [==============================] - 0s 2ms/step - loss: 1956.5576 - mae: 1956.5576 - val_loss: 1764.3871 - val_mae: 1764.3871
Epoch 456/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.4485 - mae: 1949.4485 - val_loss: 1762.3010 - val_mae: 1762.3010
Epoch 457/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.4291 - mae: 1959.4291 - val_loss: 1761.3749 - val_mae: 1761.3749
Epoch 458/500
34/34 [==============================] - 0s 2ms/step - loss: 1959.7554 - mae: 1959.7554 - val_loss: 1767.1168 - val_mae: 1767.1168
Epoch 459/500
34/34 [==============================] - 0s 2ms/step - loss: 1952.9146 - mae: 1952.9146 - val_loss: 1754.0270 - val_mae: 1754.0270
Epoch 460/500
34/34 [==============================] - 0s 2ms/step - loss: 1948.2982 - mae: 1948.2982 - val_loss: 1753.8359 - val_mae: 1753.8359
Epoch 461/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.1692 - mae: 1955.1692 - val_loss: 1747.7649 - val_mae: 1747.7649
Epoch 462/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.5964 - mae: 1957.5964 - val_loss: 1743.2943 - val_mae: 1743.2943
Epoch 463/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.3003 - mae: 1955.3003 - val_loss: 1736.1887 - val_mae: 1736.1887
Epoch 464/500
34/34 [==============================] - 0s 2ms/step - loss: 1947.2417 - mae: 1947.2417 - val_loss: 1753.8684 - val_mae: 1753.8684
Epoch 465/500
34/34 [==============================] - 0s 2ms/step - loss: 1967.0166 - mae: 1967.0166 - val_loss: 1772.1665 - val_mae: 1772.1665
Epoch 466/500
34/34 [==============================] - 0s 2ms/step - loss: 1960.2444 - mae: 1960.2444 - val_loss: 1755.9545 - val_mae: 1755.9545
Epoch 467/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.8490 - mae: 1957.8490 - val_loss: 1753.0928 - val_mae: 1753.0928
Epoch 468/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.4043 - mae: 1955.4043 - val_loss: 1761.1515 - val_mae: 1761.1515
Epoch 469/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.5554 - mae: 1958.5554 - val_loss: 1755.3362 - val_mae: 1755.3362
Epoch 470/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.7463 - mae: 1954.7463 - val_loss: 1760.5854 - val_mae: 1760.5854
Epoch 471/500
34/34 [==============================] - 0s 2ms/step - loss: 1963.0171 - mae: 1963.0171 - val_loss: 1750.7061 - val_mae: 1750.7061
Epoch 472/500
34/34 [==============================] - 0s 1ms/step - loss: 1958.8883 - mae: 1958.8883 - val_loss: 1761.9213 - val_mae: 1761.9213
Epoch 473/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.8362 - mae: 1950.8362 - val_loss: 1756.2852 - val_mae: 1756.2852
Epoch 474/500
34/34 [==============================] - 0s 2ms/step - loss: 1948.9043 - mae: 1948.9043 - val_loss: 1747.6663 - val_mae: 1747.6663
Epoch 475/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.0776 - mae: 1953.0776 - val_loss: 1761.8174 - val_mae: 1761.8174
Epoch 476/500
34/34 [==============================] - 0s 2ms/step - loss: 1957.2098 - mae: 1957.2098 - val_loss: 1746.7466 - val_mae: 1746.7466
Epoch 477/500
34/34 [==============================] - 0s 2ms/step - loss: 1953.1517 - mae: 1953.1517 - val_loss: 1756.0098 - val_mae: 1756.0098
Epoch 478/500
34/34 [==============================] - 0s 2ms/step - loss: 1960.5311 - mae: 1960.5311 - val_loss: 1751.1040 - val_mae: 1751.1040
Epoch 479/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.0209 - mae: 1958.0209 - val_loss: 1756.1705 - val_mae: 1756.1705
Epoch 480/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.3047 - mae: 1954.3047 - val_loss: 1747.0914 - val_mae: 1747.0914
Epoch 481/500
34/34 [==============================] - 0s 2ms/step - loss: 1948.5564 - mae: 1948.5564 - val_loss: 1754.9144 - val_mae: 1754.9144
Epoch 482/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.6371 - mae: 1949.6371 - val_loss: 1742.7290 - val_mae: 1742.7290
Epoch 483/500
34/34 [==============================] - 0s 2ms/step - loss: 1955.9745 - mae: 1955.9745 - val_loss: 1759.7681 - val_mae: 1759.7681
Epoch 484/500
34/34 [==============================] - 0s 2ms/step - loss: 1962.0938 - mae: 1962.0938 - val_loss: 1748.1960 - val_mae: 1748.1960
Epoch 485/500
34/34 [==============================] - 0s 1ms/step - loss: 1946.9153 - mae: 1946.9153 - val_loss: 1759.8531 - val_mae: 1759.8531
Epoch 486/500
34/34 [==============================] - 0s 2ms/step - loss: 1954.7581 - mae: 1954.7581 - val_loss: 1766.8878 - val_mae: 1766.8878
Epoch 487/500
34/34 [==============================] - 0s 2ms/step - loss: 1952.4418 - mae: 1952.4418 - val_loss: 1759.1333 - val_mae: 1759.1333
Epoch 488/500
34/34 [==============================] - 0s 2ms/step - loss: 1951.7568 - mae: 1951.7568 - val_loss: 1749.5972 - val_mae: 1749.5972
Epoch 489/500
34/34 [==============================] - 0s 2ms/step - loss: 1941.7734 - mae: 1941.7734 - val_loss: 1757.6304 - val_mae: 1757.6304
Epoch 490/500
34/34 [==============================] - 0s 2ms/step - loss: 1949.5865 - mae: 1949.5865 - val_loss: 1770.8793 - val_mae: 1770.8793
Epoch 491/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.8700 - mae: 1950.8700 - val_loss: 1746.2600 - val_mae: 1746.2600
Epoch 492/500
34/34 [==============================] - 0s 2ms/step - loss: 1961.3124 - mae: 1961.3124 - val_loss: 1751.8602 - val_mae: 1751.8602
Epoch 493/500
34/34 [==============================] - 0s 1ms/step - loss: 1943.8804 - mae: 1943.8804 - val_loss: 1742.9180 - val_mae: 1742.9180
Epoch 494/500
34/34 [==============================] - 0s 2ms/step - loss: 1937.7426 - mae: 1937.7426 - val_loss: 1760.8480 - val_mae: 1760.8480
Epoch 495/500
34/34 [==============================] - 0s 2ms/step - loss: 1958.6722 - mae: 1958.6722 - val_loss: 1759.0151 - val_mae: 1759.0151
Epoch 496/500
34/34 [==============================] - 0s 2ms/step - loss: 1948.9408 - mae: 1948.9408 - val_loss: 1758.3007 - val_mae: 1758.3007
Epoch 497/500
34/34 [==============================] - 0s 2ms/step - loss: 1951.5358 - mae: 1951.5358 - val_loss: 1752.7567 - val_mae: 1752.7567
Epoch 498/500
34/34 [==============================] - 0s 1ms/step - loss: 1973.8839 - mae: 1973.8839 - val_loss: 1755.8190 - val_mae: 1755.8190
Epoch 499/500
34/34 [==============================] - 0s 2ms/step - loss: 1956.9613 - mae: 1956.9613 - val_loss: 1756.0472 - val_mae: 1756.0472
Epoch 500/500
34/34 [==============================] - 0s 2ms/step - loss: 1950.9912 - mae: 1950.9912 - val_loss: 1755.3545 - val_mae: 1755.3545
<keras.src.callbacks.History at 0x1dbd2dd4c10>
model_preds = model.predict(scaled_X_test)
9/9 [==============================] - 0s 894us/step
r2_score(y_true = y_test ,y_pred = model_preds)
0.8621520814130484

神经网络模型得分为 0.86,与随机森林差不多,紧随其后的是线性回归。性能最好的是GradientBoost Model,最差的是(upport Vector Regression Model with GridSearchCV

代码与数据集下载

详情请见个人医疗开支预测项目-VenusAI (aideeplearning.cn)

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.rhkb.cn/news/296461.html

如若内容造成侵权/违法违规/事实不符,请联系长河编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

Java零基础入门-java8新特性(下篇)

一、概述 ​上几期&#xff0c;我们是完整的学完了java异常类的学习及实战演示、以及学习了线程进程等基础概念&#xff0c;而这一期&#xff0c;我们要来玩点好的东西&#xff0c;那就是java8&#xff0c;我们都知道java8是自2004年发布java5之后最重要且一次重大的版本更新&…

Mac OS上使用matplotlib库显示中文字体

文章目录 问题描述解决步骤参考文章 问题描述 如果我们想要使用matplotlib画图的话&#xff0c;可能会出现下面的这种warning: UserWarning: Glyph 24212 (\N{CJK UNIFIED IDEOGRAPH-5E94}) missing from current font.解决步骤 解决这个问题&#xff0c;可以按照下面的做法…

SpringBoot全局异常处理

问题 当我们没有做任何的异常处理时&#xff0c;我们三层架构处理异常的方案&#xff1a; Mapper接口在操作数据库的时候出错了&#xff0c;此时异常会往上抛(谁调用Mapper就抛给谁)&#xff0c;会抛给service。 service 中也存在异常了&#xff0c;会抛给controller。 而在…

Python基于深度学习的人脸识别项目源码+演示视频,利用OpenCV进行人脸检测与识别 preview

​ 一、原理介绍 该人脸识别实例是一个基于深度学习和计算机视觉技术的应用&#xff0c;主要利用OpenCV和Python作为开发工具。系统采用了一系列算法和技术&#xff0c;其中包括以下几个关键步骤&#xff1a; 图像预处理&#xff1a;首先&#xff0c;对输入图像进行预处理&am…

鸿蒙南向开发案例:【智能养花机】

样例简介 智能养花机通过感知花卉、盆栽等植宠生长环境的温度、湿度信息&#xff0c;适时为它们补充水分。在连接网络后&#xff0c;配合数字管家应用&#xff0c;用户可远程进行浇水操作。用户还可在应用中设定日程&#xff0c;有计划的按日、按周进行浇水。在日程中用户可添…

Servlet原理Servlet API

目录 一、Servlet运行原理 1.1、问题 1.2、Servlet的具体执行过程 1.3、Tomcat初始化流程小结 1.4、Tomcat处理请求流程 二、Servlet API详解 2.1、HttpServlet类 2.1.1、处理Get请求 2.2、HttpServletRequest类 2.3、HttpServletResponse类 2.3.1、设置状态码 ​2.…

二维码的生成、下载Java,并返回给前端展示

分析 将生成的二维码图片&#xff0c;以IO流的方式&#xff0c;通过response响应体直接返回给请求方。 第一、不需要落到我们的磁盘&#xff0c;操作在内存中完成&#xff0c;效率比较高。 第二、所有生成二维码的请求&#xff0c;都可以访问这里&#xff0c;前端直接拿img标…

【tools】Lokalise 可用于本地化各种类型的应用程序和网站

【tools】Lokalise 可用于本地化各种类型的应用程序和网站 1. Lokalise 基本功能2. Lokalise 可用于本地化各种类型的应用程序和网站,那部署的应用程序和网站运行再什么地方,数据存储再什么位置?https://app.lokalise.com/quick-start 1. Lokalise 基本功能 Lokalise 是一款…

RUST语言基本数据类型认识

1.RUST的基本数据类型参考: 2.使用RUST数据类型声明变量并赋值: let a:i81;//8位有符号整数let a1:u82;//8位无符号整数let b:i161;//16位有符号整数let b1:u162;//16位无符号整数let c:i321;//32位有符号整数let c1:u322;//32位无符号整数let d:i641;//64位有符号整数let d1:u…

Java零基础入门-java8新特性(上篇)

一、本期教学目标 java8有哪些新特性什么是函数式接口什么是Lambda表达式掌握Stream ApiStream和Collect集合区别Stream创建方式Stream操作三步骤 二、概述 上几期&#xff0c;我们是完整的学完了java异常类的学习及实战演示、以及学习了线程进程等基础概念&#xff0c;而这一…

VSCode调试C++

1、环境准备 1.1、g的安装与使用 1.1.1、安装 方式一&#xff1a;Xcode安装 苹果的开发集成工具是Xcode.app&#xff0c;其中包含一堆命令行工具。 在 App store 可以看到其大小有好几个G&#xff0c;有点大。 方式二&#xff1a;Command Line Tools 安装 Command Line Too…

算法知识点汇总

知识点 1. 求二进制中1的个数 int get_count(int x)//返回x的二进制有多少个1 int get_count(int x) {int res 0;while (x){res ;x - x & -x;}return res; }2. 建树&#xff0c;和树的DFS 记得初始化头节点 const int N 1e5 10, M N * 2; int h[N], e[M], ne[M], id…

【智能算法】猎豹优化器(CO)原理及实现

目录 1.背景2.算法原理2.1算法思想2.2算法过程 3.结果展示4.参考文献 1.背景 2022年&#xff0c;MA Akbari等人受到自然界中猎豹捕猎行为启发&#xff0c;提出了猎豹优化器&#xff08;The Cheetah Optimizer&#xff0c;CO&#xff09;。 2.算法原理 2.1算法思想 CO法对猎…

Shell GPT:直接安装使用的chatgpt应用软件

ShellGPT是一款基于预训练生成式Transformer模型&#xff08;如GPT系列&#xff09;构建的智能Shell工具。它将先进的自然语言处理能力集成到Shell环境中&#xff0c;使用户能够使用接近日常对话的语言来操作和控制操作系统。 官网&#xff1a;GitHub - akl7777777/ShellGPT: *…

使用vuepress搭建个人的博客(一):基础构建

前言 vuepress是一个构建静态资源网站的库 地址:VuePress 一般来说,这个框架非常适合构建个人技术博客,你只需要把自己写好的markdown文档准备好,完成对应的配置就可以了 搭建 初始化和引入 创建文件夹press-blog npm初始化 npm init 引入包 npm install -D vuepress…

【C++】C++11类的新功能

&#x1f440;樊梓慕&#xff1a;个人主页 &#x1f3a5;个人专栏&#xff1a;《C语言》《数据结构》《蓝桥杯试题》《LeetCode刷题笔记》《实训项目》《C》《Linux》《算法》 &#x1f31d;每一个不曾起舞的日子&#xff0c;都是对生命的辜负 目录 前言 默认成员函数 类成…

Windows下编译TinyXML(XML文件解析)

作者&#xff1a;翟天保Steven 版权声明&#xff1a;著作权归作者所有&#xff0c;商业转载请联系作者获得授权&#xff0c;非商业转载请注明出处 TinyXML是什么&#xff1f; TinyXML是一个轻量级的C XML解析器&#xff0c;它提供了一种简单的方法来解析和操作XML文档。TinyXM…

Camtasia Studio2024汉化版下载(功能强大的屏幕录制和视频编辑软件)

Camtasia Studio 2024是一款功能强大的屏幕录制和视频编辑软件&#xff0c;由TechSmith公司开发。这款软件不仅能够帮助用户轻松地记录电脑屏幕上的任何操作&#xff0c;还可以将录制的视频进行专业的编辑和制作&#xff0c;最终输出高质量的视频教程、演示文稿、培训课程等。 …

Termius for Mac v8.4.0激活版下载

Termius for Mac是一款功能强大的多协议远程管理软件&#xff0c;专为开发人员、系统管理员和网络专业人士设计。它支持多种远程连接协议&#xff0c;如SSH、Telnet、RDP、VNC和RFB等&#xff0c;使得用户可以轻松连接到不同类型的远程服务器和设备。 软件下载&#xff1a;Term…

企业家升维认知:引领企业持续发展的关键

一、引言 在快速变化的时代背景下&#xff0c;企业家面临着前所未有的挑战与机遇。新东方教育科技集团董事长俞敏洪曾深刻指出&#xff1a;“企业家本身要不断升维自己的认知&#xff0c;才能带领企业持续发展。”这句话不仅揭示了企业家认知升维的重要性&#xff0c;也为我们…