RNN心脏病预测

本文为为🔗365天深度学习训练营内部文章

原作者:K同学啊

 一 前期准备

1.数据导入

import pandas as pd
from keras.optimizers import Adam
from matplotlib import pyplot as plt
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from keras.models import Sequential
from keras.layers import Dense,SimpleRNN
import warnings
warnings.filterwarnings('ignore')df = pd.read_csv('heart.csv')

2.检查数据 

查看是否有空值

print(df.shape)
print(df.isnull().sum())
(303, 14)
age         0
sex         0
cp          0
trestbps    0
chol        0
fbs         0
restecg     0
thalach     0
exang       0
oldpeak     0
slope       0
ca          0
thal        0
target      0
dtype: int64

二 数据预处理 

1.拆分训练集 

X = df.iloc[:,:-1]
y = df.iloc[:,-1]X_train,X_test,y_train,y_test = train_test_split(X,y,test_size=0.1,random_state=14)

2.数据标准化 

sc = StandardScaler()
X_train = sc.fit_transform(X_train)
X_test = sc.fit_transform(X_test)
X_train = X_train.reshape(X_train.shape[0],X_train.shape[1],1)
X_test = X_test.reshape(X_test.shape[0],X_test.shape[1],1)
array([[[ 1.44626869],[ 0.54006172],[ 0.62321699],[ 1.37686599],[ 0.83801861],[-0.48989795],[ 0.92069654],[-1.38834656],[ 1.34839972],[ 1.83944021],[-0.74161985],[ 0.18805174],[ 1.09773445]],[[-0.11901962],[ 0.54006172],[ 1.4632051 ],[-0.7179976 ],[-1.01585167],[-0.48989795],[-0.86315301],[ 0.77440436],[-0.74161985],[ 0.85288923],[-0.74161985],[-0.78354893],[ 1.09773445]],

 三 构建RNN模型

model = Sequential()
model.add(SimpleRNN(200,input_shape=(X_train.shape[1],1),activation='relu'))
model.add(Dense(100,activation='relu'))
model.add(Dense(1,activation='sigmoid'))
model.summary()
Model: "sequential"
_________________________________________________________________Layer (type)                Output Shape              Param #   
=================================================================simple_rnn (SimpleRNN)      (None, 200)               40400     dense (Dense)               (None, 100)               20100     dense_1 (Dense)             (None, 1)                 101       =================================================================
Total params: 60,601
Trainable params: 60,601
Non-trainable params: 0
_________________________________________________________________

四 编译模型 

optimizer = Adam(learning_rate=1e-4)
# 定义损失函数为二元交叉熵(binary_crossentropy),适用于二分类任务。使用先前定义的优化器,并设置监控指标为准确率
model.compile(loss='binary_crossentropy',optimizer=optimizer,metrics='accuracy')

五 训练模型 

epochs = 100
model.fit(x=X_train,y=y_train,validation_data=(X_test,y_test),verbose=1,epochs=epochs,batch_size=128)acc = model.history.history['accuracy']
val_acc = model.history.history['val_accuracy']
loss = model.history.history['loss']
val_loss = model.history.history['val_loss']
Epoch 1/100
3/3 [==============================] - 1s 130ms/step - loss: 0.6872 - accuracy: 0.5551 - val_loss: 0.6884 - val_accuracy: 0.5806
Epoch 2/100
3/3 [==============================] - 0s 19ms/step - loss: 0.6763 - accuracy: 0.6250 - val_loss: 0.6848 - val_accuracy: 0.6129
Epoch 3/100
3/3 [==============================] - 0s 19ms/step - loss: 0.6660 - accuracy: 0.6912 - val_loss: 0.6814 - val_accuracy: 0.6452
Epoch 4/100
3/3 [==============================] - 0s 18ms/step - loss: 0.6562 - accuracy: 0.7426 - val_loss: 0.6781 - val_accuracy: 0.6452
Epoch 5/100
3/3 [==============================] - 0s 18ms/step - loss: 0.6467 - accuracy: 0.7647 - val_loss: 0.6751 - val_accuracy: 0.6129
Epoch 6/100
3/3 [==============================] - 0s 19ms/step - loss: 0.6375 - accuracy: 0.7941 - val_loss: 0.6722 - val_accuracy: 0.6452
Epoch 7/100
3/3 [==============================] - 0s 18ms/step - loss: 0.6285 - accuracy: 0.8051 - val_loss: 0.6694 - val_accuracy: 0.6129
Epoch 8/100
3/3 [==============================] - 0s 18ms/step - loss: 0.6193 - accuracy: 0.8015 - val_loss: 0.6666 - val_accuracy: 0.6129
Epoch 9/100
3/3 [==============================] - 0s 18ms/step - loss: 0.6094 - accuracy: 0.8125 - val_loss: 0.6635 - val_accuracy: 0.5806
Epoch 10/100
3/3 [==============================] - 0s 18ms/step - loss: 0.6002 - accuracy: 0.8162 - val_loss: 0.6602 - val_accuracy: 0.6129
Epoch 11/100
3/3 [==============================] - 0s 25ms/step - loss: 0.5903 - accuracy: 0.8125 - val_loss: 0.6565 - val_accuracy: 0.5806
Epoch 12/100
3/3 [==============================] - 0s 18ms/step - loss: 0.5795 - accuracy: 0.8125 - val_loss: 0.6526 - val_accuracy: 0.5806
Epoch 13/100
3/3 [==============================] - 0s 18ms/step - loss: 0.5686 - accuracy: 0.8125 - val_loss: 0.6484 - val_accuracy: 0.6129
Epoch 14/100
3/3 [==============================] - 0s 20ms/step - loss: 0.5571 - accuracy: 0.8125 - val_loss: 0.6436 - val_accuracy: 0.6452
Epoch 15/100
3/3 [==============================] - 0s 20ms/step - loss: 0.5451 - accuracy: 0.8125 - val_loss: 0.6377 - val_accuracy: 0.6452
Epoch 16/100
3/3 [==============================] - 0s 17ms/step - loss: 0.5322 - accuracy: 0.8125 - val_loss: 0.6315 - val_accuracy: 0.6452
Epoch 17/100
3/3 [==============================] - 0s 24ms/step - loss: 0.5190 - accuracy: 0.8199 - val_loss: 0.6251 - val_accuracy: 0.6452
Epoch 18/100
3/3 [==============================] - 0s 17ms/step - loss: 0.5053 - accuracy: 0.8199 - val_loss: 0.6190 - val_accuracy: 0.6774
Epoch 19/100
3/3 [==============================] - 0s 17ms/step - loss: 0.4910 - accuracy: 0.8162 - val_loss: 0.6132 - val_accuracy: 0.6774
Epoch 20/100
3/3 [==============================] - 0s 16ms/step - loss: 0.4765 - accuracy: 0.8199 - val_loss: 0.6076 - val_accuracy: 0.6774
Epoch 21/100
3/3 [==============================] - 0s 16ms/step - loss: 0.4616 - accuracy: 0.8235 - val_loss: 0.6007 - val_accuracy: 0.6774
Epoch 22/100
3/3 [==============================] - 0s 16ms/step - loss: 0.4470 - accuracy: 0.8125 - val_loss: 0.5943 - val_accuracy: 0.6774
Epoch 23/100
3/3 [==============================] - 0s 16ms/step - loss: 0.4345 - accuracy: 0.8162 - val_loss: 0.5906 - val_accuracy: 0.6774
Epoch 24/100
3/3 [==============================] - 0s 15ms/step - loss: 0.4219 - accuracy: 0.8162 - val_loss: 0.5901 - val_accuracy: 0.7419
Epoch 25/100
3/3 [==============================] - 0s 16ms/step - loss: 0.4116 - accuracy: 0.8162 - val_loss: 0.5921 - val_accuracy: 0.7742
Epoch 26/100
3/3 [==============================] - 0s 16ms/step - loss: 0.4056 - accuracy: 0.8272 - val_loss: 0.5990 - val_accuracy: 0.7419
Epoch 27/100
3/3 [==============================] - 0s 15ms/step - loss: 0.3983 - accuracy: 0.8309 - val_loss: 0.5970 - val_accuracy: 0.7097
Epoch 28/100
3/3 [==============================] - 0s 15ms/step - loss: 0.3920 - accuracy: 0.8309 - val_loss: 0.5914 - val_accuracy: 0.7097
Epoch 29/100
3/3 [==============================] - 0s 15ms/step - loss: 0.3860 - accuracy: 0.8235 - val_loss: 0.5863 - val_accuracy: 0.7097
Epoch 30/100
3/3 [==============================] - 0s 17ms/step - loss: 0.3802 - accuracy: 0.8235 - val_loss: 0.5724 - val_accuracy: 0.7097
Epoch 31/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3757 - accuracy: 0.8346 - val_loss: 0.5572 - val_accuracy: 0.7419
Epoch 32/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3766 - accuracy: 0.8272 - val_loss: 0.5545 - val_accuracy: 0.7419
Epoch 33/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3706 - accuracy: 0.8272 - val_loss: 0.5608 - val_accuracy: 0.7419
Epoch 34/100
3/3 [==============================] - 0s 17ms/step - loss: 0.3639 - accuracy: 0.8382 - val_loss: 0.5899 - val_accuracy: 0.7419
Epoch 35/100
3/3 [==============================] - 0s 16ms/step - loss: 0.3694 - accuracy: 0.8272 - val_loss: 0.6097 - val_accuracy: 0.7742
Epoch 36/100
3/3 [==============================] - 0s 16ms/step - loss: 0.3682 - accuracy: 0.8346 - val_loss: 0.5859 - val_accuracy: 0.7419
Epoch 37/100
3/3 [==============================] - 0s 17ms/step - loss: 0.3567 - accuracy: 0.8309 - val_loss: 0.5680 - val_accuracy: 0.7419
Epoch 38/100
3/3 [==============================] - 0s 17ms/step - loss: 0.3497 - accuracy: 0.8419 - val_loss: 0.5528 - val_accuracy: 0.7419
Epoch 39/100
3/3 [==============================] - 0s 16ms/step - loss: 0.3484 - accuracy: 0.8603 - val_loss: 0.5417 - val_accuracy: 0.7742
Epoch 40/100
3/3 [==============================] - 0s 22ms/step - loss: 0.3487 - accuracy: 0.8603 - val_loss: 0.5386 - val_accuracy: 0.6774
Epoch 41/100
3/3 [==============================] - 0s 22ms/step - loss: 0.3473 - accuracy: 0.8640 - val_loss: 0.5383 - val_accuracy: 0.7097
Epoch 42/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3422 - accuracy: 0.8676 - val_loss: 0.5425 - val_accuracy: 0.7742
Epoch 43/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3353 - accuracy: 0.8713 - val_loss: 0.5467 - val_accuracy: 0.7419
Epoch 44/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3318 - accuracy: 0.8787 - val_loss: 0.5565 - val_accuracy: 0.7419
Epoch 45/100
3/3 [==============================] - 0s 17ms/step - loss: 0.3289 - accuracy: 0.8750 - val_loss: 0.5572 - val_accuracy: 0.7419
Epoch 46/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3263 - accuracy: 0.8750 - val_loss: 0.5548 - val_accuracy: 0.7419
Epoch 47/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3227 - accuracy: 0.8787 - val_loss: 0.5520 - val_accuracy: 0.7419
Epoch 48/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3191 - accuracy: 0.8824 - val_loss: 0.5564 - val_accuracy: 0.7419
Epoch 49/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3172 - accuracy: 0.8713 - val_loss: 0.5539 - val_accuracy: 0.7419
Epoch 50/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3149 - accuracy: 0.8824 - val_loss: 0.5381 - val_accuracy: 0.7419
Epoch 51/100
3/3 [==============================] - 0s 17ms/step - loss: 0.3110 - accuracy: 0.8824 - val_loss: 0.5427 - val_accuracy: 0.7419
Epoch 52/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3084 - accuracy: 0.8787 - val_loss: 0.5510 - val_accuracy: 0.7419
Epoch 53/100
3/3 [==============================] - 0s 17ms/step - loss: 0.3069 - accuracy: 0.8750 - val_loss: 0.5571 - val_accuracy: 0.7419
Epoch 54/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3052 - accuracy: 0.8860 - val_loss: 0.5468 - val_accuracy: 0.7419
Epoch 55/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3024 - accuracy: 0.8787 - val_loss: 0.5347 - val_accuracy: 0.7419
Epoch 56/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3010 - accuracy: 0.8787 - val_loss: 0.5417 - val_accuracy: 0.7419
Epoch 57/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3013 - accuracy: 0.8860 - val_loss: 0.5496 - val_accuracy: 0.7419
Epoch 58/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2975 - accuracy: 0.8824 - val_loss: 0.5355 - val_accuracy: 0.7419
Epoch 59/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2954 - accuracy: 0.8787 - val_loss: 0.5198 - val_accuracy: 0.7419
Epoch 60/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2970 - accuracy: 0.8787 - val_loss: 0.5148 - val_accuracy: 0.7419
Epoch 61/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2991 - accuracy: 0.8824 - val_loss: 0.5187 - val_accuracy: 0.7419
Epoch 62/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2958 - accuracy: 0.8787 - val_loss: 0.5376 - val_accuracy: 0.7419
Epoch 63/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2891 - accuracy: 0.8860 - val_loss: 0.5659 - val_accuracy: 0.7419
Epoch 64/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2923 - accuracy: 0.8824 - val_loss: 0.5777 - val_accuracy: 0.7419
Epoch 65/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2892 - accuracy: 0.8824 - val_loss: 0.5560 - val_accuracy: 0.7419
Epoch 66/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2848 - accuracy: 0.8934 - val_loss: 0.5405 - val_accuracy: 0.7419
Epoch 67/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2828 - accuracy: 0.8897 - val_loss: 0.5334 - val_accuracy: 0.7419
Epoch 68/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2810 - accuracy: 0.8934 - val_loss: 0.5332 - val_accuracy: 0.7419
Epoch 69/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2792 - accuracy: 0.8934 - val_loss: 0.5307 - val_accuracy: 0.7419
Epoch 70/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2780 - accuracy: 0.8934 - val_loss: 0.5370 - val_accuracy: 0.7419
Epoch 71/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2763 - accuracy: 0.8934 - val_loss: 0.5459 - val_accuracy: 0.7419
Epoch 72/100
3/3 [==============================] - 0s 21ms/step - loss: 0.2762 - accuracy: 0.8971 - val_loss: 0.5583 - val_accuracy: 0.7419
Epoch 73/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2759 - accuracy: 0.8971 - val_loss: 0.5676 - val_accuracy: 0.7419
Epoch 74/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2764 - accuracy: 0.8934 - val_loss: 0.5715 - val_accuracy: 0.7419
Epoch 75/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2747 - accuracy: 0.8934 - val_loss: 0.5540 - val_accuracy: 0.7419
Epoch 76/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2701 - accuracy: 0.8971 - val_loss: 0.5387 - val_accuracy: 0.7419
Epoch 77/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2689 - accuracy: 0.9044 - val_loss: 0.5308 - val_accuracy: 0.7419
Epoch 78/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2701 - accuracy: 0.9081 - val_loss: 0.5241 - val_accuracy: 0.7097
Epoch 79/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2716 - accuracy: 0.9007 - val_loss: 0.5241 - val_accuracy: 0.7097
Epoch 80/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2690 - accuracy: 0.9007 - val_loss: 0.5332 - val_accuracy: 0.7097
Epoch 81/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2650 - accuracy: 0.9154 - val_loss: 0.5418 - val_accuracy: 0.7419
Epoch 82/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2631 - accuracy: 0.9118 - val_loss: 0.5434 - val_accuracy: 0.7419
Epoch 83/100
3/3 [==============================] - 0s 16ms/step - loss: 0.2620 - accuracy: 0.9154 - val_loss: 0.5406 - val_accuracy: 0.7419
Epoch 84/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2603 - accuracy: 0.9154 - val_loss: 0.5395 - val_accuracy: 0.7419
Epoch 85/100
3/3 [==============================] - 0s 26ms/step - loss: 0.2588 - accuracy: 0.9154 - val_loss: 0.5497 - val_accuracy: 0.7419
Epoch 86/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2562 - accuracy: 0.9081 - val_loss: 0.5687 - val_accuracy: 0.7419
Epoch 87/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2609 - accuracy: 0.8971 - val_loss: 0.5754 - val_accuracy: 0.7419
Epoch 88/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2569 - accuracy: 0.8971 - val_loss: 0.5555 - val_accuracy: 0.7419
Epoch 89/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2532 - accuracy: 0.9081 - val_loss: 0.5399 - val_accuracy: 0.7419
Epoch 90/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2545 - accuracy: 0.9191 - val_loss: 0.5361 - val_accuracy: 0.7419
Epoch 91/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2578 - accuracy: 0.9118 - val_loss: 0.5375 - val_accuracy: 0.7419
Epoch 92/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2572 - accuracy: 0.9118 - val_loss: 0.5507 - val_accuracy: 0.7419
Epoch 93/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2516 - accuracy: 0.9118 - val_loss: 0.5715 - val_accuracy: 0.7419
Epoch 94/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2487 - accuracy: 0.9118 - val_loss: 0.5705 - val_accuracy: 0.7419
Epoch 95/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2464 - accuracy: 0.9118 - val_loss: 0.5551 - val_accuracy: 0.7419
Epoch 96/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2454 - accuracy: 0.9191 - val_loss: 0.5480 - val_accuracy: 0.7419
Epoch 97/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2438 - accuracy: 0.9154 - val_loss: 0.5543 - val_accuracy: 0.7419
Epoch 98/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2447 - accuracy: 0.9118 - val_loss: 0.5534 - val_accuracy: 0.7419
Epoch 99/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2446 - accuracy: 0.9118 - val_loss: 0.5425 - val_accuracy: 0.7419
Epoch 100/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2434 - accuracy: 0.9118 - val_loss: 0.5213 - val_accuracy: 0.7742

 六 结果可视化

epochs_range = range(100)
plt.figure(figsize=(14,4))
plt.subplot(1,2,1)
plt.plot(epochs_range,acc,label='training accuracy')
plt.plot(epochs_range,val_acc,label='validation accuracy')
plt.legend(loc='lower right')
plt.title('training and validation accuracy')plt.subplot(1,2,2)
plt.plot(epochs_range,loss,label='training loss')
plt.plot(epochs_range,val_loss,label='validation loss')
plt.legend(loc='upper right')
plt.title('training and validation loss')
plt.show()

总结: 

1. 模型输入要求
RNN 输入格式:许多深度学习模型,尤其是 RNN 和 LSTM,需要输入数据的形状为三维:(样本数, 时间步数, 特征数)。这使得模型能够处理序列数据并学习时间依赖关系。
2. 数据原始形状
在标准化后,X_train 和 X_test 的形状是 (样本数, 特征数)。例如,如果 X_train 有 100 个样本和 10 个特征,则其形状为 (100, 10)。
3. 重塑的目的
重塑为三维:通过 X_train.reshape(X_train.shape[0], X_train.shape[1], 1),你将数据的形状改变为 (样本数, 特征数, 1)。这里的 1 表示特征数,在单变量情况下,只包含一个特征。
例如,假设 X_train 原本的形状是 (100, 10),重塑后将变为 (100, 10, 1),表示有 100 个样本,每个样本有 10 个时间步(特征)。
4. 适应模型结构
通过这种重塑,数据可以被 RNN 模型正确地处理,从而捕捉到特征随时间变化的模式。

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.rhkb.cn/news/445034.html

如若内容造成侵权/违法违规/事实不符,请联系长河编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

构建高效互通的数字桥梁:香港服务器托管指南

在当今全球化日益加深的商业环境中,出海企业面临着前所未有的机遇与挑战。为了确保国内外业务的顺畅运行,特别是在实现国内外数据高效互通、低延迟访问方面,选择一家合适的香港服务器机房进行托管成为了许多企业的关键决策之一。香港&#xf…

网络协议——IP协议

一、IPv4 1、IPv4:TCP/IP协议规定,IPv4地址使用32位的二进制表示,也就是4个字节,为了方便使用,IPv4地址被写成十进制形式,中间用”.”分开。 【点分十进制表示法】 2、IPv4地址分类 2.1 私有地址在互联网…

基于DSP+ARM+FPGA的电能质量分析仪的软件设计

软件设计是电能质量设备的核心内容,上述章节详细介绍了电能质量参数的 算法,并且通过仿真实验进行了验证,本章将结合现代电能质量监测设备需求实 现算法在实际电网中应用。根据设计的电能质量分析仪的需求分析,进行总体的 软件…

英特尔新旗舰 CPU 将运行更凉爽、更高效,适合 PC 游戏

英特尔终于解决了台式机 CPU 发热和耗电的问题。英特尔的新旗舰 Core Ultra 200S 系列处理器将于 10 月 24 日上市,该系列专注于每瓦性能,比之前的第 14 代芯片运行更凉爽、更高效。这些代号为 Arrow Lake S 的处理器也是英特尔首款内置 NPU(…

【笔记】自动驾驶预测与决策规划_Part5_决策过程(上)

决策过程 0. 前言1.决策过程的引入1.1有了planning,为什么还需要decision-making?1.2 决策规划的一些思考 2.马尔可夫决策过程及其关键要素2.1 马尔可夫过程2.1.1 什么是随机过程?2.1.2 什么是马尔科夫性?2.1.3 马尔可夫决策过程 …

从commit校验失效问题探究husky原理

一、背景 之前创建的项目,发现代码 commit 提交的时候没有了任何校验,具体表现: 一是 feat fix 等主题格式校验没有了二是代码 lint 不通过也能提交 尝试解决这个问题,并深入了解husky的实现原理,将相关的一些知识点…

PMP--冲刺题--解题--161-170

文章目录 4.整合管理--1.制定项目章程--当各方的认知、理解、掌握的信息不一致的时候,相对最好的方法就是共同确认项目相关文件/计划中的具体内容(是否项目真的存在这个交付物)。161、 [单选] 在一个与高级经理的项目状态会议中,项…

动态规划-路径问题——931.下降路径最小和

1.题目解析 题目来源:931.下降路径最小和——力扣 测试用例 2.算法原理 1.状态表示 我们可以开辟一个dp表,多开辟一行两列用来存储虚拟位置,dp[i][j]表示从第一行到该位置的最小路径和 2.状态转移方程 由于要找到最小路径和,并且由…

springboot将logback替换成log4j2

一 为何要替换成log4j2 1.1 log4j2的优点 log4j2使用了两种方式记录日志:AsyncAppender和AsyncLogger。 1.AsyncAppender使用队列异步记录日志,但是一旦队列已满,appender线程需要等待。2.AsyncLogger是采用Disruptor,通过环形…

Android Framework AMS(05)startActivity分析-2(ActivityThread启动到Activity拉起)

该系列文章总纲链接:专题总纲目录 Android Framework 总纲 本章关键点总结 & 说明: 说明:本章节主要解读AMS通过startActivity启动Activity的整个流程的整个流程的第二阶段:从ActivityThread启动到Activity拉起。 第一阶段文…

【Unity精品源码】打造甜蜜的三消游戏:Candy Match 3 Kit

最近总熬夜,肝不好,大家都叫我小心肝。 📂 Unity 开发资源汇总 | 插件 | 模型 | 源码 💓 欢迎访问 Unity 打怪升级大本营 在游戏开发的世界中,三消游戏以其简单易上手、趣味性强的特点,一直深受玩家喜爱。…

【HTTPS】深入解析 https

我的主页:2的n次方_ 1. 背景介绍 在使用 http 协议的时候是不安全的,可能会出现运营商劫持等安全问题,运营商通过劫持 http 流量,篡改返回的网页内容,例如广告业务,可能会通过 Referer 字段 来统计是…

第十一章 缓存之更新/穿透/雪崩/击穿

目录 一、什么是缓存 二、缓存更新策略 2.1. 缓存主动更新策略 2.1.1. Cache Aside模式(主流)‌ 2.1.2. Read/Write Through模式‌ 2.1‌.3. Write Behind模式‌ 2.1.4. 总结 三、缓存穿透 四、缓存雪崩 五、缓存击穿 5.1. 互斥锁实现 5.1.1…

Elasticsearch学习笔记(五)Elastic stack安全配置二

一、手动配置http层SSL 通过前面的配置,我们为集群传输层手动配置了TLS,集群内部节点之间的通信使用手动配置的证书进行加密,但是集群与外部客户端的http层目前还是使用的自动配置,集群中HTTP的通信目前仍然使用自动生成的证书ht…

SQL Injection | MySQL 数据库概述

关注这个漏洞的其他相关笔记:SQL 注入漏洞 - 学习手册-CSDN博客 0x01:MySQL 数据库简介 MySQL 是一个流行的关系型数据库管理系统(RDBMS),它基于 SQL (Structured Query Language)进行操作。My…

Python库matplotlib之六

Python库matplotlib之六 动画FuncAnimation构造器成员函数应用例子 动画 Matplotlib基于其绘图功能,还提供了一个使用动画模块,生成动画的接口。动画是一系列帧,其中每个帧对应于图形上的一个图。 Matplotlib使用两个类来实现动画&#xff…

Backend - MySQL Server、HeidiSQL

目录 一、MySQL Server (一)官网下载 (二)安装与配置 二、HeidiSQL软件 (一)安装 1. 官网下载 2. 打开 3. 使用 (1)打开服务 (2)新增数据库 ​&#xff…

python networkx 计算路径A*

import matplotlib.pyplot as plt # 导入 Matplotlib 工具包 import networkx as nx # 导入 NetworkX 工具包 from typing import List# 初始化空的无向图 graph nx.Graph() # 向图中添加多条赋权边: (node1,node2,weight) graph.add_weighted_edges_from([(1, 2, 50),(1, 3…

集合框架05:List接口使用、List实现类、ArrayList使用

视频链接:13.11 ArrayList使用_哔哩哔哩_bilibilihttps://www.bilibili.com/video/BV1zD4y1Q7Fw?p11&vd_sourceb5775c3a4ea16a5306db9c7c1c1486b5 1.List接口使用代码举例 package com.yundait.Demo01;import java.util.ArrayList; import java.util.List;pu…

dowhy中反驳实验怎么做?

首先,我们打开最新的dowhy版本网站。 https://www.pywhy.org/dowhy/v0.11.1/index.html 我们主要看标题栏的User Guide和Examples就可以了,如果在User Guide 里找不到使用方法,就去Examples里找例子,里面的数据读取修改为自己的数…