SHAP(六):使用 XGBoost 和 HyperOpt 进行信用卡欺诈检测

news2025/1/19 20:33:56

SHAP(六):使用 XGBoost 和 HyperOpt 进行信用卡欺诈检测

本笔记本介绍了 XGBoost Classifier 在金融行业中的实现,特别是在信用卡欺诈检测方面。 构建 XGBoost 分类器后,它将使用 HyperOpt 库(sklearn 的 GridSearchCV 和 RandomziedSearchCV 算法的替代方案)来调整各种模型参数,目标是实现正常交易和欺诈交易分类的最大 f1 分数。 作为模型评估的一部分,将计算 f1 分数度量,为分类构建混淆矩阵,生成分类报告并绘制精确召回曲线。 最后,将根据 XGBoost 的内部算法以及特征重要性的 SHAP 实现来计算和绘制特征重要性。

来源:https://github.com/albazahm/Credit_Card_Fraud_Detection_with_XGBoost_and_HyperOpt/tree/master

1. Loading Libraries and Data

#loading libraries
import numpy as np
import pandas as pd 
import matplotlib.pyplot as plt
from sklearn.metrics import f1_score, make_scorer, confusion_matrix, classification_report, precision_recall_curve, plot_precision_recall_curve, average_precision_score, auc
from sklearn.model_selection import train_test_split
import seaborn as sns
from hyperopt import hp, fmin, tpe, Trials, STATUS_OK
import xgboost as xgb
import shap
# Any results you write to the current directory are saved as output.
/kaggle/input/creditcardfraud/creditcard.csv
#loading the data into a dataframe
credit_df = pd.read_csv('./creditcard.csv')

2. Data Overview

#preview of the first 10 rows of data
credit_df.head(10)
TimeV1V2V3V4V5V6V7V8V9...V21V22V23V24V25V26V27V28AmountClass
00.0-1.359807-0.0727812.5363471.378155-0.3383210.4623880.2395990.0986980.363787...-0.0183070.277838-0.1104740.0669280.128539-0.1891150.133558-0.021053149.620
10.01.1918570.2661510.1664800.4481540.060018-0.082361-0.0788030.085102-0.255425...-0.225775-0.6386720.101288-0.3398460.1671700.125895-0.0089830.0147242.690
21.0-1.358354-1.3401631.7732090.379780-0.5031981.8004990.7914610.247676-1.514654...0.2479980.7716790.909412-0.689281-0.327642-0.139097-0.055353-0.059752378.660
31.0-0.966272-0.1852261.792993-0.863291-0.0103091.2472030.2376090.377436-1.387024...-0.1083000.005274-0.190321-1.1755750.647376-0.2219290.0627230.061458123.500
42.0-1.1582330.8777371.5487180.403034-0.4071930.0959210.592941-0.2705330.817739...-0.0094310.798278-0.1374580.141267-0.2060100.5022920.2194220.21515369.990
52.0-0.4259660.9605231.141109-0.1682520.420987-0.0297280.4762010.260314-0.568671...-0.208254-0.559825-0.026398-0.371427-0.2327940.1059150.2538440.0810803.670
64.01.2296580.1410040.0453711.2026130.1918810.272708-0.0051590.0812130.464960...-0.167716-0.270710-0.154104-0.7800550.750137-0.2572370.0345070.0051684.990
77.0-0.6442691.4179641.074380-0.4921990.9489340.4281181.120631-3.8078640.615375...1.943465-1.0154550.057504-0.649709-0.415267-0.051634-1.206921-1.08533940.800
87.0-0.8942860.286157-0.113192-0.2715262.6695993.7218180.3701450.851084-0.392048...-0.073425-0.268092-0.2042331.0115920.373205-0.3841570.0117470.14240493.200
99.0-0.3382621.1195931.044367-0.2221870.499361-0.2467610.6515830.069539-0.736727...-0.246914-0.633753-0.120794-0.385050-0.0697330.0941990.2462190.0830763.680

10 rows × 31 columns

#displaying descriptive statistics
credit_df.describe()
TimeV1V2V3V4V5V6V7V8V9...V21V22V23V24V25V26V27V28AmountClass
count284807.0000002.848070e+052.848070e+052.848070e+052.848070e+052.848070e+052.848070e+052.848070e+052.848070e+052.848070e+05...2.848070e+052.848070e+052.848070e+052.848070e+052.848070e+052.848070e+052.848070e+052.848070e+05284807.000000284807.000000
mean94813.8595753.919560e-155.688174e-16-8.769071e-152.782312e-15-1.552563e-152.010663e-15-1.694249e-15-1.927028e-16-3.137024e-15...1.537294e-167.959909e-165.367590e-164.458112e-151.453003e-151.699104e-15-3.660161e-16-1.206049e-1688.3496190.001727
std47488.1459551.958696e+001.651309e+001.516255e+001.415869e+001.380247e+001.332271e+001.237094e+001.194353e+001.098632e+00...7.345240e-017.257016e-016.244603e-016.056471e-015.212781e-014.822270e-014.036325e-013.300833e-01250.1201090.041527
min0.000000-5.640751e+01-7.271573e+01-4.832559e+01-5.683171e+00-1.137433e+02-2.616051e+01-4.355724e+01-7.321672e+01-1.343407e+01...-3.483038e+01-1.093314e+01-4.480774e+01-2.836627e+00-1.029540e+01-2.604551e+00-2.256568e+01-1.543008e+010.0000000.000000
25%54201.500000-9.203734e-01-5.985499e-01-8.903648e-01-8.486401e-01-6.915971e-01-7.682956e-01-5.540759e-01-2.086297e-01-6.430976e-01...-2.283949e-01-5.423504e-01-1.618463e-01-3.545861e-01-3.171451e-01-3.269839e-01-7.083953e-02-5.295979e-025.6000000.000000
50%84692.0000001.810880e-026.548556e-021.798463e-01-1.984653e-02-5.433583e-02-2.741871e-014.010308e-022.235804e-02-5.142873e-02...-2.945017e-026.781943e-03-1.119293e-024.097606e-021.659350e-02-5.213911e-021.342146e-031.124383e-0222.0000000.000000
75%139320.5000001.315642e+008.037239e-011.027196e+007.433413e-016.119264e-013.985649e-015.704361e-013.273459e-015.971390e-01...1.863772e-015.285536e-011.476421e-014.395266e-013.507156e-012.409522e-019.104512e-027.827995e-0277.1650000.000000
max172792.0000002.454930e+002.205773e+019.382558e+001.687534e+013.480167e+017.330163e+011.205895e+022.000721e+011.559499e+01...2.720284e+011.050309e+012.252841e+014.584549e+007.519589e+003.517346e+003.161220e+013.384781e+0125691.1600001.000000

8 rows × 31 columns

#exploring datatypes and count of non-NULL rows for each feature
credit_df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 284807 entries, 0 to 284806
Data columns (total 31 columns):
Time      284807 non-null float64
V1        284807 non-null float64
V2        284807 non-null float64
V3        284807 non-null float64
V4        284807 non-null float64
V5        284807 non-null float64
V6        284807 non-null float64
V7        284807 non-null float64
V8        284807 non-null float64
V9        284807 non-null float64
V10       284807 non-null float64
V11       284807 non-null float64
V12       284807 non-null float64
V13       284807 non-null float64
V14       284807 non-null float64
V15       284807 non-null float64
V16       284807 non-null float64
V17       284807 non-null float64
V18       284807 non-null float64
V19       284807 non-null float64
V20       284807 non-null float64
V21       284807 non-null float64
V22       284807 non-null float64
V23       284807 non-null float64
V24       284807 non-null float64
V25       284807 non-null float64
V26       284807 non-null float64
V27       284807 non-null float64
V28       284807 non-null float64
Amount    284807 non-null float64
Class     284807 non-null int64
dtypes: float64(30), int64(1)
memory usage: 67.4 MB

3. Data Preparation

在这里,我们查找并删除数据中的重复观测值,定义用于分类的自变量 (X) 和因变量 (Y),并分离出验证集和测试集。

#checking for duplicated observations
credit_df.duplicated().value_counts()
False    283726
True       1081
dtype: int64
#dropping duplicated observations
credit_df = credit_df.drop_duplicates()
#defining independent (X) and dependent (Y) variables from dataframe
X = credit_df.drop(columns = 'Class')
Y = credit_df['Class'].values
#splitting a testing set from the data
X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size = 0.20, stratify = Y, random_state = 42)
#splitting a validation set from the training set to tune parameters
X_train, X_val, Y_train, Y_val = train_test_split(X_train, Y_train, test_size = 0.20, stratify = Y_train, random_state = 42)

4. Model Set-Up and Training

在本节中,我们基于 f1 度量创建一个评分器,并为 XGBoost 模型定义参数搜索空间。 此外,我们定义了一个包含分类器的函数,提取其预测,计算损失并将其提供给优化器。 最后,我们使用所需的设置初始化优化器,运行它并查看试验中的参数和分数。

#creating a scorer from the f1-score metric
f1_scorer = make_scorer(f1_score)
# defining the space for hyperparameter tuning
space = {'eta': hp.uniform("eta", 0.1, 1),
        'max_depth': hp.quniform("max_depth", 3, 18, 1),
        'gamma': hp.uniform ('gamma', 1,9),
        'reg_alpha' : hp.quniform('reg_alpha', 50, 200, 1),
        'reg_lambda' : hp.uniform('reg_lambda', 0, 1),
        'colsample_bytree' : hp.uniform('colsample_bytree', 0.5, 1),
        'min_child_weight' : hp.quniform('min_child_weight', 0, 10, 1),
        'n_estimators': hp.quniform('n_estimators', 100, 200, 10)
        }
#defining function to optimize
def hyperparameter_tuning(space):
    clf = xgb.XGBClassifier(n_estimators = int(space['n_estimators']),       #number of trees to use
                            eta = space['eta'],                              #learning rate
                            max_depth = int(space['max_depth']),             #depth of trees
                            gamma = space['gamma'],                          #loss reduction required to further partition tree
                            reg_alpha = int(space['reg_alpha']),             #L1 regularization for weights
                            reg_lambda = space['reg_lambda'],                #L2 regularization for weights
                            min_child_weight = space['min_child_weight'],    #minimum sum of instance weight needed in child
                            colsample_bytree = space['colsample_bytree'],    #ratio of column sampling for each tree
                            nthread = -1)                                    #number of parallel threads used
    
    evaluation = [(X_train, Y_train), (X_val, Y_val)]
    
    clf.fit(X_train, Y_train,
            eval_set = evaluation,
            early_stopping_rounds = 10,
            verbose = False)

    pred = clf.predict(X_val)
    pred = [1 if i>= 0.5 else 0 for i in pred]
    f1 = f1_score(Y_val, pred)
    print ("SCORE:", f1)
    return {'loss': -f1, 'status': STATUS_OK }
# run the hyper paramter tuning
trials = Trials()
best = fmin(fn = hyperparameter_tuning,
            space = space,
            algo = tpe.suggest,
            max_evals = 100,
            trials = trials)

print (best)
SCORE:                                                 
0.7552447552447553                                     
SCORE:                                                                            
0.0                                                                               
SCORE:                                                                            
0.0                                                                               
SCORE:                                                                            
0.0                                                                               
SCORE:                                                                            
0.0                                                                               
SCORE:                                                                            
0.0                                                                               
SCORE:                                                                            
0.0                                                                               
SCORE:                                                                            
0.0                                                                               
SCORE:                                                                            
0.0                                                                               
SCORE:                                                                            
0.0                                                                               
SCORE:                                                                             
0.8169014084507042                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.6666666666666666                                                                 
SCORE:                                                                             
0.7737226277372262                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.8169014084507042                                                                 
SCORE:                                                                             
0.8169014084507042                                                                 
SCORE:                                                                             
0.8169014084507042                                                                 
SCORE:                                                                             
0.7891156462585034                                                                 
SCORE:                                                                             
0.7401574803149605                                                                 
SCORE:                                                                             
0.7737226277372262                                                                 
SCORE:                                                                             
0.7971014492753624                                                                 
SCORE:                                                                             
0.7499999999999999                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.7552447552447553                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.7883211678832117                                                                 
SCORE:                                                                             
0.7891156462585034                                                                 
SCORE:                                                                             
0.7737226277372262                                                                 
SCORE:                                                                             
0.782608695652174                                                                  
SCORE:                                                                             
0.8055555555555555                                                                 
SCORE:                                                                             
0.7401574803149605                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.7552447552447553                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.7737226277372262                                                                 
SCORE:                                                                             
0.7499999999999999                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.8085106382978723                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.7401574803149605                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.7972972972972973                                                                 
SCORE:                                                                             
0.608695652173913                                                                  
SCORE:                                                                             
0.7552447552447553                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.7384615384615385                                                                 
SCORE:                                                                             
0.8169014084507042                                                                 
SCORE:                                                                             
0.802919708029197                                                                  
SCORE:                                                                             
0.8169014084507042                                                                 
SCORE:                                                                             
0.8201438848920864                                                                 
SCORE:                                                                             
0.8201438848920864                                                                 
SCORE:                                                                             
0.8201438848920864                                                                 
SCORE:                                                                             
0.8085106382978723                                                                 
SCORE:                                                                             
0.8169014084507042                                                                 
SCORE:                                                                             
0.8085106382978723                                                                 
SCORE:                                                                             
0.7910447761194029                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.7819548872180451                                                                 
SCORE:                                                                             
0.802919708029197                                                                  
SCORE:                                                                             
0.8085106382978723                                                                 
SCORE:                                                                             
0.8169014084507042                                                                 
SCORE:                                                                             
0.7910447761194029                                                                 
SCORE:                                                                             
0.7910447761194029                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.7999999999999999                                                                 
SCORE:                                                                             
0.8085106382978723                                                                 
SCORE:                                                                             
0.8169014084507042                                                                 
SCORE:                                                                             
0.7692307692307692                                                                 
SCORE:                                                                             
0.7999999999999999                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.7737226277372262                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.7301587301587301                                                                 
SCORE:                                                                             
0.7786259541984732                                                                 
SCORE:                                                                             
0.7878787878787878                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.7878787878787878                                                                 
SCORE:                                                                             
0.7692307692307692                                                                 
SCORE:                                                                             
0.0                                                                                
SCORE:                                                                             
0.7499999999999999                                                                 
SCORE:                                                                             
0.8169014084507042                                                                 
SCORE:                                                                             
0.7910447761194029                                                                 
100%|██████████| 100/100 [11:24<00:00,  6.84s/trial, best loss: -0.8201438848920864]
{'colsample_bytree': 0.9999995803500363, 'eta': 0.1316102455832729, 'gamma': 1.6313395777817137, 'max_depth': 5.0, 'min_child_weight': 3.0, 'n_estimators': 100.0, 'reg_alpha': 47.0, 'reg_lambda': 0.4901343161108276}
#plotting feature space and f1-scores for the different trials
parameters = space.keys()
cols = len(parameters)

f, axes = plt.subplots(nrows=1, ncols=cols, figsize=(20,5))
cmap = plt.cm.jet
for i, val in enumerate(parameters):
    xs = np.array([t['misc']['vals'][val] for t in trials.trials]).ravel()
    ys = [-t['result']['loss'] for t in trials.trials]
    xs, ys = zip(*sorted(zip(xs, ys)))
    axes[i].scatter(xs, ys, s=20, linewidth=0.01, alpha=0.25, c=cmap(float(i)/len(parameters)))
    axes[i].set_title(val)
    axes[i].grid()

在这里插入图片描述

#printing best model parameters
print(best)
{'colsample_bytree': 0.9999995803500363, 'eta': 0.1316102455832729, 'gamma': 1.6313395777817137, 'max_depth': 5.0, 'min_child_weight': 3.0, 'n_estimators': 100.0, 'reg_alpha': 47.0, 'reg_lambda': 0.4901343161108276}

5. Model Test and Evaluation

本节将探讨并可视化模型在测试数据上的表现。

#initializing XGBoost Classifier with best model parameters
best_clf = xgb.XGBClassifier(n_estimators = int(best['n_estimators']), 
                            eta = best['eta'], 
                            max_depth = int(best['max_depth']), 
                            gamma = best['gamma'], 
                            reg_alpha = int(best['reg_alpha']), 
                            min_child_weight = best['min_child_weight'], 
                            colsample_bytree = best['colsample_bytree'], 
                            nthread = -1)
#fitting XGBoost Classifier with best model parameters to training data
best_clf.fit(X_train, Y_train)
XGBClassifier(base_score=0.5, booster='gbtree', colsample_bylevel=1,
              colsample_bynode=1, colsample_bytree=0.9999995803500363,
              eta=0.1316102455832729, gamma=1.6313395777817137,
              learning_rate=0.1, max_delta_step=0, max_depth=5,
              min_child_weight=3.0, missing=None, n_estimators=100, n_jobs=1,
              nthread=-1, objective='binary:logistic', random_state=0,
              reg_alpha=47, reg_lambda=1, scale_pos_weight=1, seed=None,
              silent=None, subsample=1, verbosity=1)
#using the model to predict on the test set
Y_pred = best_clf.predict(X_test)
#printing f1 score of test set predictions
print('The f1-score on the test data is: {0:.2f}'.format(f1_score(Y_test, Y_pred)))
The f1-score on the test data is: 0.74
#creating a confusion matrix and labels
cm = confusion_matrix(Y_test, Y_pred)
labels = ['Normal', 'Fraud']
#plotting the confusion matrix
sns.heatmap(cm, annot = True, xticklabels = labels, yticklabels = labels, fmt = 'd')
plt.xlabel('Predicted')
plt.ylabel('Actual')
plt.title('Confusion Matrix for Credit Card Fraud Detection')
Text(0.5, 1.0, 'Confusion Matrix for Credit Card Fraud Detection')

在这里插入图片描述

#printing classification report
print(classification_report(Y_test, Y_pred))
              precision    recall  f1-score   support

           0       1.00      1.00      1.00     56651
           1       0.87      0.64      0.74        95

    accuracy                           1.00     56746
   macro avg       0.94      0.82      0.87     56746
weighted avg       1.00      1.00      1.00     56746
Y_score = best_clf.predict_proba(X_test)[:, 1]
average_precision = average_precision_score(Y_test, Y_score)
fig = plot_precision_recall_curve(best_clf, X_test, Y_test)
fig.ax_.set_title('Precision-Recall Curve: AP={0:.2f}'.format(average_precision))
Text(0.5, 1.0, 'Precision-Recall Curve: AP=0.74')

在这里插入图片描述

6. Feature Importances

本节将介绍两种算法,一种在 XGBoost 中,一种在 SHAP 中,用于可视化特征重要性。 不幸的是,由于该数据集的特征是使用主成分分析(PCA)进行编码的,因此我们无法凭直觉得出模型如何从实际角度预测正常交易和欺诈交易的结论。

#extracting the booster from model
booster = best_clf.get_booster()

# scoring features based on information gain
importance = booster.get_score(importance_type = "gain")

#rounding importances to 2 decimal places
for key in importance.keys():
    importance[key] = round(importance[key],2)

# plotting feature importances
ax = xgb.plot_importance(importance, importance_type='gain', show_values=True)
plt.title('Feature Importances (Gain)')
plt.show()

在这里插入图片描述

#obtaining SHAP values for XGBoost Model
explainer = shap.TreeExplainer(best_clf)
shap_values = explainer.shap_values(X_train)
#plotting SHAP Values of Feature Importances
shap.summary_plot(shap_values, X_train)

在这里插入图片描述

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.coloradmin.cn/o/1280387.html

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈,一经查实,立即删除!

相关文章

Mybatis 的操作(要结合上个博客一起)续集

Mybatis 是一款优秀的 持久性 框架,用于简化 JDBC 的开发 持久层 : 指的就是持久化操作的层,通常指数据访问层(dao),是用来操作数据库的 简单来说 Mybatis 是更简单完成程序和数据库交互的框架 Mybatis 的写法有两种 : 1.xml 2.注解 这两者各有利弊,后面进行总结 Mybati…

使用Java语言实现变量互换

一、 java运算 通过异或运算符实现两个变量的互换 import java.util.Scanner;public class ExchangeValueDemo {public static void main(String[] args){try (Scanner scan new Scanner(System.in)) {System.out.println("请输入A的值&#xff1a;");long A sca…

Kafka 的特点和优势

Apache Kafka 作为一款分布式流处理平台&#xff0c;以其独特的特点和卓越的优势成为实时数据处理领域的瑰宝。本文将深入研究 Kafka 的各项特点和优势&#xff0c;并通过详实的示例代码展示其在不同场景下的强大应用。 高吞吐量和水平扩展 Kafka 的设计注重高吞吐量和水平扩…

9-4 函数输入信息,函数输出信息

#include<stdio.h>struct student{char name[10];int num;char score[3]; }stu[5]; //结构体输入信息int main(){void input(struct student stu[]);void print(struct student stu[]);input(stu);print(stu);return 0; }void input(struct student stu[5]) { int i,j;fo…

JVM虚拟机:JVM参数之标配参数

本文重点 本文我们将学习JVM中的标配参数 标配参数 从jdk刚开始就有的参数&#xff0c;比如&#xff1a; -version -help -showversion

万界星空科技灯具行业MES介绍

中国是LED照明产品最大的生产制造国&#xff0c;如今&#xff0c;我国初步形成了包括LED外延片的生产、LED芯片的制备、LED芯片的封装以及LED产品应用在内的较为完超为产业链&#xff0c;随着LED照明市场渗诱率的快速警升&#xff0c;LED下游应用市场将会越来越广阔。这也将推动…

30、LCD1602

LCD1602介绍 LCD1602&#xff08;Liquid Crystal Display&#xff09;液晶显示屏是一种字符型液晶显示模块&#xff0c;可以显示ASCII码的标准字符和其它的一些内置特殊字符&#xff0c;还可以有8个自定义字符 显示容量&#xff1a;162个字符&#xff0c;每个字符为5*7点阵 …

机器视觉最全面试题总结

文章目录 1. 为什么需要做特征归一化、标准化&#xff1f;2. 常用常用的归一化和标准化的方法有哪些&#xff1f;3. 介绍一下空洞卷积的原理和作用4. 怎么判断模型是否过拟合&#xff0c;有哪些防止过拟合的策略&#xff1f;5. 除了SGD和Adam之外&#xff0c;你还知道哪些优化算…

【Vulnhub 靶场】【Momentum: 2】【简单】【20210628】

1、环境介绍 靶场介绍&#xff1a;https://www.vulnhub.com/entry/momentum-2,702/ 靶场下载&#xff1a;https://download.vulnhub.com/momentum/Momentum2.ova 靶场难度&#xff1a;简单 发布日期&#xff1a;2021年06月28日 文件大小&#xff1a;698 MB 靶场作者&#xff1…

蓝桥杯每日一题2023.12.3

题目描述 1.移动距离 - 蓝桥云课 (lanqiao.cn) 题目分析 对于此题需要对行列的关系进行一定的探究&#xff0c;所求实际上为曼哈顿距离&#xff0c;只需要两个行列的绝对值想加即可&#xff0c;预处理使下标从0开始可以更加明确之间的关系&#xff0c;奇数行时这一行的数字需…

PointNet代码详解

PointNet代码详解 这里只看下分类网络 input transform class STN3d(nn.Module):def __init__(self):super(STN3d, self).__init__()self.conv1 torch.nn.Conv1d(3, 64, 1)self.conv2 torch.nn.Conv1d(64, 128, 1)self.conv3 torch.nn.Conv1d(128, 1024, 1)self.fc1 nn.L…

人工智能轨道交通行业周刊-第67期(2023.11.27-12.3)

本期关键词&#xff1a;列车巡检机器人、城轨智慧管控、制动梁、断路器、AICC大会、Qwen-72B 1 整理涉及公众号名单 1.1 行业类 RT轨道交通人民铁道世界轨道交通资讯网铁路信号技术交流北京铁路轨道交通网上榜铁路视点ITS World轨道交通联盟VSTR铁路与城市轨道交通RailMetro…

C++ day51 买卖股票最佳时期

题目1&#xff1a;309 买卖股票的最佳时机含冷冻期 题目链接&#xff1a;买卖股票的最佳时机含冷冻期 对题目的理解 prices[i]表示第i天股票的价格&#xff0c;尽可能多地完成更多的交易&#xff0c;不能同时进行多笔交易&#xff0c;卖出股票后&#xff0c;第二天无法买入股…

java后端自学错误总结

java后端自学错误总结 MessageSource国际化接口总结 MessageSource国际化接口 今天第一次使用MessageSource接口,比较意外遇到了一些坑 messageSource是spring中的转换消息接口&#xff0c;提供了国际化信息的能力。MessageSource用于解析 消息&#xff0c;并支持消息的参数化…

Spring Initial 脚手架国内镜像地址

官方的脚手架下载太慢了&#xff0c;并且现在没有了Java8的选项&#xff0c;所以找到国内的脚手架镜像地址&#xff0c;推荐给大家。 首先说官方的脚手架 官方的脚手架地址为&#xff1a; https://start.spring.io/ 但是可以看到&#xff0c;并没有了Java8的选项。 所以推荐…

如何通过K线发现短线机会?

一、K线的含义 股票一天之内有4个最关键的价格&#xff0c;开盘价、收盘价、最高价和最低价&#xff0c;把这个价格显示在图上就是K线图。 以金斗云智投电脑版为例&#xff0c;打开软件&#xff0c;任意搜索一支个股&#xff0c;就可以看到这支股票的K线。 股市新手看到这儿多…

【像素画板】游戏地图编辑器-uniapp项目开发流程详解

嘿&#xff0c;用过像素画板没有哦&#xff0c;相信喜欢绘画的小朋友会对它感兴趣呢&#xff0c;用来绘制像素画非常好看&#xff0c;有没有发现&#xff0c;它是可以用来绘制游戏地图的&#xff0c;是不是很好奇&#xff0c;来一起看看吧。 像素画板&#xff0c;也叫像素画的绘…

DeDeCMS v5.7 SP2 正式版 前台任意用户密码修改(漏洞复现)

1.环境搭建 PHP 5.6 DeDeCMSV5.7SP2 正式版 安装phpstudy&#xff0c;https://www.xp.cn/小皮面板 先启动Apache2.4.39和MySQL5.7.26 如果他会让你下载&#xff0c;点击是就好&#xff01; 让后点击网站—>点击创建网站 域名自己创建&#xff0c;自己取 其他的不变 点击…

iOS 自动签名打包,并用脚本上传appstore

背景&#xff1a; 1&#xff09;测试环境给测试&#xff0c;产品&#xff0c;或者其他业务人员打测试包时&#xff0c;经常存在需要添加设备&#xff0c;不得不重新生成描述文件&#xff0c;手动去更新打包机描述文件配置 2&#xff09;证书&#xff0c;描述文件过期造成打包失…

Session 与 JWT 的对决:谁是身份验证的王者? (上)

&#x1f90d; 前端开发工程师&#xff08;主业&#xff09;、技术博主&#xff08;副业&#xff09;、已过CET6 &#x1f368; 阿珊和她的猫_CSDN个人主页 &#x1f560; 牛客高级专题作者、在牛客打造高质量专栏《前端面试必备》 &#x1f35a; 蓝桥云课签约作者、已在蓝桥云…