【Python机器学习】实验04(1) 多分类(基于逻辑回归)实践

news2024/9/24 23:25:22

文章目录

  • 多分类以及机器学习实践
    • 如何对多个类别进行分类
      • 1.1 数据的预处理
      • 1.2 训练数据的准备
      • 1.3 定义假设函数,代价函数,梯度下降算法(从实验3复制过来)
      • 1.4 调用梯度下降算法来学习三个分类模型的参数
      • 1.5 利用模型进行预测
      • 1.6 评估模型
      • 1.7 试试sklearn
    • 实验4(1) 请动手完成你们第一个多分类问题,祝好运!完成下面代码
      • 2.1 数据读取
      • 2.2 训练数据的准备
      • 2.3 定义假设函数、代价函数和梯度下降算法
      • 2.4 学习这四个分类模型
      • 2.5 利用模型进行预测
      • 2.6 计算准确率

多分类以及机器学习实践

如何对多个类别进行分类

Iris数据集是常用的分类实验数据集,由Fisher, 1936收集整理。Iris也称鸢尾花卉数据集,是一类多重变量分析的数据集。数据集包含150个数据样本,分为3类,每类50个数据,每个数据包含4个属性。可通过花萼长度,花萼宽度,花瓣长度,花瓣宽度4个属性预测鸢尾花卉属于(Setosa,Versicolour,Virginica)三个种类中的哪一类。

iris以鸢尾花的特征作为数据来源,常用在分类操作中。该数据集由3种不同类型的鸢尾花的各50个样本数据构成。其中的一个种类与另外两个种类是线性可分离的,后两个种类是非线性可分离的。

该数据集包含了4个属性:
Sepal.Length(花萼长度),单位是cm;
Sepal.Width(花萼宽度),单位是cm;
Petal.Length(花瓣长度),单位是cm;
Petal.Width(花瓣宽度),单位是cm;

种类:Iris Setosa(山鸢尾)、Iris Versicolour(杂色鸢尾),以及Iris Virginica(维吉尼亚鸢尾)。

1.1 数据的预处理

import sklearn.datasets as datasets
import pandas as pd
import numpy as np
data=datasets.load_iris()
data
{'data': array([[5.1, 3.5, 1.4, 0.2],
        [4.9, 3. , 1.4, 0.2],
        [4.7, 3.2, 1.3, 0.2],
        [4.6, 3.1, 1.5, 0.2],
        [5. , 3.6, 1.4, 0.2],
        [5.4, 3.9, 1.7, 0.4],
        [4.6, 3.4, 1.4, 0.3],
        [5. , 3.4, 1.5, 0.2],
        [4.4, 2.9, 1.4, 0.2],
        [4.9, 3.1, 1.5, 0.1],
        [5.4, 3.7, 1.5, 0.2],
        [4.8, 3.4, 1.6, 0.2],
        [4.8, 3. , 1.4, 0.1],
        [4.3, 3. , 1.1, 0.1],
        [5.8, 4. , 1.2, 0.2],
        [5.7, 4.4, 1.5, 0.4],
        [5.4, 3.9, 1.3, 0.4],
        [5.1, 3.5, 1.4, 0.3],
        [5.7, 3.8, 1.7, 0.3],
        [5.1, 3.8, 1.5, 0.3],
        [5.4, 3.4, 1.7, 0.2],
        [5.1, 3.7, 1.5, 0.4],
        [4.6, 3.6, 1. , 0.2],
        [5.1, 3.3, 1.7, 0.5],
        [4.8, 3.4, 1.9, 0.2],
        [5. , 3. , 1.6, 0.2],
        [5. , 3.4, 1.6, 0.4],
        [5.2, 3.5, 1.5, 0.2],
        [5.2, 3.4, 1.4, 0.2],
        [4.7, 3.2, 1.6, 0.2],
        [4.8, 3.1, 1.6, 0.2],
        [5.4, 3.4, 1.5, 0.4],
        [5.2, 4.1, 1.5, 0.1],
        [5.5, 4.2, 1.4, 0.2],
        [4.9, 3.1, 1.5, 0.2],
        [5. , 3.2, 1.2, 0.2],
        [5.5, 3.5, 1.3, 0.2],
        [4.9, 3.6, 1.4, 0.1],
        [4.4, 3. , 1.3, 0.2],
        [5.1, 3.4, 1.5, 0.2],
        [5. , 3.5, 1.3, 0.3],
        [4.5, 2.3, 1.3, 0.3],
        [4.4, 3.2, 1.3, 0.2],
        [5. , 3.5, 1.6, 0.6],
        [5.1, 3.8, 1.9, 0.4],
        [4.8, 3. , 1.4, 0.3],
        [5.1, 3.8, 1.6, 0.2],
        [4.6, 3.2, 1.4, 0.2],
        [5.3, 3.7, 1.5, 0.2],
        [5. , 3.3, 1.4, 0.2],
        [7. , 3.2, 4.7, 1.4],
        [6.4, 3.2, 4.5, 1.5],
        [6.9, 3.1, 4.9, 1.5],
        [5.5, 2.3, 4. , 1.3],
        [6.5, 2.8, 4.6, 1.5],
        [5.7, 2.8, 4.5, 1.3],
        [6.3, 3.3, 4.7, 1.6],
        [4.9, 2.4, 3.3, 1. ],
        [6.6, 2.9, 4.6, 1.3],
        [5.2, 2.7, 3.9, 1.4],
        [5. , 2. , 3.5, 1. ],
        [5.9, 3. , 4.2, 1.5],
        [6. , 2.2, 4. , 1. ],
        [6.1, 2.9, 4.7, 1.4],
        [5.6, 2.9, 3.6, 1.3],
        [6.7, 3.1, 4.4, 1.4],
        [5.6, 3. , 4.5, 1.5],
        [5.8, 2.7, 4.1, 1. ],
        [6.2, 2.2, 4.5, 1.5],
        [5.6, 2.5, 3.9, 1.1],
        [5.9, 3.2, 4.8, 1.8],
        [6.1, 2.8, 4. , 1.3],
        [6.3, 2.5, 4.9, 1.5],
        [6.1, 2.8, 4.7, 1.2],
        [6.4, 2.9, 4.3, 1.3],
        [6.6, 3. , 4.4, 1.4],
        [6.8, 2.8, 4.8, 1.4],
        [6.7, 3. , 5. , 1.7],
        [6. , 2.9, 4.5, 1.5],
        [5.7, 2.6, 3.5, 1. ],
        [5.5, 2.4, 3.8, 1.1],
        [5.5, 2.4, 3.7, 1. ],
        [5.8, 2.7, 3.9, 1.2],
        [6. , 2.7, 5.1, 1.6],
        [5.4, 3. , 4.5, 1.5],
        [6. , 3.4, 4.5, 1.6],
        [6.7, 3.1, 4.7, 1.5],
        [6.3, 2.3, 4.4, 1.3],
        [5.6, 3. , 4.1, 1.3],
        [5.5, 2.5, 4. , 1.3],
        [5.5, 2.6, 4.4, 1.2],
        [6.1, 3. , 4.6, 1.4],
        [5.8, 2.6, 4. , 1.2],
        [5. , 2.3, 3.3, 1. ],
        [5.6, 2.7, 4.2, 1.3],
        [5.7, 3. , 4.2, 1.2],
        [5.7, 2.9, 4.2, 1.3],
        [6.2, 2.9, 4.3, 1.3],
        [5.1, 2.5, 3. , 1.1],
        [5.7, 2.8, 4.1, 1.3],
        [6.3, 3.3, 6. , 2.5],
        [5.8, 2.7, 5.1, 1.9],
        [7.1, 3. , 5.9, 2.1],
        [6.3, 2.9, 5.6, 1.8],
        [6.5, 3. , 5.8, 2.2],
        [7.6, 3. , 6.6, 2.1],
        [4.9, 2.5, 4.5, 1.7],
        [7.3, 2.9, 6.3, 1.8],
        [6.7, 2.5, 5.8, 1.8],
        [7.2, 3.6, 6.1, 2.5],
        [6.5, 3.2, 5.1, 2. ],
        [6.4, 2.7, 5.3, 1.9],
        [6.8, 3. , 5.5, 2.1],
        [5.7, 2.5, 5. , 2. ],
        [5.8, 2.8, 5.1, 2.4],
        [6.4, 3.2, 5.3, 2.3],
        [6.5, 3. , 5.5, 1.8],
        [7.7, 3.8, 6.7, 2.2],
        [7.7, 2.6, 6.9, 2.3],
        [6. , 2.2, 5. , 1.5],
        [6.9, 3.2, 5.7, 2.3],
        [5.6, 2.8, 4.9, 2. ],
        [7.7, 2.8, 6.7, 2. ],
        [6.3, 2.7, 4.9, 1.8],
        [6.7, 3.3, 5.7, 2.1],
        [7.2, 3.2, 6. , 1.8],
        [6.2, 2.8, 4.8, 1.8],
        [6.1, 3. , 4.9, 1.8],
        [6.4, 2.8, 5.6, 2.1],
        [7.2, 3. , 5.8, 1.6],
        [7.4, 2.8, 6.1, 1.9],
        [7.9, 3.8, 6.4, 2. ],
        [6.4, 2.8, 5.6, 2.2],
        [6.3, 2.8, 5.1, 1.5],
        [6.1, 2.6, 5.6, 1.4],
        [7.7, 3. , 6.1, 2.3],
        [6.3, 3.4, 5.6, 2.4],
        [6.4, 3.1, 5.5, 1.8],
        [6. , 3. , 4.8, 1.8],
        [6.9, 3.1, 5.4, 2.1],
        [6.7, 3.1, 5.6, 2.4],
        [6.9, 3.1, 5.1, 2.3],
        [5.8, 2.7, 5.1, 1.9],
        [6.8, 3.2, 5.9, 2.3],
        [6.7, 3.3, 5.7, 2.5],
        [6.7, 3. , 5.2, 2.3],
        [6.3, 2.5, 5. , 1.9],
        [6.5, 3. , 5.2, 2. ],
        [6.2, 3.4, 5.4, 2.3],
        [5.9, 3. , 5.1, 1.8]]),
 'target': array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
        0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
        0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
        1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
        1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2,
        2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2,
        2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2]),
 'frame': None,
 'target_names': array(['setosa', 'versicolor', 'virginica'], dtype='<U10'),
 'DESCR': '.. _iris_dataset:\n\nIris plants dataset\n--------------------\n\n**Data Set Characteristics:**\n\n    :Number of Instances: 150 (50 in each of three classes)\n    :Number of Attributes: 4 numeric, predictive attributes and the class\n    :Attribute Information:\n        - sepal length in cm\n        - sepal width in cm\n        - petal length in cm\n        - petal width in cm\n        - class:\n                - Iris-Setosa\n                - Iris-Versicolour\n                - Iris-Virginica\n                \n    :Summary Statistics:\n\n    ============== ==== ==== ======= ===== ====================\n                    Min  Max   Mean    SD   Class Correlation\n    ============== ==== ==== ======= ===== ====================\n    sepal length:   4.3  7.9   5.84   0.83    0.7826\n    sepal width:    2.0  4.4   3.05   0.43   -0.4194\n    petal length:   1.0  6.9   3.76   1.76    0.9490  (high!)\n    petal width:    0.1  2.5   1.20   0.76    0.9565  (high!)\n    ============== ==== ==== ======= ===== ====================\n\n    :Missing Attribute Values: None\n    :Class Distribution: 33.3% for each of 3 classes.\n    :Creator: R.A. Fisher\n    :Donor: Michael Marshall (MARSHALL%PLU@io.arc.nasa.gov)\n    :Date: July, 1988\n\nThe famous Iris database, first used by Sir R.A. Fisher. The dataset is taken\nfrom Fisher\'s paper. Note that it\'s the same as in R, but not as in the UCI\nMachine Learning Repository, which has two wrong data points.\n\nThis is perhaps the best known database to be found in the\npattern recognition literature.  Fisher\'s paper is a classic in the field and\nis referenced frequently to this day.  (See Duda & Hart, for example.)  The\ndata set contains 3 classes of 50 instances each, where each class refers to a\ntype of iris plant.  One class is linearly separable from the other 2; the\nlatter are NOT linearly separable from each other.\n\n.. topic:: References\n\n   - Fisher, R.A. "The use of multiple measurements in taxonomic problems"\n     Annual Eugenics, 7, Part II, 179-188 (1936); also in "Contributions to\n     Mathematical Statistics" (John Wiley, NY, 1950).\n   - Duda, R.O., & Hart, P.E. (1973) Pattern Classification and Scene Analysis.\n     (Q327.D83) John Wiley & Sons.  ISBN 0-471-22361-1.  See page 218.\n   - Dasarathy, B.V. (1980) "Nosing Around the Neighborhood: A New System\n     Structure and Classification Rule for Recognition in Partially Exposed\n     Environments".  IEEE Transactions on Pattern Analysis and Machine\n     Intelligence, Vol. PAMI-2, No. 1, 67-71.\n   - Gates, G.W. (1972) "The Reduced Nearest Neighbor Rule".  IEEE Transactions\n     on Information Theory, May 1972, 431-433.\n   - See also: 1988 MLC Proceedings, 54-64.  Cheeseman et al"s AUTOCLASS II\n     conceptual clustering system finds 3 classes in the data.\n   - Many, many more ...',
 'feature_names': ['sepal length (cm)',
  'sepal width (cm)',
  'petal length (cm)',
  'petal width (cm)'],
 'filename': 'iris.csv',
 'data_module': 'sklearn.datasets.data'}
data_x=data["data"]
data_y=data["target"]
data_x.shape,data_y.shape
((150, 4), (150,))
data_y=data_y.reshape([len(data_y),1])
data_y
array([[0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2]])
#法1 ,用拼接的方法
data=np.hstack([data_x,data_y])
#法二: 用插入的方法
np.insert(data_x,data_x.shape[1],data_y,axis=1)
array([[5.1, 3.5, 1.4, ..., 2. , 2. , 2. ],
       [4.9, 3. , 1.4, ..., 2. , 2. , 2. ],
       [4.7, 3.2, 1.3, ..., 2. , 2. , 2. ],
       ...,
       [6.5, 3. , 5.2, ..., 2. , 2. , 2. ],
       [6.2, 3.4, 5.4, ..., 2. , 2. , 2. ],
       [5.9, 3. , 5.1, ..., 2. , 2. , 2. ]])
data=pd.DataFrame(data,columns=["F1","F2","F3","F4","target"])
data
F1F2F3F4target
05.13.51.40.20.0
14.93.01.40.20.0
24.73.21.30.20.0
34.63.11.50.20.0
45.03.61.40.20.0
..................
1456.73.05.22.32.0
1466.32.55.01.92.0
1476.53.05.22.02.0
1486.23.45.42.32.0
1495.93.05.11.82.0

150 rows × 5 columns

data.insert(0,"ones",1)
data
onesF1F2F3F4target
015.13.51.40.20.0
114.93.01.40.20.0
214.73.21.30.20.0
314.63.11.50.20.0
415.03.61.40.20.0
.....................
14516.73.05.22.32.0
14616.32.55.01.92.0
14716.53.05.22.02.0
14816.23.45.42.32.0
14915.93.05.11.82.0

150 rows × 6 columns

data["target"]=data["target"].astype("int32")
data
onesF1F2F3F4target
015.13.51.40.20
114.93.01.40.20
214.73.21.30.20
314.63.11.50.20
415.03.61.40.20
.....................
14516.73.05.22.32
14616.32.55.01.92
14716.53.05.22.02
14816.23.45.42.32
14915.93.05.11.82

150 rows × 6 columns

1.2 训练数据的准备

data_x
array([[5.1, 3.5, 1.4, 0.2],
       [4.9, 3. , 1.4, 0.2],
       [4.7, 3.2, 1.3, 0.2],
       [4.6, 3.1, 1.5, 0.2],
       [5. , 3.6, 1.4, 0.2],
       [5.4, 3.9, 1.7, 0.4],
       [4.6, 3.4, 1.4, 0.3],
       [5. , 3.4, 1.5, 0.2],
       [4.4, 2.9, 1.4, 0.2],
       [4.9, 3.1, 1.5, 0.1],
       [5.4, 3.7, 1.5, 0.2],
       [4.8, 3.4, 1.6, 0.2],
       [4.8, 3. , 1.4, 0.1],
       [4.3, 3. , 1.1, 0.1],
       [5.8, 4. , 1.2, 0.2],
       [5.7, 4.4, 1.5, 0.4],
       [5.4, 3.9, 1.3, 0.4],
       [5.1, 3.5, 1.4, 0.3],
       [5.7, 3.8, 1.7, 0.3],
       [5.1, 3.8, 1.5, 0.3],
       [5.4, 3.4, 1.7, 0.2],
       [5.1, 3.7, 1.5, 0.4],
       [4.6, 3.6, 1. , 0.2],
       [5.1, 3.3, 1.7, 0.5],
       [4.8, 3.4, 1.9, 0.2],
       [5. , 3. , 1.6, 0.2],
       [5. , 3.4, 1.6, 0.4],
       [5.2, 3.5, 1.5, 0.2],
       [5.2, 3.4, 1.4, 0.2],
       [4.7, 3.2, 1.6, 0.2],
       [4.8, 3.1, 1.6, 0.2],
       [5.4, 3.4, 1.5, 0.4],
       [5.2, 4.1, 1.5, 0.1],
       [5.5, 4.2, 1.4, 0.2],
       [4.9, 3.1, 1.5, 0.2],
       [5. , 3.2, 1.2, 0.2],
       [5.5, 3.5, 1.3, 0.2],
       [4.9, 3.6, 1.4, 0.1],
       [4.4, 3. , 1.3, 0.2],
       [5.1, 3.4, 1.5, 0.2],
       [5. , 3.5, 1.3, 0.3],
       [4.5, 2.3, 1.3, 0.3],
       [4.4, 3.2, 1.3, 0.2],
       [5. , 3.5, 1.6, 0.6],
       [5.1, 3.8, 1.9, 0.4],
       [4.8, 3. , 1.4, 0.3],
       [5.1, 3.8, 1.6, 0.2],
       [4.6, 3.2, 1.4, 0.2],
       [5.3, 3.7, 1.5, 0.2],
       [5. , 3.3, 1.4, 0.2],
       [7. , 3.2, 4.7, 1.4],
       [6.4, 3.2, 4.5, 1.5],
       [6.9, 3.1, 4.9, 1.5],
       [5.5, 2.3, 4. , 1.3],
       [6.5, 2.8, 4.6, 1.5],
       [5.7, 2.8, 4.5, 1.3],
       [6.3, 3.3, 4.7, 1.6],
       [4.9, 2.4, 3.3, 1. ],
       [6.6, 2.9, 4.6, 1.3],
       [5.2, 2.7, 3.9, 1.4],
       [5. , 2. , 3.5, 1. ],
       [5.9, 3. , 4.2, 1.5],
       [6. , 2.2, 4. , 1. ],
       [6.1, 2.9, 4.7, 1.4],
       [5.6, 2.9, 3.6, 1.3],
       [6.7, 3.1, 4.4, 1.4],
       [5.6, 3. , 4.5, 1.5],
       [5.8, 2.7, 4.1, 1. ],
       [6.2, 2.2, 4.5, 1.5],
       [5.6, 2.5, 3.9, 1.1],
       [5.9, 3.2, 4.8, 1.8],
       [6.1, 2.8, 4. , 1.3],
       [6.3, 2.5, 4.9, 1.5],
       [6.1, 2.8, 4.7, 1.2],
       [6.4, 2.9, 4.3, 1.3],
       [6.6, 3. , 4.4, 1.4],
       [6.8, 2.8, 4.8, 1.4],
       [6.7, 3. , 5. , 1.7],
       [6. , 2.9, 4.5, 1.5],
       [5.7, 2.6, 3.5, 1. ],
       [5.5, 2.4, 3.8, 1.1],
       [5.5, 2.4, 3.7, 1. ],
       [5.8, 2.7, 3.9, 1.2],
       [6. , 2.7, 5.1, 1.6],
       [5.4, 3. , 4.5, 1.5],
       [6. , 3.4, 4.5, 1.6],
       [6.7, 3.1, 4.7, 1.5],
       [6.3, 2.3, 4.4, 1.3],
       [5.6, 3. , 4.1, 1.3],
       [5.5, 2.5, 4. , 1.3],
       [5.5, 2.6, 4.4, 1.2],
       [6.1, 3. , 4.6, 1.4],
       [5.8, 2.6, 4. , 1.2],
       [5. , 2.3, 3.3, 1. ],
       [5.6, 2.7, 4.2, 1.3],
       [5.7, 3. , 4.2, 1.2],
       [5.7, 2.9, 4.2, 1.3],
       [6.2, 2.9, 4.3, 1.3],
       [5.1, 2.5, 3. , 1.1],
       [5.7, 2.8, 4.1, 1.3],
       [6.3, 3.3, 6. , 2.5],
       [5.8, 2.7, 5.1, 1.9],
       [7.1, 3. , 5.9, 2.1],
       [6.3, 2.9, 5.6, 1.8],
       [6.5, 3. , 5.8, 2.2],
       [7.6, 3. , 6.6, 2.1],
       [4.9, 2.5, 4.5, 1.7],
       [7.3, 2.9, 6.3, 1.8],
       [6.7, 2.5, 5.8, 1.8],
       [7.2, 3.6, 6.1, 2.5],
       [6.5, 3.2, 5.1, 2. ],
       [6.4, 2.7, 5.3, 1.9],
       [6.8, 3. , 5.5, 2.1],
       [5.7, 2.5, 5. , 2. ],
       [5.8, 2.8, 5.1, 2.4],
       [6.4, 3.2, 5.3, 2.3],
       [6.5, 3. , 5.5, 1.8],
       [7.7, 3.8, 6.7, 2.2],
       [7.7, 2.6, 6.9, 2.3],
       [6. , 2.2, 5. , 1.5],
       [6.9, 3.2, 5.7, 2.3],
       [5.6, 2.8, 4.9, 2. ],
       [7.7, 2.8, 6.7, 2. ],
       [6.3, 2.7, 4.9, 1.8],
       [6.7, 3.3, 5.7, 2.1],
       [7.2, 3.2, 6. , 1.8],
       [6.2, 2.8, 4.8, 1.8],
       [6.1, 3. , 4.9, 1.8],
       [6.4, 2.8, 5.6, 2.1],
       [7.2, 3. , 5.8, 1.6],
       [7.4, 2.8, 6.1, 1.9],
       [7.9, 3.8, 6.4, 2. ],
       [6.4, 2.8, 5.6, 2.2],
       [6.3, 2.8, 5.1, 1.5],
       [6.1, 2.6, 5.6, 1.4],
       [7.7, 3. , 6.1, 2.3],
       [6.3, 3.4, 5.6, 2.4],
       [6.4, 3.1, 5.5, 1.8],
       [6. , 3. , 4.8, 1.8],
       [6.9, 3.1, 5.4, 2.1],
       [6.7, 3.1, 5.6, 2.4],
       [6.9, 3.1, 5.1, 2.3],
       [5.8, 2.7, 5.1, 1.9],
       [6.8, 3.2, 5.9, 2.3],
       [6.7, 3.3, 5.7, 2.5],
       [6.7, 3. , 5.2, 2.3],
       [6.3, 2.5, 5. , 1.9],
       [6.5, 3. , 5.2, 2. ],
       [6.2, 3.4, 5.4, 2.3],
       [5.9, 3. , 5.1, 1.8]])
data_x=np.insert(data_x,0,1,axis=1)
data_x.shape,data_y.shape
((150, 5), (150, 1))
#训练数据的特征和标签
data_x,data_y
(array([[1. , 5.1, 3.5, 1.4, 0.2],
        [1. , 4.9, 3. , 1.4, 0.2],
        [1. , 4.7, 3.2, 1.3, 0.2],
        [1. , 4.6, 3.1, 1.5, 0.2],
        [1. , 5. , 3.6, 1.4, 0.2],
        [1. , 5.4, 3.9, 1.7, 0.4],
        [1. , 4.6, 3.4, 1.4, 0.3],
        [1. , 5. , 3.4, 1.5, 0.2],
        [1. , 4.4, 2.9, 1.4, 0.2],
        [1. , 4.9, 3.1, 1.5, 0.1],
        [1. , 5.4, 3.7, 1.5, 0.2],
        [1. , 4.8, 3.4, 1.6, 0.2],
        [1. , 4.8, 3. , 1.4, 0.1],
        [1. , 4.3, 3. , 1.1, 0.1],
        [1. , 5.8, 4. , 1.2, 0.2],
        [1. , 5.7, 4.4, 1.5, 0.4],
        [1. , 5.4, 3.9, 1.3, 0.4],
        [1. , 5.1, 3.5, 1.4, 0.3],
        [1. , 5.7, 3.8, 1.7, 0.3],
        [1. , 5.1, 3.8, 1.5, 0.3],
        [1. , 5.4, 3.4, 1.7, 0.2],
        [1. , 5.1, 3.7, 1.5, 0.4],
        [1. , 4.6, 3.6, 1. , 0.2],
        [1. , 5.1, 3.3, 1.7, 0.5],
        [1. , 4.8, 3.4, 1.9, 0.2],
        [1. , 5. , 3. , 1.6, 0.2],
        [1. , 5. , 3.4, 1.6, 0.4],
        [1. , 5.2, 3.5, 1.5, 0.2],
        [1. , 5.2, 3.4, 1.4, 0.2],
        [1. , 4.7, 3.2, 1.6, 0.2],
        [1. , 4.8, 3.1, 1.6, 0.2],
        [1. , 5.4, 3.4, 1.5, 0.4],
        [1. , 5.2, 4.1, 1.5, 0.1],
        [1. , 5.5, 4.2, 1.4, 0.2],
        [1. , 4.9, 3.1, 1.5, 0.2],
        [1. , 5. , 3.2, 1.2, 0.2],
        [1. , 5.5, 3.5, 1.3, 0.2],
        [1. , 4.9, 3.6, 1.4, 0.1],
        [1. , 4.4, 3. , 1.3, 0.2],
        [1. , 5.1, 3.4, 1.5, 0.2],
        [1. , 5. , 3.5, 1.3, 0.3],
        [1. , 4.5, 2.3, 1.3, 0.3],
        [1. , 4.4, 3.2, 1.3, 0.2],
        [1. , 5. , 3.5, 1.6, 0.6],
        [1. , 5.1, 3.8, 1.9, 0.4],
        [1. , 4.8, 3. , 1.4, 0.3],
        [1. , 5.1, 3.8, 1.6, 0.2],
        [1. , 4.6, 3.2, 1.4, 0.2],
        [1. , 5.3, 3.7, 1.5, 0.2],
        [1. , 5. , 3.3, 1.4, 0.2],
        [1. , 7. , 3.2, 4.7, 1.4],
        [1. , 6.4, 3.2, 4.5, 1.5],
        [1. , 6.9, 3.1, 4.9, 1.5],
        [1. , 5.5, 2.3, 4. , 1.3],
        [1. , 6.5, 2.8, 4.6, 1.5],
        [1. , 5.7, 2.8, 4.5, 1.3],
        [1. , 6.3, 3.3, 4.7, 1.6],
        [1. , 4.9, 2.4, 3.3, 1. ],
        [1. , 6.6, 2.9, 4.6, 1.3],
        [1. , 5.2, 2.7, 3.9, 1.4],
        [1. , 5. , 2. , 3.5, 1. ],
        [1. , 5.9, 3. , 4.2, 1.5],
        [1. , 6. , 2.2, 4. , 1. ],
        [1. , 6.1, 2.9, 4.7, 1.4],
        [1. , 5.6, 2.9, 3.6, 1.3],
        [1. , 6.7, 3.1, 4.4, 1.4],
        [1. , 5.6, 3. , 4.5, 1.5],
        [1. , 5.8, 2.7, 4.1, 1. ],
        [1. , 6.2, 2.2, 4.5, 1.5],
        [1. , 5.6, 2.5, 3.9, 1.1],
        [1. , 5.9, 3.2, 4.8, 1.8],
        [1. , 6.1, 2.8, 4. , 1.3],
        [1. , 6.3, 2.5, 4.9, 1.5],
        [1. , 6.1, 2.8, 4.7, 1.2],
        [1. , 6.4, 2.9, 4.3, 1.3],
        [1. , 6.6, 3. , 4.4, 1.4],
        [1. , 6.8, 2.8, 4.8, 1.4],
        [1. , 6.7, 3. , 5. , 1.7],
        [1. , 6. , 2.9, 4.5, 1.5],
        [1. , 5.7, 2.6, 3.5, 1. ],
        [1. , 5.5, 2.4, 3.8, 1.1],
        [1. , 5.5, 2.4, 3.7, 1. ],
        [1. , 5.8, 2.7, 3.9, 1.2],
        [1. , 6. , 2.7, 5.1, 1.6],
        [1. , 5.4, 3. , 4.5, 1.5],
        [1. , 6. , 3.4, 4.5, 1.6],
        [1. , 6.7, 3.1, 4.7, 1.5],
        [1. , 6.3, 2.3, 4.4, 1.3],
        [1. , 5.6, 3. , 4.1, 1.3],
        [1. , 5.5, 2.5, 4. , 1.3],
        [1. , 5.5, 2.6, 4.4, 1.2],
        [1. , 6.1, 3. , 4.6, 1.4],
        [1. , 5.8, 2.6, 4. , 1.2],
        [1. , 5. , 2.3, 3.3, 1. ],
        [1. , 5.6, 2.7, 4.2, 1.3],
        [1. , 5.7, 3. , 4.2, 1.2],
        [1. , 5.7, 2.9, 4.2, 1.3],
        [1. , 6.2, 2.9, 4.3, 1.3],
        [1. , 5.1, 2.5, 3. , 1.1],
        [1. , 5.7, 2.8, 4.1, 1.3],
        [1. , 6.3, 3.3, 6. , 2.5],
        [1. , 5.8, 2.7, 5.1, 1.9],
        [1. , 7.1, 3. , 5.9, 2.1],
        [1. , 6.3, 2.9, 5.6, 1.8],
        [1. , 6.5, 3. , 5.8, 2.2],
        [1. , 7.6, 3. , 6.6, 2.1],
        [1. , 4.9, 2.5, 4.5, 1.7],
        [1. , 7.3, 2.9, 6.3, 1.8],
        [1. , 6.7, 2.5, 5.8, 1.8],
        [1. , 7.2, 3.6, 6.1, 2.5],
        [1. , 6.5, 3.2, 5.1, 2. ],
        [1. , 6.4, 2.7, 5.3, 1.9],
        [1. , 6.8, 3. , 5.5, 2.1],
        [1. , 5.7, 2.5, 5. , 2. ],
        [1. , 5.8, 2.8, 5.1, 2.4],
        [1. , 6.4, 3.2, 5.3, 2.3],
        [1. , 6.5, 3. , 5.5, 1.8],
        [1. , 7.7, 3.8, 6.7, 2.2],
        [1. , 7.7, 2.6, 6.9, 2.3],
        [1. , 6. , 2.2, 5. , 1.5],
        [1. , 6.9, 3.2, 5.7, 2.3],
        [1. , 5.6, 2.8, 4.9, 2. ],
        [1. , 7.7, 2.8, 6.7, 2. ],
        [1. , 6.3, 2.7, 4.9, 1.8],
        [1. , 6.7, 3.3, 5.7, 2.1],
        [1. , 7.2, 3.2, 6. , 1.8],
        [1. , 6.2, 2.8, 4.8, 1.8],
        [1. , 6.1, 3. , 4.9, 1.8],
        [1. , 6.4, 2.8, 5.6, 2.1],
        [1. , 7.2, 3. , 5.8, 1.6],
        [1. , 7.4, 2.8, 6.1, 1.9],
        [1. , 7.9, 3.8, 6.4, 2. ],
        [1. , 6.4, 2.8, 5.6, 2.2],
        [1. , 6.3, 2.8, 5.1, 1.5],
        [1. , 6.1, 2.6, 5.6, 1.4],
        [1. , 7.7, 3. , 6.1, 2.3],
        [1. , 6.3, 3.4, 5.6, 2.4],
        [1. , 6.4, 3.1, 5.5, 1.8],
        [1. , 6. , 3. , 4.8, 1.8],
        [1. , 6.9, 3.1, 5.4, 2.1],
        [1. , 6.7, 3.1, 5.6, 2.4],
        [1. , 6.9, 3.1, 5.1, 2.3],
        [1. , 5.8, 2.7, 5.1, 1.9],
        [1. , 6.8, 3.2, 5.9, 2.3],
        [1. , 6.7, 3.3, 5.7, 2.5],
        [1. , 6.7, 3. , 5.2, 2.3],
        [1. , 6.3, 2.5, 5. , 1.9],
        [1. , 6.5, 3. , 5.2, 2. ],
        [1. , 6.2, 3.4, 5.4, 2.3],
        [1. , 5.9, 3. , 5.1, 1.8]]),
 array([[0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [0],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [1],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2],
        [2]]))

由于有三个类别,那么在训练时三类数据要分开

data1=data.copy()
data1
onesF1F2F3F4target
015.13.51.40.20
114.93.01.40.20
214.73.21.30.20
314.63.11.50.20
415.03.61.40.20
.....................
14516.73.05.22.32
14616.32.55.01.92
14716.53.05.22.02
14816.23.45.42.32
14915.93.05.11.82

150 rows × 6 columns

data

data1.loc[data["target"]!=0,"target"]=0
data1.loc[data["target"]==0,"target"]=1
data1
onesF1F2F3F4target
015.13.51.40.21
114.93.01.40.21
214.73.21.30.21
314.63.11.50.21
415.03.61.40.21
.....................
14516.73.05.22.30
14616.32.55.01.90
14716.53.05.22.00
14816.23.45.42.30
14915.93.05.11.80

150 rows × 6 columns

data1_x=data1.iloc[:,:data1.shape[1]-1].values
data1_y=data1.iloc[:,data1.shape[1]-1].values
data1_x.shape,data1_y.shape
((150, 5), (150,))
#针对第二类,即第二个分类器的数据
data2=data.copy()
data2.loc[data["target"]==1,"target"]=1
data2.loc[data["target"]!=1,"target"]=0
data2["target"]==0
0      True
1      True
2      True
3      True
4      True
       ... 
145    True
146    True
147    True
148    True
149    True
Name: target, Length: 150, dtype: bool
data2.shape[1]
6
data2.iloc[50:55,:]
onesF1F2F3F4target
5017.03.24.71.41
5116.43.24.51.51
5216.93.14.91.51
5315.52.34.01.31
5416.52.84.61.51
data2_x=data2.iloc[:,:data2.shape[1]-1].values
data2_y=data2.iloc[:,data2.shape[1]-1].values
#针对第三类,即第三个分类器的数据
data3=data.copy()
data3.loc[data["target"]==2,"target"]=1
data3.loc[data["target"]!=2,"target"]=0
data3
onesF1F2F3F4target
015.13.51.40.20
114.93.01.40.20
214.73.21.30.20
314.63.11.50.20
415.03.61.40.20
.....................
14516.73.05.22.31
14616.32.55.01.91
14716.53.05.22.01
14816.23.45.42.31
14915.93.05.11.81

150 rows × 6 columns

data3_x=data3.iloc[:,:data3.shape[1]-1].values
data3_y=data3.iloc[:,data3.shape[1]-1].values

1.3 定义假设函数,代价函数,梯度下降算法(从实验3复制过来)

def sigmoid(z):
    return 1 / (1 + np.exp(-z))
def h(X,w):
    z=X@w
    h=sigmoid(z)
    return h
#代价函数构造
def cost(X,w,y):
    #当X(m,n+1),y(m,),w(n+1,1)
    y_hat=sigmoid(X@w)
    right=np.multiply(y.ravel(),np.log(y_hat).ravel())+np.multiply((1-y).ravel(),np.log(1-y_hat).ravel())
    cost=-np.sum(right)/X.shape[0]
    return cost
def sigmoid(z):
    return 1 / (1 + np.exp(-z))

def h(X,w):
    z=X@w
    h=sigmoid(z)
    return h

#代价函数构造
def cost(X,w,y):
    #当X(m,n+1),y(m,),w(n+1,1)
    y_hat=sigmoid(X@w)
    right=np.multiply(y.ravel(),np.log(y_hat).ravel())+np.multiply((1-y).ravel(),np.log(1-y_hat).ravel())
    cost=-np.sum(right)/X.shape[0]
    return cost



def grandient(X,y,iter_num,alpha):
    y=y.reshape((X.shape[0],1))
    w=np.zeros((X.shape[1],1))
    cost_lst=[]  
    for i in range(iter_num):
        y_pred=h(X,w)-y
        temp=np.zeros((X.shape[1],1))
        for j in range(X.shape[1]):
            right=np.multiply(y_pred.ravel(),X[:,j])
            
            gradient=1/(X.shape[0])*(np.sum(right))
            temp[j,0]=w[j,0]-alpha*gradient
        w=temp
        cost_lst.append(cost(X,w,y.ravel()))
    return w,cost_lst

1.4 调用梯度下降算法来学习三个分类模型的参数

#初始化超参数
iter_num,alpha=600000,0.001
#训练第一个模型
w1,cost_lst1=grandient(data1_x,data1_y,iter_num,alpha)
import matplotlib.pyplot as plt
plt.plot(range(iter_num),cost_lst1,"b-o")
[<matplotlib.lines.Line2D at 0x2562630b100>]

1

#训练第二个模型
w2,cost_lst2=grandient(data2_x,data2_y,iter_num,alpha)
import matplotlib.pyplot as plt
plt.plot(range(iter_num),cost_lst2,"b-o")
[<matplotlib.lines.Line2D at 0x25628114280>]

2

#训练第三个模型
w3,cost_lst3=grandient(data3_x,data3_y,iter_num,alpha)
w3
array([[-3.22437049],
       [-3.50214058],
       [-3.50286355],
       [ 5.16580317],
       [ 5.89898368]])
import matplotlib.pyplot as plt
plt.plot(range(iter_num),cost_lst3,"b-o")
[<matplotlib.lines.Line2D at 0x2562e0f81c0>]

3

1.5 利用模型进行预测

h(data_x,w3)
array([[1.48445441e-11],
       [1.72343968e-10],
       [1.02798153e-10],
       [5.81975546e-10],
       [1.48434710e-11],
       [1.95971176e-11],
       [2.18959639e-10],
       [5.01346874e-11],
       [1.40930075e-09],
       [1.12830635e-10],
       [4.31888744e-12],
       [1.69308343e-10],
       [1.35613372e-10],
       [1.65858883e-10],
       [7.89880725e-14],
       [4.23224675e-13],
       [2.48199140e-12],
       [2.67766642e-11],
       [5.39314286e-12],
       [1.56935848e-11],
       [3.47096426e-11],
       [4.01827075e-11],
       [7.63005509e-12],
       [8.26864773e-10],
       [7.97484594e-10],
       [3.41189783e-10],
       [2.73442178e-10],
       [1.75314894e-11],
       [1.48456174e-11],
       [4.84204982e-10],
       [4.84239990e-10],
       [4.01914238e-11],
       [1.18813180e-12],
       [3.14985611e-13],
       [2.03524473e-10],
       [2.14461446e-11],
       [2.18189955e-12],
       [1.16799745e-11],
       [5.92281641e-10],
       [3.53217554e-11],
       [2.26727669e-11],
       [8.74004884e-09],
       [2.93949962e-10],
       [6.26783110e-10],
       [2.23513465e-10],
       [4.41246960e-10],
       [1.45841303e-11],
       [2.44584721e-10],
       [6.13010507e-12],
       [4.24539165e-11],
       [1.64123143e-03],
       [8.55503211e-03],
       [1.65105645e-02],
       [9.87814122e-02],
       [3.97290777e-02],
       [1.11076040e-01],
       [4.19003715e-02],
       [2.88426221e-03],
       [6.27161978e-03],
       [7.67020481e-02],
       [2.27204861e-02],
       [2.08212169e-02],
       [4.58067633e-03],
       [9.90450665e-02],
       [1.19419048e-03],
       [1.41462060e-03],
       [2.22638069e-01],
       [2.68940904e-03],
       [3.66014737e-01],
       [6.97791873e-03],
       [5.78803255e-01],
       [2.32071970e-03],
       [5.28941621e-01],
       [4.57649874e-02],
       [2.69208900e-03],
       [2.84603646e-03],
       [2.20421076e-02],
       [2.07507605e-01],
       [9.10460936e-02],
       [2.44824946e-04],
       [8.37509821e-03],
       [2.78543808e-03],
       [3.11283202e-03],
       [8.89831833e-01],
       [3.65880536e-01],
       [3.03993844e-02],
       [1.18930239e-02],
       [4.99150151e-02],
       [1.10252946e-02],
       [5.15923462e-02],
       [1.43653056e-01],
       [4.41610209e-02],
       [7.37513950e-03],
       [2.88447014e-03],
       [5.07366744e-02],
       [7.24617687e-03],
       [1.83460602e-02],
       [5.40874928e-03],
       [3.87210511e-04],
       [1.55791816e-02],
       [9.99862942e-01],
       [9.89637526e-01],
       [9.86183040e-01],
       [9.83705644e-01],
       [9.98410187e-01],
       [9.97834502e-01],
       [9.84208537e-01],
       [9.85434538e-01],
       [9.94141336e-01],
       [9.94561329e-01],
       [7.20333384e-01],
       [9.70431293e-01],
       [9.62754456e-01],
       [9.96609064e-01],
       [9.99222270e-01],
       [9.83684437e-01],
       [9.26437633e-01],
       [9.83486260e-01],
       [9.99950496e-01],
       [9.39002061e-01],
       [9.88043323e-01],
       [9.88637702e-01],
       [9.98357641e-01],
       [7.65848930e-01],
       [9.73006160e-01],
       [8.76969899e-01],
       [6.61137141e-01],
       [6.97324053e-01],
       [9.97185846e-01],
       [6.11033594e-01],
       [9.77494647e-01],
       [6.58573810e-01],
       [9.98437920e-01],
       [5.24529693e-01],
       [9.70465066e-01],
       [9.87624920e-01],
       [9.97236435e-01],
       [9.26432706e-01],
       [6.61104746e-01],
       [8.84442100e-01],
       [9.96082862e-01],
       [8.40940308e-01],
       [9.89637526e-01],
       [9.96974990e-01],
       [9.97386310e-01],
       [9.62040470e-01],
       [9.52214579e-01],
       [8.96902215e-01],
       [9.90200940e-01],
       [9.28785160e-01]])
#将数据输入三个模型的看看结果
multi_pred=pd.DataFrame(zip(h(data_x,w1).ravel(),h(data_x,w2).ravel(),h(data_x,w3).ravel()))
multi_pred
012
00.9992970.1080371.484454e-11
10.9970610.2708141.723440e-10
20.9986330.1647101.027982e-10
30.9957740.2319105.819755e-10
40.9994150.0852591.484347e-11
............
1450.0000070.1275749.620405e-01
1460.0000060.4963899.522146e-01
1470.0000100.2347458.969022e-01
1480.0000060.0584449.902009e-01
1490.0000140.2842959.287852e-01

150 rows × 3 columns

multi_pred.values[:3]
array([[9.99297209e-01, 1.08037473e-01, 1.48445441e-11],
       [9.97060801e-01, 2.70813780e-01, 1.72343968e-10],
       [9.98632728e-01, 1.64709623e-01, 1.02798153e-10]])
#每个样本的预测值
np.argmax(multi_pred.values,axis=1)
array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
       0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
       0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
       1, 1, 1, 1, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 1, 1, 1,
       1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2,
       2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 2, 2,
       2, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2], dtype=int64)
#每个样本的真实值
data_y
array([[0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2],
       [2]])

1.6 评估模型

np.argmax(multi_pred.values,axis=1)==data_y.ravel()
array([ True,  True,  True,  True,  True,  True,  True,  True,  True,
        True,  True,  True,  True,  True,  True,  True,  True,  True,
        True,  True,  True,  True,  True,  True,  True,  True,  True,
        True,  True,  True,  True,  True,  True,  True,  True,  True,
        True,  True,  True,  True,  True,  True,  True,  True,  True,
        True,  True,  True,  True,  True,  True,  True,  True,  True,
        True,  True,  True,  True,  True,  True,  True,  True,  True,
        True,  True,  True,  True,  True,  True,  True, False,  True,
        True,  True,  True,  True,  True,  True,  True,  True,  True,
        True,  True, False, False,  True,  True,  True,  True,  True,
        True,  True,  True,  True,  True,  True,  True,  True,  True,
        True,  True,  True,  True,  True,  True,  True,  True,  True,
        True,  True,  True,  True,  True,  True,  True,  True,  True,
        True,  True,  True,  True,  True,  True,  True,  True,  True,
        True,  True,  True, False,  True,  True,  True, False,  True,
        True,  True,  True,  True,  True,  True,  True,  True,  True,
        True,  True,  True,  True,  True,  True])
np.sum(np.argmax(multi_pred.values,axis=1)==data_y.ravel())
145
np.sum(np.argmax(multi_pred.values,axis=1)==data_y.ravel())/len(data)
0.9666666666666667

1.7 试试sklearn

from sklearn.linear_model import LogisticRegression
#建立第一个模型
clf1=LogisticRegression()
clf1.fit(data1_x,data1_y)
#建立第二个模型
clf2=LogisticRegression()
clf2.fit(data2_x,data2_y)
#建立第三个模型
clf3=LogisticRegression()
clf3.fit(data3_x,data3_y)
LogisticRegression()
y_pred1=clf1.predict(data_x)
y_pred2=clf2.predict(data_x)
y_pred3=clf3.predict(data_x)
#可视化各模型的预测结果
multi_pred=pd.DataFrame(zip(y_pred1,y_pred2,y_pred3),columns=["模型1","模糊2","模型3"])
multi_pred
模型1模糊2模型3
0100
1100
2100
3100
4100
............
145001
146011
147001
148001
149001

150 rows × 3 columns

#判断预测结果
np.argmax(multi_pred.values,axis=1)
array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
       0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
       0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 0, 0, 1, 0, 1, 0, 0, 0,
       0, 1, 1, 1, 2, 0, 1, 1, 0, 0, 0, 2, 0, 1, 1, 1, 0, 1, 0, 0, 0, 1,
       0, 1, 1, 0, 1, 1, 1, 0, 0, 0, 1, 0, 2, 2, 2, 2, 2, 2, 1, 1, 1, 2,
       2, 2, 2, 1, 2, 2, 2, 2, 1, 1, 2, 2, 1, 2, 2, 2, 2, 2, 2, 2, 1, 2,
       2, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 2, 2, 2], dtype=int64)
data_y.ravel()
array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
       0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
       0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
       1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
       1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2,
       2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2,
       2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2])
#计算准确率
np.sum(np.argmax(multi_pred.values,axis=1)==data_y.ravel())/data.shape[0]
0.7333333333333333

实验4(1) 请动手完成你们第一个多分类问题,祝好运!完成下面代码

2.1 数据读取

data_x,data_y=datasets.make_blobs(n_samples=200, n_features=6,  centers=4,random_state=0)
data_x.shape,data_y.shape
((200, 6), (200,))

2.2 训练数据的准备

data=np.insert(data_x,data_x.shape[1],data_y,axis=1)
data=pd.DataFrame(data,columns=["F1","F2","F3","F4","F5","F6","target"])
data
F1F2F3F4F5F6target
02.1166327.972800-9.328969-8.224605-12.1784295.4984472.0
11.8864494.6210062.8415950.431245-2.4713502.5078330.0
22.3913296.464609-9.805900-7.289968-9.6509856.3884602.0
3-1.0347766.6268869.031235-0.8129085.4498550.1340621.0
4-0.4815938.1917537.504717-1.9756886.6490210.6368241.0
........................
1955.4348937.1284719.7895466.0613820.6341335.7570243.0
196-0.4066257.5860019.322750-1.8373336.477815-0.9927251.0
1972.0314627.804427-8.539512-9.824409-10.0469356.9180852.0
1984.0818896.12768511.0911264.812011-0.0059155.3422113.0
1990.9857447.285737-8.395940-6.586471-9.6517656.6510122.0

200 rows × 7 columns

data["target"]=data["target"].astype("int32")
data
F1F2F3F4F5F6target
02.1166327.972800-9.328969-8.224605-12.1784295.4984472
11.8864494.6210062.8415950.431245-2.4713502.5078330
22.3913296.464609-9.805900-7.289968-9.6509856.3884602
3-1.0347766.6268869.031235-0.8129085.4498550.1340621
4-0.4815938.1917537.504717-1.9756886.6490210.6368241
........................
1955.4348937.1284719.7895466.0613820.6341335.7570243
196-0.4066257.5860019.322750-1.8373336.477815-0.9927251
1972.0314627.804427-8.539512-9.824409-10.0469356.9180852
1984.0818896.12768511.0911264.812011-0.0059155.3422113
1990.9857447.285737-8.395940-6.586471-9.6517656.6510122

200 rows × 7 columns

data.insert(0,"ones",1)
data
onesF1F2F3F4F5F6target
012.1166327.972800-9.328969-8.224605-12.1784295.4984472
111.8864494.6210062.8415950.431245-2.4713502.5078330
212.3913296.464609-9.805900-7.289968-9.6509856.3884602
31-1.0347766.6268869.031235-0.8129085.4498550.1340621
41-0.4815938.1917537.504717-1.9756886.6490210.6368241
...........................
19515.4348937.1284719.7895466.0613820.6341335.7570243
1961-0.4066257.5860019.322750-1.8373336.477815-0.9927251
19712.0314627.804427-8.539512-9.824409-10.0469356.9180852
19814.0818896.12768511.0911264.812011-0.0059155.3422113
19910.9857447.285737-8.395940-6.586471-9.6517656.6510122

200 rows × 8 columns

#第一个类别的数据
data1=data.copy()
data1.loc[data["target"]==0,"target"]=1
data1.loc[data["target"]!=0,"target"]=0
data1
onesF1F2F3F4F5F6target
012.1166327.972800-9.328969-8.224605-12.1784295.4984470
111.8864494.6210062.8415950.431245-2.4713502.5078331
212.3913296.464609-9.805900-7.289968-9.6509856.3884600
31-1.0347766.6268869.031235-0.8129085.4498550.1340620
41-0.4815938.1917537.504717-1.9756886.6490210.6368240
...........................
19515.4348937.1284719.7895466.0613820.6341335.7570240
1961-0.4066257.5860019.322750-1.8373336.477815-0.9927250
19712.0314627.804427-8.539512-9.824409-10.0469356.9180850
19814.0818896.12768511.0911264.812011-0.0059155.3422110
19910.9857447.285737-8.395940-6.586471-9.6517656.6510120

200 rows × 8 columns

data1_x=data1.iloc[:,:data1.shape[1]-1].values
data1_y=data1.iloc[:,data1.shape[1]-1].values
data1_x.shape,data1_y.shape
((200, 7), (200,))
#第二个类别的数据
data2=data.copy()
data2.loc[data["target"]==1,"target"]=1
data2.loc[data["target"]!=1,"target"]=0
data2
onesF1F2F3F4F5F6target
012.1166327.972800-9.328969-8.224605-12.1784295.4984470
111.8864494.6210062.8415950.431245-2.4713502.5078330
212.3913296.464609-9.805900-7.289968-9.6509856.3884600
31-1.0347766.6268869.031235-0.8129085.4498550.1340621
41-0.4815938.1917537.504717-1.9756886.6490210.6368241
...........................
19515.4348937.1284719.7895466.0613820.6341335.7570240
1961-0.4066257.5860019.322750-1.8373336.477815-0.9927251
19712.0314627.804427-8.539512-9.824409-10.0469356.9180850
19814.0818896.12768511.0911264.812011-0.0059155.3422110
19910.9857447.285737-8.395940-6.586471-9.6517656.6510120

200 rows × 8 columns

data2_x=data2.iloc[:,:data2.shape[1]-1].values
data2_y=data2.iloc[:,data2.shape[1]-1].values
#第三个类别的数据
data3=data.copy()
data3.loc[data["target"]==2,"target"]=1
data3.loc[data["target"]!=2,"target"]=0
data3
onesF1F2F3F4F5F6target
012.1166327.972800-9.328969-8.224605-12.1784295.4984471
111.8864494.6210062.8415950.431245-2.4713502.5078330
212.3913296.464609-9.805900-7.289968-9.6509856.3884601
31-1.0347766.6268869.031235-0.8129085.4498550.1340620
41-0.4815938.1917537.504717-1.9756886.6490210.6368240
...........................
19515.4348937.1284719.7895466.0613820.6341335.7570240
1961-0.4066257.5860019.322750-1.8373336.477815-0.9927250
19712.0314627.804427-8.539512-9.824409-10.0469356.9180851
19814.0818896.12768511.0911264.812011-0.0059155.3422110
19910.9857447.285737-8.395940-6.586471-9.6517656.6510121

200 rows × 8 columns

data3_x=data3.iloc[:,:data3.shape[1]-1].values
data3_y=data3.iloc[:,data3.shape[1]-1].values
#第四个类别的数据
data4=data.copy()
data4.loc[data["target"]==3,"target"]=1
data4.loc[data["target"]!=3,"target"]=0
data4
onesF1F2F3F4F5F6target
012.1166327.972800-9.328969-8.224605-12.1784295.4984470
111.8864494.6210062.8415950.431245-2.4713502.5078330
212.3913296.464609-9.805900-7.289968-9.6509856.3884600
31-1.0347766.6268869.031235-0.8129085.4498550.1340620
41-0.4815938.1917537.504717-1.9756886.6490210.6368240
...........................
19515.4348937.1284719.7895466.0613820.6341335.7570241
1961-0.4066257.5860019.322750-1.8373336.477815-0.9927250
19712.0314627.804427-8.539512-9.824409-10.0469356.9180850
19814.0818896.12768511.0911264.812011-0.0059155.3422111
19910.9857447.285737-8.395940-6.586471-9.6517656.6510120

200 rows × 8 columns

data4_x=data4.iloc[:,:data4.shape[1]-1].values
data4_y=data4.iloc[:,data4.shape[1]-1].values

2.3 定义假设函数、代价函数和梯度下降算法

def sigmoid(z):
    return 1 / (1 + np.exp(-z))
def h(X,w):
    z=X@w
    h=sigmoid(z)
    return h
#代价函数构造
def cost(X,w,y):
    #当X(m,n+1),y(m,),w(n+1,1)
    y_hat=sigmoid(X@w)
    right=np.multiply(y.ravel(),np.log(y_hat).ravel())+np.multiply((1-y).ravel(),np.log(1-y_hat).ravel())
    cost=-np.sum(right)/X.shape[0]
    return cost
def grandient(X,y,iter_num,alpha):
    y=y.reshape((X.shape[0],1))
    w=np.zeros((X.shape[1],1))
    cost_lst=[]  
    for i in range(iter_num):
        y_pred=h(X,w)-y
        temp=np.zeros((X.shape[1],1))
        for j in range(X.shape[1]):
            right=np.multiply(y_pred.ravel(),X[:,j])
            
            gradient=1/(X.shape[0])*(np.sum(right))
            temp[j,0]=w[j,0]-alpha*gradient
        w=temp
        cost_lst.append(cost(X,w,y.ravel()))
    return w,cost_lst

2.4 学习这四个分类模型

import matplotlib.pyplot as plt
#初始化超参数
iter_num,alpha=600000,0.001
#训练第1个模型
w1,cost_lst1=grandient(data1_x,data1_y,iter_num,alpha)
plt.plot(range(iter_num),cost_lst1,"b-o")
[<matplotlib.lines.Line2D at 0x25624eb08e0>]

4

#训练第2个模型
w2,cost_lst2=grandient(data2_x,data2_y,iter_num,alpha)
plt.plot(range(iter_num),cost_lst2,"b-o")
[<matplotlib.lines.Line2D at 0x25631b87a60>]

5

#训练第3个模型
w3,cost_lst3=grandient(data3_x,data3_y,iter_num,alpha)
plt.plot(range(iter_num),cost_lst3,"b-o")
[<matplotlib.lines.Line2D at 0x2562bcdfac0>]

6

#训练第4个模型
w4,cost_lst4=grandient(data4_x,data4_y,iter_num,alpha)
plt.plot(range(iter_num),cost_lst4,"b-o")
[<matplotlib.lines.Line2D at 0x25631ff4ee0>]

7

2.5 利用模型进行预测

data_x
array([[ 2.11663151e+00,  7.97280013e+00, -9.32896918e+00,
        -8.22460526e+00, -1.21784287e+01,  5.49844655e+00],
       [ 1.88644899e+00,  4.62100554e+00,  2.84159548e+00,
         4.31244563e-01, -2.47135027e+00,  2.50783257e+00],
       [ 2.39132949e+00,  6.46460915e+00, -9.80590050e+00,
        -7.28996786e+00, -9.65098460e+00,  6.38845956e+00],
       ...,
       [ 2.03146167e+00,  7.80442707e+00, -8.53951210e+00,
        -9.82440872e+00, -1.00469351e+01,  6.91808489e+00],
       [ 4.08188906e+00,  6.12768483e+00,  1.10911262e+01,
         4.81201082e+00, -5.91530191e-03,  5.34221079e+00],
       [ 9.85744105e-01,  7.28573657e+00, -8.39593964e+00,
        -6.58647097e+00, -9.65176507e+00,  6.65101187e+00]])
data_x=np.insert(data_x,0,1,axis=1)
data_x.shape
(200, 7)
w3.shape
(7, 1)
multi_pred=pd.DataFrame(zip(h(data_x,w1).ravel(),h(data_x,w2).ravel(),h(data_x,w3).ravel(),h(data_x,w4).ravel()))
multi_pred
0123
00.0204364.556248e-159.999975e-012.601227e-27
10.8204884.180906e-053.551499e-055.908691e-05
20.1093097.316201e-149.999978e-017.091713e-24
30.0366089.999562e-011.048562e-095.724854e-03
40.0030759.999292e-012.516742e-096.423038e-05
...............
1950.0172783.221293e-063.753372e-149.999943e-01
1960.0033699.999966e-016.673394e-102.281428e-03
1970.0006061.118174e-139.999941e-011.780212e-28
1980.0130724.999118e-059.811154e-149.996689e-01
1990.1515481.329623e-139.999447e-012.571989e-24

200 rows × 4 columns

2.6 计算准确率

np.sum(np.argmax(multi_pred.values,axis=1)==data_y.ravel())/len(data)
1.0

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.coloradmin.cn/o/796887.html

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈,一经查实,立即删除!

相关文章

volatile关键字(轻量级锁)

目录 一、volatile出现背景 二、JMM概述 2.1、JMM的规定 三、volatile的特性 3.1、可见性 3.1.1、举例说明 3.1.2、总结 3.2、无法保证原子性 3.2.1、举例说明 3.2.2、分析 3.2.3、使用volatile对原子性测试 3.2.4、使用锁机制 3.2.5、总结 3.3、禁止指令重排序 四、v…

【KO】vite使用 git bash here创建vue3项目时方向键失败!

文章目录 起因过程结果 起因 今天使用vite创建ue3项目&#xff0c;因为git使用习惯了就直接用git运行创建命令&#xff0c;前两步都没啥问题&#xff0c;到选择框架的时候问题来了&#xff0c;方向键无效。如图&#xff1a; 过程 常理来说是直接用方向键↑和↓进行选择&…

day15-239. 滑动窗口最大值

239. 滑动窗口最大值 力扣题目链接(opens new window) 给定一个数组 nums&#xff0c;有一个大小为 k 的滑动窗口从数组的最左侧移动到数组的最右侧。你只可以看到在滑动窗口内的 k 个数字。滑动窗口每次只向右移动一位。 返回滑动窗口中的最大值。 进阶&#xff1a; 你能…

docker部署zabbix 6.0高可用集群实验

0 实验环境 虚拟机&#xff0c;postgresql本地部署&#xff0c;zabbix server及nginx容器部署 1 postgresql 参看前作 《postgresql timescaledb离线安装笔记》完成部署&#xff0c;对外端口tcp 5432&#xff0c;账号zabbix&#xff0c;密码123 2 zabbix server 2.1 拉取…

在vue页面中,直接展示代码及样式高亮(vue 中使用 highlight)

参考链接&#xff1a;https://blog.csdn.net/u011364720/article/details/90417302 前言&#xff1a;效果如下 想要在前端页面中&#xff0c;直接展示代码的样式&#xff0c;就像一些开发文档的源码展示一样使用插件 highlight.js 1、安装 npm install highlight.js2、main.j…

css中flex后文本溢出的问题

原因&#xff1a; 为了给flex item提供一个合理的默认最小尺寸&#xff0c;flex将flex item的min-width 和 min-height属性设置为了auto flex item的默认设置为&#xff1a; min-width&#xff1a; auto 水平flex布局 min-height&#xff1a;auto 垂直flex布局 解决办法&…

vue - 实现列表点击选择及多选 / 全选功能,类似购物车商品列表单选和全选效果功能,vue实现单选和多选功能详细示例教程(详细示例源码,一键复制开箱即用)

效果图 在vue项目中,实现了列表单选 / 全选功能,像商品购物车一样的功能效果详细教程,一键复制运行。 示例源码 直接复制运行就行,把数据换成后端返回的就可以了。 <tem

Nacos环境搭建

Nacos环境搭建 官方文档/下载地址 https://nacos.io/zh-cn/docs/quick-start.html 目录结构 导入nacos-mysql 在MySQL创建数据库&#xff1a;nacos_config导入conf目录下的nacos-mysql.sql文件 新建用户 在nacos_config/user中新增数据即可&#xff0c;但是密码是要BCryp…

汇编语言(第4版)实验7 寻址方式在结构化数据访问中的应用

参考答案&#xff1a; ①经分析&#xff0c;完整程序代码如下。 assume cs:codesg data segmentdb 1975,1976,1977,1978,1979,1980,1981,1982,1983db 1984,1985,1986,1987,1988,1989,1990,1991,1992db 1993,1994,1995dd 16,22,382,1356,2390,8000,16000,24486,50065,97479,140…

【C++】开源:Muduo网络库配置与使用

&#x1f60f;★,:.☆(&#xffe3;▽&#xffe3;)/$:.★ &#x1f60f; 这篇文章主要介绍Muduo网络库配置与使用。 无专精则不能成&#xff0c;无涉猎则不能通。——梁启超 欢迎来到我的博客&#xff0c;一起学习&#xff0c;共同进步。 喜欢的朋友可以关注一下&#xff0c;下…

【力扣每日一题】2023.7.27 删除每行中的最大值

目录 题目&#xff1a; 示例&#xff1a; 分析&#xff1a; 代码运行结果&#xff1a; 题目&#xff1a; 示例&#xff1a; 分析&#xff1a; 给我们一个矩阵&#xff0c;每次都把每行中的最大元素拿出来删掉&#xff0c;再把每次删除的元素里最大的元素拿出来加到结果里&…

C语言第九课------------------数组----------------C中之将

作者前言 作者介绍&#xff1a; 作者id&#xff1a;老秦包你会&#xff0c; 简单介绍&#xff1a; 喜欢学习C语言和python等编程语言&#xff0c;是一位爱分享的博主&#xff0c;有兴趣的小可爱可以来互讨 个人主页::小小页面 gitee页面:秦大大 一个爱分享的小博主 欢迎小可爱…

DaVinci工具链之DaVinci Configurator工程创建

目录 1、用DaVinci Configurator新建项目工程​编辑 2、用DaVinci Developer打开GL_Demo项目工程​编辑 3、在DaVinci Developer中新建一个调光控制模块 3.1.创建SWC Types 3.1.1.创建调光控制模块Composition SWC Type 3.1.2.创建调光分析计算Application SWC Type 3.1…

生信专题十余种案例

集成多组学数据的机器学习在生物医学中的应用 原文链接 案例部分图示&#xff1a; 案例图示一&#xff1a;基于自编码器的单细胞转录组-蛋白组学整合分析 案例图示二&#xff1a;基于蛋白组学-代谢组学的肿瘤生物标志物发现 案例图示三&#xff1a;基于GWAS-表型组学的肺癌风…

idea terminal npm指令无效

文章目录 一、修改setting二、修改启动方式 一、修改setting 菜单栏&#xff1a;File->Settings 二、修改启动方式 快捷方式->右键属性->兼容性->勾选管理员身份运行

ChatGLM-6B 部署与 P-Tuning 微调实战-使用Pycharm实战

国产大模型ChatGLM-6B微调部署入门-使用Pycharm实战 1.ChatGLM模型介绍 ChatGLM-6B 是一个开源的、支持中英双语的对话语言模型&#xff0c;基于 General Language Model (GLM) 架构&#xff0c;具有 62 亿参数。结合模型量化技术&#xff0c;用户可以在消费级的显卡上进行本…

MyBatis学习笔记之缓存

文章目录 一级缓存一级缓存失效 二级缓存二级缓存失效二级缓存相关配置 MyBatis集成EhCache 缓存&#xff1a;cache 缓存的作用&#xff1a;通过减少IO的方式&#xff0c;来提高程序的执行效率 mybatis的缓存&#xff1a;将select语句的查询结果放到缓存&#xff08;内存&…

用OpenCV图像处理技巧之白平衡算法(一)

1. 引言 欢迎继续来到我们的图像处理系列&#xff0c;在这里我们将探讨白平衡的关键技术。如果大家曾经拍过一张看起来暗淡、褪色或颜色不自然的照片&#xff0c;那么此时大家就需要了解到白平衡技术的重要性。在本文中&#xff0c;我们将深入探讨白平衡的概念&#xff0c;并探…

Qt完成文本转换为语音播报与保存(系统内置语音引擎)(二)

一、前言 随着人工智能技术的不断发展,语音技术也逐渐成为人们关注的焦点之一。语音技术在很多领域都有着广泛的应用,例如智能家居、智能客服、语音识别等等。其中,语音转文字技术是语音技术中的一个重要分支,它可以将语音转换成可编辑的文本,为人们的生活和工作带来了更…

首批!棱镜七彩通过汽车云-汽车软件研发效能成熟度模型能力评估

2023年7月25-26日&#xff0c;由中国信息通信研究院、中国通信标准化协会联合主办的“2023年可信云大会”隆重召开。会上&#xff0c;在中国信息通信研究院云计算与大数据研究所副所长栗蔚的主持下&#xff0c;中国信通院发布了“2023年上半年可信云评估结果”&#xff0c;并由…