时序预测 | Matlab实现SSA-BiLSTM、BiLSTM麻雀算法优化双向长短期记忆神经网络时间序列预测(含优化前后对比)
目录
- 时序预测 | Matlab实现SSA-BiLSTM、BiLSTM麻雀算法优化双向长短期记忆神经网络时间序列预测(含优化前后对比)
- 预测效果
- 基本介绍
- 程序设计
- 参考资料
预测效果
基本介绍
Matlab实现SSA-BiLSTM、BiLSTM麻雀算法优化双向长短期记忆神经网络时间序列预测对比(完整程序和数据)
单变量时间序列预测,运行环境Matlab2018及以上。
SSA-BiLSTM优化得到的最优参数为:
SSA-BiLSTM优化得到的隐藏单元数目为:9
SSA-BiLSTM优化得到的最大训练周期为:19
SSA-BiLSTM优化得到的InitialLearnRate为:0.095122SSA-BiLSTM优化得到的L2Regularization为:0.15231
SSA-BiLSTM结果:
SSA-BiLSTM训练集MSE:0.014575
SSA-BiLSTM测试集MSE:0.033147
BiLSTM结果:
BiLSTM训练集MSE:0.00040462
BiLSTM测试集MSE:0.15018
程序设计
- 完整程序和数据下载方式1(资源处直接下载):Matlab实现SSA-BiLSTM、BiLSTM麻雀算法优化双向长短期记忆神经网络时间序列预测(含优化前后对比)
- 完整程序和数据下载方式2(订阅《LSTM长短期记忆神经网络》专栏,同时可阅读《LSTM长短期记忆神经网络》专栏收录的所有内容,数据订阅后私信我获取):Matlab实现SSA-BiLSTM、BiLSTM麻雀算法优化双向长短期记忆神经网络时间序列预测(含优化前后对比)
- 完整程序和数据下载方式3(订阅《智能学习》专栏,同时获取《智能学习》专栏收录程序6份,数据订阅后私信我获取):Matlab实现SSA-BiLSTM、BiLSTM麻雀算法优化双向长短期记忆神经网络时间序列预测(含优化前后对比)
%_________________________________________________________________________%
% 麻雀优化算法 %
%_________________________________________________________________________%
function [Best_pos,Best_score,curve,BestNet]=SSA(pop,Max_iter,lb,ub,dim,fobj)
disp('麻雀算法开始...')
ST = 0.6;%预警值
PD = 0.7;%发现者的比列,剩下的是加入者
SD = 0.2;%意识到有危险麻雀的比重
PDNumber = round(pop*PD); %发现者数量
SDNumber = round(pop*SD);%意识到有危险麻雀数量
if(max(size(ub)) == 1)
ub = ub.*ones(1,dim);
lb = lb.*ones(1,dim);
end
net = {};
%种群初始化
X0=initialization(pop,dim,ub,lb);
X = X0;
%计算初始适应度值
fitness = zeros(1,pop);
for i = 1:pop
[fitness(i),net{i}] = fobj(X(i,:));
end
[fitness, index]= sort(fitness);%排序
BestF = fitness(1);
WorstF = fitness(end);
GBestF = fitness(1);%全局最优适应度值
for i = 1:pop
X(i,:) = X0(index(i),:);
net{i}=net{index(i)};
end
curve=zeros(1,Max_iter);
GBestX = X(1,:);%全局最优位置
X_new = X;
BestNet = net{1};
curve(1)=GBestF;
for i = 2: Max_iter
disp(['第',num2str(i),'次迭代']);
BestF = fitness(1);
WorstF = fitness(end);
R2 = rand(1);
for j = 1:PDNumber
if(R2<ST)
X_new(j,:) = X(j,:).*exp(-j/(rand(1)*Max_iter));
else
X_new(j,:) = X(j,:) + randn()*ones(1,dim);
end
end
for j = PDNumber+1:pop
% if(j>(pop/2))
if(j>(pop - PDNumber)/2 + PDNumber)
X_new(j,:)= randn().*exp((X(end,:) - X(j,:))/j^2);
else
%产生-1,1的随机数
A = ones(1,dim);
for a = 1:dim
if(rand()>0.5)
A(a) = -1;
end
end
AA = A'*inv(A*A');
X_new(j,:)= X(1,:) + abs(X(j,:) - X(1,:)).*AA';
end
end
Temp = randperm(pop);
SDchooseIndex = Temp(1:SDNumber);
for j = 1:SDNumber
if(fitness(SDchooseIndex(j))>BestF)
X_new(SDchooseIndex(j),:) = X(1,:) + randn().*abs(X(SDchooseIndex(j),:) - X(1,:));
elseif(fitness(SDchooseIndex(j))== BestF)
K = 2*rand() -1;
X_new(SDchooseIndex(j),:) = X(SDchooseIndex(j),:) + K.*(abs( X(SDchooseIndex(j),:) - X(end,:))./(fitness(SDchooseIndex(j)) - fitness(end) + 10^-8));
end
end
%边界控制
for j = 1:pop
for a = 1: dim
if(X_new(j,a)>ub(a)||isnan(X_new(j,a)))
X_new(j,a) =ub(a);
end
if(X_new(j,a)<lb(a)||isnan(X_new(j,a)))
X_new(j,a) =lb(a);
end
end
end
%更新位置
for j=1:pop
[fitness_new(j),net{j}] = fobj(X_new(j,:));
end
for j = 1:pop
if(fitness_new(j) < GBestF)
GBestF = fitness_new(j);
GBestX = X_new(j,:);
BestNet=net{j};
end
end
X = X_new;
fitness = fitness_new;
%排序更新
[fitness, index]= sort(fitness);%排序
BestF = fitness(1);
WorstF = fitness(end);
for j = 1:pop
X(j,:) = X(index(j),:);
net{j}=net{index(j)};
end
curve(i) = GBestF;
end
Best_pos =GBestX;
Best_score = curve(end);
end
参考资料
[1] https://blog.csdn.net/article/details/126072792?spm=1001.2014.3001.5502
[2] https://blog.csdn.net/article/details/126044265?spm=1001.2014.3001.5502
[3] https://blog.csdn.net/article/details/126043107?spm=1001.2014.3001.5502