matlab中的神经网络输出全是NaN,来个大神帮我看看问题在哪!!!
functionmain()clcclearall;closeall;trainnum=500;testsamnum=500;forcastnum=500;hiddenu...
function main()
clc
clear all;
close all;
trainnum=500;
testsamnum=500;
forcastnum=500;
hiddenunitnum=8;%%中间层隐节点数量(暂定)
indim=3;%%网络输入维数
outdim=1;%%网络输出维数
%读入数据
data=load('g:\1.txt');
data=data';
p=data(1:indim,1:trainnum);
t=data(indim+1,1:trainnum);
[trainin,minp,maxp,tn,mint,maxt]=premnmx(p,t);
rand('state',sum(100*clock));
voisevar=0.01;
voise=voisevar*rand(1,trainnum);
trainout=voise+tn;
testin=trainin;
testout=trainout;
maxepochs=50000;
lr=0.035;
e0=0.05*10^(-3);
w1=0.05*rand(hiddenunitnum,indim)-0.1;
b1=0.05*rand(hiddenunitnum,1)-0.1;
w2=0.05*rand(outdim,hiddenunitnum)-0.1;
b2=0.05*rand(outdim,1)-0.1;
errhistory=[];
for i=1:maxepochs
hiddenout=logsig(w1*trainin+repmat(b1,1,trainnum));
networkout=w2*hiddenout+repmat(b2,1,trainnum);
error=trainout-networkout;
sse=sumsqr(error);
errhistory=[errhistory sse];
if sse<e0,break,end
delta2=error;
delta1=w2'*delta2.*hiddenout.*(1-hiddenout);
dw2=delta2*hiddenout';
db2=delta2*ones(trainnum,1);
dw1=delta1*trainin';
db1=delta1*ones(trainnum,1);
w2=w2+lr*dw2;
b2=b2+lr*db2;
w1=w1+lr*dw1;
b1=b2+lr*db1;
end
hiddenout=logsig(w1*trainin+repmat(b1,1,testsamnum));
networkout=w2*hiddenout+repmat(b2,1,testsamnum);
a=postmnmx(networkout,mint,maxt);
x=1:500;
newk=a(1,:);
%figure;
%plot(x,newk,'g',x,t,'*k');
pnew=load('g:\2.txt');
pnew=[91;2;204.49];
pnewn=tramnmx(pnew,minp,maxp);
hiddenout=logsig(w1*pnewn+repmat(b1,1,1));
anewn=w2*hiddenout+repmat(b2,1,1);
anew=tramnmx(anewn,mint,maxt);
fid=fopen('g:\3.txt','wt');
fprintf(fid,'%g\n',anew);
fclose(fid); 展开
clc
clear all;
close all;
trainnum=500;
testsamnum=500;
forcastnum=500;
hiddenunitnum=8;%%中间层隐节点数量(暂定)
indim=3;%%网络输入维数
outdim=1;%%网络输出维数
%读入数据
data=load('g:\1.txt');
data=data';
p=data(1:indim,1:trainnum);
t=data(indim+1,1:trainnum);
[trainin,minp,maxp,tn,mint,maxt]=premnmx(p,t);
rand('state',sum(100*clock));
voisevar=0.01;
voise=voisevar*rand(1,trainnum);
trainout=voise+tn;
testin=trainin;
testout=trainout;
maxepochs=50000;
lr=0.035;
e0=0.05*10^(-3);
w1=0.05*rand(hiddenunitnum,indim)-0.1;
b1=0.05*rand(hiddenunitnum,1)-0.1;
w2=0.05*rand(outdim,hiddenunitnum)-0.1;
b2=0.05*rand(outdim,1)-0.1;
errhistory=[];
for i=1:maxepochs
hiddenout=logsig(w1*trainin+repmat(b1,1,trainnum));
networkout=w2*hiddenout+repmat(b2,1,trainnum);
error=trainout-networkout;
sse=sumsqr(error);
errhistory=[errhistory sse];
if sse<e0,break,end
delta2=error;
delta1=w2'*delta2.*hiddenout.*(1-hiddenout);
dw2=delta2*hiddenout';
db2=delta2*ones(trainnum,1);
dw1=delta1*trainin';
db1=delta1*ones(trainnum,1);
w2=w2+lr*dw2;
b2=b2+lr*db2;
w1=w1+lr*dw1;
b1=b2+lr*db1;
end
hiddenout=logsig(w1*trainin+repmat(b1,1,testsamnum));
networkout=w2*hiddenout+repmat(b2,1,testsamnum);
a=postmnmx(networkout,mint,maxt);
x=1:500;
newk=a(1,:);
%figure;
%plot(x,newk,'g',x,t,'*k');
pnew=load('g:\2.txt');
pnew=[91;2;204.49];
pnewn=tramnmx(pnew,minp,maxp);
hiddenout=logsig(w1*pnewn+repmat(b1,1,1));
anewn=w2*hiddenout+repmat(b2,1,1);
anew=tramnmx(anewn,mint,maxt);
fid=fopen('g:\3.txt','wt');
fprintf(fid,'%g\n',anew);
fclose(fid); 展开
- 你的回答被采纳后将获得:
- 系统奖励15(财富值+成长值)+难题奖励30(财富值+成长值)
展开全部
肯定是公式有错,NaN是not a number的意思,肯定是某个地方写错了。不过,既然是matlab编程,为什么不使用神经网络工具箱呢。附件是一个工具箱的例子。
newff函数的格式为:
net=newff(PR,[S1 S2 ...SN],{TF1 TF2...TFN},BTF,BLF,PF),函数newff建立一个可训练的前馈网络。输入参数说明:
PR:Rx2的矩阵以定义R个输入向量的最小值和最大值;
Si:第i层神经元个数;
TFi:第i层的传递函数,默认函数为tansig函数;
BTF:训练函数,默认函数为trainlm函数;
BLF:权值/阀值学习函数,默认函数为learngdm函数;
PF:性能函数,默认函数为mse函数。
本回答被提问者和网友采纳
已赞过
已踩过<
评论
收起
你对这个回答的评价是?
推荐律师服务:
若未解决您的问题,请您详细描述您的问题,通过百度律临进行免费专业咨询