COMPGROUPS.NET | Search | Post Question | Groups | Stream | About | Register

### Bengali character recognition

• Email
• Follow

```Hi,
we are doing a Bengali character recognition project.

there are 69 character patter.
we have formed 16X16 skeleton images of all characters.
so our input P= 256X69 matrix
Target is T= 69X69 matrix (by eye(69))

now here is our neural network code....

S1 = 69;
[R,Q] = size(input_pattern);
[S2,Q] = size(target);
P = input_pattern; % 256X69 matrix
T = target; % target =eye(69);

net = newff(P,T,[S1 S2],{'logsig' 'logsig' },'traingdx');
net=init(net);

net.performFcn = 'sse';
net.trainParam.goal = 0.000001;
net.trainParam.epochs = 700;
net.trainParam.max_fail=374;

net = train(net, P, T);

[row col]=size(P);
for i=1:10
P = P+ randn(row,col);
net = train(net, P, T); % training with errored value
end

% examine with the training data to check network is trained or not
for i=1:aa
P(:,i)
A2 = sim(net,P(:,i));
A2 = compet(A2);;
end

after executing this code we find that it does not even work for the training data. if it can not recognize the training data how it will recognize random data.

```
 0

See related articles to this posting

```% "asdf " <ohh_i_am_undone@yahoo.com> wrote in message
% <jutqlg\$2e2\$1@newscl01ah.mathworks.com>...
% > Hi,
% > we are doing a Bengali character recognition project.
% >
% > there are 69 character patter.
% > we have formed 16X16 skeleton images of all characters.
% > so our input P= 256X69 matrix
% > Target is T= 69X69 matrix (by eye(69))
% >
% > now here is our neural network code....

>   S1 = 69;

% Why?

[ I N ] = size(P)                      % [ 256 69 ]

[ O N ] = size(T)                    % [ 69 69 ]

classindex = vec2ind(T)     % 1:69

% For an I-H-O net, the number of unknown weights is

Nw = (I+1)*H+(H+1)*O     %  = 257*69+70*69 = 22563

% but the Number of training equations is only

Neq = N*O      % = 69*69 =  4761  ~  Nw / 4.7

Therefore need either/and/or
1. Fewer hidden nodes
2. More data (Adding random noise has merit)
3. Validaton Stopping
4. Regularization(trainbr)

>   [R,Q] = size(input_pattern);
>   [S2,Q] = size(target);
>   P = input_pattern; % 256X69 matrix
>   T = target; % target =eye(69);
>
>   net = newff(P,T,[S1 S2],{'logsig' 'logsig' },'traingdx');

% No. S2 is automatically obtained from T;

net = newff(P,T,S1,{'tansig' 'softmax' },'trainscg'); % Best for classification

>   net=init(net);

% Unnecessary. Newff weight initialization is automatic

> net.performFcn = 'sse';
> net.trainParam.goal = 0.000001;
> net.trainParam.epochs = 700;
> net.trainParam.max_fail=374;

% No. Validation stop default is 6
%
% Why not delete the last four statements and rely on defaults?

> net = train(net, P, T);

[net,tr] = train(net,P,T); % tr contains the training results

%For all data (train/val/test)

Y = net( P )
classes = vec2ind(Y)
numerr = numel(classes~=classindex)
Pcterr0 = 100*numerr/N

% For separate train/val/test results, use tr

> [row col]=size(P);
>  for i=1:10
>      P = P+ randn(row,col);
>        net = train(net, P, T); % training with errored value

[ net tr ] = train(net, P, T)   % Need tr

% Need to calculate error rates Pcterr(i) for each loop

>  end

> % examine with the training data to check network is trained or not

No.
See classes and vec2ind above

Hope this helps.

Greg
```
 0
Reply heath (3990) 8/11/2012 10:28:08 PM

1 Replies
30 Views

Similar Articles

12/6/2013 1:20:26 AM
page loaded in 4315 ms. (0)