×
INTELLIGENT WORK FORUMS
FOR ENGINEERING PROFESSIONALS

Log In

Come Join Us!

Are you an
Engineering professional?
Join Eng-Tips Forums!
  • Talk With Other Members
  • Be Notified Of Responses
    To Your Posts
  • Keyword Search
  • One-Click Access To Your
    Favorite Forums
  • Automated Signatures
    On Your Posts
  • Best Of All, It's Free!
  • Students Click Here

*Eng-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.

Students Click Here

Jobs

Python (and neural nets)

Python (and neural nets)

Python (and neural nets)

(OP)
I've just started learning python, and a question on one of the discussion boards set me off looking for a neural net software that was (a) free and (b) easy to use. If you don't want to do programming then Orange is the answer, but I wanted to program it.

Anyway, without further ado here is the result, in which we teach the neural net to multiply two numbers between -.5 and +.5 together



Here's the python

CODE --> python

import numpy as np
import neurolab as nl
import pylab
# Create train samples
N=100
input = np.random.uniform(-0.5, 0.5, (N, 2))
target = ((input[:, 0] * input[:, 1])**1).reshape(N, 1)
input2 = np.random.uniform(-0.5, 0.5, (N, 2))
target2 = ((input2[:, 0] * input2[:, 1])**1).reshape(N, 1)
# Create network with 2 inputs, 5 neurons in input layer and 1 in output layer
net = nl.net.newff([[-0.5, 0.5], [-0.5, 0.5]], [5, 1])
# Train process
err = net.train(input, target, show=15)
# Test
train=target.copy()*0.
test=target.copy()*0.
for i in range(N):
    train[i] =net.sim([[input[i,0], input[i,1]]])
    test[i] =net.sim([[input2[i,0], input2[i,1]]])
print '(0.2*0.1)=',net.sim([[0.2, 0.1]]) 


# Plot result


pylab.plot(np.log(err))
pylab.xlabel('Epoch number')
pylab.ylabel('Log(error (default SSE))')
pylab.grid()
pylab.show()
figsize = (8,6) # fig size in inches (width,height)
figure = pylab.figure(figsize = figsize)
pylab.plot(target,train,'*')
pylab.plot(target2,test,'ok')
pylab.plot([test.max(),test.min()],[test.max(),test.min()],'r')
pylab.grid()
pylab.legend(['training set','testing set','ideal'])
pylab.title('Multiplication of two random numbers via neural net')
pylab.show() 



Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

RE: Python (and neural nets)

An interesting technology.

I can't find the reference, so this is from memory.

There was reportedly a case where a large, complex, computer based neural net was taught to recognize a certain type of object in images. After a vast effort, it seemed to work with their test data. When deployed, it was discovered that the system had latched onto some other extraneous feature in the training image set, and was dangerously useless in the real world.

A made-up example might be self-driving cars that unfortunately and regretfully only recognize children on rainy days if they're wearing rain coats and rubber boots. Because most stock images of children playing outside on rainy days have them wearing rain coats and rubber boots.

Perhaps in the future the scientists and engineers involved will be able to analyze the trained synapse parameters and formally prove that it's learned correctly.

RE: Python (and neural nets)

(OP)
Yes that is definitely an issue - for instance in my code above both the training and the testing set are generated using the same random number generator - so if there is a fault in the RNG then I haven't really tested the algorithm properly.

The reason this came up is someone asked why on a physics course why we always jump into differential equations, so I pointed out you can derive models without assuming a DE explains a given behaviour, but since I hadn't played with neural nets I didn't know if they were any good in general.


my answer now is that they are dangerous in untutored hands such as mine - here's a fit for (a*b)^4 with a and b between -1. and 1. woohoo




and here's the same model trained and run over the range -.5 to +.5, maybe the problem is just the small values





Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

RE: Python (and neural nets)

The whole concept of NNs is that the human brain is ostensibly a massive neural net, and therefore, we're supposedly taking advantage. However, the promised performance has repeatedly fallen short, both in classical and non-NN architectures, particularly in the quagmire of "training" to differentiate the "features" that distinguish desired objects from undesired objects. But, while non-NN approaches use the human to determine the features, NNs essentially are allowed to find their own features, based on the segmentations dictated by the training set. Therein lies the rub, since the segmentation of the training set is human-determined, and the selection of the objects is also human determined, resulting in potentially unexpected segmentations that might have arisen because of hidden features or hidden biases on the part of the humans involved.

We've been trying to do this for going well on 30+ years, and it seems to still be a crap-shoot, at best.

TTFN
FAQ731-376: Eng-Tips.com Forum Policies

Need help writing a question or understanding a reply? forum1529: Translation Assistance for Engineers


Of course I can. I can do anything. I can do absolutely anything. I'm an expert!
There is a homework forum hosted by engineering.com: http://www.engineering.com/AskForum/aff/32.aspx

Red Flag This Post

Please let us know here why this post is inappropriate. Reasons such as off-topic, duplicates, flames, illegal, vulgar, or students posting their homework.

Red Flag Submitted

Thank you for helping keep Eng-Tips Forums free from inappropriate posts.
The Eng-Tips staff will check this out and take appropriate action.

Reply To This Thread

Posting in the Eng-Tips forums is a member-only feature.

Click Here to join Eng-Tips and talk with other members!


Resources