I managed to program a network which actually does what I want, but it doesn't learn. My XOR-Network just... tries every possbile configuration of weights and then pics the right one, actually, it's not more intelligent than a fucking normal program.

I'll jsut post it for fun,
'Simple "Neural Network" by psygate
'Solving the XOR-Problem, which can't be solved by a perceptron.
'At the moment... It's a very very stupid network and doesn't learn anything.
'
'Activation function is just a binary one. 0 or 1.
'Weights are chosen by random.
'2 input and 1 output neuron.
'Sorry for the shitty code!
randomize timer
option explicit
dim as integer a,b,c,d,t
const numofneuro=5
type neuron
pot as double 'Activation potencial
weight(numofneuro) as double 'Weight of Synapses to other neurons
threshold as integer 'Activation Threshold
end type
declare function process(a as ubyte,b as ubyte) as ubyte
dim shared as neuron n(numofneuro)
while t<>1
for a=1 to numofneuro 'Initialize every neuron
with n(a)
.pot=0 'Set potencial to 0
.threshold=.5 'Set an appropriate threshold
end with
for b=1 to numofneuro
if b<>a then
n(a).weight(b)=(-1)^int(rnd*2)*int(rnd*3) 'Initialize synapses weight.
else
n(a).weight(b)=0 'Prohibits loops from neuron1 to neuron1
end if
next b
next a
if process(0,0)=0 and process(1,0)=1 and process(0,1)=1 and process(1,1)=0 _
then t=1
wend
for a=1 to numofneuro
for b=1 to numofneuro-1
if b<>a then print n(a).weight(b)
next
next
sleep
function process(a as ubyte,b as ubyte) as ubyte
dim as integer l,m
n(1).pot=a
n(2).pot=b
for l=1 to numofneuro
for m=1 to numofneuro
n(m).pot+=n(l).weight(m)*n(l).pot
next
next
return n(numofneuro).pot
end function