If you can't read please download the document
Upload
wynter-jefferson
View
46
Download
1
Embed Size (px)
DESCRIPTION
Neural Computation 0368-4149-01. Prof. Nathan Intrator TA: Yehudit Hasson Tuesday 16:00-19:00 Dan David 111 Office hours: Wed 4-5 [email protected]. Neural Computation. Neuroscience The objective is to understand the human brain Biologically azrealistic models of neurons - PowerPoint PPT Presentation
Citation preview
Neural Computation 0368-4149-01Prof. Nathan Intrator
TA: Yehudit HassonTuesday 16:00-19:00 Dan David 111Office hours: Wed [email protected]
Neural ComputationNeuroscienceThe objective is to understand the human brainBiologically azrealistic models of neuronsBiologically realistic connection topologiesNeural computationThe objective is to develop learning, representation and computation methodsNovel architectures for data representation and processing
The goals of neural computationTo understand how the brain actually worksIts big and very complicated and made of yukky stuff that dies when you poke it aroundTo understand a new style of computationInspired by neurons and their adaptive connectionsVery different style from sequential computationshould be good for things that brains are good at (e.g. vision)Should be bad for things that brains are bad at (e.g. 23 x 71)To solve practical problems by using novel learning algorithmsLearning algorithms can be very useful even if they have nothing to do with how the brain works
The BrainThe brain - that's my second most favorite organ! - Woody Allen
The Brain: Fundamental QuestionsWhat kind of information is extracted from the environment?How is information represented, e.g. visual?How is information stored?How is information altered (learning & memory)?How is information processed and manipulated?
The Brain: Simpler QuestionsHow is 3D information storedHow is relational information stored:The child is on the floorThe book is in the bagHow are verbs associated with adjectivesHow is information bound together:Collections of items which are on the tableCollection of edges which form an object
Physiological experiments help us learn how a new scene is analyzed, in particular the eye movement is used to learn about the analysis strategyIn this unseen set of images, it takes very long time to detect the changes between the bear and microscope. How do we observe changes in familiar scenes very fast?
Man versus Machine (hardware)
NumbersHuman brainVon Neumann computer# elements1010 - 1012 neurons107 - 108 transistors# connections / element104 - 10310switching frequency103 Hz109 Hzenergy / operation10-16 Joule10-6 Joulepower consumption10 Watt100 - 500 Wattreliability of elementslowreasonablereliability of systemhighreasonable
Man versus Machine (information processing)No memory management, No hardware/software/data distinction
FeaturesaHuman BrainVon Neumann computerData representationanalogdigitalMemory localizationdistributedlocalizedControldistributedlocalizedProcessingparallelsequentialSkill acqazuisitionlearningprogramming
Brain Performance
Flies have a better stabilizing mechanism than a Boeing 747
Their gyroscope is being studied in a wind tunnel
http://www.kyb.mpg.de/publications/pdfs/pdf340.pdf
The bats external ears pick up both the emitted sounds and the returning echoes to serve as the receiving antennas. Echo delay estimation 20 nanoSec!!
Movies: NavigationDARPA Robot Race
Dolphins sonar propertiesDiscriminate between alloys of aluminumSee a tennis ball from 75 metersDistinguish between a penny and dime from 3 metersDetect fish buried .5 meter undergroundExcellent shape discrimination (same material) Send up to 200 clicks per second! Frequency range 15 kHz 120 kHz Excellent sensor array (whole face) W. W. L. Au (1993) The sonar of dolphins. (Springer).
Brief OutlineUnsupervised LearningShort bio motivationUnsupervised Neuronal ModelConnection with Projection Pursuit and advanced feature extractionSupervised Learning SchemesPerceptron and Multi Layer Perceptron RBF, SVM, TreesTraining and optimizationModel Selection and Validation (advanced training methods)Cross Validation, Regularization, Noise injectionEnsemblesBrain Machine InterfaceEEG, fMRI modalitiesBrain state interpretation based on machine learning modelRecent Research in BMI
Introduction to the BrainBy: Geoffrey Hinton
www.cs.toronto.edu/~hinton/csc321/notes/lec1.ppt
A typical cortical neuronGross physical structure:There is one axon that branchesThere is a dendritic tree that collects input from other neurons
Axons typically contact dendritic trees at synapsesA spike of activity in the axon causes charge to be injected into the post-synaptic neuron
Spike generation:There is an axon hillock that generates outgoing spikes whenever enough charge has flowed in at synapses to depolarize the cell membraneaxonbodydendritictree
A Neuron
The synaptic junctionSynapses, Ca influx, release of neurotransmitter, opening of post-synaptic channels
Some relevant termsAxon, dendriteIon channelsMembrane rest potentialAction potential, refractory period
The Biological Neuron 10 billion neurons in human brainSummation of input stimuliSpatial (signals)Temporal (pulses)Threshold over composed inputsConstant firing strength
Biological Neural Networks10,000 synapses per neuronComputational power = connectivityPlasticity new connections (?)strength of connections modified
Neural DynamicsRefractory timeAction potentialAction potential 100mVActivation threshold 20-30mVRest potential -65mVSpike time 1-2msRefractory time 10-20ms
Chart1
-65-65-35
-55-65-35
-55.9516258196-65-35
-56.8126924692-65-35
-57.5918177932-65-35
-58.2967995396-65-35
-58.9346934029-65-35
-59.5118836391-65-35
-60.0341469621-65-35
-60.5067103588-65-35
-60.9343034026-65-35
-45.9343034026-65-35
-47.7486443177-65-35
-49.3903278668-65-35
-50.8757845707-65-35
-52.2198813791-65-35
-53.4360704649-65-35
-54.5365238571-65-35
-55.5322552632-65-35
-56.4332302977-65-35
-57.2484662217-65-35
-47.2484662217-65-35
-48.9377480098-65-35
-50.4662733814-65-35
-51.849340332-65-35
-53.1007910605-65-35
-54.2331505065-65-35
-55.2577517039-65-35
-56.1848492059-65-35
-57.0237217159-65-35
-57.7827649519-65-35
-35.7827649519-65-35
-38.5631524769-65-35
-41.0789511462-65-35
-43.3553399184-65-35
-45.4151016575-65-35
-47.2788511513-65-35
-48.9652414311-65-35
-50.4911504577-65-35
-51.8718500415-65-35
-53.1211586879-65-35
-38.1211586879-65-35
-40.6790186274-65-35
-42.9934660107-65-35
-45.0876646052-65-35
-46.9825738543-65-35
-48.6971586467-65-35
-50.2485791232-65-35
-51.6523624215-65-35
-52.9225580766-65-35
-54.0718786335-65-35
35-65-35
-115-65-35
-107.3240862445-65-35
-100.8265655287-65-35
-95.3265329856-65-35
-90.6708559516-65-35
-86.7299104254-65-35
-83.3939720586-65-35
-65.5701611957-65-35
-64.6072956352-65-35
-63.8755467113-65-35
-63.3315068317-65-35
-62.9391866135-65-35
-62.6688042661-65-35
-62.4957676582-65-35
-62.3998188363-65-35
-62.3643154694-65-35
-62.375627665-65-35
-41.842278686-65-35
-43.8744134401-65-35
-45.7395108584-65-35
-47.449422075-65-35
-47.5056221838-65-35
-49.1704323466-65-35
-50.6768148759-65-35
-52.0398461543-65-35
-53.2731678569-65-35
-54.3891234819-65-35
-44.3891234819-65-35
-46.3505077079-65-35
-48.1252415467-65-35
-49.7310871311-65-35
-51.1841163035-65-35
-52.4988714682-65-35
-53.6885111368-65-35
-54.7649416228-65-35
-55.7389362046-65-35
-56.6202429471-65-35
-41.6202429471-65-35
-43.8451209939-65-35
-45.8582739013-65-35
-47.6798499801-65-35
-49.328080176-65-35
-50.8194605308-65-35
-52.1689172803-65-35
-53.3899562413-65-35
-54.4947979821-65-35
-55.4945001302-65-35
-50.4945001302-65-35
-51.8748809505-65-35
membrane
rest
activation
ms
mV
Sheet1
0000-65-65-35
10110-65-55-35
210.9048374189.0483741804-65-55.9516258196-35
320.81873075318.1873075308-65-56.8126924692-35
430.74081822077.4081822068-65-57.5918177932-35
540.6703200466.7032004604-65-58.2967995396-35
650.60653065976.0653065971-65-58.9346934029-35
760.54881163615.4881163609-65-59.5118836391-35
870.49658530384.9658530379-65-60.0341469621-35
980.44932896414.4932896412-65-60.5067103588-35
1090.40656965974.0656965974-65-60.9343034026-35
110119.0656965974-65-45.9343034026-35
1210.90483741817.2513556823-65-47.7486443177-35
1320.818730753115.6096721332-65-49.3903278668-35
1430.740818220714.1242154293-65-50.8757845707-35
1540.67032004612.7801186209-65-52.2198813791-35
1650.606530659711.5639295351-65-53.4360704649-35
1760.548811636110.4634761429-65-54.5365238571-35
1870.49658530389.4677447368-65-55.5322552632-35
1980.44932896418.5667697023-65-56.4332302977-35
2090.40656965977.7515337783-65-57.2484662217-35
210117.7515337783-65-47.2484662217-35
2210.90483741816.0622519902-65-48.9377480098-35
2320.818730753114.5337266186-65-50.4662733814-35
2430.740818220713.150659668-65-51.849340332-35
2540.67032004611.8992089395-65-53.1007910605-35
2650.606530659710.7668494935-65-54.2331505065-35
2760.54881163619.7422482961-65-55.2577517039-35
2870.49658530388.8151507941-65-56.1848492059-35
2980.44932896417.9762782841-65-57.0237217159-35
3090.40656965977.2172350481-65-57.7827649519-35
310129.2172350481-65-35.7827649519-35
3210.90483741826.4368475231-65-38.5631524769-35
3320.818730753123.9210488538-65-41.0789511462-35
3430.740818220721.6446600816-65-43.3553399184-35
3540.67032004619.5848983425-65-45.4151016575-35
3650.606530659717.7211488487-65-47.2788511513-35
3760.548811636116.0347585689-65-48.9652414311-35
3870.496585303814.5088495423-65-50.4911504577-35
3980.449328964113.1281499585-65-51.8718500415-35
4090.406569659711.8788413121-65-53.1211586879-35
410126.8788413121-65-38.1211586879-35
4210.90483741824.3209813726-65-40.6790186274-35
4320.818730753122.0065339893-65-42.9934660107-35
4430.740818220719.9123353948-65-45.0876646052-35
4540.67032004618.0174261457-65-46.9825738543-35
4650.606530659716.3028413533-65-48.6971586467-35
4760.548811636114.7514208768-65-50.2485791232-35
4870.496585303813.3476375785-65-51.6523624215-35
4980.449328964112.0774419234-65-52.9225580766-35
5090.406569659710.9281213665-65-54.0718786335-35
51010100-6535-35
5200-50-65-115-350-500
5310-42.3240862445-65-107.3240862445-351-42.32408624450
5420-35.8265655287-65-100.8265655287-352-35.82656552870
5530-30.3265329856-65-95.3265329856-353-30.32653298560
5640-25.6708559516-65-90.6708559516-354-25.67085595160
5750-21.7299104254-65-86.7299104254-355-21.72991042540
5860-18.3939720586-65-83.3939720586-356-18.39397205860
5901-0.5701611957-65-65.5701611957-357-15.57016119570115
6010.9048374180.3927043648-65-64.6072956352-358-13.179856905810.90483741813.5725612705
6120.81873075311.1244532887-65-63.8755467113-359-11.156508007420.818730753112.2809612962
6230.74081822071.6684931683-65-63.3315068317-3510-9.443780141930.740818220711.1122733102
6340.6703200462.0608133865-65-62.9391866135-3511-7.99398730440.67032004610.0548006905
6450.60653065972.3311957339-65-62.6688042661-3512-6.766764161850.60653065979.0979598957
6560.54881163612.5042323418-65-62.4957676582-3513-5.727942199660.54881163618.2321745414
6670.49658530382.6001811637-65-62.3998188363-3514-4.848598393270.49658530387.4487795569
6780.44932896412.6356845306-65-62.3643154694-3515-4.104249931280.44932896416.7399344618
6890.40656965972.624372335-65-62.375627665-3516-3.474172561190.40656965976.0985448961
690123.157721314-65-41.842278686-3517-2.94082358210126.0985448961
7010.90483741821.1255865599-65-43.8744134401-3518-2.489353418410.90483741823.6149399783
7120.818730753119.2604891416-65-45.7395108584-3519-2.107192175520.818730753121.367681317
7230.740818220717.550577925-65-47.449422075-3520-1.783699667430.740818220719.3342775923
7340.67032004617.4943778162-65-47.5056221838-3540.67032004617.4943778162
7450.606530659715.8295676534-65-49.1704323466-3550.606530659715.8295676534
7560.548811636114.3231851241-65-50.6768148759-3560.548811636114.3231851241
7670.496585303812.9601538457-65-52.0398461543-3570.496585303812.9601538457
7780.449328964111.7268321431-65-53.2731678569-3580.449328964111.7268321431
7890.406569659710.6108765181-65-54.3891234819-3590.406569659710.6108765181
790120.6108765181-65-44.3891234819-35
8010.90483741818.6494922921-65-46.3505077079-35
8120.818730753116.8747584533-65-48.1252415467-35
8230.740818220715.2689128689-65-49.7310871311-35
8340.67032004613.8158836965-65-51.1841163035-35
8450.606530659712.5011285318-65-52.4988714682-35
8560.548811636111.3114888632-65-53.6885111368-35
8670.496585303810.2350583772-65-54.7649416228-35
8780.44932896419.2610637954-65-55.7389362046-35
8890.40656965978.3797570529-65-56.6202429471-35
890123.3797570529-65-41.6202429471-35
9010.90483741821.1548790061-65-43.8451209939-35
9120.818730753119.1417260987-65-45.8582739013-35
9230.740818220717.3201500199-65-47.6798499801-35
9340.67032004615.671919824-65-49.328080176-35
9450.606530659714.1805394692-65-50.8194605308-35
9560.548811636112.8310827197-65-52.1689172803-35
9670.496585303811.6100437587-65-53.3899562413-35
9780.449328964110.5052020179-65-54.4947979821-35
9890.40656965979.5054998698-65-55.4945001302-35
990114.5054998698-65-50.4945001302-35
10010.90483741813.1251190495-65-51.8748809505-35
Sheet1
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
membrane
rest
activation
ms
mV
Sheet2
Sheet3
The Artificial NeuronStimulusurest = resting potentialxj(t) = output of neuron j at time twij = connection strength between neuron i and neuron ju(t) = total stimulus at time tyi(t)x1(t)x2(t)x5(t)x3(t)x4(t)wi1wi3wi2wi4wi5Neuron iResponse
Artificial Neural ModelsMcCulloch Pitt-type Neurons (static)Digital neurons: activation state interpretation (snapshot of the system each time a unit fires)Analog neurons: firing rate interpretation (activation of units equal to firing rate)Activation of neurons encodes informationSpiking Neurons (dynamic)Firing pattern interpretation (spike trains of units)Timing of spike trains encodes information (time to first spike, phase of signal, correlation and synchronicity
Binary NeuronsHard threshold= thresholdex: Perceptrons, Hopfield NNs, Boltzmann MachinesMain drawbacks: can only map binary functions, biologically implausible.offonStimulusResponse
Chart4
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
-1-11
0-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
1-11
heaviside
off
on
input
output
hard threshold
Sheet1
-10-1-11-0.9999092043
-9.9-1-11-0.9998996557
-9.8-1-11-0.999889103
-9.7-1-11-0.9998774405
-9.6-1-11-0.9998645517
-9.5-1-11-0.9998503075
-9.4-1-11-0.9998345656
-9.3-1-11-0.9998171683
-9.2-1-11-0.9997979416
-9.1-1-11-0.9997766933
-9-1-11-0.9997532108
-8.9-1-11-0.9997272593
-8.8-1-11-0.9996985793
-8.7-1-11-0.9996668839
-8.6-1-11-0.9996318562
-8.5-1-11-0.999593146
-8.4-1-11-0.9995503665
-8.3-1-11-0.9995030898
-8.2-1-11-0.9994508437
-8.1-1-11-0.9993931059
-8-1-11-0.9993292997
-7.9-1-11-0.9992587877
-7.8-1-11-0.9991808657
-7.7-1-11-0.9990947556
-7.6-1-11-0.9989995978
-7.5-1-11-0.9988944427
-7.4-1-11-0.9987782413
-7.3-1-11-0.9986498345
-7.2-1-11-0.9985079423
-7.1-1-11-0.9983511506
-7-1-11-0.9981778976
-6.9-1-11-0.9979864584
-6.8-1-11-0.9977749279
-6.7-1-11-0.9975412028
-6.6-1-11-0.9972829601
-6.5-1-11-0.9969976355
-6.4-1-11-0.9966823978
-6.3-1-11-0.9963341221
-6.2-1-11-0.9959493592
-6.1-1-11-0.995524303
-6-1-11-0.9950547537
-5.9-1-11-0.9945360785
-5.8-1-11-0.9939631674
-5.7-1-11-0.9933303854
-5.6-1-11-0.9926315202
-5.5-1-11-0.9918597246
-5.4-1-11-0.9910074537
-5.3-1-11-0.9900663967
-5.2-1-11-0.9890274022
-5.1-1-11-0.987880397
-5-1-11-0.9866142982
-4.9-1-11-0.9852169173
-4.8-1-11-0.9836748577
-4.7-1-11-0.9819734027
-4.6-1-11-0.9800963963
-4.5-1-11-0.9780261147
-4.4-1-11-0.97574313
-4.3-1-11-0.9732261643
-4.2-1-11-0.9704519366
-4.1-1-11-0.9673950013
-4-1-11-0.9640275801
-3.9-1-11-0.9603193885
-3.8-1-11-0.9562374581
-3.7-1-11-0.9517459572
-3.6-1-11-0.9468060128
-3.5-1-11-0.9413755385
-3.4-1-11-0.9354090706
-3.3-1-11-0.9288576215
-3.2-1-11-0.9216685544
-3.1-1-11-0.9137854901
-3-1-11-0.9051482536
-2.9-1-11-0.8956928738
-2.8-1-11-0.8853516482
-2.7-1-11-0.8740532879
-2.6-1-11-0.8617231593
-2.5-1-11-0.84828364
-2.4-1-11-0.833654607
-2.3-1-11-0.817754078
-2.2-1-11-0.8004990218
-2.1-1-11-0.7818063576
-2-1-11-0.761594156
-1.9-1-11-0.7397830513
-1.8-1-11-0.7162978702
-1.7-1-11-0.6910694698
-1.6-1-11-0.6640367703
-1.5-1-11-0.6351489524
-1.4-1-11-0.6043677771
-1.3-1-11-0.5716699661
-1.2-1-11-0.537049567
-1.1-1-11-0.5005202112
-1-1-11-0.4621171573
-0.9-1-11-0.4218990053
-0.8-1-11-0.3799489623
-0.7-1-11-0.3363755443
-0.6-1-11-0.2913126125
-0.5-1-11-0.2449186624
-0.4-1-11-0.1973753202
-0.3-1-11-0.1488850336
-0.2-1-11-0.0996679946
-0.1-1-11-0.049958375
00-110
0.11-110.049958375
0.21-110.0996679946
0.31-110.1488850336
0.41-110.1973753202
0.51-110.2449186624
0.61-110.2913126125
0.71-110.3363755443
0.81-110.3799489623
0.91-110.4218990053
11-110.4621171573
1.11-110.5005202112
1.21-110.537049567
1.31-110.5716699661
1.41-110.6043677771
1.51-110.6351489524
1.61-110.6640367703
1.71-110.6910694698
1.81-110.7162978702
1.91-110.7397830513
21-110.761594156
2.11-110.7818063576
2.21-110.8004990218
2.31-110.817754078
2.41-110.833654607
2.51-110.84828364
2.61-110.8617231593
2.71-110.8740532879
2.81-110.8853516482
2.91-110.8956928738
31-110.9051482536
3.11-110.9137854901
3.21-110.9216685544
3.31-110.9288576215
3.41-110.9354090706
3.51-110.9413755385
3.61-110.9468060128
3.71-110.9517459572
3.81-110.9562374581
3.91-110.9603193885
41-110.9640275801
4.11-110.9673950013
4.21-110.9704519366
4.31-110.9732261643
4.41-110.97574313
4.51-110.9780261147
4.61-110.9800963963
4.71-110.9819734027
4.81-110.9836748577
4.91-110.9852169173
51-110.9866142982
5.11-110.987880397
5.21-110.9890274022
5.31-110.9900663967
5.41-110.9910074537
5.51-110.9918597246
5.61-110.9926315202
5.71-110.9933303854
5.81-110.9939631674
5.91-110.9945360785
61-110.9950547537
6.11-110.995524303
6.21-110.9959493592
6.31-110.9963341221
6.41-110.9966823978
6.51-110.9969976355
6.61-110.9972829601
6.71-110.9975412028
6.81-110.9977749279
6.91-110.9979864584
71-110.9981778976
7.11-110.9983511506
7.21-110.9985079423
7.31-110.9986498345
7.41-110.9987782413
7.51-110.9988944427
7.61-110.9989995978
7.71-110.9990947556
7.81-110.9991808657
7.91-110.9992587877
81-110.9993292997
8.11-110.9993931059
8.21-110.9994508437
8.31-110.9995030898
8.41-110.9995503665
8.51-110.999593146
8.61-110.9996318562
8.71-110.9996668839
8.81-110.9996985793
8.91-110.9997272593
91-110.9997532108
9.11-110.9997766933
9.21-110.9997979416
9.31-110.9998171683
9.41-110.9998345656
9.51-110.9998503075
9.61-110.9998645517
9.71-110.9998774405
9.81-110.999889103
9.91-110.9998996557
101-110.9999092043
Sheet1
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
heaviside
off
on
input
output
hard threshold
Sheet2
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
2/(1+exp(-x))-1
min
max
input
output
sigmoid
Sheet3
Analog NeuronsSoft thresholdex: MLPs, Recurrent NNs, RBF NNs...Main drawbacks: difficult to process time patterns, biologically implausible.offonStimulusResponse
Chart5
-0.9999092043-11
-0.9998996557-11
-0.999889103-11
-0.9998774405-11
-0.9998645517-11
-0.9998503075-11
-0.9998345656-11
-0.9998171683-11
-0.9997979416-11
-0.9997766933-11
-0.9997532108-11
-0.9997272593-11
-0.9996985793-11
-0.9996668839-11
-0.9996318562-11
-0.999593146-11
-0.9995503665-11
-0.9995030898-11
-0.9994508437-11
-0.9993931059-11
-0.9993292997-11
-0.9992587877-11
-0.9991808657-11
-0.9990947556-11
-0.9989995978-11
-0.9988944427-11
-0.9987782413-11
-0.9986498345-11
-0.9985079423-11
-0.9983511506-11
-0.9981778976-11
-0.9979864584-11
-0.9977749279-11
-0.9975412028-11
-0.9972829601-11
-0.9969976355-11
-0.9966823978-11
-0.9963341221-11
-0.9959493592-11
-0.995524303-11
-0.9950547537-11
-0.9945360785-11
-0.9939631674-11
-0.9933303854-11
-0.9926315202-11
-0.9918597246-11
-0.9910074537-11
-0.9900663967-11
-0.9890274022-11
-0.987880397-11
-0.9866142982-11
-0.9852169173-11
-0.9836748577-11
-0.9819734027-11
-0.9800963963-11
-0.9780261147-11
-0.97574313-11
-0.9732261643-11
-0.9704519366-11
-0.9673950013-11
-0.9640275801-11
-0.9603193885-11
-0.9562374581-11
-0.9517459572-11
-0.9468060128-11
-0.9413755385-11
-0.9354090706-11
-0.9288576215-11
-0.9216685544-11
-0.9137854901-11
-0.9051482536-11
-0.8956928738-11
-0.8853516482-11
-0.8740532879-11
-0.8617231593-11
-0.84828364-11
-0.833654607-11
-0.817754078-11
-0.8004990218-11
-0.7818063576-11
-0.761594156-11
-0.7397830513-11
-0.7162978702-11
-0.6910694698-11
-0.6640367703-11
-0.6351489524-11
-0.6043677771-11
-0.5716699661-11
-0.537049567-11
-0.5005202112-11
-0.4621171573-11
-0.4218990053-11
-0.3799489623-11
-0.3363755443-11
-0.2913126125-11
-0.2449186624-11
-0.1973753202-11
-0.1488850336-11
-0.0996679946-11
-0.049958375-11
0-11
0.049958375-11
0.0996679946-11
0.1488850336-11
0.1973753202-11
0.2449186624-11
0.2913126125-11
0.3363755443-11
0.3799489623-11
0.4218990053-11
0.4621171573-11
0.5005202112-11
0.537049567-11
0.5716699661-11
0.6043677771-11
0.6351489524-11
0.6640367703-11
0.6910694698-11
0.7162978702-11
0.7397830513-11
0.761594156-11
0.7818063576-11
0.8004990218-11
0.817754078-11
0.833654607-11
0.84828364-11
0.8617231593-11
0.8740532879-11
0.8853516482-11
0.8956928738-11
0.9051482536-11
0.9137854901-11
0.9216685544-11
0.9288576215-11
0.9354090706-11
0.9413755385-11
0.9468060128-11
0.9517459572-11
0.9562374581-11
0.9603193885-11
0.9640275801-11
0.9673950013-11
0.9704519366-11
0.9732261643-11
0.97574313-11
0.9780261147-11
0.9800963963-11
0.9819734027-11
0.9836748577-11
0.9852169173-11
0.9866142982-11
0.987880397-11
0.9890274022-11
0.9900663967-11
0.9910074537-11
0.9918597246-11
0.9926315202-11
0.9933303854-11
0.9939631674-11
0.9945360785-11
0.9950547537-11
0.995524303-11
0.9959493592-11
0.9963341221-11
0.9966823978-11
0.9969976355-11
0.9972829601-11
0.9975412028-11
0.9977749279-11
0.9979864584-11
0.9981778976-11
0.9983511506-11
0.9985079423-11
0.9986498345-11
0.9987782413-11
0.9988944427-11
0.9989995978-11
0.9990947556-11
0.9991808657-11
0.9992587877-11
0.9993292997-11
0.9993931059-11
0.9994508437-11
0.9995030898-11
0.9995503665-11
0.999593146-11
0.9996318562-11
0.9996668839-11
0.9996985793-11
0.9997272593-11
0.9997532108-11
0.9997766933-11
0.9997979416-11
0.9998171683-11
0.9998345656-11
0.9998503075-11
0.9998645517-11
0.9998774405-11
0.999889103-11
0.9998996557-11
0.9999092043-11
2/(1+exp(-x))-1
min
max
input
output
sigmoid
Sheet1
-10-1-11-0.9999092043
-9.9-1-11-0.9998996557
-9.8-1-11-0.999889103
-9.7-1-11-0.9998774405
-9.6-1-11-0.9998645517
-9.5-1-11-0.9998503075
-9.4-1-11-0.9998345656
-9.3-1-11-0.9998171683
-9.2-1-11-0.9997979416
-9.1-1-11-0.9997766933
-9-1-11-0.9997532108
-8.9-1-11-0.9997272593
-8.8-1-11-0.9996985793
-8.7-1-11-0.9996668839
-8.6-1-11-0.9996318562
-8.5-1-11-0.999593146
-8.4-1-11-0.9995503665
-8.3-1-11-0.9995030898
-8.2-1-11-0.9994508437
-8.1-1-11-0.9993931059
-8-1-11-0.9993292997
-7.9-1-11-0.9992587877
-7.8-1-11-0.9991808657
-7.7-1-11-0.9990947556
-7.6-1-11-0.9989995978
-7.5-1-11-0.9988944427
-7.4-1-11-0.9987782413
-7.3-1-11-0.9986498345
-7.2-1-11-0.9985079423
-7.1-1-11-0.9983511506
-7-1-11-0.9981778976
-6.9-1-11-0.9979864584
-6.8-1-11-0.9977749279
-6.7-1-11-0.9975412028
-6.6-1-11-0.9972829601
-6.5-1-11-0.9969976355
-6.4-1-11-0.9966823978
-6.3-1-11-0.9963341221
-6.2-1-11-0.9959493592
-6.1-1-11-0.995524303
-6-1-11-0.9950547537
-5.9-1-11-0.9945360785
-5.8-1-11-0.9939631674
-5.7-1-11-0.9933303854
-5.6-1-11-0.9926315202
-5.5-1-11-0.9918597246
-5.4-1-11-0.9910074537
-5.3-1-11-0.9900663967
-5.2-1-11-0.9890274022
-5.1-1-11-0.987880397
-5-1-11-0.9866142982
-4.9-1-11-0.9852169173
-4.8-1-11-0.9836748577
-4.7-1-11-0.9819734027
-4.6-1-11-0.9800963963
-4.5-1-11-0.9780261147
-4.4-1-11-0.97574313
-4.3-1-11-0.9732261643
-4.2-1-11-0.9704519366
-4.1-1-11-0.9673950013
-4-1-11-0.9640275801
-3.9-1-11-0.9603193885
-3.8-1-11-0.9562374581
-3.7-1-11-0.9517459572
-3.6-1-11-0.9468060128
-3.5-1-11-0.9413755385
-3.4-1-11-0.9354090706
-3.3-1-11-0.9288576215
-3.2-1-11-0.9216685544
-3.1-1-11-0.9137854901
-3-1-11-0.9051482536
-2.9-1-11-0.8956928738
-2.8-1-11-0.8853516482
-2.7-1-11-0.8740532879
-2.6-1-11-0.8617231593
-2.5-1-11-0.84828364
-2.4-1-11-0.833654607
-2.3-1-11-0.817754078
-2.2-1-11-0.8004990218
-2.1-1-11-0.7818063576
-2-1-11-0.761594156
-1.9-1-11-0.7397830513
-1.8-1-11-0.7162978702
-1.7-1-11-0.6910694698
-1.6-1-11-0.6640367703
-1.5-1-11-0.6351489524
-1.4-1-11-0.6043677771
-1.3-1-11-0.5716699661
-1.2-1-11-0.537049567
-1.1-1-11-0.5005202112
-1-1-11-0.4621171573
-0.9-1-11-0.4218990053
-0.8-1-11-0.3799489623
-0.7-1-11-0.3363755443
-0.6-1-11-0.2913126125
-0.5-1-11-0.2449186624
-0.4-1-11-0.1973753202
-0.3-1-11-0.1488850336
-0.2-1-11-0.0996679946
-0.1-1-11-0.049958375
00-110
0.11-110.049958375
0.21-110.0996679946
0.31-110.1488850336
0.41-110.1973753202
0.51-110.2449186624
0.61-110.2913126125
0.71-110.3363755443
0.81-110.3799489623
0.91-110.4218990053
11-110.4621171573
1.11-110.5005202112
1.21-110.537049567
1.31-110.5716699661
1.41-110.6043677771
1.51-110.6351489524
1.61-110.6640367703
1.71-110.6910694698
1.81-110.7162978702
1.91-110.7397830513
21-110.761594156
2.11-110.7818063576
2.21-110.8004990218
2.31-110.817754078
2.41-110.833654607
2.51-110.84828364
2.61-110.8617231593
2.71-110.8740532879
2.81-110.8853516482
2.91-110.8956928738
31-110.9051482536
3.11-110.9137854901
3.21-110.9216685544
3.31-110.9288576215
3.41-110.9354090706
3.51-110.9413755385
3.61-110.9468060128
3.71-110.9517459572
3.81-110.9562374581
3.91-110.9603193885
41-110.9640275801
4.11-110.9673950013
4.21-110.9704519366
4.31-110.9732261643
4.41-110.97574313
4.51-110.9780261147
4.61-110.9800963963
4.71-110.9819734027
4.81-110.9836748577
4.91-110.9852169173
51-110.9866142982
5.11-110.987880397
5.21-110.9890274022
5.31-110.9900663967
5.41-110.9910074537
5.51-110.9918597246
5.61-110.9926315202
5.71-110.9933303854
5.81-110.9939631674
5.91-110.9945360785
61-110.9950547537
6.11-110.995524303
6.21-110.9959493592
6.31-110.9963341221
6.41-110.9966823978
6.51-110.9969976355
6.61-110.9972829601
6.71-110.9975412028
6.81-110.9977749279
6.91-110.9979864584
71-110.9981778976
7.11-110.9983511506
7.21-110.9985079423
7.31-110.9986498345
7.41-110.9987782413
7.51-110.9988944427
7.61-110.9989995978
7.71-110.9990947556
7.81-110.9991808657
7.91-110.9992587877
81-110.9993292997
8.11-110.9993931059
8.21-110.9994508437
8.31-110.9995030898
8.41-110.9995503665
8.51-110.999593146
8.61-110.9996318562
8.71-110.9996668839
8.81-110.9996985793
8.91-110.9997272593
91-110.9997532108
9.11-110.9997766933
9.21-110.9997979416
9.31-110.9998171683
9.41-110.9998345656
9.51-110.9998503075
9.61-110.9998645517
9.71-110.9998774405
9.81-110.999889103
9.91-110.9998996557
101-110.9999092043
Sheet1
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
heaviside
off
on
input
output
hard threshold
Sheet2
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
000
2/(1+exp(-x))-1
min
max
input
output
sigmoid
Sheet3
Spiking Neurons = spike and afterspike potentialurest = resting potentiale(t,u(t)) = trace at time t of input at time t= thresholdxj(t) = output of neuron j at time twij = efficacy of synapse from neuron i to neuron ju(t) = input stimulus at time tResponseStimulus
Spiking Neuron Dynamics
Hebbs Postulate of Learning When an axon of cell A is near enough to excite a cell and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that As efficiency as one of the cells firing B is increased.
Hebbs Postulate: revisitedStent (1973), and Changeux and Danchin (1976)have expanded Hebbs rule such that it also models inhibitory synapses:If two neurons on either side of a synapse are activated simultaneously (synchronously), then the strength of that synapse is selectively increased.If two neurons on either side of a synapse are activated asynchronously, then that synapse is selectively weakened or eliminated.a
SynapsesWhen a spike travels along an axon and arrives at a synapse it causes vesicles of transmitter chemical to be releasedThere are several kinds of transmitterThe transmitter molecules diffuse across the synaptic cleft and bind to receptor molecules in the membrane of the post-synaptic neuron thus changing their shape. This opens up holes that allow specific ions in or out.The effectiveness of the synapse can be changed vary the number of vesicles of transmitter vary the number of receptor molecules.Synapses are slow, but they have advantages over RAMVery smallThey adapt using locally available signals (but how?)
How the brain worksEach neuron receives inputs from other neuronsSome neurons also connect to receptorsCortical neurons use spikes to communicateThe timing of spikes is importantThe effect of each input line on the neuron is controlled by a synaptic weightThe weights can be positive or negative
The synaptic weights adapt so that the whole network learns to perform useful computationsRecognizing objects, understanding language, making plans, controlling the bodyYou have about 10 neurons each with about 10 weights A huge number of weights can affect the computation in a very short time. Much better bandwidth than pentium.
113
Modularity and the brainDifferent bits of the cortex do different things.Local damage to the brain has specific effectsSpecific tasks increase the blood flow to specific regions.But cortex looks pretty much the same all over.Early brain damage makes functions relocateCortex is made of general purpose stuff that has the ability to turn into special purpose hardware in response to experience.This gives rapid parallel computation plus flexibilityConventional computers get flexibility by having stored programs, but this requires very fast central processors to perform large computations.
Idealized neuronsTo model things we have to idealize them (e.g. atoms)Idealization removes complicated details that are not essential for understanding the main principlesAllows us to apply mathematics and to make analogies to other, familiar systems.Once we understand the basic principles, its easy to add complexity to make the model more faithfulIt is often worth understanding models that are known to be wrong (but we mustnt forget that they are wrong!)E.g. neurons that communicate real values rather than discrete spikes of activity.
Linear neuronsThese are simple but computationally limited If we can make them learn we may get insight into more complicated neuronsoutputbiasindex overinput connectionsi inputthithweight on
input00by
Binary threshold neuronsMcCulloch-Pitts (1943): influenced Von Neumann!First compute a weighted sum of the inputs from other neuronsThen send out a fixed size spike of activity if the weighted sum exceeds a threshold. Maybe each spike is like the truth value of a proposition and each neuron combines truth values to compute the truth value of another proposition!
1 if0 otherwiseyz10threshold
Linear threshold neurons
0 otherwiseyz0thresholdThese have a confusing name. They compute a linear weighted sum of their inputs
The output is a non-linear function of the total input
Sigmoid neuronsThese give a real-valued output that is a smooth and bounded function of their total input.Typically they use the logistic functionThey have nice derivatives which make learning easy (see lecture 4).If we treat as a probability of producing a spike, we get stochastic binary neurons.0.5001
Types of connectivityFeedforward networksThese compute a series of transformationsTypically, the first layer is the input and the last layer is the output.Recurrent networksThese have directed cycles in their connection graph. They can have complicated dynamics.More biologically realistic.
hidden unitsoutput unitsinput units
Types of learning taskSupervised learningLearn to predict output when given input vectorWho provides the correct answer?Reinforcement learningLearn action to maximize payoffNot much information in a payoff signalPayoff is often delayedUnsupervised learningCreate an internal representation of the input e.g. form clusters; extract featuresHow do we know if a representation is good?
**Twee hoofdstromenVak bij BMT zal op den duur meer kant 1 opgaanBij AI komt meer kijken: kennis representatie reasoning learning
*******Getallen voor de van Neumann computer veranderen met voortschrijden technologie(Moore en Shannon)
*Inzicht in hoe information processing in de hersenen verloopt staat nog in de kinderschoenenIn ieder geval is er sprake van een gelaagde structuur*****Add here a picture of the dolphin head sensor array******1000-100.000 inputs per neuron, Na, K, Cl and Ca channelsAction potentials enable cell to cell communication, unlike sub threshold membrane potentials that attenuate over small distances.
*The soma is the cell body*********Hebbs postulate captures plasticity of the synapsesThis does not capture inhibitory synapses
Simple form of Hebbs rule is the product rule F(y, x) = \eta * y * xThe mathematical form does
*Synchrony and asynchrony have to do with firing frequency
***********