New ideas for hnn
Thomas Bereknyei
tomberek at gmail.com
Sat Mar 6 02:42:20 EST 2010
Hello,
I've been working on some more ideas. It's a long post, but I'm
trying to explain some of my thoughts. The idea is to be able to
compose transformations on neural networks. There is some monadic
code in Data.Graph.Inductive.Monad, but I haven't figured out how to
use it yet.
In any case, I believe that evaluations of a neural network are
actually comonadic, perhaps even the learning algorithms are. I have
only been able to get the types to work out, but I don't fully
comprehend the exact meaning of each (insert category theory term
here). My goal here is to hide all of the inner workings so that the
end user will be able to do something like this:
data SimpleNeuron= SimpleNeuron * sum sigmoid
data NN a b= NN (Context a b) (Gr a b)
randomGraph = mkGraph ...blah...blah...
myNN = NN outputContext RandomGraph
step :: Inputs -> Context a b -> a
step is c@(i,n,a,o) = (neuronAction a) is i
--just specifying that the inputs are routed through the input edges
and evaluated by
--a composition of the function in SimpleNeuron.... I'm sure there's a
clearer way to do this.
extend step myNN -- and this gives a single timestep advance of
signals in the neural network.
So an outline:
Specify the functions of edges and nodes.
Specify the graph.
Specify an evaluation/learning strategy
Apply strategy.
So I figured I would play around making an NN and making it instances
of various typeclasses. Here it is in literate Haskell:
>{-# LANGUAGE GADTs,TypeSynonymInstances,MultiParamTypeClasses #-}
>
>module AI.HNN where
>
>import Data.Graph.Inductive.Graph
>import Data.Graph.Inductive.Query.Monad (mapFst,mapSnd)
>import Data.Graph.Inductive.Basic
>import Data.Graph.Inductive.PatriciaTree
>import Control.Comonad
>import Control.Comonad.Parameterized
>import Control.Category.Hask
>import Control.Functor
>
>import Data.Graph.Inductive.Query.DFS (topsort)
>import Text.PrettyPrint.Boxes -- (render,text,(<+>),vcat)
>import Data.List (transpose)
>import Data.Maybe (fromJust)
Ok, done with the imports. Now we have the types. These are pretty
simple, but help us organize. The choice of making NN = Edge x Graph
was to allow a pointer to a node and edge at the same time. The
convention is that the pointed node is the first node in the tuple.
>type Neuron a b = ([(Dendrite a b,Node)]
> ,Node
> ,Axon a b
> ,[(Dendrite a b,Node)])
>
>data Dendrite a b = Dendrite {
> weightFunc :: b -> a -> a
> ,weight :: b
> }
>data Axon a b where
> Axon :: ([a] -> a) -> a -> Axon a b
>
>data NN a b = NN Edge (Gr a b)
This simply makes NN an instance of graph where each operation ignores
the Edge component.
>instance Graph NN where
> empty = NN undefined empty --TODO carefull!
> isEmpty (NN _ net) = isEmpty net
> match n (NN edge net) = (mc,NN edge g)
> where (mc,g)=match n net
> mkGraph ns es=NN (firstTwo $ head es) (mkGraph ns es) --first edge
> where firstTwo (f,s,_)=(f,s)
> labNodes (NN _ net) = labNodes net
> nodeRange (NN _ net) = nodeRange net
>instance DynGraph NN where
> c & (NN edge net)= NN edge (c & net)
A utility function.
>contexts :: Graph gr => gr a b -> [Context a b]
>contexts = gsel (const True)
Some instances defining (NN a) as a comonad. This only works over the
edge type.
>instance Functor (NN a) where
> fmap = liftW
>instance Copointed (NN a) where
> extract (NN edge net)= result where
> (_,_,result)=head $ filter (\(i,o,b) -> edge==(i,o)) (labEdges net)
>instance Comonad (NN a) where --map every edge
> extend f (NN edge net) = NN edge (gmap convert net) where
> convert c@(i,n,a,o)= (edgesConv edgeConv i,n,a,edgesConv edgeConv' o)
> where
> edgeConv' = \x y -> ((f. flip NN net) (x,y),y)
> edgeConv = \x y -> ((f. flip NN net) (y,x),y)
> edgesConv ec= map $ (ec n.snd)
Instances to make NN a parameterized comonad. These work on the nodes.
>instance PFunctor NN Hask Hask where
> first = nmap
>instance QFunctor NN Hask Hask where
> second = emap
>instance Bifunctor NN Hask Hask Hask where
> bimap r s =gmap fc
> where fc c@(i,n,a,o)= (edgeMap i,n,r a,edgeMap o)
> edgeMap = map (mapFst s)
>instance PCopointed NN where
> pextract (NN edge net)=fromJust $ lab net (fst edge)
>instance PComonad NN where
> pextend f nn@(NN edge net) = NN edge (gmap convert net) where
> convert c@(i,n,a,o)= (i,n,(f $ NN (n,snd $ head o) net),o)
Here is an example neural network. While working on doubleWeights I'm
coming to think that a better choice for NN is something like NN =
Context x Graph.
>doubleWeights :: Num a => PseudounipolarNN a b-> Axon a b
>doubleWeights nn@(NN edge net)= Axon f (2*w)
> where Axon f w = fromJust $ lab net (fst edge)
>
>type PseudounipolarNN a b=NN (Axon a b) (Dendrite a b)
>
>t = mkGraph nz ez :: PseudounipolarNN Double Double
>sig :: Double -> Double
>sig a =1.0 / (1 + exp (-1*a)) -0.5
>
>nz = zip ns nns
> where ns = [1..23]
> nns= map (Axon $ sig.sum) is
> is= [-1,-0.9..0] ++ repeat 0
>ez= let
> a2layer=[(x,y) | x<-[1..10],y<-[11..20]
> , or [x+10==y,x*2==y,x*3==y] ]
> a3layer=a2layer++[(x,y) | x<-[11..20],y<-[21,22]
> , or [ and [y==22,x>15] , and [y==21,x<=15] ]
> ]
> a4layer=a3layer++[(21,23),(22,23)]
> zipit (x,y) z=(x,y,z)
> ss = zipWith Dendrite (repeat (*)) [-2,-1.8..]
> in
> zipWith zipit a4layer ss
The code to print a NN.
>instance Show b =>Show (Dendrite a b)
> where show s= (take 4 $ show $ weight s)
>
>instance Show a => Show (Axon a b)
> where show (Axon _ activity)= (take 4 $ show $ activity)
>
>instance (Show a, Show b) => Show (NN a b) where
> show (NN edge net)
> |isEmpty net = "empty"
> |otherwise = render $ hcat top $ map (vcat left) cList
> where cList=transpose $ map (showC.context net) (topsort net)
> showC (i,n,a,o)=(case n of n | n==fst edge -> [text "*"]
> | n==snd edge -> [text "+"]
> | otherwise -> [text " "])
> ++[alignHoriz right 5 $ text(show n++": ")
> ,alignHoriz right 4 $ text $ show a , text " --> "
> ,text $ show es ,text " with weights: " ,text $ show ws]
> where (ws,es)=unzip o
More information about the Hnn
mailing list