Haskell Neural Network

Alp Mestanogullari alpmestan at gmail.com
Sat Jan 14 10:03:53 GMT 2012


Hey Thomas!

Hope you're doing well.

Kiet actually has a working feed-forward implementation:
https://github.com/ktklam9/HaskellNN
I think I'll just try to bring something to that project.

On Sat, Jan 14, 2012 at 8:02 AM, Thomas Bereknyei <tomberek at gmail.com>wrote:

> As Alp said, the idea was to be general, then specialize and optimize.
> So of course we'd have a representation that took advantage of matrix
> optimizations. I always pushed to have the general graph background
> because I have an implementation in mind that requires it. It also
> allows a simple conceptual understanding; build the graph then run it
> as a neural network. It subsumes any discussion of hidden nodes,
> layers, etc.
>
> Time as always is an issue, though I'd be willing to pull time from
> some other projects if there were some fresh ideas or some progress to
> be made.
>
> I've also learned since then, GHC has gotten better, etc.
>
> -Tom
>
> On Fri, Jan 13, 2012 at 14:28, Alp Mestanogullari <alpmestan at gmail.com>
> wrote:
> > Hi,
> >
> > Actually, the project has been pretty dead for a while. I have been quite
> > busy and the others too I guess.
> > Last time we talked about the project, we wanted to go in two directions:
> >
> > - I've always found that it was pretty reasonable to have a special
> > representation (using vectors and matrices) for feed-forward neural
> > networks, so I wanted (and still do) some day to rewrite HNN using the
> > 'vector' package, probably together with hmatrix I guess, using all the
> > performance tricks I have learned since I started Haskell.
> >
> > - There were a few other people interested in writing a more general
> neural
> > networks framework, thus graph-based, so that they would be able to
> > experiment with much more complicated models (like real-time feedback for
> > general neural networks, etc).
> >
> > I may actually have some time for rewriting a good and viable
> feed-forward
> > implementation. But definitely not the latter.
> >
> > Would you be interested in somehow contributing ?
> >
> > On Fri, Jan 13, 2012 at 7:35 PM, Kiet Lam <ktklam9 at gmail.com> wrote:
> >>
> >> Hey, I don't know how active the HNN project is, but after looking at
> your
> >> code draft, I have a few suggestions.
> >>
> >> You guys are making it way too hard on yourselves by going to a graph
> >> representation.
> >> This will no doubt have a performance impact and will / already has
> >> introduced needless complexity to your codes.
> >>
> >> From the Wiki page of HNN, it seems like you are interested in a
> >> feed-forward neural network. From this,
> >> I suggest that you take a step back and go to a vector/matrix
> >> representation.
> >>
> >> This will reduces a lot of the complexity in your codes.
> >>
> >> Moreover, many of the advanced training algorithms (Conjugate Gradient,
> >> BFGS, L-BFGS) requires the representation
> >> of the weight parameters to be in vector form. If the HNN project
> >> continues to use a graph representation,
> >> it would be costly to convert back-and-forth from graph to vector to use
> >> the advance training algorithms.
> >>
> >> I understand that you guys want to make the most general representation
> >> possible so the project would be scalable, but I think
> >> it would behoove you guys to re-evaluate your representation of the
> neural
> >> network.
> >>
> >> Thanks.
> >>
> >> Kiet Lam
> >>
> >> _______________________________________________
> >> Hnn mailing list
> >> Hnn at projects.haskell.org
> >> http://projects.haskell.org/cgi-bin/mailman/listinfo/hnn
> >>
> >
> >
> >
> > --
> > Alp Mestanogullari
> > http://alpmestan.wordpress.com/
> > http://alp.developpez.com/
> >
> > _______________________________________________
> > Hnn mailing list
> > Hnn at projects.haskell.org
> > http://projects.haskell.org/cgi-bin/mailman/listinfo/hnn
> >
>



-- 
Alp Mestanogullari
http://alpmestan.wordpress.com/
http://alp.developpez.com/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://projects.haskell.org/pipermail/hnn/attachments/20120114/f4aad7ba/attachment.htm>


More information about the Hnn mailing list