[SciPy-user] Academic Question?
conor.robinson at gmail.com
Tue Aug 29 11:34:20 CDT 2006
I have both of those resources, they are good, but I'm really looking
for a study comparing encoding schema. I've been combing the
University California libraries, however most studies use a "hand
wave" gesture or don't mention how they encode whatsoever. I've
developed a basic technique for compressing 1ofC, but I would like to
see what others have done. In bishop around p. 230 he notes you can
use cross entropy with sigmoid for probabilities.
On 8/28/06, Aarre Laakso <aarre at pair.com> wrote:
> Conor Robinson wrote:
> > If one is using 1ofC encoding, what does PCA
> > applied to 1ofC result in? 1ofc is nice if you're trying to get the
> > posterior probability distribution as output (sigmoid single output
> > unit feedforward network),
> > however would this still hold true after applying PCA to reduce input
> > dimensionality? Does this even make "sense" for 1ofc? Furthermore,
> > is there any solid literature reviewing encoding schemes for nnets
> > etc?
> >From what I've been told, it doesn't make sense to apply PCA to
> categorical variables, although it is a common practice.
> If you want the posterior probability distribution at the outputs, then
> I believe you want to use softmax.
> The neural nets FAQ http://www.faqs.org/faqs/ai-faq/neural-nets/ has
> some practical advice about encoding, with citations to the literature
> (part 2, "How should categories be encoded?"), as well as some
> recommendations on the literature in general (part 4). If you haven't
> read Bishop (1995), I highly recommend it.
> Aarre Laakso
> SciPy-user mailing list
> SciPy-user at scipy.org
More information about the SciPy-user