## Neural Computation

Let *N* be the class of functions realizable by feedforward linear threshold nets with *n* input units, two hidden units each of zero threshold, and an output unit. This class is also essentially equivalent to the class of intersections of two open half spaces that are bounded by planes through the origin. We give an algorithm that probably almost correctly (PAC) learns this class from examples and membership queries. The algorithm runs in time polynomial in *n*, ∊ (the accuracy parameter), and δ (the confidence parameter). If only examples are allowed, but not membership queries, we give an algorithm that learns *N* in polynomial time provided that the probability distribution *D* from which examples are chosen satisfies *D*(*x*) = *D*(−*x*) ∀*x*. The algorithm yields a hypothesis net with two hidden units, one linear threshold and the other quadratic threshold.