I want to make a simple neural network which uses the ReLU function. Can someone give me a clue of how can I implement the function using numpy.
You can do it in much easier way:
def ReLU(x): return x * (x > 0) def dReLU(x): return 1. * (x > 0)