NumPy - What is broadcasting?

后端 未结 5 2051
不思量自难忘°
不思量自难忘° 2020-12-03 16:10

I don\'t understand broadcasting. The documentation explains the rules of broadcasting but doesn\'t seem to define it in English. My guess is that broadcasting is when NumPy

5条回答
  •  北海茫月
    2020-12-03 16:52

    1.What is Broadcasting? Broadcasting is a Tensor operation. Helpful in Neural Network (ML, AI)

    2.What is the use of Broadcasting?

    • Without Broadcasting addition of only identical Dimension(shape) Tensors is supported.

    • Broadcasting Provide us the Flexibility to add two Tensors of Different Dimension.

      for Example: adding a 2D Tensor with a 1D Tensor is not possible without broadcasting see the image explaining Broadcasting pictorially

    Run the Python example code understand the concept

    x = np.array([1,3,5,6,7,8]) 
    y = np.array([2,4,5])
    X=x.reshape(2,3)
    

    x is reshaped to get a 2D Tensor X of shape (2,3), and adding this 2D Tensor X with 1D Tensor y of shape(1,3) to get a 2D Tensor z of shape(2,3)

    print("X =",X)
    print("\n y =",y)
    z=X+y
    print("X + y =",z)
    

    You are almost correct about smaller Tensor, no ambiguity, the smaller tensor will be broadcasted to match the shape of the larger tensor.(Small vector is repeated but not filled with Dummy Data or Zeros to Match the Shape of larger).

    3. How broadcasting happens? Broadcasting consists of two steps:

    1 Broadcast axes are added to the smaller tensor to match the ndim of the larger tensor.

    2 The smaller tensor is repeated alongside these new axes to match the full shape of the larger tensor.

    4. Why Broadcasting not happening in your code? your code is working but Broadcasting can not happen here because both Tensors are different in shape but Identical in Dimensional(1D). Broadcasting occurs when dimensions are nonidentical. what you need to do is change Dimension of one of the Tensor, you will experience Broadcasting.

    5. Going in Depth. Broadcasting(repetition of smaller Tensor) occurs along broadcast axes but since both the Tensors are 1 Dimensional there is no broadcast Axis. Don't Confuse Tensor Dimension with the shape of tensor, Tensor Dimensions are not same as Matrices Dimension.

提交回复
热议问题