Custom layer with two parameters function on Core ML

倖福魔咒の 提交于 2021-02-19 07:33:09

问题


Thanks to this great article(http://machinethink.net/blog/coreml-custom-layers/), I understood how to write converting using coremltools and Lambda with Keras custom layer. But, I cannot understand on the situation, function with two parameters.

#python
def scaling(x, scale):
    return x * scale

Keras layer is here.

#python
up = conv2d_bn(mixed,
                   K.int_shape(x)[channel_axis],
                   1,
                   activation=None,
                   use_bias=True,
                   name=name_fmt('Conv2d_1x1'))
x = Lambda(scaling, # HERE !!
           output_shape=K.int_shape(up)[1:],
           arguments={'scale': scale})(up)
x = add([x, up])

On this situation, how can I write func evaluate(inputs: [MLMultiArray], outputs: [MLMultiArray]) in custom MLCustomLayer class on Swift? I understand just in one parameter function situation, like this,

#swift 
func evaluate(inputs: [MLMultiArray], outputs: [MLMultiArray]) throws {
  for i in 0..<inputs.count {
    let input = inputs[i]
    let output = outputs[i]

    for j in 0..<input.count {
      let x = input[j].floatValue
      let y = x / (1 + exp(-x))
      output[j] = NSNumber(value: y)
    }
  }  
}

How about two parameters function, like x * scale?

Full code is here.

  • Converting to Core ML model with custom layer https://github.com/osmszk/dla_team14/blob/master/facenet/coreml/CoremlTest.ipynb
  • Network model by Keras https://github.com/osmszk/dla_team14/blob/master/facenet/code/facenet_keras_v2.py

Thank you.


回答1:


It looks like scale is a hyperparameter, not a learnable parameter, is that correct?

In that case, you need to add scale to the parameters dictionary for the custom layer. Then in your Swift class, scale will also be inside the parameters dictionary that is passed into your init(parameters) function. Store it inside a property and then in evaluate(inputs, outputs) read from that property again.

My blog post actually shows how to do this. ;-)




回答2:


I solved this problem on this way thanks to hollance's blog. On converting func, in this case, in convert_lambda, I should have added a scale parameter for the custom layer.

python code(converting Core ML)

def convert_lambda(layer):
    if layer.function == scaling:
        params = NeuralNetwork_pb2.CustomLayerParams()

        params.className = "scaling"
        params.description = "scaling input"

        # HERE!! This is important.
        params.parameters["scale"].doubleValue = layer.arguments['scale']

        return params
    else:
        return None

coreml_model = coremltools.converters.keras.convert(
    model,
    input_names="image",
    image_input_names="image",
    output_names="output",
    add_custom_layers=True,
    custom_conversion_functions={ "Lambda": convert_lambda })

swift code(Custom layer)

//custom MLCustomLayer `scaling` class
let scale: Float

required init(parameters: [String : Any]) throws {
    if let scale = parameters["scale"] as? Float {
        self.scale = scale
    } else {
        self.scale = 1.0
    }
    print(#function, parameters, self.scale)
    super.init()
}

func evaluate(inputs: [MLMultiArray], outputs: [MLMultiArray]) throws {

    for i in 0..<inputs.count {
        let input = inputs[i]
        let output = outputs[i]

        for j in 0..<input.count {
            let x = input[j].floatValue
            let y = x * self.scale
            output[j] = NSNumber(value: y)
        }
        //faster
        /*
        let count = input.count
        let inputPointer = UnsafeMutablePointer<Float>(OpaquePointer(input.dataPointer))
        let outputPointer = UnsafeMutablePointer<Float>(OpaquePointer(output.dataPointer))
        var scale = self.scale
        vDSP_vsmul(inputPointer, 1, &scale, outputPointer, 1, vDSP_Length(count))
        */
    }
}

Thank you.



来源:https://stackoverflow.com/questions/47987777/custom-layer-with-two-parameters-function-on-core-ml

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!