Tool/software:
Hello TI:
We have a .onnx model and it is HRNet. We use tidl_model_import.out convert to two bin file. And follow is our import txt file.
modelType = 2
numParamBits = 8
numFeatureBits = 8
#quantizationStyle = 3
quantizationStyle = 2
inputNetFile = "../../test/testvecs/sh_model/keypoint/model/mymodel.onnx"
outputNetFile = "../../test/testvecs/sh_model/keypoint/out/tidl_net_mymodel.bin"
outputParamsFile = "../../test/testvecs/sh_model/keypoint/out/tidl_io_mymodel_"
inData = "../../test/testvecs/sh_model/keypoint/in/detection_list.txt"
perfSimConfig = ../../test/testvecs/config/import/device_config.cfg
#perfSimConfig = ../../test/testvecs/sh_model/keypoint/in/device_config.cfg
inDataNorm = 1
inMean = 123.675 116.28 103.53
inScale = 0.017125 0.017507 0.017429
inWidth = 256
inHeight = 256
inNumChannels = 3
inDataFormat = 1
inResizeType = 1
numFrames = 1
inElementType = 0
Then we get two bin file. And we write a app, follow is our graph.
And we find that the result we get from pc emulation is correct. But the result was wrong when the we run in target.
We use same bin file in both emulation mode and target mode. Our model is HRNet which is used for face detection.
So we want to know whether or not emulation mode is different from target mode when we use tidl?
regards,