Tensorflow version - 1.14.0
Barracuda version - 1.0.0
ML agents version - 1.0.2
I am trying to convert a custom trained tensorflow model on ssd_mobilenet_v1_coco to .nn format to use in Unity. I have tried tensorflow_to_barracuda.py code with both frozen_inference_graph.pb and saved_model.pb files and these are the errors I got.
When converting saved_model.pb
Traceback (most recent call last):
File “barracuda-release-release-1.0.0/Tools/tensorflow_to_barracuda.py”, line 21, in
tf2bc.convert(args.source_file, args.target_file, args.trim_unused_by_output, args)
File “C:\Users\Devi\devi\tensorflow\models\research\object_detection\inference_graph-cpu-1.14\barracuda-release-release-1.0.0\Tools\tensorflow_to_barracuda.py”, line 1351, in convert
i_model.ParseFromString(f.read())
File “C:\Users\Devi.conda\envs\tensorflow1\lib\site-packages\google\protobuf\message.py”, line 199, in ParseFromString
return self.MergeFromString(serialized)
File “C:\Users\Devi.conda\envs\tensorflow1\lib\site-packages\google\protobuf\internal\python_message.py”, line 1134, in MergeFromString
if self._InternalParse(serialized, 0, length) != length:
File “C:\Users\Devi.conda\envs\tensorflow1\lib\site-packages\google\protobuf\internal\python_message.py”, line 1201, in InternalParse
pos = field_decoder(buffer, new_pos, end, self, field_dict)
File “C:\Users\Devi.conda\envs\tensorflow1\lib\site-packages\google\protobuf\internal\decoder.py”, line 738, in DecodeField
if value._InternalParse(buffer, pos, new_pos) != new_pos:
File “C:\Users\Devi.conda\envs\tensorflow1\lib\site-packages\google\protobuf\internal\python_message.py”, line 1201, in InternalParse
pos = field_decoder(buffer, new_pos, end, self, field_dict)
File “C:\Users\Devi.conda\envs\tensorflow1\lib\site-packages\google\protobuf\internal\decoder.py”, line 717, in DecodeRepeatedField
if value.add()._InternalParse(buffer, pos, new_pos) != new_pos:
File “C:\Users\Devi.conda\envs\tensorflow1\lib\site-packages\google\protobuf\internal\python_message.py”, line 1201, in InternalParse
pos = field_decoder(buffer, new_pos, end, self, field_dict)
File “C:\Users\Devi.conda\envs\tensorflow1\lib\site-packages\google\protobuf\internal\decoder.py”, line 872, in DecodeMap
if submsg._InternalParse(buffer, pos, new_pos) != new_pos:
File “C:\Users\Devi.conda\envs\tensorflow1\lib\site-packages\google\protobuf\internal\python_message.py”, line 1188, in InternalParse
buffer, new_pos, wire_type) # pylint: disable=protected-access
File “C:\Users\Devi.conda\envs\tensorflow1\lib\site-packages\google\protobuf\internal\decoder.py”, line 973, in _DecodeUnknownField
(data, pos) = _DecodeUnknownFieldSet(buffer, pos)
File “C:\Users\Devi.conda\envs\tensorflow1\lib\site-packages\google\protobuf\internal\decoder.py”, line 952, in _DecodeUnknownFieldSet
(data, pos) = _DecodeUnknownField(buffer, pos, wire_type)
File “C:\Users\Devi.conda\envs\tensorflow1\lib\site-packages\google\protobuf\internal\decoder.py”, line 977, in _DecodeUnknownField
raise _DecodeError(‘Wrong wire type in tag.’)
google.protobuf.message.DecodeError: Wrong wire type in tag.
When converting frozen_inference_graph.pb
Sorting model, may take a while… Done!
IGNORED: Cast unknown layer
IGNORED: Shape unknown layer
IGNORED: TensorArrayV3 unknown layer
IGNORED: Shape unknown layer
IGNORED: Range unknown layer
IGNORED: TensorArrayScatterV3 unknown layer
IGNORED: TensorArrayV3 unknown layer
IGNORED: TensorArrayV3 unknown layer
IGNORED: Enter unknown layer
IGNORED: Enter unknown layer
IGNORED: Enter unknown layer
IGNORED: Enter unknown layer
WARNING: rank unknown for tensor Preprocessor/map/while/Switch:1 while processing node Preprocessor/map/while/Identity
Traceback (most recent call last):
File “barracuda-release-release-1.0.0/Tools/tensorflow_to_barracuda.py”, line 21, in
tf2bc.convert(args.source_file, args.target_file, args.trim_unused_by_output, args)
File “C:\Users\Devi\devi\tensorflow\models\research\object_detection\inference_graph-cpu-1.14\barracuda-release-release-1.0.0\Tools\tensorflow_to_barracuda.py”, line 1364, in convert
process_model(i_model, args)
File “C:\Users\Devi\devi\tensorflow\models\research\object_detection\inference_graph-cpu-1.14\barracuda-release-release-1.0.0\Tools\tensorflow_to_barracuda.py”, line 1226, in process_model
process_layer(node, o_context, args)
File “C:\Users\Devi\devi\tensorflow\models\research\object_detection\inference_graph-cpu-1.14\barracuda-release-release-1.0.0\Tools\tensorflow_to_barracuda.py”, line 1095, in process_layer
assert(-1 not in input_ranks) # for rank() lambda all input ranks have to be known (not -1)
AssertionError
I have searched for a solution but didn’t find any. Please help me. @xiaomaogy @Mantas-Puida