Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Scs4onnx | 43 | 1 | 7 months ago | 19 | May 25, 2022 | mit | Python | |||
A very simple tool that compresses the overall size of the ONNX model by aggregating duplicate constant values as much as possible. | ||||||||||
Onnx2json | 18 | 2 months ago | mit | Python | ||||||
Exports the ONNX file to a JSON file and JSON dict. | ||||||||||
Mlflow Redisai | 18 | 2 years ago | 4 | apache-2.0 | Python | |||||
RedisAI integration for MLFlow | ||||||||||
Json2onnx | 13 | 7 months ago | mit | Python | ||||||
Converts a JSON file to an ONNX file. | ||||||||||
Scc4onnx | 13 | 1 | 7 months ago | 5 | May 25, 2022 | mit | Python | |||
Very simple NCHW and NHWC conversion tool for ONNX. Change to the specified input order for each and every input OP. Also, change the channel order of RGB and BGR. Simple Channel Converter for ONNX. | ||||||||||
Sne4onnx | 11 | 3 months ago | mit | Python | ||||||
A very simple tool for situations where optimization with onnx-simplifier would exceed the Protocol Buffers upper file size limit of 2GB, or simply to separate onnx files to any size you want. | ||||||||||
Snc4onnx | 8 | a month ago | mit | Python | ||||||
Simple tool to combine(merge) onnx models. Simple Network Combine Tool for ONNX. | ||||||||||
Onnx Opcounter | 8 | 2 years ago | n,ull | apache-2.0 | Python | |||||
Count number of parameters / MACs / FLOPS for ONNX models. | ||||||||||
Sio4onnx | 7 | 7 months ago | mit | Python | ||||||
Simple tool to change the INPUT and OUTPUT shape of ONNX. | ||||||||||
Snd4onnx | 6 | 7 months ago | mit | Python | ||||||
Simple node deletion tool for onnx. |
A very simple tool for situations where optimization with onnx-simplifier would exceed the Protocol Buffers upper file size limit of 2GB, or simply to separate onnx files to any size you want. Simple Network Extraction for ONNX.
PINTO0309/simple-onnx-processing-tools
onnx.utils.extractor.extract_model
because it is very slow and I implement my own model separation logic.### option
$ echo export PATH="~/.local/bin:$PATH" >> ~/.bashrc \
&& source ~/.bashrc
### run
$ pip install -U onnx \
&& python3 -m pip install -U onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com
&& pip install -U sne4onnx
PINTO0309/simple-onnx-processing-tools
$ sne4onnx -h
usage:
sne4onnx [-h]
-if INPUT_ONNX_FILE_PATH
-ion INPUT_OP_NAMES
-oon OUTPUT_OP_NAMES
[-of OUTPUT_ONNX_FILE_PATH]
[-n]
optional arguments:
-h, --help
show this help message and exit
-if INPUT_ONNX_FILE_PATH, --input_onnx_file_path INPUT_ONNX_FILE_PATH
Input onnx file path.
-ion INPUT_OP_NAMES [INPUT_OP_NAMES ...], --input_op_names INPUT_OP_NAMES [INPUT_OP_NAMES ...]
List of OP names to specify for the input layer of the model.
e.g. --input_op_names aaa bbb ccc
-oon OUTPUT_OP_NAMES [OUTPUT_OP_NAMES ...], --output_op_names OUTPUT_OP_NAMES [OUTPUT_OP_NAMES ...]
List of OP names to specify for the output layer of the model.
e.g. --output_op_names ddd eee fff
-of OUTPUT_ONNX_FILE_PATH, --output_onnx_file_path OUTPUT_ONNX_FILE_PATH
Output onnx file path. If not specified, extracted.onnx is output.
-n, --non_verbose
Do not show all information logs. Only error logs are displayed.
$ python
>>> from sne4onnx import extraction
>>> help(extraction)
Help on function extraction in module sne4onnx.onnx_network_extraction:
extraction(
input_op_names: List[str],
output_op_names: List[str],
input_onnx_file_path: Union[str, NoneType] = '',
onnx_graph: Union[onnx.onnx_ml_pb2.ModelProto, NoneType] = None,
output_onnx_file_path: Union[str, NoneType] = '',
non_verbose: Optional[bool] = False
) -> onnx.onnx_ml_pb2.ModelProto
Parameters
----------
input_op_names: List[str]
List of OP names to specify for the input layer of the model.
e.g. ['aaa','bbb','ccc']
output_op_names: List[str]
List of OP names to specify for the output layer of the model.
e.g. ['ddd','eee','fff']
input_onnx_file_path: Optional[str]
Input onnx file path.
Either input_onnx_file_path or onnx_graph must be specified.
onnx_graph If specified, ignore input_onnx_file_path and process onnx_graph.
onnx_graph: Optional[onnx.ModelProto]
onnx.ModelProto.
Either input_onnx_file_path or onnx_graph must be specified.
onnx_graph If specified, ignore input_onnx_file_path and process onnx_graph.
output_onnx_file_path: Optional[str]
Output onnx file path.
If not specified, .onnx is not output.
Default: ''
non_verbose: Optional[bool]
Do not show all information logs. Only error logs are displayed.
Default: False
Returns
-------
extracted_graph: onnx.ModelProto
Extracted onnx ModelProto
$ sne4onnx \
--input_onnx_file_path input.onnx \
--input_op_names aaa bbb ccc \
--output_op_names ddd eee fff \
--output_onnx_file_path output.onnx
from sne4onnx import extraction
extracted_graph = extraction(
input_op_names=['aaa','bbb','ccc'],
output_op_names=['ddd','eee','fff'],
input_onnx_file_path='input.onnx',
output_onnx_file_path='output.onnx',
)
from sne4onnx import extraction
extracted_graph = extraction(
input_op_names=['aaa','bbb','ccc'],
output_op_names=['ddd','eee','fff'],
onnx_graph=graph,
output_onnx_file_path='output.onnx',
)
$ sne4onnx \
--input_onnx_file_path hitnet_sf_finalpass_720x1280.onnx \
--input_op_names 0 1 \
--output_op_names 497 785 \
--output_onnx_file_path hitnet_sf_finalpass_720x960_head.onnx
https://github.com/PINTO0309/simple-onnx-processing-tools/issues