Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Command to generate submission folder gives error #229

Open
csemasthan opened this issue Feb 16, 2025 · 2 comments
Open

Command to generate submission folder gives error #229

csemasthan opened this issue Feb 16, 2025 · 2 comments

Comments

@csemasthan
Copy link

csemasthan commented Feb 16, 2025

After running the mlc find cache --tags=get,mlperf,inference,results,dir | xargs tree

I got directory changes... approx 2792 directories, 125496 files

After that I ran the below command, but it is giving error.

mlcuser@cf7c81bf5ff6:~$ mlcr generate,inference,submission --clean --preprocess_submission=yes --run_checker=yes --submitter=MLCommons --division=closed --env.MLC_DETERMINE_MEMORY_CONFIGURATION=yes --quiet
[2025-02-16 09:14:22,657 module.py:560 INFO] - * mlcr generate,inference,submission
[2025-02-16 09:14:22,664 module.py:560 INFO] -   * mlcr get,python3
[2025-02-16 09:14:22,667 module.py:1274 INFO] -        ! load /home/mlcuser/MLC/repos/local/cache/get-python3_bfd25e5f/mlc-cached-state.json
[2025-02-16 09:14:22,667 module.py:2220 INFO] - Path to Python: /home/mlcuser/venv/mlc/bin/python3
[2025-02-16 09:14:22,668 module.py:2220 INFO] - Python version: 3.10.12
[2025-02-16 09:14:22,684 module.py:560 INFO] -   * mlcr mlcommons,inference,src
[2025-02-16 09:14:22,687 module.py:1274 INFO] -        ! load /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-src_128307d7/mlc-cached-state.json
[2025-02-16 09:14:22,695 module.py:560 INFO] -   * mlcr get,sut,system-description
[2025-02-16 09:14:22,702 module.py:560 INFO] -     * mlcr detect,os
[2025-02-16 09:14:22,707 module.py:5334 INFO] -            ! cd /home/mlcuser
[2025-02-16 09:14:22,707 module.py:5335 INFO] -            ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-02-16 09:14:22,731 module.py:5481 INFO] -            ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-02-16 09:14:22,747 module.py:560 INFO] -     * mlcr detect,cpu
[2025-02-16 09:14:22,757 module.py:560 INFO] -       * mlcr detect,os
[2025-02-16 09:14:22,761 module.py:5334 INFO] -              ! cd /home/mlcuser
[2025-02-16 09:14:22,761 module.py:5335 INFO] -              ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-02-16 09:14:22,777 module.py:5481 INFO] -              ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-02-16 09:14:22,787 module.py:5334 INFO] -            ! cd /home/mlcuser
[2025-02-16 09:14:22,787 module.py:5335 INFO] -            ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/run.sh from tmp-run.sh
[2025-02-16 09:14:22,827 module.py:5481 INFO] -            ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/customize.py
[2025-02-16 09:14:22,839 module.py:560 INFO] -     * mlcr get,python3
[2025-02-16 09:14:22,840 module.py:1274 INFO] -          ! load /home/mlcuser/MLC/repos/local/cache/get-python3_bfd25e5f/mlc-cached-state.json
[2025-02-16 09:14:22,841 module.py:2220 INFO] - Path to Python: /home/mlcuser/venv/mlc/bin/python3
[2025-02-16 09:14:22,841 module.py:2220 INFO] - Python version: 3.10.12
[2025-02-16 09:14:22,851 module.py:560 INFO] -     * mlcr get,compiler
[2025-02-16 09:14:22,853 module.py:1274 INFO] -          ! load /home/mlcuser/MLC/repos/local/cache/get-llvm_c9c0a8fd/mlc-cached-state.json
[2025-02-16 09:14:22,859 module.py:560 INFO] -     * mlcr detect,sudo
[2025-02-16 09:14:22,899 module.py:5334 INFO] -            ! cd /home/mlcuser
[2025-02-16 09:14:22,899 module.py:5335 INFO] -            ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-sudo/run.sh from tmp-run.sh
[2025-02-16 09:14:22,903 module.py:5481 INFO] -            ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-sudo/customize.py
[2025-02-16 09:14:22,959 module.py:560 INFO] -     * mlcr get,generic-python-lib,_package.dmiparser
[2025-02-16 09:14:22,969 module.py:560 INFO] -       * mlcr get,python3
[2025-02-16 09:14:22,970 module.py:1274 INFO] -            ! load /home/mlcuser/MLC/repos/local/cache/get-python3_bfd25e5f/mlc-cached-state.json
[2025-02-16 09:14:22,970 module.py:2220 INFO] - Path to Python: /home/mlcuser/venv/mlc/bin/python3
[2025-02-16 09:14:22,970 module.py:2220 INFO] - Python version: 3.10.12
[2025-02-16 09:14:22,971 module.py:5334 INFO] -            ! cd /home/mlcuser
[2025-02-16 09:14:22,971 module.py:5335 INFO] -            ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh from tmp-run.sh
[2025-02-16 09:14:23,037 module.py:5481 INFO] -            ! call "detect_version" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/customize.py
          Detected version: 5.1
[2025-02-16 09:14:23,050 module.py:560 INFO] -       * mlcr get,python3
[2025-02-16 09:14:23,051 module.py:1274 INFO] -            ! load /home/mlcuser/MLC/repos/local/cache/get-python3_bfd25e5f/mlc-cached-state.json
[2025-02-16 09:14:23,051 module.py:2220 INFO] - Path to Python: /home/mlcuser/venv/mlc/bin/python3
[2025-02-16 09:14:23,051 module.py:2220 INFO] - Python version: 3.10.12
[2025-02-16 09:14:23,052 module.py:1274 INFO] -          ! load /home/mlcuser/MLC/repos/local/cache/get-generic-python-lib_9f67505c/mlc-cached-state.json
[2025-02-16 09:14:23,058 module.py:560 INFO] -     * mlcr get,cache,dir,_name.mlperf-inference-sut-descriptions
[2025-02-16 09:14:23,060 module.py:1274 INFO] -          ! load /home/mlcuser/MLC/repos/local/cache/get-cache-dir_c0252835/mlc-cached-state.json
Generating SUT description file for cf7c81bf5ff6
[2025-02-16 09:14:23,065 module.py:5334 INFO] -          ! cd /home/mlcuser
[2025-02-16 09:14:23,065 module.py:5335 INFO] -          ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-sut-description/detect_memory.sh from tmp-run.sh
/dev/mem: No such file or directory
[2025-02-16 09:14:23,100 module.py:5481 INFO] -          ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-sut-description/customize.py
[2025-02-16 09:14:23,110 module.py:560 INFO] -   * mlcr install,pip-package,for-mlc-python,_package.tabulate
[2025-02-16 09:14:23,112 module.py:1274 INFO] -        ! load /home/mlcuser/MLC/repos/local/cache/install-pip-package-for-mlc-python_63e717fc/mlc-cached-state.json
[2025-02-16 09:14:23,117 module.py:560 INFO] -   * mlcr get,mlperf,inference,utils
[2025-02-16 09:14:23,135 module.py:560 INFO] -     * mlcr get,mlperf,inference,src
[2025-02-16 09:14:23,136 module.py:1274 INFO] -          ! load /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-src_128307d7/mlc-cached-state.json
[2025-02-16 09:14:23,141 module.py:5481 INFO] -          ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-utils/customize.py
[2025-02-16 09:14:23,152 module.py:560 INFO] -   * mlcr get,mlperf,results,dir,local
[2025-02-16 09:14:23,194 module.py:1274 INFO] -        ! load /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733/mlc-cached-state.json
[2025-02-16 09:14:23,203 module.py:560 INFO] -   * mlcr get,mlperf,submission,dir
[2025-02-16 09:14:23,212 module.py:1274 INFO] -        ! load /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-submission-dir_5a0bdd1f/mlc-cached-state.json
[2025-02-16 09:14:23,234 module.py:5481 INFO] -        ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-submission/customize.py
=================================================
Cleaning /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-submission-dir_5a0bdd1f/mlperf-inference-submission ...
=================================================
* MLPerf inference submission dir: /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-submission-dir_5a0bdd1f/mlperf-inference-submission
* MLPerf inference results dir: /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733/valid_results
* MLPerf inference division: closed
* MLPerf inference submitter: MLCommons
The SUT folder name for submission generation is: 014f429ca342-reference-cpu-onnxruntime-v1.20.1-default_config
* MLPerf inference model: resnet50
WARNING: system_meta.json was not found in the SUT root or mode directory inside the results folder. CM is automatically creating one using the system defaults. Please modify them as required.
{'accelerator_frequency': '', 'accelerator_host_interconnect': 'N/A', 'accelerator_interconnect': 'N/A', 'accelerator_interconnect_topology': '', 'accelerator_memory_capacity': 'N/A', 'accelerator_memory_configuration': 'N/A', 'accelerator_model_name': 'N/A', 'accelerator_on-chip_memories': '', 'accelerators_per_node': '0', 'cooling': 'air', 'division': 'closed', 'framework': 'pytorch', 'host_memory_capacity': '7.8G', 'host_memory_configuration': 'undefined', 'host_network_card_count': '1', 'host_networking': 'Gig Ethernet', 'host_networking_topology': 'N/A', 'host_processor_caches': 'L1d cache: 384 KiB (8 instances), L1i cache: 256 KiB (8 instances), L2 cache: 10 MiB (8 instances), L3 cache: 12 MiB (1 instance)', 'host_processor_core_count': '8', 'host_processor_frequency': 'undefined', 'host_processor_interconnect': '', 'host_processor_model_name': '12th Gen Intel(R) Core(TM) i5-1240P', 'host_processors_per_node': '1', 'host_storage_capacity': '2.1T', 'host_storage_type': 'SSD', 'hw_notes': '', 'number_of_nodes': '1', 'operating_system': 'Ubuntu 22.04 (linux-5.15.167.4-microsoft-standard-WSL2-glibc2.35)', 'other_software_stack': 'Python: 3.10.12, LLVM-15.0.6, Using Docker ', 'status': 'available', 'submitter': 'MLCommons', 'sw_notes': '', 'system_name': 'cf7c81bf5ff6 (auto detected)', 'system_type': 'edge', 'system_type_detail': 'edge server'}
Traceback (most recent call last):
  File "/home/mlcuser/venv/mlc/bin/mlcr", line 8, in <module>
    sys.exit(mlcr())
  File "/home/mlcuser/venv/mlc/lib/python3.10/site-packages/mlc/main.py", line 1631, in mlcr
    main()
  File "/home/mlcuser/venv/mlc/lib/python3.10/site-packages/mlc/main.py", line 1713, in main
    res = method(run_args)
  File "/home/mlcuser/venv/mlc/lib/python3.10/site-packages/mlc/main.py", line 1472, in run
    return self.call_script_module_function("run", run_args)
  File "/home/mlcuser/venv/mlc/lib/python3.10/site-packages/mlc/main.py", line 1462, in call_script_module_function
    raise ScriptExecutionError(f"Script {function_name} execution failed. Error : {error}")
mlc.main.ScriptExecutionError: Script run execution failed. Error : user.conf missing in both paths: /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733/valid_results/014f429ca342-reference-cpu-onnxruntime-v1.20.1-default_config/resnet50/offline/performance/run_1/user.conf and /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733/valid_results/014f429ca342-reference-cpu-onnxruntime-v1.20.1-default_config/resnet50/offline/user.conf
@arjunsuresh
Copy link
Collaborator

/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733/valid_results/014f429ca342-reference-cpu-onnxruntime-v1.20.1-default_config/resnet50/offline/performance/run_1/user.conf

The above directory is supposed to have user.conf file. Looks like /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733/valid_results/directory is having some partial results from a previous docker run. You can delete the stale directories inside /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733/valid_results/ folder.

@anandhu-eng
Copy link
Contributor

Hi @csemasthan , is the issue resolved?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants