Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add pre_condtime for preconditioning and allow multiple randrw workloads within the same YAML #326

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

lee-j-sanders
Copy link

@lee-j-sanders lee-j-sanders commented Feb 12, 2025

This PR only changes the functionality of workloads.

If workloads are in use, and there is a preconditioning step within the YAML, often users will want to run preconditioning for say 10 minutes, where as each other workload will run for a shorter period of time, this PR allows the user to override time with precond_time and specify precond within the preconditioning step to use the different time attribute.

Additionally, workloads with randrw only let you specify 1 mixed workload within a YAML as the same directory name was used for the subsequent randrw workloads, thus overwriting all the previous results in the archive directory with the later tests. This PR adds the readwrite ratio into the directory name if the mode is randrw.

The directory name for 100% read and 100% write tests are unaffected.

Chris Harris and others added 3 commits February 11, 2025 18:13
    - specify a precond_time in the preconditioning workload of a RBD YAML
      that is using the workloads feature. So you can have a different
      length of preconditioning time compared to a regular runtime (time in
      the YAML)
    - Fixes an issue where if you specified more than one workload with the
      same block size, but with different rwmixread ratios, the 2nd workload
      would overwrite the 1st workloads results log file.

Signed-off-by: lee-j-sanders <[email protected]>
@lee-j-sanders lee-j-sanders self-assigned this Feb 12, 2025
@lee-j-sanders
Copy link
Author

lee-j-sanders commented Feb 12, 2025

Note, at the moment this is based on #324 due to changes in the same area - at the time of writing PR324 is unmerged hence the PR324 changes are appearing in the review.

@lee-j-sanders
Copy link
Author

lee-j-sanders commented Feb 12, 2025

================================================================================================== test session starts ==================================================================================================
platform linux -- Python 3.9.21, pytest-8.3.4, pluggy-1.5.0
rootdir: /cbt.lee
collected 331 items

tests/test_benchmarkfactory.py ... [ 0%]
tests/test_bm_cephtestrados.py .................. [ 6%]
tests/test_bm_fio.py ......................................... [ 18%]
tests/test_bm_getput.py .............................. [ 27%]
tests/test_bm_hsbench.py .............................. [ 36%]
tests/test_bm_kvmrbdfio.py ................................... [ 47%]
tests/test_bm_librbdfio.py ..................................................... [ 63%]
tests/test_bm_nullbench.py ............ [ 67%]
tests/test_bm_radosbench.py ............................... [ 76%]
tests/test_bm_rawfio.py ................................ [ 86%]
tests/test_bm_rbdfio.py .................................... [ 96%]
tests/test_common.py .sss [ 98%]
tests/test_common_output_formatter.py ...... [100%]

=================================================================================================== warnings summary ====================================================================================================
post_processing/formatter/test_run_result.py:21
/cbt.lee/post_processing/formatter/test_run_result.py:21: PytestCollectionWarning: cannot collect test class 'TestRunResult' because it has a init constructor (from: tests/test_common_output_formatter.py)
class TestRunResult:

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
======================================================================================= 328 passed, 3 skipped, 1 warning in 0.46s =======================================================================================

The warning looks to be a problem in master with the common output formatter stuff, not anything I've introduced.

I will look into this with @harriscr before merging.

Copy link
Contributor

@perezjosibm perezjosibm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think the new option for precond_time is needed, I invite the submitter to try the suggested experiment without the change proposed to help convince himself.

@@ -29,6 +29,8 @@ def __init__(self, archive_dir, cluster, config):
self.recov_test_type = config.get('recov_test_type', 'blocking')
self.data_pool_profile = config.get('data_pool_profile', None)
self.time = config.get('time', None)
self.precond_time = config.get('precond_time',None )
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you look at the way workloads work, basically they temporarily overwrite the global FIO options defined in the test plan .yaml (normally located before the workloads section). So within a precondition one can set a time value that only affects the precondition, which is then restored to the global. Then, another workload step would do the same.

In other words, there is not actual need to introduce yet another 'precond_time'!

But you can do the experiment: define two different time values in a precond and a workload step, from the global ones, with the existing code (that is, without this change) to convince yourself.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And this is also mentioned in the documentation https://github.com/ceph/cbt/blob/master/docs/Workloads.md

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants