Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Type error] cache_requests parameter is mismatch. #493

Open
minimi-kei opened this issue Jan 9, 2025 · 1 comment
Open

[Type error] cache_requests parameter is mismatch. #493

minimi-kei opened this issue Jan 9, 2025 · 1 comment

Comments

@minimi-kei
Copy link

The cache_requests Type in main and the cache_requests Type in simple_evaluation are different.

In main, the type of cache_requests is string type, and in simple_evaluation, the type is boolean. Please check which one is correct.

If the string type used in main is correct, please also explain the meaning of each value.

main.py
parser.add_argument( "--cache_requests", type=str, default=None, choices=["true", "refresh", "delete"], help="Speed up evaluation by caching the building of dataset requests.None if not caching.", )

evaluator.py

def simple_evaluate( model, model_args: Optional[Union[str, dict]] = None, tasks: Optional[List[Union[str, dict, object]]] = None, num_fewshot: Optional[int] = None, batch_size: Optional[Union[int, str]] = None, max_batch_size: Optional[int] = None, device: Optional[str] = None, use_cache: Optional[str] = None, cache_requests: bool = False, rewrite_requests_cache: bool = False, delete_requests_cache: bool = False, limit: Optional[Union[int, float]] = None, bootstrap_iters: int = 100000, check_integrity: bool = False, write_out: bool = False, log_samples: bool = True, evaluation_tracker: Optional[EvaluationTracker] = None, system_instruction: Optional[str] = None, apply_chat_template: bool = False, fewshot_as_multiturn: bool = False, gen_kwargs: Optional[str] = None, task_manager: Optional[TaskManager] = None, verbosity: str = "INFO", predict_only: bool = False, random_seed: int = 0, numpy_random_seed: int = 1234, torch_random_seed: int = 1234, fewshot_random_seed: int = 1234, datetime_str: str = get_datetime_str(), cli_args=None, ):

@kcz358
Copy link
Collaborator

kcz358 commented Jan 17, 2025

Hi, I think cache requests has not been tested in this framework and is simply migrated from the original lm-eval.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants