Skip to content

Trainer.parse_argparser does not yield sensible default for default_root_dir #1916

@jeremyjordan

Description

@jeremyjordan

🐛 Bug

Using Trainer.parse_argparser returns True for default_root_dir, however, a string is expected.

To Reproduce

Steps to reproduce the behavior:

>>> from pytorch_lightning import Trainer
>>> from argparse import ArgumentParser, Namespace
>>> parser = ArgumentParser(add_help=False)
>>> parser = Trainer.add_argparse_args(parent_parser=parser)
>>> args = Trainer.parse_argparser(parser)
>>> args
Namespace(accumulate_grad_batches=1, amp_level='O1', auto_lr_find=False, auto_scale_batch_size=False, auto_select_gpus=False, benchmark=False, check_val_every_n_epoch=1, checkpoint_callback=True, default_root_dir=True, deterministic=False, distributed_backend=True, early_stop_callback=False, fast_dev_run=False, gpus=<function Trainer._arg_default at 0x1219efdd0>, gradient_clip_val=0, log_gpu_memory=True, log_save_interval=100, logger=True, max_epochs=1000, max_steps=True, min_epochs=1, min_steps=True, num_nodes=1, num_processes=1, num_sanity_val_steps=2, overfit_pct=0.0, precision=32, print_nan_grads=False, process_position=0, profiler=True, progress_bar_callback=True, progress_bar_refresh_rate=1, reload_dataloaders_every_epoch=False, replace_sampler_ddp=True, resume_from_checkpoint=True, row_log_interval=10, terminate_on_nan=False, test_percent_check=1.0, tpu_cores=True, track_grad_norm=-1, train_percent_check=1.0, truncated_bptt_steps=True, val_check_interval=1.0, val_percent_check=1.0, weights_save_path=True, weights_summary='full')

Metadata

Metadata

Assignees

Labels

bugSomething isn't workinghelp wantedOpen to be worked on

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions