How do I specify what parameters i can accept in my command line Interpreter?

I am working on some bootcamp assignment and I have reached a point that I feel stuck.
So basically I have built a command line interface for an airbnb clone. and the question i am stuck on is that, I am required to update my cmd interpreter to accept this command.


That retrieves all instances of a class.
In the same project I have built another method that does this but accepts this as the argument.

all <class_name>

This is my code for that

def do_all(self, arg):
        """Print all string representation of all instances."""
        args = shlex.split(arg)
        if len(args) > 0 and args[0] not in classes:
            print("** class doesn't exist **")
            obj_dict = []
            for obj in
                # Checks if class name exists
                if len(args) > 0 and args[0] == obj.__class__.__name__:
                    # Append string representation of object
                elif len(args) == 0:  # For all objects
            # print(obj_dict) -- Rollback to this if not working
            # Added this Feature to print each object in a new line
            for i in range(len(obj_dict)):
                if i == len(obj_dict) - 1:

Let me know if You want to see the whole class.
Any help will be appreciated

1 Like

Hey Jamal,

I would recommend exploring the argparse module that you can find here.


I agree; argparse is an excellent choice and is part of the standard library.


I don’t like argparse. It’s too dynamic, which means it doesn’t play nicely with static analyzers.

I prefer to use sys.argv directly, especially in large or complex projects where a static analyzer is vital.

For example:

This doesn’t have to be the case. In my recent projects, I define typed dataclasses for agument groups, with a mixin that can siphon values from the argparse namespace object based on the dataclass fields. I do this conversion nearly immediately, so the application proper never sees anything except the typed objects.

1 Like

The easiest way to do this is by using typer Arguments with string Enums as type annotations.

It sounds interesting. Do you have some publicly available example code?

I find it hard to believe that using argparse will prevent you from using static typing. The whole point of Python’s gradual typing system is that if it can’t check the one or two functions, that doesn’t stop you from checking the rest of the program.

But in any case, if static typing is stopping you from using a well-known, effective and powerful library, you should start to consider whether or not the benefits of static typing are really worth the costs it imposes. Static typing can only find the smallest fraction of potential errors: only static errors (TypeError and AttributeError, and maybe a few others like NameError). Errors which, for the most part, are obvious the first time you try to run the program.

As we know from the old joke about certain corporations whose software testing consists of “It compiles? Quick, ship it!”, passing your static testing doesn’t come even within the same galaxy of proving that the program is correct.

You need to test the correctness of your command line argument processing code, for logic errors and other correctness tests that static analysis can’t find. argparse is heavily tested. has more than 4500 sloc of tests. Will your attempt to reinvent the wheel come even close to being as well tested?

In any case, if you don’t like argparse, you can use getopt for a simpler library that might suit you better.

I am amused that your idea of a “large or complex project” where static type checking is “vital” is a straight-forward script of less than sloc and completely lacking any static type hints.


It sounds interesting. Do you have some publicly available example code?

Sure, in rough outline, first a dataclass for some option group:

class Logging:
    def __post_init__(self, **kw: dict[str, Any]) -> None:
        # fixup logdir to be real Path, have to use __setattr__ for frozen
        if self.logdir:
            object.__setattr__(self, "logdir", Path(self.logdir))

    user_logging_levels: str | None
    logdir: Path
    log_to_file: bool
    keep_logs: bool

A tiny helper function to initialize any dataclass from an object (e.g. an argparse namespace):

# This seems like it ought to be in stdlib
class DataclassProtocol(Protocol):
    __dataclass_fields__: dict[str, Field[Any]]

T = TypeVar("T", bound=DataclassProtocol)

def object_to_dataclass(obj: object, typ: Type[T]) -> T:
    kws = {name: getattr(obj, name) for name in typ.__dataclass_fields__}
    return typ(**kws)

Then I have a Config that accepts an argv and initializes all the sub-groups of options (just one shown here, but there are usually several)

class Config:

    def __init__(self, argv: ArgList) -> None:
        args, extra = parser.parse_known_args(self.argv[1:]

        self.user_opts = tuple(extra)

        # there are several of these in real use
        self.logging = object_to_dataclass(args, Logging)

So you need some tests to maintain and verify that the Config(sys.argv) step does everything it needs to. I have basic tests for each dataclass and “spy” tests to confirm that initializing a Config calls object_to_dataclass for everything it is supposed to, and probably good to have some tests passing real “argv” lists to Config. But all in all much more comforting and less work than the thought of reinventing argparse from scratch.

The larger and/or more complex the project, the more static analysis becomes important. Argparse pulls you in the opposite direction. It does not completely prevent static analysis, nor did I say it did. But you would know that if you gave it a moment of thought, so why am I bothering to explain?

argparse is a relatively recent tool, and Python’s gradual typing even more so. I seem to have offended you somehow. Did you perhaps contribute to argparse? It will never match the power of parsing sys.argv, of course. Argparse is more analogous to a DSL, like SQL or HTML. Parsing sys.argv gives you the full power of Python.

I didn’t say anything that would come close to “it passes the static analyzer, quick ship it.” Why are you setting up strawperson arguments?

Actually, static analysis can check types very well, if you parse sys.argv directly, because you are presented with str’s and you convert them explicitly to what you need in a way that about any static analyzer can follow. This is simply about making the best use of your tools. Again, here you are dragging my words in the direction of something which simply isn’t true. What is your true goal here? Did I offend you somehow?

getopt is similarly dynamic, but again, I think you know that. You neglected to trot out optparse - that would be a similar strawperson argument.

As to the example of a large or complex project, you are again setting up a strawperson argument. I gave an illustration of the general approach, not an example of a large of complex project - a large or complex project wouldn’t be a very good example of the approach, would it? You almost certainly know that though.

Do you, for some odd reason, require an example of a large or complex project from me? It has nothing to do with the point I was making, but I can provide them if you’re continuing to feel ornery.

Before you you reply, perhaps you would do well to take a deep breath, and think to yourself “Is this so easy to refute I may as well not post it” and “Does this have anything to do with the post I’m replying to?” Your arguments so far seem to border on the ad hominem.

That was supposed to be less than 150 sloc. Oops, sorry :slight_smile:

FWIW the real use case for my approach above is for a driver program to correctly launch a custom Python interpreter for automated distributed computing. This top-level driver has to string together:

  • a process launcher (e.g. mpirun or jsrun) and its command line args
  • a shell script that binds CPU/GPU/memory affinity and its command line args,
  • a few different CPU and/or GPU profilers and their args
  • a readline wrapper
  • the custom Python interpreter itself and its args
  • the actual script that the user wants to run with that custom interpreter, and any command line args the user wants their script to have
  • finally profiling log processors on exit.

All of that complicated orchestration rests entirely on (very) many command line arguments that the top-level driver has to handle. I inherited a monolithic script with no tests or structure. This technique was the foundation of taming it for both mypy and (thousands of) unit tests, without reinventing wheels.

Hey guys,

Can someone please summarise why argparse is not recommended? I use it heavily in my projects.

It is not ‘not recommended’ by the core developers. One person above does not like it for he reason given.


Alright, got it. Thank you.


Use it if you want to. It’s brief.

If you want to get the most out of your static analysis, don’t use it. There’s a somewhat wordy way around this, but it probably belongs in argparse itself.

argparse was added to the standard library more than 12 years ago. In Python-land, that’s not ‘relatively recent’ :slight_smile:

1 Like

Relative to what? I started with Python 29 years ago. :slight_smile:

1 Like