syntax – Can someone explain __all__ in Python?

syntax – Can someone explain __all__ in Python?

Linked to, but not explicitly mentioned here, is exactly when __all__ is used. It is a list of strings defining what symbols in a module will be exported when from <module> import * is used on the module.

For example, the following code in a foo.py explicitly exports the symbols bar and baz:

__all__ = [bar, baz]

waz = 5
bar = 10
def baz(): return baz

These symbols can then be imported like so:

from foo import *

print(bar)
print(baz)

# The following will trigger an exception, as waz is not exported by the module
print(waz)

If the __all__ above is commented out, this code will then execute to completion, as the default behaviour of import * is to import all symbols that do not begin with an underscore, from the given namespace.

Reference: https://docs.python.org/tutorial/modules.html#importing-from-a-package

NOTE: __all__ affects the from <module> import * behavior only. Members that are not mentioned in __all__ are still accessible from outside the module and can be imported with from <module> import <member>.

Its a list of public objects of that module, as interpreted by import *. It overrides the default of hiding everything that begins with an underscore.

syntax – Can someone explain __all__ in Python?

Explain all in Python?

I keep seeing the variable __all__ set in different __init__.py files.

What does this do?

What does __all__ do?

It declares the semantically public names from a module. If there is a name in __all__, users are expected to use it, and they can have the expectation that it will not change.

It also will have programmatic effects:

import *

__all__ in a module, e.g. module.py:

__all__ = [foo, Bar]

means that when you import * from the module, only those names in the __all__ are imported:

from module import *               # imports foo and Bar

Documentation tools

Documentation and code autocompletion tools may (in fact, should) also inspect the __all__ to determine what names to show as available from a module.

__init__.py makes a directory a Python package

From the docs:

The __init__.py files are required to make Python treat the directories as containing packages; this is done to prevent directories with a common name, such as string, from unintentionally hiding valid modules that occur later on the module search path.

In the simplest case, __init__.py can just be an empty file, but it can also execute initialization code for the package or set the __all__ variable.

So the __init__.py can declare the __all__ for a package.

Managing an API:

A package is typically made up of modules that may import one another, but that are necessarily tied together with an __init__.py file. That file is what makes the directory an actual Python package. For example, say you have the following files in a package:

package
├── __init__.py
├── module_1.py
└── module_2.py

Lets create these files with Python so you can follow along – you could paste the following into a Python 3 shell:

from pathlib import Path

package = Path(package)
package.mkdir()

(package / __init__.py).write_text(
from .module_1 import *
from .module_2 import *
)

package_module_1 = package / module_1.py
package_module_1.write_text(
__all__ = [foo]
imp_detail1 = imp_detail2 = imp_detail3 = None
def foo(): pass
)

package_module_2 = package / module_2.py
package_module_2.write_text(
__all__ = [Bar]
imp_detail1 = imp_detail2 = imp_detail3 = None
class Bar: pass
)

And now you have presented a complete api that someone else can use when they import your package, like so:

import package
package.foo()
package.Bar()

And the package wont have all the other implementation details you used when creating your modules cluttering up the package namespace.

__all__ in __init__.py

After more work, maybe youve decided that the modules are too big (like many thousands of lines?) and need to be split up. So you do the following:

package
├── __init__.py
├── module_1
│   ├── foo_implementation.py
│   └── __init__.py
└── module_2
    ├── Bar_implementation.py
    └── __init__.py

First make the subpackage directories with the same names as the modules:

subpackage_1 = package / module_1
subpackage_1.mkdir()
subpackage_2 = package / module_2
subpackage_2.mkdir()

Move the implementations:

package_module_1.rename(subpackage_1 / foo_implementation.py)
package_module_2.rename(subpackage_2 / Bar_implementation.py)

create __init__.pys for the subpackages that declare the __all__ for each:

(subpackage_1 / __init__.py).write_text(
from .foo_implementation import *
__all__ = [foo]
)
(subpackage_2 / __init__.py).write_text(
from .Bar_implementation import *
__all__ = [Bar]
)

And now you still have the api provisioned at the package level:

>>> import package
>>> package.foo()
>>> package.Bar()
<package.module_2.Bar_implementation.Bar object at 0x7f0c2349d210>

And you can easily add things to your API that you can manage at the subpackage level instead of the subpackages module level. If you want to add a new name to the API, you simply update the __init__.py, e.g. in module_2:

from .Bar_implementation import *
from .Baz_implementation import *
__all__ = [Bar, Baz]

And if youre not ready to publish Baz in the top level API, in your top level __init__.py you could have:

from .module_1 import *       # also constrained by __all__s
from .module_2 import *       # in the __init__.pys
__all__ = [foo, Bar]     # further constraining the names advertised

and if your users are aware of the availability of Baz, they can use it:

import package
package.Baz()

but if they dont know about it, other tools (like pydoc) wont inform them.

You can later change that when Baz is ready for prime time:

from .module_1 import *
from .module_2 import *
__all__ = [foo, Bar, Baz]

Prefixing _ versus __all__:

By default, Python will export all names that do not start with an _. You certainly could rely on this mechanism. Some packages in the Python standard library, in fact, do rely on this, but to do so, they alias their imports, for example, in ctypes/__init__.py:

import os as _os, sys as _sys

Using the _ convention can be more elegant because it removes the redundancy of naming the names again. But it adds the redundancy for imports (if you have a lot of them) and it is easy to forget to do this consistently – and the last thing you want is to have to indefinitely support something you intended to only be an implementation detail, just because you forgot to prefix an _ when naming a function.

I personally write an __all__ early in my development lifecycle for modules so that others who might use my code know what they should use and not use.

Most packages in the standard library also use __all__.

When avoiding __all__ makes sense

It makes sense to stick to the _ prefix convention in lieu of __all__ when:

  • Youre still in early development mode and have no users, and are constantly tweaking your API.
  • Maybe you do have users, but you have unittests that cover the API, and youre still actively adding to the API and tweaking in development.

An export decorator

The downside of using __all__ is that you have to write the names of functions and classes being exported twice – and the information is kept separate from the definitions. We could use a decorator to solve this problem.

I got the idea for such an export decorator from David Beazleys talk on packaging. This implementation seems to work well in CPythons traditional importer. If you have a special import hook or system, I do not guarantee it, but if you adopt it, it is fairly trivial to back out – youll just need to manually add the names back into the __all__

So in, for example, a utility library, you would define the decorator:

import sys

def export(fn):
    mod = sys.modules[fn.__module__]
    if hasattr(mod, __all__):
        mod.__all__.append(fn.__name__)
    else:
        mod.__all__ = [fn.__name__]
    return fn

and then, where you would define an __all__, you do this:

$ cat > main.py
from lib import export
__all__ = [] # optional - we create a list if __all__ is not there.

@export
def foo(): pass

@export
def bar():
    bar

def main():
    print(main)

if __name__ == __main__:
    main()

And this works fine whether run as main or imported by another function.

$ cat > run.py
import main
main.main()

$ python run.py
main

And API provisioning with import * will work too:

$ cat > run.py
from main import *
foo()
bar()
main() # expected to error here, not exported

$ python run.py
Traceback (most recent call last):
  File run.py, line 4, in <module>
    main() # expected to error here, not exported
NameError: name main is not defined

Leave a Reply

Your email address will not be published. Required fields are marked *