summaryrefslogtreecommitdiff
path: root/ext/pybind11/docs/faq.rst
blob: 34002b42dc68d755c335851d83c1cd59d9b1c484 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
Frequently asked questions
##########################

"ImportError: dynamic module does not define init function"
===========================================================

1. Make sure that the name specified in ``pybind::module`` and
   ``PYBIND11_PLUGIN`` is consistent and identical to the filename of the
   extension library. The latter should not contain any extra prefixes (e.g.
   ``test.so`` instead of ``libtest.so``).

2. If the above did not fix your issue, then you are likely using an
   incompatible version of Python (for instance, the extension library was
   compiled against Python 2, while the interpreter is running on top of some
   version of Python 3, or vice versa)

"Symbol not found: ``__Py_ZeroStruct`` / ``_PyInstanceMethod_Type``"
========================================================================

See item 2 of the first answer.

"SystemError: dynamic module not initialized properly"
======================================================

See item 2 of the first answer.

The Python interpreter immediately crashes when importing my module
===================================================================

See item 2 of the first answer.

CMake doesn't detect the right Python version
=============================================

The CMake-based build system will try to automatically detect the installed
version of Python and link against that. When this fails, or when there are
multiple versions of Python and it finds the wrong one, delete
``CMakeCache.txt`` and then invoke CMake as follows:

.. code-block:: bash

    cmake -DPYTHON_EXECUTABLE:FILEPATH=<path-to-python-executable> .

Limitations involving reference arguments
=========================================

In C++, it's fairly common to pass arguments using mutable references or
mutable pointers, which allows both read and write access to the value
supplied by the caller. This is sometimes done for efficiency reasons, or to
realize functions that have multiple return values. Here are two very basic
examples:

.. code-block:: cpp

    void increment(int &i) { i++; }
    void increment_ptr(int *i) { (*i)++; }

In Python, all arguments are passed by reference, so there is no general
issue in binding such code from Python.

However, certain basic Python types (like ``str``, ``int``, ``bool``,
``float``, etc.) are **immutable**. This means that the following attempt
to port the function to Python doesn't have the same effect on the value
provided by the caller -- in fact, it does nothing at all.

.. code-block:: python

    def increment(i):
        i += 1 # nope..

pybind11 is also affected by such language-level conventions, which means that
binding ``increment`` or ``increment_ptr`` will also create Python functions
that don't modify their arguments.

Although inconvenient, one workaround is to encapsulate the immutable types in
a custom type that does allow modifications.

An other alternative involves binding a small wrapper lambda function that
returns a tuple with all output arguments (see the remainder of the
documentation for examples on binding lambda functions). An example:

.. code-block:: cpp

    int foo(int &i) { i++; return 123; }

and the binding code

.. code-block:: cpp

   m.def("foo", [](int i) { int rv = foo(i); return std::make_tuple(rv, i); });


How can I reduce the build time?
================================

It's good practice to split binding code over multiple files, as in the
following example:

:file:`example.cpp`:

.. code-block:: cpp

    void init_ex1(py::module &);
    void init_ex2(py::module &);
    /* ... */

    PYBIND11_PLUGIN(example) {
        py::module m("example", "pybind example plugin");

        init_ex1(m);
        init_ex2(m);
        /* ... */

        return m.ptr();
    }

:file:`ex1.cpp`:

.. code-block:: cpp

    void init_ex1(py::module &m) {
        m.def("add", [](int a, int b) { return a + b; });
    }

:file:`ex2.cpp`:

.. code-block:: cpp

    void init_ex1(py::module &m) {
        m.def("sub", [](int a, int b) { return a - b; });
    }

:command:`python`:

.. code-block:: pycon

    >>> import example
    >>> example.add(1, 2)
    3
    >>> example.sub(1, 1)
    0

As shown above, the various ``init_ex`` functions should be contained in
separate files that can be compiled independently from one another, and then
linked together into the same final shared object.  Following this approach
will:

1. reduce memory requirements per compilation unit.

2. enable parallel builds (if desired).

3. allow for faster incremental builds. For instance, when a single class
   definition is changed, only a subset of the binding code will generally need
   to be recompiled.

"recursive template instantiation exceeded maximum depth of 256"
================================================================

If you receive an error about excessive recursive template evaluation, try
specifying a larger value, e.g. ``-ftemplate-depth=1024`` on GCC/Clang. The
culprit is generally the generation of function signatures at compile time
using C++14 template metaprogramming.


.. _`faq:symhidden`:

How can I create smaller binaries?
==================================

To do its job, pybind11 extensively relies on a programming technique known as
*template metaprogramming*, which is a way of performing computation at compile
time using type information. Template metaprogamming usually instantiates code
involving significant numbers of deeply nested types that are either completely
removed or reduced to just a few instructions during the compiler's optimization
phase. However, due to the nested nature of these types, the resulting symbol
names in the compiled extension library can be extremely long. For instance,
the included test suite contains the following symbol:

.. only:: html

    .. code-block:: none

        _​_​Z​N​8​p​y​b​i​n​d​1​1​1​2​c​p​p​_​f​u​n​c​t​i​o​n​C​1​I​v​8​E​x​a​m​p​l​e​2​J​R​N​S​t​3​_​_​1​6​v​e​c​t​o​r​I​N​S​3​_​1​2​b​a​s​i​c​_​s​t​r​i​n​g​I​w​N​S​3​_​1​1​c​h​a​r​_​t​r​a​i​t​s​I​w​E​E​N​S​3​_​9​a​l​l​o​c​a​t​o​r​I​w​E​E​E​E​N​S​8​_​I​S​A​_​E​E​E​E​E​J​N​S​_​4​n​a​m​e​E​N​S​_​7​s​i​b​l​i​n​g​E​N​S​_​9​i​s​_​m​e​t​h​o​d​E​A​2​8​_​c​E​E​E​M​T​0​_​F​T​_​D​p​T​1​_​E​D​p​R​K​T​2​_

.. only:: not html

    .. code-block:: cpp

        __ZN8pybind1112cpp_functionC1Iv8Example2JRNSt3__16vectorINS3_12basic_stringIwNS3_11char_traitsIwEENS3_9allocatorIwEEEENS8_ISA_EEEEEJNS_4nameENS_7siblingENS_9is_methodEA28_cEEEMT0_FT_DpT1_EDpRKT2_

which is the mangled form of the following function type:

.. code-block:: cpp

    pybind11::cpp_function::cpp_function<void, Example2, std::__1::vector<std::__1::basic_string<wchar_t, std::__1::char_traits<wchar_t>, std::__1::allocator<wchar_t> >, std::__1::allocator<std::__1::basic_string<wchar_t, std::__1::char_traits<wchar_t>, std::__1::allocator<wchar_t> > > >&, pybind11::name, pybind11::sibling, pybind11::is_method, char [28]>(void (Example2::*)(std::__1::vector<std::__1::basic_string<wchar_t, std::__1::char_traits<wchar_t>, std::__1::allocator<wchar_t> >, std::__1::allocator<std::__1::basic_string<wchar_t, std::__1::char_traits<wchar_t>, std::__1::allocator<wchar_t> > > >&), pybind11::name const&, pybind11::sibling const&, pybind11::is_method const&, char const (&) [28])

The memory needed to store just the mangled name of this function (196 bytes)
is larger than the actual piece of code (111 bytes) it represents! On the other
hand, it's silly to even give this function a name -- after all, it's just a
tiny cog in a bigger piece of machinery that is not exposed to the outside
world. So we'll generally only want to export symbols for those functions which
are actually called from the outside.

This can be achieved by specifying the parameter ``-fvisibility=hidden`` to GCC
and Clang, which sets the default symbol visibility to *hidden*. It's best to
do this only for release builds, since the symbol names can be helpful in
debugging sessions. On Visual Studio, symbols are already hidden by default, so
nothing needs to be done there. Needless to say, this has a tremendous impact
on the final binary size of the resulting extension library.

Another aspect that can require a fair bit of code are function signature
descriptions. pybind11 automatically generates human-readable function
signatures for docstrings, e.g.:

.. code-block:: none

     |  __init__(...)
     |      __init__(*args, **kwargs)
     |      Overloaded function.
     |
     |      1. __init__(example.Example1) -> NoneType
     |
     |      Docstring for overload #1 goes here
     |
     |      2. __init__(example.Example1, int) -> NoneType
     |
     |      Docstring for overload #2 goes here
     |
     |      3. __init__(example.Example1, example.Example1) -> NoneType
     |
     |      Docstring for overload #3 goes here


In C++11 mode, these are generated at run time using string concatenation,
which can amount to 10-20% of the size of the resulting binary. If you can,
enable C++14 language features (using ``-std=c++14`` for GCC/Clang), in which
case signatures are efficiently pre-generated at compile time. Unfortunately,
Visual Studio's C++14 support (``constexpr``) is not good enough as of April
2016, so it always uses the more expensive run-time approach.

Working with ancient Visual Studio 2009 builds on Windows
=========================================================

The official Windows distributions of Python are compiled using truly
ancient versions of Visual Studio that lack good C++11 support. Some users
implicitly assume that it would be impossible to load a plugin built with
Visual Studio 2015 into a Python distribution that was compiled using Visual
Studio 2009. However, no such issue exists: it's perfectly legitimate to
interface DLLs that are built with different compilers and/or C libraries.
Common gotchas to watch out for involve not ``free()``-ing memory region
that that were ``malloc()``-ed in another shared library, using data
structures with incompatible ABIs, and so on. pybind11 is very careful not
to make these types of mistakes.