Is it possible to make setuptool to set a binary as exectuable when installed - python

I have three binary executables which are compiled for OSX, Windows and Linux. Which are going to be called by a pythonscript, my problem is that the package is built on a windows machine which strips the file attributes from the binaries for OSX and Linux. What I'm looking for is a way of making setuptools set the executable attribute to the files when it copies it into the python package.
So what I have for installing the package for now is file containing:
recursive-include foo/bar/lib *%
which includes three folders Linux, OSX and Windows
Then the
from setuptools import setup, find_packages
I've also used
'lib': ['lib/*'],
in before but I moved over to manifest as I want it to work with bdist and sdist.
So what I'm looking for is to make sure that the -x flag is set on the binary files which are installed when calling python install . inside the folder. As this is not distributed as a pip package.

You cannot, no way. pip is a rather simplistic package manager intended to install Python libraries and accompanying Python scripts.
People try tricks but they don't work.
You need a real package manager.

Adding os.chmod(path_to_executable, <executable attribute> into worked for me at least.


“python install” command doesn't install libs in install_requires list

My goal is creating .deb package from a python package and distribute my python scripts at the end. I have 2 questions about this process
1- I am able to create a python package with following steps in here. My is like that
from distutils.core import setup
# Application name:
# Version number (initial):
# Application author details:
author="name surname",
# Packages
# Include additional files into the package
# Details
# license="LICENSE.txt",
description="Useful towel-related stuff.",
# long_description=open("README.txt").read(),
# Dependent packages (distributions)
Things are starting to be different in the install_requires part. I know those libs can be installed via pip so in this case after I created the python package and it creates tar.gz of the package. So
python install command doesn't install the libs in the install_requires list but if I call the python package tar.gz with pip install name_of_the_package.tar.gz it installs the libs in the list. So why python install command don't install the libs?
2- Then I am creating the .deb package from my python package using stdeb. When I try to install .deb package to my system, I am expecting to get libs installed in the install_requires list but they don't get installed?
I feel like I am skipping a part but I don't know what I am skipping?
The install_requires keyword argument to the setup() function is a setuptools feature, it's not supported as such in plain distutils.
If you instead import the setup function from setuptools using
from setuptools import setup
it might just work.
That guide on DigitalOcean seems incorrect and outdated. I would recommend following the official packaging guide at instead.

How to move my python scripts to the right path when installing

I have been reusing several python projects and when I execute python install all the code gets copied into /usr/lib/python2.7/dist-packages. Now I am trying to create my own Python project and this is my
from setuptools import setup, find_packages
name = "My project",
version = "0.1",
license = "BSD",
However, that copies all my scripts into a /build directory which gets created in the directory where the is. How can I change that behaviour and so that the scripts move to /usr/lib/python2.7/dist-packages instead?
You have to build the package first and that package ends up in /dist. Once you have the package file you have to install it with pip install. It will then be transferred to /usr/lib/python2.7/dist-packages.
The purpose of creating packages is for distributing your code, so it assumes that you won't just be using it yourself and builds a package that can then be installed.
You won't be allowed to directly write to /usr/lib/python2.7/dist-packages unless you use sudo because it is a system directory.
Also direct installation is not recommended because you might get something wrong and have to uninstall. It is always a safer and better practice to build the package and test it in a virtual environment before you actually install it.

Tox can't copy non-python file while installing the module

This is the tree structure of the module I'm writing the file for:
ls .
I configured my as follows:
from setuptools import setup, find_packages
# [...]
# [...]
('license', ['LICENSE']),
# [...]
# could also include long_description, download_url, classifiers, etc.
If I install the package from my python environment (also a virtualenv)
pip install .
the LICENSE file gets correctly installed.
But running tox:
envlist = py27, py35
deps =
commands = py.test \
I get this error:
running install_data
creating build/bdist.macosx-10.11-x86_64/wheel/
creating build/bdist.macosx-10.11-x86_64/wheel/
creating build/bdist.macosx-10.11-x86_64/wheel/
error: can't copy 'LICENSE': doesn't exist or not a regular file
Removing the data_files part from the makes tox running correctly.
Your issue here is that setuptools is not able to find the 'LICENSE' file in the files that have been included for building the source distribution. You have 2 options, to tell setuptools to include that file (both have been pointed to here):
Add a file (like
Use include_package_data=True in your file.
Using is often simpler and easier to verify due to, making it possible to use automation to verify that things are indeed correct (if you use a VCS like Git or SVN).
pip install . builds a wheel using python bdist_wheel which is installed by simply unpacking it appropriately, as defined in the Wheel Specification:
tox builds a source distribution using python sdist, which is then unpacked and installed using python install.
That might be a reason for the difference in behavior for you.
I have some resource files inside my packages which I use during the execution. To make setup store them in a package with python code, I use include_package_data=True and I access them using importlib.resources. You can use backport for an older Python version than 3.7 or another library.
Before each release I have a script which verifies, that all files I need are placed inside a bdist wheel to be sure that everything is on the place.

python is not installing dependencies listed in install_requires of setuptools

I have written a python module that depends on openpyxl. I want openpxyl to be installed as a dependency automatically using setuptools.
I read that the proper way to do this is to include the following in the script:
version=find_version("lala", ""),
author='Jonathan T',
'openpxyl = 2.3.3',
So I packaged up my module with python sdist, took the *.tar.gz file, unzipped it, and then ran python install, and openpyxl is NOT installing!!!
What am I doing wrong here?
Try providing your dependency both in install_requires and setup_requires.
Following is from setuptool's documentation at
A string or list of strings specifying what other distributions need to be present in order for the setup script to run.
setuptools will attempt to obtain these (even going so far as to
download them using EasyInstall) before processing the rest of the
setup script or commands. This argument is needed if you are using
distutils extensions as part of your build process; for example,
extensions that process setup() arguments and turn them into EGG-INFO
metadata files.
(Note: projects listed in setup_requires will NOT be automatically
installed on the system where the setup script is being run. They are
simply downloaded to the ./.eggs directory if they’re not locally
available already. If you want them to be installed, as well as being
available when the setup script is run, you should add them to
install_requires and setup_requires.)
I notice when you use override 'install' with a 'cmdclass' key. The pattern below also left me with uninstalled dependencies.
def run(self):
# some custom commands
Adding the dependencies into setup_requires didn't work for me so in the end I just did my own pip install in the custom install command..
def pip_install(package_name):
[sys.executable, '-m', 'pip', 'install', package_name]

How to specify the location where entry_points scripts are installed in python package?

I have a python package with file like so:
packages=find_packages(exclude=['ez_setup', 'examples', 'tests']),
sound-run =
sound-resume =
# Other setuptools stuff
I am trying to make it such that when installed as a debian package, the scripts (sound-run, sound-resume) binaries end up in a destination I specify. Currently when I make the debian package the scripts end up in /usr/local/bin/ but I would like to have them install into /usr/local/myfolder/myfolder1.
Any suggestions?
python --install-scripts destination. See the --help option and
Note that this is provided by distutils, which setuptools extends; I assume that setuptools entry-points-generated scripts also respect that option.
Regarding your use case, Debian has specific helpers that will install things to the right locations so that the Debian tools can finish the job. I’d advise to look into using dh with pybuild and dh_python2.