Compare commits

..

24 Commits
master ... f27

Author SHA1 Message Date
Charalampos Stratakis 3ef90dfc66 Update to 3.6.5
Rebased patches: 102, 111, 262

Removed patches due to being upstreamed:
264, 273, 298

Update pip version to 9.0.3
2018-04-04 12:51:55 +02:00
Miro Hrončok 3a8efff052 Fix shebangs of the GDB hooks
Also, use -p (preserve timestamp) and -n (don't create backup files)
with pathfix.py.

Resolves https://bugzilla.redhat.com/show_bug.cgi?id=1560295
2018-04-04 12:51:55 +02:00
Miro Hrončok 8dd12915a9 rpmlintrc: Filter macro-in-comment %{_pyconfig(32|64)_h} 2018-04-04 12:51:55 +02:00
Miro Hrončok f124ad4c93 Fix broken macro invocation and broken building of C Python extensions
Revert "Use %% for actual % in spec"

This reverts commit 90512a5a1b.

Resolves https://bugzilla.redhat.com/show_bug.cgi?id=1560103
2018-04-04 12:51:55 +02:00
Miro Hrončok 23b7cdc7ee rpmlintrc: Do not filter library-without-ldconfig-post on F < 28 2018-04-04 12:51:43 +02:00
Miro Hrončok 1c084246f6 Add rpmlintrc file
Filter all the errors and warnings. This allows us to actually read the rpmlint
output to get new information. From now on, we can rely on this information
when pushing updates.

Resolves https://bugzilla.redhat.com/show_bug.cgi?id=1548683

Backport of https://src.fedoraproject.org/rpms/python37/pull-request/10
2018-03-29 16:45:01 +02:00
Miro Hrončok e73306ff64 Use %% for actual % in spec
rpmlint reports this as macro in comment, however it was left here, because it
is not macro and not comment. On the other hand, % shall be escaped using %%.
All this needs to blow is somebody defining a macro called _pyconfig64_h.
2018-03-29 16:17:44 +02:00
Miro Hrončok b0edf85387 Add -n option for pathfix.py (#1546990) 2018-03-29 16:17:42 +02:00
Miro Hrončok 3324e84bc3 Fix the py_byte_compile macro to work on Python 2
See https://bugzilla.redhat.com/show_bug.cgi?id=1484993

Inspired by Terje Røsten's workaround from that bugzilla
2018-03-29 16:09:04 +02:00
Charalampos Stratakis bc228b8ca2 Do not send IP addresses in SNI TLS extension 2018-03-13 17:00:59 +01:00
Charalampos Stratakis 6e02335726 Fix the name macro in the description 2018-02-08 14:55:14 +01:00
Michal Cyprian 956845fa5b Remove sys.executable check from change-user-install-location patch
Resolves: rhbz#1532287
2018-02-08 14:52:09 +01:00
Charalampos Stratakis 73123677e8 Define TLS cipher suite on build time 2018-02-01 11:23:20 +01:00
Charalampos Stratakis d20afa1807 Restore the PyExc_RecursionErrorInst public symbol 2018-01-23 17:22:24 +01:00
Charalampos Stratakis d3a063dd35 Properly add patch 273 2018-01-19 17:46:46 +01:00
Charalampos Stratakis 08e5703d68 Fix localeconv() encoding for LC_NUMERIC 2018-01-19 17:36:21 +01:00
Igor Gnatenko ec05ee2814 R: gdbm-devel → R: gdbm for python3-libs
Signed-off-by: Igor Gnatenko <ignatenkobrain@fedoraproject.org>
2018-01-19 12:46:06 +01:00
Miro Hrončok 2787c85b78 Require large enough gdbm (fixup for previous commit) 2018-01-17 12:10:59 +01:00
Charalampos Stratakis e7bbd26b13 Rebuild for reverted gdbm 1:1.13 on F27 2018-01-16 20:37:17 +01:00
Charalampos Stratakis 9109dafcdb Update to version 3.6.4
Rebased patches: 189, 262

Dropped patches due to being upstreamed: 277, 279
2018-01-15 15:51:53 +01:00
Charalampos Stratakis bfc0c338e5 Remove a ppc64 segfault workaround which provided a larger stack for that
arch, as it doesn't seem to affect the build anymore.
2017-12-04 18:06:06 +01:00
Charalampos Stratakis 8b736574c2 Masc two macros in comments that were expanded.
Remove the commented out file for the time shared library.
2017-12-04 18:06:00 +01:00
Charalampos Stratakis b5bbd7e7a9 Remove python-gdb.py source file as it now gets installed from the upstream sources 2017-12-04 18:05:53 +01:00
Charalampos Stratakis 6eb19770a6 Remove our downstream systemtap instrumentation as now upstream provides us
with dtrace hooks.
2017-12-04 18:05:43 +01:00
31 changed files with 2679 additions and 12806 deletions

2
.gitignore vendored
View File

@ -1,3 +1 @@
/*.tar.*
/*.src.rpm
/results_python3*

View File

@ -1,18 +1,7 @@
From bf01d6c367d9cb8f6594afa87c16f0498ae7321f Mon Sep 17 00:00:00 2001
From: David Malcolm <dmalcolm@redhat.com>
Date: Wed, 13 Jan 2010 21:25:18 +0000
Subject: [PATCH] 00001: Fixup distutils/unixccompiler.py to remove standard
library path from rpath Was Patch0 in ivazquez' python3000 specfile
---
Lib/distutils/unixccompiler.py | 9 +++++++++
1 file changed, 9 insertions(+)
diff --git a/Lib/distutils/unixccompiler.py b/Lib/distutils/unixccompiler.py
index d10a78da31..4df4b67810 100644
--- a/Lib/distutils/unixccompiler.py
+++ b/Lib/distutils/unixccompiler.py
@@ -82,6 +82,15 @@ class UnixCCompiler(CCompiler):
diff -up Python-3.1.1/Lib/distutils/unixccompiler.py.rpath Python-3.1.1/Lib/distutils/unixccompiler.py
--- Python-3.1.1/Lib/distutils/unixccompiler.py.rpath 2009-09-04 17:29:34.000000000 -0400
+++ Python-3.1.1/Lib/distutils/unixccompiler.py 2009-09-04 17:49:54.000000000 -0400
@@ -141,6 +141,15 @@ class UnixCCompiler(CCompiler):
if sys.platform == "cygwin":
exe_extension = ".exe"
@ -28,6 +17,3 @@ index d10a78da31..4df4b67810 100644
def preprocess(self, source, output_file=None, macros=None,
include_dirs=None, extra_preargs=None, extra_postargs=None):
fixed_args = self._fix_compile_args(None, macros, include_dirs)
--
2.24.1

View File

@ -1,39 +1,5 @@
From 96580364051672475607c88cdb31ec875cea6e97 Mon Sep 17 00:00:00 2001
From: David Malcolm <dmalcolm@redhat.com>
Date: Wed, 13 Jan 2010 21:25:18 +0000
Subject: [PATCH] 00102: Change the various install paths to use /usr/lib64/
instead or /usr/lib/
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Only used when "%{_lib}" == "lib64".
Co-authored-by: David Malcolm <dmalcolm@redhat.com>
Co-authored-by: Thomas Spura <tomspur@fedoraproject.org>
Co-authored-by: Slavek Kabrda <bkabrda@redhat.com>
Co-authored-by: Matej Stuchlik <mstuchli@redhat.com>
Co-authored-by: Tomas Orsava <torsava@redhat.com>
Co-authored-by: Charalampos Stratakis <cstratak@redhat.com>
Co-authored-by: Petr Viktorin <pviktori@redhat.com>
Co-authored-by: Miro Hrončok <miro@hroncok.cz>
Co-authored-by: Iryna Shcherbina <shcherbina.iryna@gmail.com>
---
Lib/distutils/command/install.py | 4 ++--
Lib/distutils/sysconfig.py | 6 +++++-
Lib/distutils/tests/test_install.py | 3 ++-
Lib/site.py | 4 ++++
Lib/sysconfig.py | 12 ++++++------
Lib/test/test_site.py | 4 ++--
Makefile.pre.in | 2 +-
Modules/getpath.c | 6 +++---
configure | 4 ++--
configure.ac | 4 ++--
setup.py | 6 +++---
11 files changed, 32 insertions(+), 23 deletions(-)
diff --git a/Lib/distutils/command/install.py b/Lib/distutils/command/install.py
index c625c95bf7..ae4f915669 100644
index 9474e9c..c0ce4c6 100644
--- a/Lib/distutils/command/install.py
+++ b/Lib/distutils/command/install.py
@@ -30,14 +30,14 @@ WINDOWS_SCHEME = {
@ -54,10 +20,10 @@ index c625c95bf7..ae4f915669 100644
'scripts': '$base/bin',
'data' : '$base',
diff --git a/Lib/distutils/sysconfig.py b/Lib/distutils/sysconfig.py
index b51629eb94..9a4892a737 100644
index 026cca7..6d3e077 100644
--- a/Lib/distutils/sysconfig.py
+++ b/Lib/distutils/sysconfig.py
@@ -146,8 +146,12 @@ def get_python_lib(plat_specific=0, standard_lib=0, prefix=None):
@@ -132,8 +132,12 @@ def get_python_lib(plat_specific=0, standard_lib=0, prefix=None):
prefix = plat_specific and EXEC_PREFIX or PREFIX
if os.name == "posix":
@ -71,11 +37,10 @@ index b51629eb94..9a4892a737 100644
if standard_lib:
return libpython
else:
diff --git a/Lib/distutils/tests/test_install.py b/Lib/distutils/tests/test_install.py
index 287ab1989e..d4c05e0ab1 100644
diff a/Lib/distutils/tests/test_install.py b/Lib/distutils/tests/test_install.py
--- a/Lib/distutils/tests/test_install.py
+++ b/Lib/distutils/tests/test_install.py
@@ -57,8 +57,9 @@ class InstallTestCase(support.TempdirManager,
@@ -57,8 +57,9 @@
self.assertEqual(got, expected)
libdir = os.path.join(destination, "lib", "python")
@ -87,10 +52,10 @@ index 287ab1989e..d4c05e0ab1 100644
check_path(cmd.install_headers,
os.path.join(destination, "include", "python", "foopkg"))
diff --git a/Lib/site.py b/Lib/site.py
index a065ab0b5d..22d53fa562 100644
index a84e3bb..ba0d3ea 100644
--- a/Lib/site.py
+++ b/Lib/site.py
@@ -335,11 +335,15 @@ def getsitepackages(prefixes=None):
@@ -303,11 +303,15 @@ def getsitepackages(prefixes=None):
seen.add(prefix)
if os.sep == '/':
@ -104,10 +69,10 @@ index a065ab0b5d..22d53fa562 100644
sitepackages.append(prefix)
+ sitepackages.append(os.path.join(prefix, "lib64", "site-packages"))
sitepackages.append(os.path.join(prefix, "lib", "site-packages"))
return sitepackages
if sys.platform == "darwin":
# for framework builds *only* we add the standard Apple
diff --git a/Lib/sysconfig.py b/Lib/sysconfig.py
index b9e2fafbc0..0ae6d35b69 100644
index b9bbfe5..2a5f29c 100644
--- a/Lib/sysconfig.py
+++ b/Lib/sysconfig.py
@@ -20,10 +20,10 @@ __all__ = [
@ -124,7 +89,7 @@ index b9e2fafbc0..0ae6d35b69 100644
'include':
'{installed_base}/include/python{py_version_short}{abiflags}',
'platinclude':
@@ -62,10 +62,10 @@ _INSTALL_SCHEMES = {
@@ -61,10 +61,10 @@ _INSTALL_SCHEMES = {
'data': '{userbase}',
},
'posix_user': {
@ -139,13 +104,13 @@ index b9e2fafbc0..0ae6d35b69 100644
'scripts': '{userbase}/bin',
'data': '{userbase}',
diff --git a/Lib/test/test_site.py b/Lib/test/test_site.py
index 41c4229919..543c88432a 100644
index f698927..bc977b5 100644
--- a/Lib/test/test_site.py
+++ b/Lib/test/test_site.py
@@ -266,8 +266,8 @@ class HelperFunctionsTests(unittest.TestCase):
dirs = site.getsitepackages()
if os.sep == '/':
# OS X, Linux, FreeBSD, etc
@@ -248,8 +248,8 @@ class HelperFunctionsTests(unittest.TestCase):
self.assertEqual(dirs[1], wanted)
elif os.sep == '/':
# OS X non-framework builds, Linux, FreeBSD, etc
- self.assertEqual(len(dirs), 1)
- wanted = os.path.join('xoxo', 'lib',
+ self.assertEqual(len(dirs), 2)
@ -154,10 +119,10 @@ index 41c4229919..543c88432a 100644
'site-packages')
self.assertEqual(dirs[0], wanted)
diff --git a/Makefile.pre.in b/Makefile.pre.in
index a914a9c70f..406a441082 100644
index 8fa7934..a693917 100644
--- a/Makefile.pre.in
+++ b/Makefile.pre.in
@@ -143,7 +143,7 @@ LIBDIR= @libdir@
@@ -126,7 +126,7 @@ LIBDIR= @libdir@
MANDIR= @mandir@
INCLUDEDIR= @includedir@
CONFINCLUDEDIR= $(exec_prefix)/include
@ -167,95 +132,71 @@ index a914a9c70f..406a441082 100644
# Detailed destination directories
diff --git a/Modules/getpath.c b/Modules/getpath.c
index b727f66953..a0c5fb6139 100644
index 65b47a3..eaa756c 100644
--- a/Modules/getpath.c
+++ b/Modules/getpath.c
@@ -730,7 +730,7 @@ calculate_exec_prefix(PyCalculatePath *calculate, _PyPathConfig *pathconfig,
if (safe_wcscpy(exec_prefix, calculate->exec_prefix, exec_prefix_len) < 0) {
return PATHLEN_ERR();
}
- status = joinpath(exec_prefix, L"lib/lib-dynload", exec_prefix_len);
+ status = joinpath(exec_prefix, L"lib64/lib-dynload", exec_prefix_len);
if (_PyStatus_EXCEPTION(status)) {
return status;
}
@@ -1067,7 +1067,7 @@ calculate_zip_path(PyCalculatePath *calculate, const wchar_t *prefix,
return PATHLEN_ERR();
}
}
- status = joinpath(zip_path, L"lib/python00.zip", zip_path_len);
+ status = joinpath(zip_path, L"lib64/python00.zip", zip_path_len);
if (_PyStatus_EXCEPTION(status)) {
return status;
}
@@ -1197,7 +1197,7 @@ calculate_init(PyCalculatePath *calculate, const PyConfig *config)
if (!calculate->exec_prefix) {
return DECODE_LOCALE_ERR("EXEC_PREFIX define", len);
}
- calculate->lib_python = Py_DecodeLocale("lib/python" VERSION, &len);
+ calculate->lib_python = Py_DecodeLocale("lib64/python" VERSION, &len);
if (!calculate->lib_python) {
return DECODE_LOCALE_ERR("EXEC_PREFIX define", len);
}
diff --git a/configure b/configure
index a979363acf..b89ae1be3c 100755
--- a/configure
+++ b/configure
@@ -15188,9 +15188,9 @@ fi
@@ -494,7 +494,7 @@ calculate_path(void)
_pythonpath = Py_DecodeLocale(PYTHONPATH, NULL);
_prefix = Py_DecodeLocale(PREFIX, NULL);
_exec_prefix = Py_DecodeLocale(EXEC_PREFIX, NULL);
- lib_python = Py_DecodeLocale("lib/python" VERSION, NULL);
+ lib_python = Py_DecodeLocale("lib64/python" VERSION, NULL);
if test x$PLATFORM_TRIPLET = x; then
- LIBPL='$(prefix)'"/lib/python${VERSION}/config-${LDVERSION}"
+ LIBPL='$(prefix)'"/lib64/python${VERSION}/config-${LDVERSION}"
else
- LIBPL='$(prefix)'"/lib/python${VERSION}/config-${LDVERSION}-${PLATFORM_TRIPLET}"
+ LIBPL='$(prefix)'"/lib64/python${VERSION}/config-${LDVERSION}-${PLATFORM_TRIPLET}"
fi
diff --git a/configure.ac b/configure.ac
index e57ef7c38b..c59cbc223f 100644
--- a/configure.ac
+++ b/configure.ac
@@ -4674,9 +4674,9 @@ fi
dnl define LIBPL after ABIFLAGS and LDVERSION is defined.
AC_SUBST(PY_ENABLE_SHARED)
if test x$PLATFORM_TRIPLET = x; then
- LIBPL='$(prefix)'"/lib/python${VERSION}/config-${LDVERSION}"
+ LIBPL='$(prefix)'"/lib64/python${VERSION}/config-${LDVERSION}"
else
- LIBPL='$(prefix)'"/lib/python${VERSION}/config-${LDVERSION}-${PLATFORM_TRIPLET}"
+ LIBPL='$(prefix)'"/lib64/python${VERSION}/config-${LDVERSION}-${PLATFORM_TRIPLET}"
fi
AC_SUBST(LIBPL)
if (!_pythonpath || !_prefix || !_exec_prefix || !lib_python) {
Py_FatalError(
@@ -683,7 +683,7 @@ calculate_path(void)
}
else
wcsncpy(zip_path, _prefix, MAXPATHLEN);
- joinpath(zip_path, L"lib/python00.zip");
+ joinpath(zip_path, L"lib64/python00.zip");
bufsz = wcslen(zip_path); /* Replace "00" with version */
zip_path[bufsz - 6] = VERSION[0];
zip_path[bufsz - 5] = VERSION[2];
@@ -695,7 +695,7 @@ calculate_path(void)
fprintf(stderr,
"Could not find platform dependent libraries <exec_prefix>\n");
wcsncpy(exec_prefix, _exec_prefix, MAXPATHLEN);
- joinpath(exec_prefix, L"lib/lib-dynload");
+ joinpath(exec_prefix, L"lib64/lib-dynload");
}
/* If we found EXEC_PREFIX do *not* reduce it! (Yet.) */
diff --git a/setup.py b/setup.py
index 20d7f35652..024a1035c0 100644
index 0f2dfc4..da37896 100644
--- a/setup.py
+++ b/setup.py
@@ -649,7 +649,7 @@ class PyBuildExt(build_ext):
@@ -492,7 +492,7 @@ class PyBuildExt(build_ext):
# directories (i.e. '.' and 'Include') must be first. See issue
# 10520.
if not CROSS_COMPILING:
if not cross_compiling:
- add_dir_to_list(self.compiler.library_dirs, '/usr/local/lib')
+ add_dir_to_list(self.compiler.library_dirs, '/usr/local/lib64')
add_dir_to_list(self.compiler.include_dirs, '/usr/local/include')
# only change this for cross builds for 3.3, issues on Mageia
if CROSS_COMPILING:
@@ -953,11 +953,11 @@ class PyBuildExt(build_ext):
if cross_compiling:
@@ -780,11 +780,11 @@ class PyBuildExt(build_ext):
elif curses_library:
readline_libs.append(curses_library)
elif self.compiler.find_library_file(self.lib_dirs +
elif self.compiler.find_library_file(lib_dirs +
- ['/usr/lib/termcap'],
+ ['/usr/lib64/termcap'],
'termcap'):
readline_libs.append('termcap')
self.add(Extension('readline', ['readline.c'],
- library_dirs=['/usr/lib/termcap'],
+ library_dirs=['/usr/lib64/termcap'],
extra_link_args=readline_extra_link_args,
libraries=readline_libs))
exts.append( Extension('readline', ['readline.c'],
- library_dirs=['/usr/lib/termcap'],
+ library_dirs=['/usr/lib64/termcap'],
extra_link_args=readline_extra_link_args,
libraries=readline_libs) )
else:
--
2.24.1
@@ -821,8 +821,8 @@ class PyBuildExt(build_ext):
if krb5_h:
ssl_incs += krb5_h
ssl_libs = find_library_file(self.compiler, 'ssl',lib_dirs,
- ['/usr/local/ssl/lib',
- '/usr/contrib/ssl/lib/'
+ ['/usr/local/ssl/lib64',
+ '/usr/contrib/ssl/lib64/'
] )
if (ssl_incs is not None and

View File

@ -1,61 +1,45 @@
From fb93392b0f4975a02775a608611dc9ceb20c06ad Mon Sep 17 00:00:00 2001
From: David Malcolm <dmalcolm@redhat.com>
Date: Mon, 18 Jan 2010 17:59:07 +0000
Subject: [PATCH] 00111: Don't try to build a libpythonMAJOR.MINOR.a
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Downstream only: not appropriate for upstream.
See https://bugzilla.redhat.com/show_bug.cgi?id=556092
Co-authored-by: David Malcolm <dmalcolm@redhat.com>
Co-authored-by: Bohuslav Kabrda <bkabrda@redhat.com>
Co-authored-by: Matej Stuchlik <mstuchli@redhat.com>
Co-authored-by: Robert Kuska <rkuska@redhat.com>
Co-authored-by: Charalampos Stratakis <cstratak@redhat.com>
Co-authored-by: Miro Hrončok <miro@hroncok.cz>
---
Makefile.pre.in | 21 ++-------------------
1 file changed, 2 insertions(+), 19 deletions(-)
diff --git a/Makefile.pre.in b/Makefile.pre.in
index 406a441082..917303dd92 100644
index 4b093e3..1088435 100644
--- a/Makefile.pre.in
+++ b/Makefile.pre.in
@@ -562,7 +562,7 @@ clinic: check-clean-src $(srcdir)/Modules/_blake2/blake2s_impl.c
$(PYTHON_FOR_REGEN) $(srcdir)/Tools/clinic/clinic.py --make --srcdir $(srcdir)
@@ -543,7 +543,7 @@ clinic: check-clean-src $(srcdir)/Modules/_blake2/blake2s_impl.c
$(PYTHON_FOR_REGEN) ./Tools/clinic/clinic.py --make
# Build the interpreter
-$(BUILDPYTHON): Programs/python.o $(LIBRARY) $(LDLIBRARY) $(PY3LIBRARY)
+$(BUILDPYTHON): Programs/python.o $(LDLIBRARY) $(PY3LIBRARY)
$(LINKCC) $(PY_CORE_LDFLAGS) $(LINKFORSHARED) -o $@ Programs/python.o $(BLDLIBRARY) $(LIBS) $(MODLIBS) $(SYSLIBS)
$(LINKCC) $(PY_LDFLAGS) $(LINKFORSHARED) -o $@ Programs/python.o $(BLDLIBRARY) $(LIBS) $(MODLIBS) $(SYSLIBS) $(LDLAST)
platform: $(BUILDPYTHON) pybuilddir.txt
@@ -610,12 +610,6 @@ sharedmods: $(BUILDPYTHON) pybuilddir.txt Modules/_math.o
_TCLTK_INCLUDES='$(TCLTK_INCLUDES)' _TCLTK_LIBS='$(TCLTK_LIBS)' \
@@ -588,18 +588,6 @@ sharedmods: $(BUILDPYTHON) pybuilddir.txt Modules/_math.o
$(PYTHON_FOR_BUILD) $(srcdir)/setup.py $$quiet build
-
-# Build static library
-# avoid long command lines, same as LIBRARY_OBJS
-$(LIBRARY): $(LIBRARY_OBJS)
- -rm -f $@
- $(AR) $(ARFLAGS) $@ $(LIBRARY_OBJS)
- $(AR) $(ARFLAGS) $@ Modules/getbuildinfo.o
- $(AR) $(ARFLAGS) $@ $(PARSER_OBJS)
- $(AR) $(ARFLAGS) $@ $(OBJECT_OBJS)
- $(AR) $(ARFLAGS) $@ $(PYTHON_OBJS) Python/frozen.o
- $(AR) $(ARFLAGS) $@ $(MODULE_OBJS)
- $(AR) $(ARFLAGS) $@ $(MODOBJS)
- $(RANLIB) $@
-
libpython$(LDVERSION).so: $(LIBRARY_OBJS) $(DTRACE_OBJS)
libpython$(LDVERSION).so: $(LIBRARY_OBJS)
if test $(INSTSONAME) != $(LDLIBRARY); then \
$(BLDSHARED) -Wl,-h$(INSTSONAME) -o $(INSTSONAME) $(LIBRARY_OBJS) $(MODLIBS) $(SHLIBS) $(LIBC) $(LIBM); \
@@ -693,7 +687,7 @@ Makefile Modules/config.c: Makefile.pre \
@echo "The Makefile was updated, you may need to re-run make."
$(BLDSHARED) -Wl,-h$(INSTSONAME) -o $(INSTSONAME) $(LIBRARY_OBJS) $(MODLIBS) $(SHLIBS) $(LIBC) $(LIBM) $(LDLAST); \
@@ -689,7 +677,7 @@ Modules/Setup: $(srcdir)/Modules/Setup.dist
echo "-----------------------------------------------"; \
fi
-Programs/_testembed: Programs/_testembed.o $(LIBRARY) $(LDLIBRARY) $(PY3LIBRARY)
+Programs/_testembed: Programs/_testembed.o $(LDLIBRARY) $(PY3LIBRARY)
$(LINKCC) $(PY_CORE_LDFLAGS) $(LINKFORSHARED) -o $@ Programs/_testembed.o $(BLDLIBRARY) $(LIBS) $(MODLIBS) $(SYSLIBS)
$(LINKCC) $(PY_LDFLAGS) $(LINKFORSHARED) -o $@ Programs/_testembed.o $(BLDLIBRARY) $(LIBS) $(MODLIBS) $(SYSLIBS) $(LDLAST)
############################################################################
@@ -1557,17 +1551,6 @@ libainstall: @DEF_MAKE_RULE@ python-config
@@ -1425,18 +1413,6 @@ libainstall: @DEF_MAKE_RULE@ python-config
else true; \
fi; \
done
@ -65,6 +49,7 @@ index 406a441082..917303dd92 100644
- $(INSTALL_DATA) $(LDLIBRARY) $(DESTDIR)$(LIBPL) ; \
- else \
- $(INSTALL_DATA) $(LIBRARY) $(DESTDIR)$(LIBPL)/$(LIBRARY) ; \
- $(RANLIB) $(DESTDIR)$(LIBPL)/$(LIBRARY) ; \
- fi; \
- else \
- echo Skip install of $(LIBRARY) - use make frameworkinstall; \
@ -73,6 +58,3 @@ index 406a441082..917303dd92 100644
$(INSTALL_DATA) Modules/config.c $(DESTDIR)$(LIBPL)/config.c
$(INSTALL_DATA) Programs/python.o $(DESTDIR)$(LIBPL)/python.o
$(INSTALL_DATA) $(srcdir)/Modules/config.c.in $(DESTDIR)$(LIBPL)/config.c.in
--
2.24.1

View File

@ -0,0 +1,46 @@
diff -up Python-3.2.2/Lib/unittest/case.py.add-rpmbuild-hooks-to-unittest Python-3.2.2/Lib/unittest/case.py
--- Python-3.2.2/Lib/unittest/case.py.add-rpmbuild-hooks-to-unittest 2011-09-03 12:16:44.000000000 -0400
+++ Python-3.2.2/Lib/unittest/case.py 2011-09-09 06:35:16.365568382 -0400
@@ -3,6 +3,7 @@
import sys
import functools
import difflib
+import os
import logging
import pprint
import re
@@ -101,5 +102,21 @@ def expectedFailure(func):
raise self.test_case.failureException(msg)
+# Non-standard/downstream-only hooks for handling issues with specific test
+# cases:
+
+def _skipInRpmBuild(reason):
+ """
+ Non-standard/downstream-only decorator for marking a specific unit test
+ to be skipped when run within the %check of an rpmbuild.
+
+ Specifically, this takes effect when WITHIN_PYTHON_RPM_BUILD is set within
+ the environment, and has no effect otherwise.
+ """
+ if 'WITHIN_PYTHON_RPM_BUILD' in os.environ:
+ return skip(reason)
+ else:
+ return _id
+
class _AssertRaisesBaseContext(_BaseTestCaseContext):
def __init__(self, expected, test_case, expected_regex=None):
diff -up Python-3.2.2/Lib/unittest/__init__.py.add-rpmbuild-hooks-to-unittest Python-3.2.2/Lib/unittest/__init__.py
--- Python-3.2.2/Lib/unittest/__init__.py.add-rpmbuild-hooks-to-unittest 2011-09-03 12:16:44.000000000 -0400
+++ Python-3.2.2/Lib/unittest/__init__.py 2011-09-09 06:35:16.366568382 -0400
@@ -57,7 +57,8 @@ __unittest = True
from .result import TestResult
from .case import (TestCase, FunctionTestCase, SkipTest, skip, skipIf,
- skipUnless, expectedFailure)
+ skipUnless, expectedFailure,
+ _skipInRpmBuild)
from .suite import BaseTestSuite, TestSuite
from .loader import (TestLoader, defaultTestLoader, makeSuite, getTestCaseNames,
findTestCases)

View File

@ -0,0 +1,15 @@
diff -up Python-3.2.3/Lib/ctypes/__init__.py.rhbz814391 Python-3.2.3/Lib/ctypes/__init__.py
--- Python-3.2.3/Lib/ctypes/__init__.py.rhbz814391 2012-04-20 15:12:49.017867692 -0400
+++ Python-3.2.3/Lib/ctypes/__init__.py 2012-04-20 15:15:09.501111408 -0400
@@ -275,11 +275,6 @@ def _reset_cache():
# _SimpleCData.c_char_p_from_param
POINTER(c_char).from_param = c_char_p.from_param
_pointer_type_cache[None] = c_void_p
- # XXX for whatever reasons, creating the first instance of a callback
- # function is needed for the unittests on Win64 to succeed. This MAY
- # be a compiler bug, since the problem occurs only when _ctypes is
- # compiled with the MS SDK compiler. Or an uninitialized variable?
- CFUNCTYPE(c_int)(lambda: None)
def create_unicode_buffer(init, size=None):
"""create_unicode_buffer(aString) -> character array

View File

@ -0,0 +1,11 @@
diff -up cpython-59223da36dec/Lib/test/test_posix.py.disable-test_fs_holes-in-rpm-build cpython-59223da36dec/Lib/test/test_posix.py
--- cpython-59223da36dec/Lib/test/test_posix.py.disable-test_fs_holes-in-rpm-build 2012-08-07 17:15:59.000000000 -0400
+++ cpython-59223da36dec/Lib/test/test_posix.py 2012-08-07 17:16:53.528330330 -0400
@@ -973,6 +973,7 @@ class PosixTester(unittest.TestCase):
posix.RTLD_GLOBAL
posix.RTLD_LOCAL
+ @unittest._skipInRpmBuild('running kernel may not match kernel in chroot')
@unittest.skipUnless(hasattr(os, 'SEEK_HOLE'),
"test needs an OS that reports file holes")
def test_fs_holes(self):

View File

@ -0,0 +1,11 @@
diff -up Python-3.3.0b1/Lib/test/test_socket.py.disable-test_socket-in-rpm-builds Python-3.3.0b1/Lib/test/test_socket.py
--- Python-3.3.0b1/Lib/test/test_socket.py.disable-test_socket-in-rpm-builds 2012-07-24 15:02:30.823355067 -0400
+++ Python-3.3.0b1/Lib/test/test_socket.py 2012-07-24 15:08:13.021354999 -0400
@@ -2188,6 +2188,7 @@ class RecvmsgGenericStreamTests(RecvmsgG
# Tests which require a stream socket and can use either recvmsg()
# or recvmsg_into().
+ @unittest._skipInRpmBuild('fails intermittently when run within Koji')
def testRecvmsgEOF(self):
# Receive end-of-stream indicator (b"", peer socket closed).
msg, ancdata, flags, addr = self.doRecvmsg(self.serv_sock, 1024)

310
00170-gc-assertions.patch Normal file
View File

@ -0,0 +1,310 @@
diff --git a/Include/object.h b/Include/object.h
index 0c88603..e3413e8 100644
--- a/Include/object.h
+++ b/Include/object.h
@@ -1059,6 +1059,49 @@ PyAPI_FUNC(void)
_PyObject_DebugTypeStats(FILE *out);
#endif /* ifndef Py_LIMITED_API */
+/*
+ Define a pair of assertion macros.
+
+ These work like the regular C assert(), in that they will abort the
+ process with a message on stderr if the given condition fails to hold,
+ but compile away to nothing if NDEBUG is defined.
+
+ However, before aborting, Python will also try to call _PyObject_Dump() on
+ the given object. This may be of use when investigating bugs in which a
+ particular object is corrupt (e.g. buggy a tp_visit method in an extension
+ module breaking the garbage collector), to help locate the broken objects.
+
+ The WITH_MSG variant allows you to supply an additional message that Python
+ will attempt to print to stderr, after the object dump.
+*/
+#ifdef NDEBUG
+/* No debugging: compile away the assertions: */
+#define PyObject_ASSERT_WITH_MSG(obj, expr, msg) ((void)0)
+#else
+/* With debugging: generate checks: */
+#define PyObject_ASSERT_WITH_MSG(obj, expr, msg) \
+ ((expr) \
+ ? (void)(0) \
+ : _PyObject_AssertFailed((obj), \
+ (msg), \
+ (__STRING(expr)), \
+ (__FILE__), \
+ (__LINE__), \
+ (__PRETTY_FUNCTION__)))
+#endif
+
+#define PyObject_ASSERT(obj, expr) \
+ PyObject_ASSERT_WITH_MSG(obj, expr, NULL)
+
+/*
+ Declare and define the entrypoint even when NDEBUG is defined, to avoid
+ causing compiler/linker errors when building extensions without NDEBUG
+ against a Python built with NDEBUG defined
+*/
+PyAPI_FUNC(void) _PyObject_AssertFailed(PyObject *, const char *,
+ const char *, const char *, int,
+ const char *);
+
#ifdef __cplusplus
}
#endif
diff --git a/Lib/test/test_gc.py b/Lib/test/test_gc.py
index e727499..6efcafb 100644
--- a/Lib/test/test_gc.py
+++ b/Lib/test/test_gc.py
@@ -1,10 +1,11 @@
import unittest
from test.support import (verbose, refcount_test, run_unittest,
strip_python_stderr, cpython_only, start_threads,
- temp_dir, requires_type_collecting)
+ temp_dir, import_module, requires_type_collecting)
from test.support.script_helper import assert_python_ok, make_script
import sys
+import sysconfig
import time
import gc
import weakref
@@ -50,6 +51,8 @@ class GC_Detector(object):
# gc collects it.
self.wr = weakref.ref(C1055820(666), it_happened)
+BUILD_WITH_NDEBUG = ('-DNDEBUG' in sysconfig.get_config_vars()['PY_CFLAGS'])
+
@with_tp_del
class Uncollectable(object):
"""Create a reference cycle with multiple __del__ methods.
@@ -862,6 +865,50 @@ class GCCallbackTests(unittest.TestCase):
self.assertEqual(len(gc.garbage), 0)
+ @unittest.skipIf(BUILD_WITH_NDEBUG,
+ 'built with -NDEBUG')
+ def test_refcount_errors(self):
+ self.preclean()
+ # Verify the "handling" of objects with broken refcounts
+ import_module("ctypes") #skip if not supported
+
+ import subprocess
+ code = '''if 1:
+ a = []
+ b = [a]
+
+ # Simulate the refcount of "a" being too low (compared to the
+ # references held on it by live data), but keeping it above zero
+ # (to avoid deallocating it):
+ import ctypes
+ ctypes.pythonapi.Py_DecRef(ctypes.py_object(a))
+
+ # The garbage collector should now have a fatal error when it reaches
+ # the broken object:
+ import gc
+ gc.collect()
+ '''
+ p = subprocess.Popen([sys.executable, "-c", code],
+ stdout=subprocess.PIPE,
+ stderr=subprocess.PIPE)
+ stdout, stderr = p.communicate()
+ p.stdout.close()
+ p.stderr.close()
+ # Verify that stderr has a useful error message:
+ self.assertRegex(stderr,
+ b'Modules/gcmodule.c:[0-9]+: visit_decref: Assertion "\(\(gc\)->gc.gc_refs >> \(1\)\) != 0" failed.')
+ self.assertRegex(stderr,
+ b'refcount was too small')
+ self.assertRegex(stderr,
+ b'object : \[\]')
+ self.assertRegex(stderr,
+ b'type : list')
+ self.assertRegex(stderr,
+ b'refcount: 1')
+ self.assertRegex(stderr,
+ b'address : 0x[0-9a-f]+')
+
+
class GCTogglingTests(unittest.TestCase):
def setUp(self):
gc.enable()
diff --git a/Modules/gcmodule.c b/Modules/gcmodule.c
index 0c6f444..87edd5a 100644
--- a/Modules/gcmodule.c
+++ b/Modules/gcmodule.c
@@ -341,7 +341,8 @@ update_refs(PyGC_Head *containers)
{
PyGC_Head *gc = containers->gc.gc_next;
for (; gc != containers; gc = gc->gc.gc_next) {
- assert(_PyGCHead_REFS(gc) == GC_REACHABLE);
+ PyObject_ASSERT(FROM_GC(gc),
+ _PyGCHead_REFS(gc) == GC_REACHABLE);
_PyGCHead_SET_REFS(gc, Py_REFCNT(FROM_GC(gc)));
/* Python's cyclic gc should never see an incoming refcount
* of 0: if something decref'ed to 0, it should have been
@@ -361,7 +362,8 @@ update_refs(PyGC_Head *containers)
* so serious that maybe this should be a release-build
* check instead of an assert?
*/
- assert(_PyGCHead_REFS(gc) != 0);
+ PyObject_ASSERT(FROM_GC(gc),
+ _PyGCHead_REFS(gc) != 0);
}
}
@@ -376,7 +378,9 @@ visit_decref(PyObject *op, void *data)
* generation being collected, which can be recognized
* because only they have positive gc_refs.
*/
- assert(_PyGCHead_REFS(gc) != 0); /* else refcount was too small */
+ PyObject_ASSERT_WITH_MSG(FROM_GC(gc),
+ _PyGCHead_REFS(gc) != 0,
+ "refcount was too small"); /* else refcount was too small */
if (_PyGCHead_REFS(gc) > 0)
_PyGCHead_DECREF(gc);
}
@@ -436,9 +440,10 @@ visit_reachable(PyObject *op, PyGC_Head *reachable)
* If gc_refs == GC_UNTRACKED, it must be ignored.
*/
else {
- assert(gc_refs > 0
- || gc_refs == GC_REACHABLE
- || gc_refs == GC_UNTRACKED);
+ PyObject_ASSERT(FROM_GC(gc),
+ gc_refs > 0
+ || gc_refs == GC_REACHABLE
+ || gc_refs == GC_UNTRACKED);
}
}
return 0;
@@ -480,7 +485,7 @@ move_unreachable(PyGC_Head *young, PyGC_Head *unreachable)
*/
PyObject *op = FROM_GC(gc);
traverseproc traverse = Py_TYPE(op)->tp_traverse;
- assert(_PyGCHead_REFS(gc) > 0);
+ PyObject_ASSERT(op, _PyGCHead_REFS(gc) > 0);
_PyGCHead_SET_REFS(gc, GC_REACHABLE);
(void) traverse(op,
(visitproc)visit_reachable,
@@ -543,7 +548,7 @@ move_legacy_finalizers(PyGC_Head *unreachable, PyGC_Head *finalizers)
for (gc = unreachable->gc.gc_next; gc != unreachable; gc = next) {
PyObject *op = FROM_GC(gc);
- assert(IS_TENTATIVELY_UNREACHABLE(op));
+ PyObject_ASSERT(op, IS_TENTATIVELY_UNREACHABLE(op));
next = gc->gc.gc_next;
if (has_legacy_finalizer(op)) {
@@ -619,7 +624,7 @@ handle_weakrefs(PyGC_Head *unreachable, PyGC_Head *old)
PyWeakReference **wrlist;
op = FROM_GC(gc);
- assert(IS_TENTATIVELY_UNREACHABLE(op));
+ PyObject_ASSERT(op, IS_TENTATIVELY_UNREACHABLE(op));
next = gc->gc.gc_next;
if (! PyType_SUPPORTS_WEAKREFS(Py_TYPE(op)))
@@ -640,9 +645,9 @@ handle_weakrefs(PyGC_Head *unreachable, PyGC_Head *old)
* the callback pointer intact. Obscure: it also
* changes *wrlist.
*/
- assert(wr->wr_object == op);
+ PyObject_ASSERT(wr->wr_object, wr->wr_object == op);
_PyWeakref_ClearRef(wr);
- assert(wr->wr_object == Py_None);
+ PyObject_ASSERT(wr->wr_object, wr->wr_object == Py_None);
if (wr->wr_callback == NULL)
continue; /* no callback */
@@ -676,7 +681,7 @@ handle_weakrefs(PyGC_Head *unreachable, PyGC_Head *old)
*/
if (IS_TENTATIVELY_UNREACHABLE(wr))
continue;
- assert(IS_REACHABLE(wr));
+ PyObject_ASSERT(op, IS_REACHABLE(wr));
/* Create a new reference so that wr can't go away
* before we can process it again.
@@ -685,7 +690,8 @@ handle_weakrefs(PyGC_Head *unreachable, PyGC_Head *old)
/* Move wr to wrcb_to_call, for the next pass. */
wrasgc = AS_GC(wr);
- assert(wrasgc != next); /* wrasgc is reachable, but
+ PyObject_ASSERT(op, wrasgc != next);
+ /* wrasgc is reachable, but
next isn't, so they can't
be the same */
gc_list_move(wrasgc, &wrcb_to_call);
@@ -701,11 +707,11 @@ handle_weakrefs(PyGC_Head *unreachable, PyGC_Head *old)
gc = wrcb_to_call.gc.gc_next;
op = FROM_GC(gc);
- assert(IS_REACHABLE(op));
- assert(PyWeakref_Check(op));
+ PyObject_ASSERT(op, IS_REACHABLE(op));
+ PyObject_ASSERT(op, PyWeakref_Check(op));
wr = (PyWeakReference *)op;
callback = wr->wr_callback;
- assert(callback != NULL);
+ PyObject_ASSERT(op, callback != NULL);
/* copy-paste of weakrefobject.c's handle_callback() */
temp = PyObject_CallFunctionObjArgs(callback, wr, NULL);
@@ -822,12 +828,14 @@ check_garbage(PyGC_Head *collectable)
for (gc = collectable->gc.gc_next; gc != collectable;
gc = gc->gc.gc_next) {
_PyGCHead_SET_REFS(gc, Py_REFCNT(FROM_GC(gc)));
- assert(_PyGCHead_REFS(gc) != 0);
+ PyObject_ASSERT(FROM_GC(gc),
+ _PyGCHead_REFS(gc) != 0);
}
subtract_refs(collectable);
for (gc = collectable->gc.gc_next; gc != collectable;
gc = gc->gc.gc_next) {
- assert(_PyGCHead_REFS(gc) >= 0);
+ PyObject_ASSERT(FROM_GC(gc),
+ _PyGCHead_REFS(gc) >= 0);
if (_PyGCHead_REFS(gc) != 0)
return -1;
}
diff --git a/Objects/object.c b/Objects/object.c
index 559794f..a47d47f 100644
--- a/Objects/object.c
+++ b/Objects/object.c
@@ -2022,6 +2022,35 @@ _PyTrash_thread_destroy_chain(void)
}
}
+PyAPI_FUNC(void)
+_PyObject_AssertFailed(PyObject *obj, const char *msg, const char *expr,
+ const char *file, int line, const char *function)
+{
+ fprintf(stderr,
+ "%s:%d: %s: Assertion \"%s\" failed.\n",
+ file, line, function, expr);
+ if (msg) {
+ fprintf(stderr, "%s\n", msg);
+ }
+
+ fflush(stderr);
+
+ if (obj) {
+ /* This might succeed or fail, but we're about to abort, so at least
+ try to provide any extra info we can: */
+ _PyObject_Dump(obj);
+ }
+ else {
+ fprintf(stderr, "NULL object\n");
+ }
+
+ fflush(stdout);
+ fflush(stderr);
+
+ /* Terminate the process: */
+ abort();
+}
+
#ifndef Py_TRACE_REFS
/* For Py_LIMITED_API, we need an out-of-line version of _Py_Dealloc.
Define this here, so we can undefine the macro. */

View File

@ -0,0 +1,30 @@
diff -r 39b9b05c3085 Lib/distutils/sysconfig.py
--- a/Lib/distutils/sysconfig.py Wed Apr 10 00:27:23 2013 +0200
+++ b/Lib/distutils/sysconfig.py Wed Apr 10 10:14:18 2013 +0200
@@ -362,7 +362,10 @@
done[n] = item = ""
if found:
after = value[m.end():]
- value = value[:m.start()] + item + after
+ value = value[:m.start()]
+ if item.strip() not in value:
+ value += item
+ value += after
if "$" in after:
notdone[name] = value
else:
diff -r 39b9b05c3085 Lib/sysconfig.py
--- a/Lib/sysconfig.py Wed Apr 10 00:27:23 2013 +0200
+++ b/Lib/sysconfig.py Wed Apr 10 10:14:18 2013 +0200
@@ -296,7 +296,10 @@
if found:
after = value[m.end():]
- value = value[:m.start()] + item + after
+ value = value[:m.start()]
+ if item.strip() not in value:
+ value += item
+ value += after
if "$" in after:
notdone[name] = value
else:

View File

@ -0,0 +1,233 @@
diff --git a/Lib/ensurepip/__init__.py b/Lib/ensurepip/__init__.py
index d69e09f..5cb12df 100644
--- a/Lib/ensurepip/__init__.py
+++ b/Lib/ensurepip/__init__.py
@@ -1,8 +1,10 @@
import os
import os.path
import pkgutil
+import shutil
import sys
import tempfile
+from ensurepip import rewheel
__all__ = ["version", "bootstrap"]
@@ -25,6 +27,8 @@ def _run_pip(args, additional_paths=None):
# Install the bundled software
import pip
+ if args[0] in ["install", "list", "wheel"]:
+ args.append('--pre')
return pip.main(args)
@@ -88,20 +92,39 @@ def _bootstrap(*, root=None, upgrade=False, user=False,
# omit pip and easy_install
os.environ["ENSUREPIP_OPTIONS"] = "install"
+ whls = []
+ rewheel_dir = None
+ # try to see if we have system-wide versions of _PROJECTS
+ dep_records = rewheel.find_system_records([p[0] for p in _PROJECTS])
+ # TODO: check if system-wide versions are the newest ones
+ # if --upgrade is used?
+ if all(dep_records):
+ # if we have all _PROJECTS installed system-wide, we'll recreate
+ # wheels from them and install those
+ rewheel_dir = tempfile.TemporaryDirectory()
+ for dr in dep_records:
+ new_whl = rewheel.rewheel_from_record(dr, rewheel_dir.name)
+ whls.append(os.path.join(rewheel_dir.name, new_whl))
+ else:
+ # if we don't have all the _PROJECTS installed system-wide,
+ # let's just fall back to bundled wheels
+ for project, version in _PROJECTS:
+ whl = os.path.join(
+ os.path.dirname(__file__),
+ "_bundled",
+ "{}-{}-py2.py3-none-any.whl".format(project, version)
+ )
+ whls.append(whl)
+
with tempfile.TemporaryDirectory() as tmpdir:
# Put our bundled wheels into a temporary directory and construct the
# additional paths that need added to sys.path
additional_paths = []
- for project, version in _PROJECTS:
- wheel_name = "{}-{}-py2.py3-none-any.whl".format(project, version)
- whl = pkgutil.get_data(
- "ensurepip",
- "_bundled/{}".format(wheel_name),
- )
- with open(os.path.join(tmpdir, wheel_name), "wb") as fp:
- fp.write(whl)
-
- additional_paths.append(os.path.join(tmpdir, wheel_name))
+ for whl in whls:
+ shutil.copy(whl, tmpdir)
+ additional_paths.append(os.path.join(tmpdir, os.path.basename(whl)))
+ if rewheel_dir:
+ rewheel_dir.cleanup()
# Construct the arguments to be passed to the pip command
args = ["install", "--no-index", "--find-links", tmpdir]
diff -Nur Python-3.4.1/Lib/ensurepip/rewheel/__init__.py Python-3.4.1-rewheel/Lib/ensurepip/rewheel/__init__.py
--- Python-3.4.1/Lib/ensurepip/rewheel/__init__.py 1970-01-01 01:00:00.000000000 +0100
+++ Python-3.4.1-rewheel/Lib/ensurepip/rewheel/__init__.py 2014-08-21 10:11:22.560320121 +0200
@@ -0,0 +1,143 @@
+import argparse
+import codecs
+import csv
+import email.parser
+import os
+import io
+import re
+import site
+import subprocess
+import sys
+import zipfile
+
+def run():
+ parser = argparse.ArgumentParser(description='Recreate wheel of package with given RECORD.')
+ parser.add_argument('record_path',
+ help='Path to RECORD file')
+ parser.add_argument('-o', '--output-dir',
+ help='Dir where to place the wheel, defaults to current working dir.',
+ dest='outdir',
+ default=os.path.curdir)
+
+ ns = parser.parse_args()
+ retcode = 0
+ try:
+ print(rewheel_from_record(**vars(ns)))
+ except BaseException as e:
+ print('Failed: {}'.format(e))
+ retcode = 1
+ sys.exit(1)
+
+def find_system_records(projects):
+ """Return list of paths to RECORD files for system-installed projects.
+
+ If a project is not installed, the resulting list contains None instead
+ of a path to its RECORD
+ """
+ records = []
+ # get system site-packages dirs
+ sys_sitepack = site.getsitepackages([sys.base_prefix, sys.base_exec_prefix])
+ sys_sitepack = [sp for sp in sys_sitepack if os.path.exists(sp)]
+ # try to find all projects in all system site-packages
+ for project in projects:
+ path = None
+ for sp in sys_sitepack:
+ dist_info_re = os.path.join(sp, project) + r'-[^\{0}]+\.dist-info'.format(os.sep)
+ candidates = [os.path.join(sp, p) for p in os.listdir(sp)]
+ # filter out candidate dirs based on the above regexp
+ filtered = [c for c in candidates if re.match(dist_info_re, c)]
+ # if we have 0 or 2 or more dirs, something is wrong...
+ if len(filtered) == 1:
+ path = filtered[0]
+ if path is not None:
+ records.append(os.path.join(path, 'RECORD'))
+ else:
+ records.append(None)
+ return records
+
+def rewheel_from_record(record_path, outdir):
+ """Recreates a whee of package with given record_path and returns path
+ to the newly created wheel."""
+ site_dir = os.path.dirname(os.path.dirname(record_path))
+ record_relpath = record_path[len(site_dir):].strip(os.path.sep)
+ to_write, to_omit = get_records_to_pack(site_dir, record_relpath)
+ new_wheel_name = get_wheel_name(record_path)
+ new_wheel_path = os.path.join(outdir, new_wheel_name + '.whl')
+
+ new_wheel = zipfile.ZipFile(new_wheel_path, mode='w', compression=zipfile.ZIP_DEFLATED)
+ # we need to write a new record with just the files that we will write,
+ # e.g. not binaries and *.pyc/*.pyo files
+ new_record = io.StringIO()
+ writer = csv.writer(new_record)
+
+ # handle files that we can write straight away
+ for f, sha_hash, size in to_write:
+ new_wheel.write(os.path.join(site_dir, f), arcname=f)
+ writer.writerow([f, sha_hash,size])
+
+ # rewrite the old wheel file with a new computed one
+ writer.writerow([record_relpath, '', ''])
+ new_wheel.writestr(record_relpath, new_record.getvalue())
+
+ new_wheel.close()
+
+ return new_wheel.filename
+
+def get_wheel_name(record_path):
+ """Return proper name of the wheel, without .whl."""
+
+ wheel_info_path = os.path.join(os.path.dirname(record_path), 'WHEEL')
+ with codecs.open(wheel_info_path, encoding='utf-8') as wheel_info_file:
+ wheel_info = email.parser.Parser().parsestr(wheel_info_file.read())
+
+ metadata_path = os.path.join(os.path.dirname(record_path), 'METADATA')
+ with codecs.open(metadata_path, encoding='utf-8') as metadata_file:
+ metadata = email.parser.Parser().parsestr(metadata_file.read())
+
+ # construct name parts according to wheel spec
+ distribution = metadata.get('Name')
+ version = metadata.get('Version')
+ build_tag = '' # nothing for now
+ lang_tag = []
+ for t in wheel_info.get_all('Tag'):
+ lang_tag.append(t.split('-')[0])
+ lang_tag = '.'.join(lang_tag)
+ abi_tag, plat_tag = wheel_info.get('Tag').split('-')[1:3]
+ # leave out build tag, if it is empty
+ to_join = filter(None, [distribution, version, build_tag, lang_tag, abi_tag, plat_tag])
+ return '-'.join(list(to_join))
+
+def get_records_to_pack(site_dir, record_relpath):
+ """Accepts path of sitedir and path of RECORD file relative to it.
+ Returns two lists:
+ - list of files that can be written to new RECORD straight away
+ - list of files that shouldn't be written or need some processing
+ (pyc and pyo files, scripts)
+ """
+ record_file_path = os.path.join(site_dir, record_relpath)
+ with codecs.open(record_file_path, encoding='utf-8') as record_file:
+ record_contents = record_file.read()
+ # temporary fix for https://github.com/pypa/pip/issues/1376
+ # we need to ignore files under ".data" directory
+ data_dir = os.path.dirname(record_relpath).strip(os.path.sep)
+ data_dir = data_dir[:-len('dist-info')] + 'data'
+
+ to_write = []
+ to_omit = []
+ for l in record_contents.splitlines():
+ spl = l.split(',')
+ if len(spl) == 3:
+ # new record will omit (or write differently):
+ # - abs paths, paths with ".." (entry points),
+ # - pyc+pyo files
+ # - the old RECORD file
+ # TODO: is there any better way to recognize an entry point?
+ if os.path.isabs(spl[0]) or spl[0].startswith('..') or \
+ spl[0].endswith('.pyc') or spl[0].endswith('.pyo') or \
+ spl[0] == record_relpath or spl[0].startswith(data_dir):
+ to_omit.append(spl)
+ else:
+ to_write.append(spl)
+ else:
+ pass # bad RECORD or empty line
+ return to_write, to_omit
diff -Nur Python-3.4.1/Makefile.pre.in Python-3.4.1-rewheel/Makefile.pre.in
--- Python-3.4.1/Makefile.pre.in 2014-08-21 10:49:31.512695040 +0200
+++ Python-3.4.1-rewheel/Makefile.pre.in 2014-08-21 10:10:41.961341722 +0200
@@ -1145,7 +1145,7 @@
test/test_asyncio \
collections concurrent concurrent/futures encodings \
email email/mime test/test_email test/test_email/data \
- ensurepip ensurepip/_bundled \
+ ensurepip ensurepip/_bundled ensurepip/rewheel \
html json test/test_json http dbm xmlrpc \
sqlite3 sqlite3/test \
logging csv wsgiref urllib \

View File

@ -1,70 +0,0 @@
From 72d6cb277804f58b660bf96d8f5efff78d88491c Mon Sep 17 00:00:00 2001
From: =?UTF-8?q?Miro=20Hron=C4=8Dok?= <miro@hroncok.cz>
Date: Wed, 15 Aug 2018 15:36:29 +0200
Subject: [PATCH] 00189: Instead of bundled wheels, use our RPM packaged wheels
We keep them in /usr/share/python-wheels
---
Lib/ensurepip/__init__.py | 32 ++++++++++++++++++++++----------
1 file changed, 22 insertions(+), 10 deletions(-)
diff --git a/Lib/ensurepip/__init__.py b/Lib/ensurepip/__init__.py
index fc0edec6e3..731817a3f0 100644
--- a/Lib/ensurepip/__init__.py
+++ b/Lib/ensurepip/__init__.py
@@ -1,16 +1,31 @@
+import distutils.version
+import glob
import os
import os.path
-import pkgutil
import sys
import tempfile
__all__ = ["version", "bootstrap"]
+_WHEEL_DIR = "/usr/share/python-wheels/"
-_SETUPTOOLS_VERSION = "41.2.0"
+_wheels = {}
-_PIP_VERSION = "19.2.3"
+def _get_most_recent_wheel_version(pkg):
+ prefix = os.path.join(_WHEEL_DIR, "{}-".format(pkg))
+ _wheels[pkg] = {}
+ for suffix in "-py2.py3-none-any.whl", "-py3-none-any.whl":
+ pattern = "{}*{}".format(prefix, suffix)
+ for path in glob.glob(pattern):
+ version_str = path[len(prefix):-len(suffix)]
+ _wheels[pkg][version_str] = os.path.basename(path)
+ return str(max(_wheels[pkg], key=distutils.version.LooseVersion))
+
+
+_SETUPTOOLS_VERSION = _get_most_recent_wheel_version("setuptools")
+
+_PIP_VERSION = _get_most_recent_wheel_version("pip")
_PROJECTS = [
("setuptools", _SETUPTOOLS_VERSION),
@@ -95,13 +110,10 @@ def _bootstrap(*, root=None, upgrade=False, user=False,
# additional paths that need added to sys.path
additional_paths = []
for project, version in _PROJECTS:
- wheel_name = "{}-{}-py2.py3-none-any.whl".format(project, version)
- whl = pkgutil.get_data(
- "ensurepip",
- "_bundled/{}".format(wheel_name),
- )
- with open(os.path.join(tmpdir, wheel_name), "wb") as fp:
- fp.write(whl)
+ wheel_name = _wheels[project][version]
+ with open(os.path.join(_WHEEL_DIR, wheel_name), "rb") as sfp:
+ with open(os.path.join(tmpdir, wheel_name), "wb") as fp:
+ fp.write(sfp.read())
additional_paths.append(os.path.join(tmpdir, wheel_name))
--
2.24.1

View File

@ -0,0 +1,12 @@
diff -up Python-3.5.0/Makefile.pre.in.lib Python-3.5.0/Makefile.pre.in
--- Python-3.5.0/Makefile.pre.in.lib 2015-09-21 15:39:47.928286620 +0200
+++ Python-3.5.0/Makefile.pre.in 2015-09-21 15:42:58.004042762 +0200
@@ -1340,7 +1340,7 @@ inclinstall:
# Install the library and miscellaneous stuff needed for extending/embedding
# This goes into $(exec_prefix)
-LIBPL= @LIBPL@
+LIBPL= $(LIBDEST)/config-$(LDVERSION)-$(MULTIARCH)
# pkgconfig directory
LIBPC= $(LIBDIR)/pkgconfig

View File

@ -1,26 +1,11 @@
From a1f0ea8fae6fb87cdc9d9c16bc0898e8f66fa907 Mon Sep 17 00:00:00 2001
From: Michal Cyprian <m.cyprian@gmail.com>
Date: Mon, 26 Jun 2017 16:32:56 +0200
Subject: [PATCH] 00251: Change user install location
Set values of prefix and exec_prefix in distutils install command
to /usr/local if executable is /usr/bin/python* and RPM build
is not detected to make pip and distutils install into separate location.
Fedora Change: https://fedoraproject.org/wiki/Changes/Making_sudo_pip_safe
---
Lib/distutils/command/install.py | 15 +++++++++++++--
Lib/site.py | 9 ++++++++-
2 files changed, 21 insertions(+), 3 deletions(-)
diff --git a/Lib/distutils/command/install.py b/Lib/distutils/command/install.py
index ae4f915669..0e4fd5b74a 100644
index 0258d3d..4ebf50a 100644
--- a/Lib/distutils/command/install.py
+++ b/Lib/distutils/command/install.py
@@ -418,8 +418,19 @@ class install(Command):
raise DistutilsOptionError(
"must not supply exec-prefix without prefix")
- self.prefix = os.path.normpath(sys.prefix)
- self.exec_prefix = os.path.normpath(sys.exec_prefix)
+ # self.prefix is set to sys.prefix + /local/
@ -36,16 +21,16 @@ index ae4f915669..0e4fd5b74a 100644
+
+ self.prefix = os.path.normpath(sys.prefix) + addition
+ self.exec_prefix = os.path.normpath(sys.exec_prefix) + addition
else:
if self.exec_prefix is None:
diff --git a/Lib/site.py b/Lib/site.py
index 22d53fa562..9513526109 100644
index 0fc9200..c95202e 100644
--- a/Lib/site.py
+++ b/Lib/site.py
@@ -348,7 +348,14 @@ def getsitepackages(prefixes=None):
@@ -322,7 +322,14 @@ def getsitepackages(prefixes=None):
return sitepackages
def addsitepackages(known_paths, prefixes=None):
- """Add site-packages to sys.path"""
+ """Add site-packages to sys.path
@ -59,6 +44,3 @@ index 22d53fa562..9513526109 100644
for sitedir in getsitepackages(prefixes):
if os.path.isdir(sitedir):
addsitedir(sitedir, known_paths)
--
2.24.1

View File

@ -0,0 +1,940 @@
diff --git a/Doc/using/cmdline.rst b/Doc/using/cmdline.rst
index 9ffb714..3f7201a 100644
--- a/Doc/using/cmdline.rst
+++ b/Doc/using/cmdline.rst
@@ -711,6 +711,45 @@ conflict.
.. versionadded:: 3.6
+
+.. envvar:: PYTHONCOERCECLOCALE
+
+ If set to the value ``0``, causes the main Python command line application
+ to skip coercing the legacy ASCII-based C locale to a more capable UTF-8
+ based alternative. Note that this setting is checked even when the
+ :option:`-E` or :option:`-I` options are used, as it is handled prior to
+ the processing of command line options.
+
+ If this variable is *not* set, or is set to a value other than ``0``, and
+ the current locale reported for the ``LC_CTYPE`` category is the default
+ ``C`` locale, then the Python CLI will attempt to configure one of the
+ following locales for the given locale categories before loading the
+ interpreter runtime:
+
+ * ``C.UTF-8`` (``LC_ALL``)
+ * ``C.utf8`` (``LC_ALL``)
+ * ``UTF-8`` (``LC_CTYPE``)
+
+ If setting one of these locale categories succeeds, then the matching
+ environment variables will be set (both ``LC_ALL`` and ``LANG`` for the
+ ``LC_ALL`` category, and ``LC_CTYPE`` for the ``LC_CTYPE`` category) in
+ the current process environment before the Python runtime is initialized.
+
+ Configuring one of these locales (either explicitly or via the above
+ implicit locale coercion) will automatically set the error handler for
+ :data:`sys.stdin` and :data:`sys.stdout` to ``surrogateescape``. This
+ behavior can be overridden using :envvar:`PYTHONIOENCODING` as usual.
+
+ For debugging purposes, setting ``PYTHONCOERCECLOCALE=warn`` will cause
+ Python to emit warning messages on ``stderr`` if either the locale coercion
+ activates, or else if a locale that *would* have triggered coercion is
+ still active when the Python runtime is initialized.
+
+ Availability: \*nix
+
+ .. versionadded:: 3.7
+ See :pep:`538` for more details.
+
Debug-mode variables
~~~~~~~~~~~~~~~~~~~~
diff --git a/Lib/test/support/script_helper.py b/Lib/test/support/script_helper.py
index ca5f9c2..7aa460b 100644
--- a/Lib/test/support/script_helper.py
+++ b/Lib/test/support/script_helper.py
@@ -51,8 +51,35 @@ def interpreter_requires_environment():
return __cached_interp_requires_environment
-_PythonRunResult = collections.namedtuple("_PythonRunResult",
- ("rc", "out", "err"))
+class _PythonRunResult(collections.namedtuple("_PythonRunResult",
+ ("rc", "out", "err"))):
+ """Helper for reporting Python subprocess run results"""
+ def fail(self, cmd_line):
+ """Provide helpful details about failed subcommand runs"""
+ # Limit to 80 lines to ASCII characters
+ maxlen = 80 * 100
+ out, err = self.out, self.err
+ if len(out) > maxlen:
+ out = b'(... truncated stdout ...)' + out[-maxlen:]
+ if len(err) > maxlen:
+ err = b'(... truncated stderr ...)' + err[-maxlen:]
+ out = out.decode('ascii', 'replace').rstrip()
+ err = err.decode('ascii', 'replace').rstrip()
+ raise AssertionError("Process return code is %d\n"
+ "command line: %r\n"
+ "\n"
+ "stdout:\n"
+ "---\n"
+ "%s\n"
+ "---\n"
+ "\n"
+ "stderr:\n"
+ "---\n"
+ "%s\n"
+ "---"
+ % (self.rc, cmd_line,
+ out,
+ err))
# Executing the interpreter in a subprocess
@@ -110,30 +137,7 @@ def run_python_until_end(*args, **env_vars):
def _assert_python(expected_success, *args, **env_vars):
res, cmd_line = run_python_until_end(*args, **env_vars)
if (res.rc and expected_success) or (not res.rc and not expected_success):
- # Limit to 80 lines to ASCII characters
- maxlen = 80 * 100
- out, err = res.out, res.err
- if len(out) > maxlen:
- out = b'(... truncated stdout ...)' + out[-maxlen:]
- if len(err) > maxlen:
- err = b'(... truncated stderr ...)' + err[-maxlen:]
- out = out.decode('ascii', 'replace').rstrip()
- err = err.decode('ascii', 'replace').rstrip()
- raise AssertionError("Process return code is %d\n"
- "command line: %r\n"
- "\n"
- "stdout:\n"
- "---\n"
- "%s\n"
- "---\n"
- "\n"
- "stderr:\n"
- "---\n"
- "%s\n"
- "---"
- % (res.rc, cmd_line,
- out,
- err))
+ res.fail(cmd_line)
return res
def assert_python_ok(*args, **env_vars):
diff --git a/Lib/test/test_c_locale_coercion.py b/Lib/test/test_c_locale_coercion.py
new file mode 100644
index 0000000..635c98f
--- /dev/null
+++ b/Lib/test/test_c_locale_coercion.py
@@ -0,0 +1,371 @@
+# Tests the attempted automatic coercion of the C locale to a UTF-8 locale
+
+import unittest
+import locale
+import os
+import sys
+import sysconfig
+import shutil
+import subprocess
+from collections import namedtuple
+
+import test.support
+from test.support.script_helper import (
+ run_python_until_end,
+ interpreter_requires_environment,
+)
+
+# Set our expectation for the default encoding used in the C locale
+# for the filesystem encoding and the standard streams
+
+# AIX uses iso8859-1 in the C locale, other *nix platforms use ASCII
+if sys.platform.startswith("aix"):
+ C_LOCALE_STREAM_ENCODING = "iso8859-1"
+else:
+ C_LOCALE_STREAM_ENCODING = "ascii"
+
+# FS encoding is UTF-8 on macOS, other *nix platforms use the locale encoding
+if sys.platform == "darwin":
+ C_LOCALE_FS_ENCODING = "utf-8"
+else:
+ C_LOCALE_FS_ENCODING = C_LOCALE_STREAM_ENCODING
+
+# Note that the above is probably still wrong in some cases, such as:
+# * Windows when PYTHONLEGACYWINDOWSFSENCODING is set
+# * AIX and any other platforms that use latin-1 in the C locale
+#
+# Options for dealing with this:
+# * Don't set PYTHON_COERCE_C_LOCALE on such platforms (e.g. Windows doesn't)
+# * Fix the test expectations to match the actual platform behaviour
+
+# In order to get the warning messages to match up as expected, the candidate
+# order here must much the target locale order in Python/pylifecycle.c
+_C_UTF8_LOCALES = ("C.UTF-8", "C.utf8", "UTF-8")
+
+# There's no reliable cross-platform way of checking locale alias
+# lists, so the only way of knowing which of these locales will work
+# is to try them with locale.setlocale(). We do that in a subprocess
+# to avoid altering the locale of the test runner.
+#
+# If the relevant locale module attributes exist, and we're not on a platform
+# where we expect it to always succeed, we also check that
+# `locale.nl_langinfo(locale.CODESET)` works, as if it fails, the interpreter
+# will skip locale coercion for that particular target locale
+_check_nl_langinfo_CODESET = bool(
+ sys.platform not in ("darwin", "linux") and
+ hasattr(locale, "nl_langinfo") and
+ hasattr(locale, "CODESET")
+)
+
+def _set_locale_in_subprocess(locale_name):
+ cmd_fmt = "import locale; print(locale.setlocale(locale.LC_CTYPE, '{}'))"
+ if _check_nl_langinfo_CODESET:
+ # If there's no valid CODESET, we expect coercion to be skipped
+ cmd_fmt += "; import sys; sys.exit(not locale.nl_langinfo(locale.CODESET))"
+ cmd = cmd_fmt.format(locale_name)
+ result, py_cmd = run_python_until_end("-c", cmd, __isolated=True)
+ return result.rc == 0
+
+
+
+_fields = "fsencoding stdin_info stdout_info stderr_info lang lc_ctype lc_all"
+_EncodingDetails = namedtuple("EncodingDetails", _fields)
+
+class EncodingDetails(_EncodingDetails):
+ # XXX (ncoghlan): Using JSON for child state reporting may be less fragile
+ CHILD_PROCESS_SCRIPT = ";".join([
+ "import sys, os",
+ "print(sys.getfilesystemencoding())",
+ "print(sys.stdin.encoding + ':' + sys.stdin.errors)",
+ "print(sys.stdout.encoding + ':' + sys.stdout.errors)",
+ "print(sys.stderr.encoding + ':' + sys.stderr.errors)",
+ "print(os.environ.get('LANG', 'not set'))",
+ "print(os.environ.get('LC_CTYPE', 'not set'))",
+ "print(os.environ.get('LC_ALL', 'not set'))",
+ ])
+
+ @classmethod
+ def get_expected_details(cls, coercion_expected, fs_encoding, stream_encoding, env_vars):
+ """Returns expected child process details for a given encoding"""
+ _stream = stream_encoding + ":{}"
+ # stdin and stdout should use surrogateescape either because the
+ # coercion triggered, or because the C locale was detected
+ stream_info = 2*[_stream.format("surrogateescape")]
+ # stderr should always use backslashreplace
+ stream_info.append(_stream.format("backslashreplace"))
+ expected_lang = env_vars.get("LANG", "not set").lower()
+ if coercion_expected:
+ expected_lc_ctype = CLI_COERCION_TARGET.lower()
+ else:
+ expected_lc_ctype = env_vars.get("LC_CTYPE", "not set").lower()
+ expected_lc_all = env_vars.get("LC_ALL", "not set").lower()
+ env_info = expected_lang, expected_lc_ctype, expected_lc_all
+ return dict(cls(fs_encoding, *stream_info, *env_info)._asdict())
+
+ @staticmethod
+ def _handle_output_variations(data):
+ """Adjust the output to handle platform specific idiosyncrasies
+
+ * Some platforms report ASCII as ANSI_X3.4-1968
+ * Some platforms report ASCII as US-ASCII
+ * Some platforms report UTF-8 instead of utf-8
+ """
+ data = data.replace(b"ANSI_X3.4-1968", b"ascii")
+ data = data.replace(b"US-ASCII", b"ascii")
+ data = data.lower()
+ return data
+
+ @classmethod
+ def get_child_details(cls, env_vars):
+ """Retrieves fsencoding and standard stream details from a child process
+
+ Returns (encoding_details, stderr_lines):
+
+ - encoding_details: EncodingDetails for eager decoding
+ - stderr_lines: result of calling splitlines() on the stderr output
+
+ The child is run in isolated mode if the current interpreter supports
+ that.
+ """
+ result, py_cmd = run_python_until_end(
+ "-c", cls.CHILD_PROCESS_SCRIPT,
+ __isolated=True,
+ **env_vars
+ )
+ if not result.rc == 0:
+ result.fail(py_cmd)
+ # All subprocess outputs in this test case should be pure ASCII
+ adjusted_output = cls._handle_output_variations(result.out)
+ stdout_lines = adjusted_output.decode("ascii").splitlines()
+ child_encoding_details = dict(cls(*stdout_lines)._asdict())
+ stderr_lines = result.err.decode("ascii").rstrip().splitlines()
+ return child_encoding_details, stderr_lines
+
+
+# Details of the shared library warning emitted at runtime
+LEGACY_LOCALE_WARNING = (
+ "Python runtime initialized with LC_CTYPE=C (a locale with default ASCII "
+ "encoding), which may cause Unicode compatibility problems. Using C.UTF-8, "
+ "C.utf8, or UTF-8 (if available) as alternative Unicode-compatible "
+ "locales is recommended."
+)
+
+# Details of the CLI locale coercion warning emitted at runtime
+CLI_COERCION_WARNING_FMT = (
+ "Python detected LC_CTYPE=C: LC_CTYPE coerced to {} (set another locale "
+ "or PYTHONCOERCECLOCALE=0 to disable this locale coercion behavior)."
+)
+
+
+AVAILABLE_TARGETS = None
+CLI_COERCION_TARGET = None
+CLI_COERCION_WARNING = None
+
+def setUpModule():
+ global AVAILABLE_TARGETS
+ global CLI_COERCION_TARGET
+ global CLI_COERCION_WARNING
+
+ if AVAILABLE_TARGETS is not None:
+ # initialization already done
+ return
+ AVAILABLE_TARGETS = []
+
+ # Find the target locales available in the current system
+ for target_locale in _C_UTF8_LOCALES:
+ if _set_locale_in_subprocess(target_locale):
+ AVAILABLE_TARGETS.append(target_locale)
+
+ if AVAILABLE_TARGETS:
+ # Coercion is expected to use the first available target locale
+ CLI_COERCION_TARGET = AVAILABLE_TARGETS[0]
+ CLI_COERCION_WARNING = CLI_COERCION_WARNING_FMT.format(CLI_COERCION_TARGET)
+
+
+class _LocaleHandlingTestCase(unittest.TestCase):
+ # Base class to check expected locale handling behaviour
+
+ def _check_child_encoding_details(self,
+ env_vars,
+ expected_fs_encoding,
+ expected_stream_encoding,
+ expected_warnings,
+ coercion_expected):
+ """Check the C locale handling for the given process environment
+
+ Parameters:
+ expected_fs_encoding: expected sys.getfilesystemencoding() result
+ expected_stream_encoding: expected encoding for standard streams
+ expected_warning: stderr output to expect (if any)
+ """
+ result = EncodingDetails.get_child_details(env_vars)
+ encoding_details, stderr_lines = result
+ expected_details = EncodingDetails.get_expected_details(
+ coercion_expected,
+ expected_fs_encoding,
+ expected_stream_encoding,
+ env_vars
+ )
+ self.assertEqual(encoding_details, expected_details)
+ if expected_warnings is None:
+ expected_warnings = []
+ self.assertEqual(stderr_lines, expected_warnings)
+
+
+class LocaleConfigurationTests(_LocaleHandlingTestCase):
+ # Test explicit external configuration via the process environment
+
+ def setUpClass():
+ # This relies on setupModule() having been run, so it can't be
+ # handled via the @unittest.skipUnless decorator
+ if not AVAILABLE_TARGETS:
+ raise unittest.SkipTest("No C-with-UTF-8 locale available")
+
+ def test_external_target_locale_configuration(self):
+
+ # Explicitly setting a target locale should give the same behaviour as
+ # is seen when implicitly coercing to that target locale
+ self.maxDiff = None
+
+ expected_fs_encoding = "utf-8"
+ expected_stream_encoding = "utf-8"
+
+ base_var_dict = {
+ "LANG": "",
+ "LC_CTYPE": "",
+ "LC_ALL": "",
+ }
+ for env_var in ("LANG", "LC_CTYPE"):
+ for locale_to_set in AVAILABLE_TARGETS:
+ # XXX (ncoghlan): LANG=UTF-8 doesn't appear to work as
+ # expected, so skip that combination for now
+ # See https://bugs.python.org/issue30672 for discussion
+ if env_var == "LANG" and locale_to_set == "UTF-8":
+ continue
+
+ with self.subTest(env_var=env_var,
+ configured_locale=locale_to_set):
+ var_dict = base_var_dict.copy()
+ var_dict[env_var] = locale_to_set
+ self._check_child_encoding_details(var_dict,
+ expected_fs_encoding,
+ expected_stream_encoding,
+ expected_warnings=None,
+ coercion_expected=False)
+
+
+
+@test.support.cpython_only
+@unittest.skipUnless(sysconfig.get_config_var("PY_COERCE_C_LOCALE"),
+ "C locale coercion disabled at build time")
+class LocaleCoercionTests(_LocaleHandlingTestCase):
+ # Test implicit reconfiguration of the environment during CLI startup
+
+ def _check_c_locale_coercion(self,
+ fs_encoding, stream_encoding,
+ coerce_c_locale,
+ expected_warnings=None,
+ coercion_expected=True,
+ **extra_vars):
+ """Check the C locale handling for various configurations
+
+ Parameters:
+ fs_encoding: expected sys.getfilesystemencoding() result
+ stream_encoding: expected encoding for standard streams
+ coerce_c_locale: setting to use for PYTHONCOERCECLOCALE
+ None: don't set the variable at all
+ str: the value set in the child's environment
+ expected_warnings: expected warning lines on stderr
+ extra_vars: additional environment variables to set in subprocess
+ """
+ self.maxDiff = None
+
+ if not AVAILABLE_TARGETS:
+ # Locale coercion is disabled when there aren't any target locales
+ fs_encoding = C_LOCALE_FS_ENCODING
+ stream_encoding = C_LOCALE_STREAM_ENCODING
+ coercion_expected = False
+ if expected_warnings:
+ expected_warnings = [LEGACY_LOCALE_WARNING]
+
+ base_var_dict = {
+ "LANG": "",
+ "LC_CTYPE": "",
+ "LC_ALL": "",
+ }
+ base_var_dict.update(extra_vars)
+ for env_var in ("LANG", "LC_CTYPE"):
+ for locale_to_set in ("", "C", "POSIX", "invalid.ascii"):
+ # XXX (ncoghlan): *BSD platforms don't behave as expected in the
+ # POSIX locale, so we skip that for now
+ # See https://bugs.python.org/issue30672 for discussion
+ if locale_to_set == "POSIX":
+ continue
+ with self.subTest(env_var=env_var,
+ nominal_locale=locale_to_set,
+ PYTHONCOERCECLOCALE=coerce_c_locale):
+ var_dict = base_var_dict.copy()
+ var_dict[env_var] = locale_to_set
+ if coerce_c_locale is not None:
+ var_dict["PYTHONCOERCECLOCALE"] = coerce_c_locale
+ # Check behaviour on successful coercion
+ self._check_child_encoding_details(var_dict,
+ fs_encoding,
+ stream_encoding,
+ expected_warnings,
+ coercion_expected)
+
+ def test_test_PYTHONCOERCECLOCALE_not_set(self):
+ # This should coerce to the first available target locale by default
+ self._check_c_locale_coercion("utf-8", "utf-8", coerce_c_locale=None)
+
+ def test_PYTHONCOERCECLOCALE_not_zero(self):
+ # *Any* string other than "0" is considered "set" for our purposes
+ # and hence should result in the locale coercion being enabled
+ for setting in ("", "1", "true", "false"):
+ self._check_c_locale_coercion("utf-8", "utf-8", coerce_c_locale=setting)
+
+ def test_PYTHONCOERCECLOCALE_set_to_warn(self):
+ # PYTHONCOERCECLOCALE=warn enables runtime warnings for legacy locales
+ self._check_c_locale_coercion("utf-8", "utf-8",
+ coerce_c_locale="warn",
+ expected_warnings=[CLI_COERCION_WARNING])
+
+
+ def test_PYTHONCOERCECLOCALE_set_to_zero(self):
+ # The setting "0" should result in the locale coercion being disabled
+ self._check_c_locale_coercion(C_LOCALE_FS_ENCODING,
+ C_LOCALE_STREAM_ENCODING,
+ coerce_c_locale="0",
+ coercion_expected=False)
+ # Setting LC_ALL=C shouldn't make any difference to the behaviour
+ self._check_c_locale_coercion(C_LOCALE_FS_ENCODING,
+ C_LOCALE_STREAM_ENCODING,
+ coerce_c_locale="0",
+ LC_ALL="C",
+ coercion_expected=False)
+
+ def test_LC_ALL_set_to_C(self):
+ # Setting LC_ALL should render the locale coercion ineffective
+ self._check_c_locale_coercion(C_LOCALE_FS_ENCODING,
+ C_LOCALE_STREAM_ENCODING,
+ coerce_c_locale=None,
+ LC_ALL="C",
+ coercion_expected=False)
+ # And result in a warning about a lack of locale compatibility
+ self._check_c_locale_coercion(C_LOCALE_FS_ENCODING,
+ C_LOCALE_STREAM_ENCODING,
+ coerce_c_locale="warn",
+ LC_ALL="C",
+ expected_warnings=[LEGACY_LOCALE_WARNING],
+ coercion_expected=False)
+
+def test_main():
+ test.support.run_unittest(
+ LocaleConfigurationTests,
+ LocaleCoercionTests
+ )
+ test.support.reap_children()
+
+if __name__ == "__main__":
+ test_main()
diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py
index 6e4286e..594dfa9 100644
--- a/Lib/test/test_capi.py
+++ b/Lib/test/test_capi.py
@@ -425,32 +425,21 @@ class EmbeddingTests(unittest.TestCase):
def test_repeated_init_and_subinterpreters(self):
# This is just a "don't crash" test
out, err = self.run_embedded_interpreter('repeated_init_and_subinterpreters')
- if support.verbose:
+ if support.verbose > 1:
print()
print(out)
print(err)
- @staticmethod
- def _get_default_pipe_encoding():
- rp, wp = os.pipe()
- try:
- with os.fdopen(wp, 'w') as w:
- default_pipe_encoding = w.encoding
- finally:
- os.close(rp)
- return default_pipe_encoding
-
def test_forced_io_encoding(self):
# Checks forced configuration of embedded interpreter IO streams
env = dict(os.environ, PYTHONIOENCODING="utf-8:surrogateescape")
out, err = self.run_embedded_interpreter("forced_io_encoding", env=env)
- if support.verbose:
+ if support.verbose > 1:
print()
print(out)
print(err)
expected_stream_encoding = "utf-8"
expected_errors = "surrogateescape"
- expected_pipe_encoding = self._get_default_pipe_encoding()
expected_output = '\n'.join([
"--- Use defaults ---",
"Expected encoding: default",
diff --git a/Lib/test/test_cmd_line.py b/Lib/test/test_cmd_line.py
index ae2bcd4..0a302ff 100644
--- a/Lib/test/test_cmd_line.py
+++ b/Lib/test/test_cmd_line.py
@@ -151,6 +152,7 @@ class CmdLineTest(unittest.TestCase):
env = os.environ.copy()
# Use C locale to get ascii for the locale encoding
env['LC_ALL'] = 'C'
+ env['PYTHONCOERCECLOCALE'] = '0'
code = (
b'import locale; '
b'print(ascii("' + undecodable + b'"), '
diff --git a/Lib/test/test_sys.py b/Lib/test/test_sys.py
index 7866a5c..b41239a 100644
--- a/Lib/test/test_sys.py
+++ b/Lib/test/test_sys.py
@@ -680,6 +680,7 @@ class SysModuleTest(unittest.TestCase):
# Force the POSIX locale
env = os.environ.copy()
env["LC_ALL"] = "C"
+ env["PYTHONCOERCECLOCALE"] = "0"
code = '\n'.join((
'import sys',
'def dump(name):',
diff --git a/Modules/main.c b/Modules/main.c
index b0fb78f..0d8590a 100644
--- a/Modules/main.c
+++ b/Modules/main.c
@@ -105,7 +105,11 @@ static const char usage_6[] =
" predictable seed.\n"
"PYTHONMALLOC: set the Python memory allocators and/or install debug hooks\n"
" on Python memory allocators. Use PYTHONMALLOC=debug to install debug\n"
-" hooks.\n";
+" hooks.\n"
+
+"PYTHONCOERCECLOCALE: if this variable is set to 0, it disables the locale\n"
+" coercion behavior. Use PYTHONCOERCECLOCALE=warn to request display of\n"
+" locale coercion and locale compatibility warnings on stderr.\n";
static int
usage(int exitcode, const wchar_t* program)
diff --git a/Programs/_testembed.c b/Programs/_testembed.c
index b0f9087..da892bf 100644
--- a/Programs/_testembed.c
+++ b/Programs/_testembed.c
@@ -1,4 +1,5 @@
#include <Python.h>
+#include "pyconfig.h"
#include "pythread.h"
#include <stdio.h>
diff --git a/Programs/python.c b/Programs/python.c
index a7afbc7..03f8295 100644
--- a/Programs/python.c
+++ b/Programs/python.c
@@ -15,6 +15,21 @@ wmain(int argc, wchar_t **argv)
}
#else
+/* Access private pylifecycle helper API to better handle the legacy C locale
+ *
+ * The legacy C locale assumes ASCII as the default text encoding, which
+ * causes problems not only for the CPython runtime, but also other
+ * components like GNU readline.
+ *
+ * Accordingly, when the CLI detects it, it attempts to coerce it to a
+ * more capable UTF-8 based alternative.
+ *
+ * See the documentation of the PYTHONCOERCECLOCALE setting for more details.
+ *
+ */
+extern int _Py_LegacyLocaleDetected(void);
+extern void _Py_CoerceLegacyLocale(void);
+
int
main(int argc, char **argv)
{
@@ -25,7 +40,11 @@ main(int argc, char **argv)
char *oldloc;
/* Force malloc() allocator to bootstrap Python */
+#ifdef Py_DEBUG
+ (void)_PyMem_SetupAllocators("malloc_debug");
+# else
(void)_PyMem_SetupAllocators("malloc");
+# endif
argv_copy = (wchar_t **)PyMem_RawMalloc(sizeof(wchar_t*) * (argc+1));
argv_copy2 = (wchar_t **)PyMem_RawMalloc(sizeof(wchar_t*) * (argc+1));
@@ -49,7 +68,21 @@ main(int argc, char **argv)
return 1;
}
+#ifdef __ANDROID__
+ /* Passing "" to setlocale() on Android requests the C locale rather
+ * than checking environment variables, so request C.UTF-8 explicitly
+ */
+ setlocale(LC_ALL, "C.UTF-8");
+#else
+ /* Reconfigure the locale to the default for this process */
setlocale(LC_ALL, "");
+#endif
+
+ if (_Py_LegacyLocaleDetected()) {
+ _Py_CoerceLegacyLocale();
+ }
+
+ /* Convert from char to wchar_t based on the locale settings */
for (i = 0; i < argc; i++) {
argv_copy[i] = Py_DecodeLocale(argv[i], NULL);
if (!argv_copy[i]) {
@@ -70,7 +103,11 @@ main(int argc, char **argv)
/* Force again malloc() allocator to release memory blocks allocated
before Py_Main() */
+#ifdef Py_DEBUG
+ (void)_PyMem_SetupAllocators("malloc_debug");
+# else
(void)_PyMem_SetupAllocators("malloc");
+# endif
for (i = 0; i < argc; i++) {
PyMem_RawFree(argv_copy2[i]);
diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c
index 640271f..2a22b24 100644
--- a/Python/pylifecycle.c
+++ b/Python/pylifecycle.c
@@ -167,6 +167,7 @@ Py_SetStandardStreamEncoding(const char *encoding, const char *errors)
return 0;
}
+
/* Global initializations. Can be undone by Py_FinalizeEx(). Don't
call this twice without an intervening Py_FinalizeEx() call. When
initializations fail, a fatal error is issued and the function does
@@ -301,6 +302,183 @@ import_init(PyInterpreterState *interp, PyObject *sysmod)
}
+/* Helper functions to better handle the legacy C locale
+ *
+ * The legacy C locale assumes ASCII as the default text encoding, which
+ * causes problems not only for the CPython runtime, but also other
+ * components like GNU readline.
+ *
+ * Accordingly, when the CLI detects it, it attempts to coerce it to a
+ * more capable UTF-8 based alternative as follows:
+ *
+ * if (_Py_LegacyLocaleDetected()) {
+ * _Py_CoerceLegacyLocale();
+ * }
+ *
+ * See the documentation of the PYTHONCOERCECLOCALE setting for more details.
+ *
+ * Locale coercion also impacts the default error handler for the standard
+ * streams: while the usual default is "strict", the default for the legacy
+ * C locale and for any of the coercion target locales is "surrogateescape".
+ */
+
+int
+_Py_LegacyLocaleDetected(void)
+{
+#ifndef MS_WINDOWS
+ /* On non-Windows systems, the C locale is considered a legacy locale */
+ /* XXX (ncoghlan): some platforms (notably Mac OS X) don't appear to treat
+ * the POSIX locale as a simple alias for the C locale, so
+ * we may also want to check for that explicitly.
+ */
+ const char *ctype_loc = setlocale(LC_CTYPE, NULL);
+ return ctype_loc != NULL && strcmp(ctype_loc, "C") == 0;
+#else
+ /* Windows uses code pages instead of locales, so no locale is legacy */
+ return 0;
+#endif
+}
+
+
+static const char *_C_LOCALE_WARNING =
+ "Python runtime initialized with LC_CTYPE=C (a locale with default ASCII "
+ "encoding), which may cause Unicode compatibility problems. Using C.UTF-8, "
+ "C.utf8, or UTF-8 (if available) as alternative Unicode-compatible "
+ "locales is recommended.\n";
+
+static int
+_legacy_locale_warnings_enabled(void)
+{
+ const char *coerce_c_locale = getenv("PYTHONCOERCECLOCALE");
+ return (coerce_c_locale != NULL &&
+ strncmp(coerce_c_locale, "warn", 5) == 0);
+}
+
+static void
+_emit_stderr_warning_for_legacy_locale(void)
+{
+ if (_legacy_locale_warnings_enabled()) {
+ if (_Py_LegacyLocaleDetected()) {
+ fprintf(stderr, "%s", _C_LOCALE_WARNING);
+ }
+ }
+}
+
+typedef struct _CandidateLocale {
+ const char *locale_name; /* The locale to try as a coercion target */
+} _LocaleCoercionTarget;
+
+static _LocaleCoercionTarget _TARGET_LOCALES[] = {
+ {"C.UTF-8"},
+ {"C.utf8"},
+ {"UTF-8"},
+ {NULL}
+};
+
+static char *
+get_default_standard_stream_error_handler(void)
+{
+ const char *ctype_loc = setlocale(LC_CTYPE, NULL);
+ if (ctype_loc != NULL) {
+ /* "surrogateescape" is the default in the legacy C locale */
+ if (strcmp(ctype_loc, "C") == 0) {
+ return "surrogateescape";
+ }
+
+#ifdef PY_COERCE_C_LOCALE
+ /* "surrogateescape" is the default in locale coercion target locales */
+ const _LocaleCoercionTarget *target = NULL;
+ for (target = _TARGET_LOCALES; target->locale_name; target++) {
+ if (strcmp(ctype_loc, target->locale_name) == 0) {
+ return "surrogateescape";
+ }
+ }
+#endif
+ }
+
+ /* Otherwise return NULL to request the typical default error handler */
+ return NULL;
+}
+
+#ifdef PY_COERCE_C_LOCALE
+static const char *_C_LOCALE_COERCION_WARNING =
+ "Python detected LC_CTYPE=C: LC_CTYPE coerced to %.20s (set another locale "
+ "or PYTHONCOERCECLOCALE=0 to disable this locale coercion behavior).\n";
+
+static void
+_coerce_default_locale_settings(const _LocaleCoercionTarget *target)
+{
+
+ const char *newloc = target->locale_name;
+
+ /* Reset locale back to currently configured defaults */
+ setlocale(LC_ALL, "");
+
+ /* Set the relevant locale environment variable */
+ if (setenv("LC_CTYPE", newloc, 1)) {
+ fprintf(stderr,
+ "Error setting LC_CTYPE, skipping C locale coercion\n");
+ return;
+ }
+ if (_legacy_locale_warnings_enabled()) {
+ fprintf(stderr, _C_LOCALE_COERCION_WARNING, newloc);
+ }
+
+ /* Reconfigure with the overridden environment variables */
+ setlocale(LC_ALL, "");
+}
+#endif
+
+
+void
+_Py_CoerceLegacyLocale(void)
+{
+#ifdef PY_COERCE_C_LOCALE
+ /* We ignore the Python -E and -I flags here, as the CLI needs to sort out
+ * the locale settings *before* we try to do anything with the command
+ * line arguments. For cross-platform debugging purposes, we also need
+ * to give end users a way to force even scripts that are otherwise
+ * isolated from their environment to use the legacy ASCII-centric C
+ * locale.
+ *
+ * Ignoring -E and -I is safe from a security perspective, as we only use
+ * the setting to turn *off* the implicit locale coercion, and anyone with
+ * access to the process environment already has the ability to set
+ * `LC_ALL=C` to override the C level locale settings anyway.
+ */
+ const char *coerce_c_locale = getenv("PYTHONCOERCECLOCALE");
+ if (coerce_c_locale == NULL || strncmp(coerce_c_locale, "0", 2) != 0) {
+ /* PYTHONCOERCECLOCALE is not set, or is set to something other than "0" */
+ const char *locale_override = getenv("LC_ALL");
+ if (locale_override == NULL || *locale_override == '\0') {
+ /* LC_ALL is also not set (or is set to an empty string) */
+ const _LocaleCoercionTarget *target = NULL;
+ for (target = _TARGET_LOCALES; target->locale_name; target++) {
+ const char *new_locale = setlocale(LC_CTYPE,
+ target->locale_name);
+ if (new_locale != NULL) {
+#if !defined(__APPLE__) && defined(HAVE_LANGINFO_H) && defined(CODESET)
+ /* Also ensure that nl_langinfo works in this locale */
+ char *codeset = nl_langinfo(CODESET);
+ if (!codeset || *codeset == '\0') {
+ /* CODESET is not set or empty, so skip coercion */
+ new_locale = NULL;
+ setlocale(LC_CTYPE, "");
+ continue;
+ }
+#endif
+ /* Successfully configured locale, so make it the default */
+ _coerce_default_locale_settings(target);
+ return;
+ }
+ }
+ }
+ }
+ /* No C locale warning here, as Py_Initialize will emit one later */
+#endif
+}
+
+
void
_Py_InitializeEx_Private(int install_sigs, int install_importlib)
{
@@ -315,11 +493,19 @@ _Py_InitializeEx_Private(int install_sigs, int install_importlib)
initialized = 1;
_Py_Finalizing = NULL;
-#ifdef HAVE_SETLOCALE
+#ifdef __ANDROID__
+ /* Passing "" to setlocale() on Android requests the C locale rather
+ * than checking environment variables, so request C.UTF-8 explicitly
+ */
+ setlocale(LC_CTYPE, "C.UTF-8");
+#else
+#ifndef MS_WINDOWS
/* Set up the LC_CTYPE locale, so we can obtain
the locale's charset without having to switch
locales. */
setlocale(LC_CTYPE, "");
+ _emit_stderr_warning_for_legacy_locale();
+#endif
#endif
if ((p = Py_GETENV("PYTHONDEBUG")) && *p != '\0')
@@ -1251,12 +1437,8 @@ initstdio(void)
}
}
if (!errors && !(pythonioencoding && *pythonioencoding)) {
- /* When the LC_CTYPE locale is the POSIX locale ("C locale"),
- stdin and stdout use the surrogateescape error handler by
- default, instead of the strict error handler. */
- char *loc = setlocale(LC_CTYPE, NULL);
- if (loc != NULL && strcmp(loc, "C") == 0)
- errors = "surrogateescape";
+ /* Choose the default error handler based on the current locale */
+ errors = get_default_standard_stream_error_handler();
}
}
diff --git a/configure.ac b/configure.ac
index 601cc84..5cdc021 100644
--- a/configure.ac
+++ b/configure.ac
@@ -3310,6 +3310,40 @@ then
fi
AC_MSG_RESULT($with_pymalloc)
+# Check for --with-c-locale-coercion
+AC_MSG_CHECKING(for --with-c-locale-coercion)
+AC_ARG_WITH(c-locale-coercion,
+ AS_HELP_STRING([--with(out)-c-locale-coercion],
+ [disable/enable C locale coercion to a UTF-8 based locale]))
+
+if test -z "$with_c_locale_coercion"
+then
+ with_c_locale_coercion="yes"
+fi
+if test "$with_c_locale_coercion" != "no"
+then
+ AC_DEFINE(PY_COERCE_C_LOCALE, 1,
+ [Define if you want to coerce the C locale to a UTF-8 based locale])
+fi
+AC_MSG_RESULT($with_c_locale_coercion)
+
+# Check for --with-c-locale-warning
+AC_MSG_CHECKING(for --with-c-locale-warning)
+AC_ARG_WITH(c-locale-warning,
+ AS_HELP_STRING([--with(out)-c-locale-warning],
+ [disable/enable locale compatibility warning in the C locale]))
+
+if test -z "$with_c_locale_warning"
+then
+ with_c_locale_warning="yes"
+fi
+if test "$with_c_locale_warning" != "no"
+then
+ AC_DEFINE(PY_WARN_ON_C_LOCALE, 1,
+ [Define to emit a locale compatibility warning in the C locale])
+fi
+AC_MSG_RESULT($with_c_locale_warning)
+
# Check for Valgrind support
AC_MSG_CHECKING([for --with-valgrind])
AC_ARG_WITH([valgrind],

View File

@ -1,38 +1,7 @@
From b60a8fee7e91e36b48a2ea27d1bb9f42642c3eb2 Mon Sep 17 00:00:00 2001
From: Petr Viktorin <pviktori@redhat.com>
Date: Mon, 28 Aug 2017 17:16:46 +0200
Subject: [PATCH] 00274: Upstream uses Debian-style architecture naming, change
to match Fedora
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Co-authored-by: Petr Viktorin <pviktori@redhat.com>
Co-authored-by: Miro Hrončok <miro@hroncok.cz>
Co-authored-by: Tomas Orsava <torsava@redhat.com>
---
config.sub | 2 +-
configure.ac | 16 ++++++++--------
2 files changed, 9 insertions(+), 9 deletions(-)
diff --git a/config.sub b/config.sub
index ba37cf99e2..52a9ec6662 100755
--- a/config.sub
+++ b/config.sub
@@ -1042,7 +1042,7 @@ case $basic_machine in
;;
ppc64) basic_machine=powerpc64-unknown
;;
- ppc64-*) basic_machine=powerpc64-`echo "$basic_machine" | sed 's/^[^-]*-//'`
+ ppc64-* | ppc64p7-*) basic_machine=powerpc64-`echo "$basic_machine" | sed 's/^[^-]*-//'`
;;
ppc64le | powerpc64little)
basic_machine=powerpc64le-unknown
diff --git a/configure.ac b/configure.ac
index c59cbc223f..a3e1c04e1b 100644
--- a/configure.ac
+++ b/configure.ac
@@ -747,9 +747,9 @@ cat >> conftest.c <<EOF
diff -up Python-3.5.0/configure.ac.than Python-3.5.0/configure.ac
--- Python-3.5.0/configure.ac.than 2015-11-13 11:51:32.039560172 -0500
+++ Python-3.5.0/configure.ac 2015-11-13 11:52:11.670168157 -0500
@@ -788,9 +788,9 @@ cat >> conftest.c <<EOF
alpha-linux-gnu
# elif defined(__ARM_EABI__) && defined(__ARM_PCS_VFP)
# if defined(__ARMEL__)
@ -44,7 +13,7 @@ index c59cbc223f..a3e1c04e1b 100644
# endif
# elif defined(__ARM_EABI__) && !defined(__ARM_PCS_VFP)
# if defined(__ARMEL__)
@@ -789,7 +789,7 @@ cat >> conftest.c <<EOF
@@ -810,7 +810,7 @@ cat >> conftest.c <<EOF
# elif _MIPS_SIM == _ABIN32
mips64el-linux-gnuabin32
# elif _MIPS_SIM == _ABI64
@ -53,7 +22,7 @@ index c59cbc223f..a3e1c04e1b 100644
# else
# error unknown platform triplet
# endif
@@ -799,22 +799,22 @@ cat >> conftest.c <<EOF
@@ -820,7 +820,7 @@ cat >> conftest.c <<EOF
# elif _MIPS_SIM == _ABIN32
mips64-linux-gnuabin32
# elif _MIPS_SIM == _ABI64
@ -62,11 +31,8 @@ index c59cbc223f..a3e1c04e1b 100644
# else
# error unknown platform triplet
# endif
# elif defined(__or1k__)
or1k-linux-gnu
# elif defined(__powerpc__) && defined(__SPE__)
- powerpc-linux-gnuspe
+ ppc-linux-gnuspe
@@ -830,9 +830,9 @@ cat >> conftest.c <<EOF
powerpc-linux-gnuspe
# elif defined(__powerpc64__)
# if defined(__LITTLE_ENDIAN__)
- powerpc64le-linux-gnu
@ -76,11 +42,17 @@ index c59cbc223f..a3e1c04e1b 100644
+ ppc64-linux-gnu
# endif
# elif defined(__powerpc__)
- powerpc-linux-gnu
+ ppc-linux-gnu
# elif defined(__s390x__)
s390x-linux-gnu
# elif defined(__s390__)
--
2.24.1
powerpc-linux-gnu
diff --git a/config.sub b/config.sub
index 40ea5df..932128b 100755
--- a/config.sub
+++ b/config.sub
@@ -1045,7 +1045,7 @@ case $basic_machine in
;;
ppc64) basic_machine=powerpc64-unknown
;;
- ppc64-*) basic_machine=powerpc64-`echo $basic_machine | sed 's/^[^-]*-//'`
+ ppc64-* | ppc64p7-*) basic_machine=powerpc64-`echo $basic_machine | sed 's/^[^-]*-//'`
;;
ppc64le | powerpc64little)
basic_machine=powerpc64le-unknown

View File

@ -0,0 +1,106 @@
diff --git a/Doc/whatsnew/3.6.rst b/Doc/whatsnew/3.6.rst
index 847b50140a6..570dc3ed6fe 100644
--- a/Doc/whatsnew/3.6.rst
+++ b/Doc/whatsnew/3.6.rst
@@ -1852,10 +1852,10 @@ Build and C API Changes
* The :c:func:`PyUnicode_FSConverter` and :c:func:`PyUnicode_FSDecoder`
functions will now accept :term:`path-like objects <path-like object>`.
-* The ``PyExc_RecursionErrorInst`` singleton that was part of the public API
- has been removed as its members being never cleared may cause a segfault
- during finalization of the interpreter. Contributed by Xavier de Gaye in
- :issue:`22898` and :issue:`30697`.
+* The ``PyExc_RecursionErrorInst`` singleton is not used anymore as its members
+ being never cleared may cause a segfault during finalization of the
+ interpreter. Contributed by Xavier de Gaye in :issue:`22898` and
+ :issue:`30697`.
Other Improvements
diff --git a/Include/pyerrors.h b/Include/pyerrors.h
index c28c1373f82..8c1dbc5047b 100644
--- a/Include/pyerrors.h
+++ b/Include/pyerrors.h
@@ -219,6 +219,8 @@ PyAPI_DATA(PyObject *) PyExc_IOError;
PyAPI_DATA(PyObject *) PyExc_WindowsError;
#endif
+PyAPI_DATA(PyObject *) PyExc_RecursionErrorInst;
+
/* Predefined warning categories */
PyAPI_DATA(PyObject *) PyExc_Warning;
PyAPI_DATA(PyObject *) PyExc_UserWarning;
diff --git a/Misc/NEWS.d/next/C API/2017-12-20-15-23-06.bpo-30697.v9FmgG.rst b/Misc/NEWS.d/next/C API/2017-12-20-15-23-06.bpo-30697.v9FmgG.rst
new file mode 100644
index 00000000000..28f74ad4f30
--- /dev/null
+++ b/Misc/NEWS.d/next/C API/2017-12-20-15-23-06.bpo-30697.v9FmgG.rst
@@ -0,0 +1 @@
+Restore PyExc_RecursionErrorInst in 3.6
diff --git a/Objects/exceptions.c b/Objects/exceptions.c
index df4899372a5..271e293e325 100644
--- a/Objects/exceptions.c
+++ b/Objects/exceptions.c
@@ -2430,6 +2430,12 @@ SimpleExtendsException(PyExc_Warning, ResourceWarning,
+/* Pre-computed RecursionError instance for when recursion depth is reached.
+ Meant to be used when normalizing the exception for exceeding the recursion
+ depth will cause its own infinite recursion.
+*/
+PyObject *PyExc_RecursionErrorInst = NULL;
+
#define PRE_INIT(TYPE) \
if (!(_PyExc_ ## TYPE.tp_flags & Py_TPFLAGS_READY)) { \
if (PyType_Ready(&_PyExc_ ## TYPE) < 0) \
@@ -2691,11 +2697,37 @@ _PyExc_Init(PyObject *bltinmod)
ADD_ERRNO(TimeoutError, ETIMEDOUT);
preallocate_memerrors();
+
+ if (!PyExc_RecursionErrorInst) {
+ PyExc_RecursionErrorInst = BaseException_new(&_PyExc_RecursionError, NULL, NULL);
+ if (!PyExc_RecursionErrorInst)
+ Py_FatalError("Cannot pre-allocate RecursionError instance for "
+ "recursion errors");
+ else {
+ PyBaseExceptionObject *err_inst =
+ (PyBaseExceptionObject *)PyExc_RecursionErrorInst;
+ PyObject *args_tuple;
+ PyObject *exc_message;
+ exc_message = PyUnicode_FromString("maximum recursion depth exceeded");
+ if (!exc_message)
+ Py_FatalError("cannot allocate argument for RecursionError "
+ "pre-allocation");
+ args_tuple = PyTuple_Pack(1, exc_message);
+ if (!args_tuple)
+ Py_FatalError("cannot allocate tuple for RecursionError "
+ "pre-allocation");
+ Py_DECREF(exc_message);
+ if (BaseException_init(err_inst, args_tuple, NULL))
+ Py_FatalError("init of pre-allocated RecursionError failed");
+ Py_DECREF(args_tuple);
+ }
+ }
}
void
_PyExc_Fini(void)
{
+ Py_CLEAR(PyExc_RecursionErrorInst);
free_preallocated_memerrors();
Py_CLEAR(errnomap);
}
diff --git a/PC/python3.def b/PC/python3.def
index 4fc4a6814ee..ff70718fc37 100644
--- a/PC/python3.def
+++ b/PC/python3.def
@@ -224,6 +224,7 @@ EXPORTS
PyExc_PermissionError=python36.PyExc_PermissionError DATA
PyExc_ProcessLookupError=python36.PyExc_ProcessLookupError DATA
PyExc_RecursionError=python36.PyExc_RecursionError DATA
+ PyExc_RecursionErrorInst=python36.PyExc_RecursionErrorInst DATA
PyExc_ReferenceError=python36.PyExc_ReferenceError DATA
PyExc_ResourceWarning=python36.PyExc_ResourceWarning DATA
PyExc_RuntimeError=python36.PyExc_RuntimeError DATA

View File

@ -0,0 +1,228 @@
diff --git a/Lib/ssl.py b/Lib/ssl.py
index 1f3a31a..b54a684 100644
--- a/Lib/ssl.py
+++ b/Lib/ssl.py
@@ -116,6 +116,7 @@ except ImportError:
from _ssl import HAS_SNI, HAS_ECDH, HAS_NPN, HAS_ALPN, HAS_TLSv1_3
+from _ssl import _DEFAULT_CIPHERS
from _ssl import _OPENSSL_API_VERSION
@@ -174,48 +175,7 @@ else:
CHANNEL_BINDING_TYPES = []
-# Disable weak or insecure ciphers by default
-# (OpenSSL's default setting is 'DEFAULT:!aNULL:!eNULL')
-# Enable a better set of ciphers by default
-# This list has been explicitly chosen to:
-# * TLS 1.3 ChaCha20 and AES-GCM cipher suites
-# * Prefer cipher suites that offer perfect forward secrecy (DHE/ECDHE)
-# * Prefer ECDHE over DHE for better performance
-# * Prefer AEAD over CBC for better performance and security
-# * Prefer AES-GCM over ChaCha20 because most platforms have AES-NI
-# (ChaCha20 needs OpenSSL 1.1.0 or patched 1.0.2)
-# * Prefer any AES-GCM and ChaCha20 over any AES-CBC for better
-# performance and security
-# * Then Use HIGH cipher suites as a fallback
-# * Disable NULL authentication, NULL encryption, 3DES and MD5 MACs
-# for security reasons
-_DEFAULT_CIPHERS = (
- 'TLS13-AES-256-GCM-SHA384:TLS13-CHACHA20-POLY1305-SHA256:'
- 'TLS13-AES-128-GCM-SHA256:'
- 'ECDH+AESGCM:ECDH+CHACHA20:DH+AESGCM:DH+CHACHA20:ECDH+AES256:DH+AES256:'
- 'ECDH+AES128:DH+AES:ECDH+HIGH:DH+HIGH:RSA+AESGCM:RSA+AES:RSA+HIGH:'
- '!aNULL:!eNULL:!MD5:!3DES'
- )
-
-# Restricted and more secure ciphers for the server side
-# This list has been explicitly chosen to:
-# * TLS 1.3 ChaCha20 and AES-GCM cipher suites
-# * Prefer cipher suites that offer perfect forward secrecy (DHE/ECDHE)
-# * Prefer ECDHE over DHE for better performance
-# * Prefer AEAD over CBC for better performance and security
-# * Prefer AES-GCM over ChaCha20 because most platforms have AES-NI
-# * Prefer any AES-GCM and ChaCha20 over any AES-CBC for better
-# performance and security
-# * Then Use HIGH cipher suites as a fallback
-# * Disable NULL authentication, NULL encryption, MD5 MACs, DSS, RC4, and
-# 3DES for security reasons
-_RESTRICTED_SERVER_CIPHERS = (
- 'TLS13-AES-256-GCM-SHA384:TLS13-CHACHA20-POLY1305-SHA256:'
- 'TLS13-AES-128-GCM-SHA256:'
- 'ECDH+AESGCM:ECDH+CHACHA20:DH+AESGCM:DH+CHACHA20:ECDH+AES256:DH+AES256:'
- 'ECDH+AES128:DH+AES:ECDH+HIGH:DH+HIGH:RSA+AESGCM:RSA+AES:RSA+HIGH:'
- '!aNULL:!eNULL:!MD5:!DSS:!RC4:!3DES'
-)
+_RESTRICTED_SERVER_CIPHERS = _DEFAULT_CIPHERS
class CertificateError(ValueError):
@@ -389,8 +349,6 @@ class SSLContext(_SSLContext):
def __new__(cls, protocol=PROTOCOL_TLS, *args, **kwargs):
self = _SSLContext.__new__(cls, protocol)
- if protocol != _SSLv2_IF_EXISTS:
- self.set_ciphers(_DEFAULT_CIPHERS)
return self
def __init__(self, protocol=PROTOCOL_TLS):
@@ -505,8 +463,6 @@ def create_default_context(purpose=Purpose.SERVER_AUTH, *, cafile=None,
# verify certs and host name in client mode
context.verify_mode = CERT_REQUIRED
context.check_hostname = True
- elif purpose == Purpose.CLIENT_AUTH:
- context.set_ciphers(_RESTRICTED_SERVER_CIPHERS)
if cafile or capath or cadata:
context.load_verify_locations(cafile, capath, cadata)
diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py
index 54644e1..799100c 100644
--- a/Lib/test/test_ssl.py
+++ b/Lib/test/test_ssl.py
@@ -18,6 +18,7 @@ import asyncore
import weakref
import platform
import functools
+import sysconfig
try:
import ctypes
except ImportError:
@@ -36,7 +37,7 @@ PROTOCOLS = sorted(ssl._PROTOCOL_NAMES)
HOST = support.HOST
IS_LIBRESSL = ssl.OPENSSL_VERSION.startswith('LibreSSL')
IS_OPENSSL_1_1 = not IS_LIBRESSL and ssl.OPENSSL_VERSION_INFO >= (1, 1, 0)
-
+PY_SSL_DEFAULT_CIPHERS = sysconfig.get_config_var('PY_SSL_DEFAULT_CIPHERS')
def data_file(*name):
return os.path.join(os.path.dirname(__file__), *name)
@@ -889,6 +890,19 @@ class ContextTests(unittest.TestCase):
with self.assertRaisesRegex(ssl.SSLError, "No cipher can be selected"):
ctx.set_ciphers("^$:,;?*'dorothyx")
+ @unittest.skipUnless(PY_SSL_DEFAULT_CIPHERS == 1,
+ "Test applies only to Python default ciphers")
+ def test_python_ciphers(self):
+ ctx = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
+ ciphers = ctx.get_ciphers()
+ for suite in ciphers:
+ name = suite['name']
+ self.assertNotIn("PSK", name)
+ self.assertNotIn("SRP", name)
+ self.assertNotIn("MD5", name)
+ self.assertNotIn("RC4", name)
+ self.assertNotIn("3DES", name)
+
@unittest.skipIf(ssl.OPENSSL_VERSION_INFO < (1, 0, 2, 0, 0), 'OpenSSL too old')
def test_get_ciphers(self):
ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1)
diff --git a/Modules/_ssl.c b/Modules/_ssl.c
index df8c6a7..e23a569 100644
--- a/Modules/_ssl.c
+++ b/Modules/_ssl.c
@@ -206,6 +206,31 @@ SSL_SESSION_get_ticket_lifetime_hint(const SSL_SESSION *s)
#endif /* OpenSSL < 1.1.0 or LibreSSL */
+/* Default cipher suites */
+#ifndef PY_SSL_DEFAULT_CIPHERS
+#define PY_SSL_DEFAULT_CIPHERS 1
+#endif
+
+#if PY_SSL_DEFAULT_CIPHERS == 0
+ #ifndef PY_SSL_DEFAULT_CIPHER_STRING
+ #error "Py_SSL_DEFAULT_CIPHERS 0 needs Py_SSL_DEFAULT_CIPHER_STRING"
+ #endif
+#elif PY_SSL_DEFAULT_CIPHERS == 1
+/* Python custom selection of sensible ciper suites
+ * DEFAULT: OpenSSL's default cipher list. Since 1.0.2 the list is in sensible order.
+ * !aNULL:!eNULL: really no NULL ciphers
+ * !MD5:!3DES:!DES:!RC4:!IDEA:!SEED: no weak or broken algorithms on old OpenSSL versions.
+ * !aDSS: no authentication with discrete logarithm DSA algorithm
+ * !SRP:!PSK: no secure remote password or pre-shared key authentication
+ */
+ #define PY_SSL_DEFAULT_CIPHER_STRING "DEFAULT:!aNULL:!eNULL:!MD5:!3DES:!DES:!RC4:!IDEA:!SEED:!aDSS:!SRP:!PSK"
+#elif PY_SSL_DEFAULT_CIPHERS == 2
+/* Ignored in SSLContext constructor, only used to as _ssl.DEFAULT_CIPHER_STRING */
+ #define PY_SSL_DEFAULT_CIPHER_STRING SSL_DEFAULT_CIPHER_LIST
+#else
+ #error "Unsupported PY_SSL_DEFAULT_CIPHERS"
+#endif
+
enum py_ssl_error {
/* these mirror ssl.h */
@@ -2739,7 +2764,12 @@ _ssl__SSLContext_impl(PyTypeObject *type, int proto_version)
/* A bare minimum cipher list without completely broken cipher suites.
* It's far from perfect but gives users a better head start. */
if (proto_version != PY_SSL_VERSION_SSL2) {
- result = SSL_CTX_set_cipher_list(ctx, "HIGH:!aNULL:!eNULL:!MD5");
+#if PY_SSL_DEFAULT_CIPHERS == 2
+ /* stick to OpenSSL's default settings */
+ result = 1;
+#else
+ result = SSL_CTX_set_cipher_list(ctx, PY_SSL_DEFAULT_CIPHER_STRING);
+#endif
} else {
/* SSLv2 needs MD5 */
result = SSL_CTX_set_cipher_list(ctx, "HIGH:!aNULL:!eNULL");
@@ -5279,6 +5309,9 @@ PyInit__ssl(void)
(PyObject *)&PySSLSession_Type) != 0)
return NULL;
+ PyModule_AddStringConstant(m, "_DEFAULT_CIPHERS",
+ PY_SSL_DEFAULT_CIPHER_STRING);
+
PyModule_AddIntConstant(m, "SSL_ERROR_ZERO_RETURN",
PY_SSL_ERROR_ZERO_RETURN);
PyModule_AddIntConstant(m, "SSL_ERROR_WANT_READ",
diff --git a/configure.ac b/configure.ac
index 7ea62f8..4b42393 100644
--- a/configure.ac
+++ b/configure.ac
@@ -5555,6 +5555,42 @@ if test "$have_getrandom" = yes; then
[Define to 1 if the getrandom() function is available])
fi
+# ssl module default cipher suite string
+AH_TEMPLATE(PY_SSL_DEFAULT_CIPHERS,
+ [Default cipher suites list for ssl module.
+ 1: Python's preferred selection, 2: leave OpenSSL defaults untouched, 0: custom string])
+AH_TEMPLATE(PY_SSL_DEFAULT_CIPHER_STRING,
+ [Cipher suite string for PY_SSL_DEFAULT_CIPHERS=0]
+)
+AC_MSG_CHECKING(for --with-ssl-default-suites)
+AC_ARG_WITH(ssl-default-suites,
+ AS_HELP_STRING([--with-ssl-default-suites=@<:@python|openssl|STRING@:>@],
+ [Override default cipher suites string,
+ python: use Python's preferred selection (default),
+ openssl: leave OpenSSL's defaults untouched,
+ STRING: use a custom string,
+ PROTOCOL_SSLv2 ignores the setting]),
+[
+AC_MSG_RESULT($withval)
+case "$withval" in
+ python)
+ AC_DEFINE(PY_SSL_DEFAULT_CIPHERS, 1)
+ ;;
+ openssl)
+ AC_DEFINE(PY_SSL_DEFAULT_CIPHERS, 2)
+ ;;
+ *)
+ AC_DEFINE(PY_SSL_DEFAULT_CIPHERS, 0)
+ AC_DEFINE_UNQUOTED(PY_SSL_DEFAULT_CIPHER_STRING, "$withval")
+ ;;
+esac
+],
+[
+AC_MSG_RESULT(python)
+AC_DEFINE(PY_SSL_DEFAULT_CIPHERS, 1)
+])
+
+
# generate output files
AC_CONFIG_FILES(Makefile.pre Modules/Setup.config Misc/python.pc Misc/python-config.sh)
AC_CONFIG_FILES([Modules/ld_so_aix], [chmod +x Modules/ld_so_aix])

View File

@ -0,0 +1,104 @@
From 5affd5c29eb1493cb31ef3cfdde15538ac134689 Mon Sep 17 00:00:00 2001
From: =?UTF-8?q?Miro=20Hron=C4=8Dok?= <miro@hroncok.cz>
Date: Tue, 13 Mar 2018 10:56:43 +0100
Subject: [PATCH] bpo-32885: Tools/scripts/pathfix.py: Add -n option for no
backup~ (#5772)
Creating backup files with ~ suffix can be undesirable in some environment,
such as when building RPM packages. Instead of requiring the user to remove
those files manually, option -n was added, that simply disables this feature.
-n was selected because 2to3 has the same option with this behavior.
---
Misc/ACKS | 1 +
.../2018-02-20-12-16-47.bpo-32885.dL5x7C.rst | 2 ++
Tools/scripts/pathfix.py | 28 +++++++++++++++-------
3 files changed, 23 insertions(+), 8 deletions(-)
create mode 100644 Misc/NEWS.d/next/Tools-Demos/2018-02-20-12-16-47.bpo-32885.dL5x7C.rst
diff --git a/Misc/ACKS b/Misc/ACKS
index d8179c8b03ab..d752d8a35434 100644
--- a/Misc/ACKS
+++ b/Misc/ACKS
@@ -687,6 +687,7 @@ Ken Howard
Brad Howes
Mike Hoy
Ben Hoyt
+Miro Hrončok
Chiu-Hsiang Hsu
Chih-Hao Huang
Christian Hudon
diff --git a/Misc/NEWS.d/next/Tools-Demos/2018-02-20-12-16-47.bpo-32885.dL5x7C.rst b/Misc/NEWS.d/next/Tools-Demos/2018-02-20-12-16-47.bpo-32885.dL5x7C.rst
new file mode 100644
index 000000000000..e003e1d84fd0
--- /dev/null
+++ b/Misc/NEWS.d/next/Tools-Demos/2018-02-20-12-16-47.bpo-32885.dL5x7C.rst
@@ -0,0 +1,2 @@
+Add an ``-n`` flag for ``Tools/scripts/pathfix.py`` to disbale automatic
+backup creation (files with ``~`` suffix).
diff --git a/Tools/scripts/pathfix.py b/Tools/scripts/pathfix.py
index 562bbc737812..c5bf984306a3 100755
--- a/Tools/scripts/pathfix.py
+++ b/Tools/scripts/pathfix.py
@@ -7,8 +7,9 @@
# Directories are searched recursively for files whose name looks
# like a python module.
# Symbolic links are always ignored (except as explicit directory
-# arguments). Of course, the original file is kept as a back-up
-# (with a "~" attached to its name).
+# arguments).
+# The original file is kept as a back-up (with a "~" attached to its name),
+# -n flag can be used to disable this.
#
# Undoubtedly you can do this using find and sed or perl, but this is
# a nice example of Python code that recurses down a directory tree
@@ -31,14 +32,17 @@
new_interpreter = None
preserve_timestamps = False
+create_backup = True
+
def main():
global new_interpreter
global preserve_timestamps
- usage = ('usage: %s -i /interpreter -p file-or-directory ...\n' %
+ global create_backup
+ usage = ('usage: %s -i /interpreter -p -n file-or-directory ...\n' %
sys.argv[0])
try:
- opts, args = getopt.getopt(sys.argv[1:], 'i:p')
+ opts, args = getopt.getopt(sys.argv[1:], 'i:pn')
except getopt.error as msg:
err(str(msg) + '\n')
err(usage)
@@ -48,6 +52,8 @@ def main():
new_interpreter = a.encode()
if o == '-p':
preserve_timestamps = True
+ if o == '-n':
+ create_backup = False
if not new_interpreter or not new_interpreter.startswith(b'/') or \
not args:
err('-i option or file-or-directory missing\n')
@@ -134,10 +140,16 @@ def fix(filename):
except OSError as msg:
err('%s: warning: chmod failed (%r)\n' % (tempname, msg))
# Then make a backup of the original file as filename~
- try:
- os.rename(filename, filename + '~')
- except OSError as msg:
- err('%s: warning: backup failed (%r)\n' % (filename, msg))
+ if create_backup:
+ try:
+ os.rename(filename, filename + '~')
+ except OSError as msg:
+ err('%s: warning: backup failed (%r)\n' % (filename, msg))
+ else:
+ try:
+ os.remove(filename)
+ except OSError as msg:
+ err('%s: warning: removing failed (%r)\n' % (filename, msg))
# Now move the temp file to the original file
try:
os.rename(tempname, filename)

View File

@ -1,55 +0,0 @@
From 0d41a311e805af08637e3f6dc0fb6fae32e508ab Mon Sep 17 00:00:00 2001
From: =?UTF-8?q?Miro=20Hron=C4=8Dok?= <miro@hroncok.cz>
Date: Thu, 11 Jul 2019 13:44:13 +0200
Subject: [PATCH] 00328: Restore pyc to TIMESTAMP invalidation mode as default
in rpmbuild
Since Fedora 31, the $SOURCE_DATE_EPOCH is set in rpmbuild to the latest
%changelog date. This makes Python default to the CHECKED_HASH pyc
invalidation mode, bringing more reproducible builds traded for an import
performance decrease. To avoid that, we don't default to CHECKED_HASH
when $RPM_BUILD_ROOT is set (i.e. when we are building RPM packages).
See https://src.fedoraproject.org/rpms/redhat-rpm-config/pull-request/57#comment-27426
---
Lib/py_compile.py | 3 ++-
Lib/test/test_py_compile.py | 2 ++
2 files changed, 4 insertions(+), 1 deletion(-)
diff --git a/Lib/py_compile.py b/Lib/py_compile.py
index 21736896af..310bed5620 100644
--- a/Lib/py_compile.py
+++ b/Lib/py_compile.py
@@ -70,7 +70,8 @@ class PycInvalidationMode(enum.Enum):
def _get_default_invalidation_mode():
- if os.environ.get('SOURCE_DATE_EPOCH'):
+ if (os.environ.get('SOURCE_DATE_EPOCH') and not
+ os.environ.get('RPM_BUILD_ROOT')):
return PycInvalidationMode.CHECKED_HASH
else:
return PycInvalidationMode.TIMESTAMP
diff --git a/Lib/test/test_py_compile.py b/Lib/test/test_py_compile.py
index d4a68c9320..ed09874023 100644
--- a/Lib/test/test_py_compile.py
+++ b/Lib/test/test_py_compile.py
@@ -17,6 +17,7 @@ def without_source_date_epoch(fxn):
def wrapper(*args, **kwargs):
with support.EnvironmentVarGuard() as env:
env.unset('SOURCE_DATE_EPOCH')
+ env.unset('RPM_BUILD_ROOT')
return fxn(*args, **kwargs)
return wrapper
@@ -27,6 +28,7 @@ def with_source_date_epoch(fxn):
def wrapper(*args, **kwargs):
with support.EnvironmentVarGuard() as env:
env['SOURCE_DATE_EPOCH'] = '123456789'
+ env.unset('RPM_BUILD_ROOT')
return fxn(*args, **kwargs)
return wrapper
--
2.24.1

View File

@ -0,0 +1,60 @@
"""Checks if all *.pyc and *.pyo files have later mtime than their *.py files."""
import imp
import os
import sys
# list of test and other files that we expect not to have bytecode
not_compiled = [
'/usr/bin/pathfix.py',
'test/bad_coding.py',
'test/bad_coding2.py',
'test/badsyntax_3131.py',
'test/badsyntax_future3.py',
'test/badsyntax_future4.py',
'test/badsyntax_future5.py',
'test/badsyntax_future6.py',
'test/badsyntax_future7.py',
'test/badsyntax_future8.py',
'test/badsyntax_future9.py',
'test/badsyntax_future10.py',
'test/badsyntax_async1.py',
'test/badsyntax_async2.py',
'test/badsyntax_async3.py',
'test/badsyntax_async4.py',
'test/badsyntax_async5.py',
'test/badsyntax_async6.py',
'test/badsyntax_async7.py',
'test/badsyntax_async8.py',
'test/badsyntax_async9.py',
'test/badsyntax_pep3120.py',
'lib2to3/tests/data/bom.py',
'lib2to3/tests/data/crlf.py',
'lib2to3/tests/data/different_encoding.py',
'lib2to3/tests/data/false_encoding.py',
'lib2to3/tests/data/py2_test_grammar.py',
'.debug-gdb.py',
]
failed = 0
def bytecode_expected(source):
for f in not_compiled:
if source.endswith(f):
return False
return True
compiled = filter(lambda f: bytecode_expected(f), sys.argv[1:])
for f in compiled:
# check both pyo and pyc
to_check = map(lambda b: imp.cache_from_source(f, b), (True, False))
f_mtime = os.path.getmtime(f)
for c in to_check:
c_mtime = os.path.getmtime(c)
if c_mtime < f_mtime:
sys.stderr.write('Failed bytecompilation timestamps check: ')
sys.stderr.write('Bytecode file {} is older than source file {}.\n'.format(c, f))
failed += 1
if failed:
sys.stderr.write('\n{} files failed bytecompilation timestamps check.\n'.format(failed))
sys.exit(1)

View File

@ -1,55 +0,0 @@
"""Checks if all *.pyc files have later mtime than their *.py files."""
import os
import sys
from importlib.util import cache_from_source
from pathlib import Path
RPM_BUILD_ROOT = os.environ.get('RPM_BUILD_ROOT', '')
# ...cpython-3X.pyc
# ...cpython-3X.opt-1.pyc
# ...cpython-3X.opt-2.pyc
LEVELS = (None, 1, 2)
# list of globs of test and other files that we expect not to have bytecode
not_compiled = [
'/usr/bin/*',
'*/test/bad_coding.py',
'*/test/bad_coding2.py',
'*/test/badsyntax_*.py',
'*/lib2to3/tests/data/bom.py',
'*/lib2to3/tests/data/crlf.py',
'*/lib2to3/tests/data/different_encoding.py',
'*/lib2to3/tests/data/false_encoding.py',
'*/lib2to3/tests/data/py2_test_grammar.py',
'*.debug-gdb.py',
]
def bytecode_expected(path):
path = Path(path[len(RPM_BUILD_ROOT):])
for glob in not_compiled:
if path.match(glob):
return False
return True
failed = 0
compiled = (path for path in sys.argv[1:] if bytecode_expected(path))
for path in compiled:
to_check = (cache_from_source(path, optimization=opt) for opt in LEVELS)
f_mtime = os.path.getmtime(path)
for pyc in to_check:
c_mtime = os.path.getmtime(pyc)
if c_mtime < f_mtime:
print('Failed bytecompilation timestamps check: '
f'Bytecode file {pyc} is older than source file {path}',
file=sys.stderr)
failed += 1
if failed:
print(f'\n{failed} files failed bytecompilation timestamps check.',
file=sys.stderr)
sys.exit(1)

25
macros.pybytecompile3.6 Normal file
View File

@ -0,0 +1,25 @@
# Note that the path could itself be a python file, or a directory
# Python's compile_all module only works on directories, and requires a max
# recursion depth
# Note that the py_byte_compile macro should work for python2 as well
# Which unfortunately makes the definition more complicated than it should be
# The condition should be reversed once /usr/bin/python is python3!
%py_byte_compile()\
py2_byte_compile () {\
python_binary="%1"\
bytecode_compilation_path="%2"\
find $bytecode_compilation_path -type f -a -name "*.py" -print0 | xargs -0 $python_binary -c 'import py_compile, sys; [py_compile.compile(f, dfile=f.partition("$RPM_BUILD_ROOT")[2]) for f in sys.argv[1:]]' || :\
find $bytecode_compilation_path -type f -a -name "*.py" -print0 | xargs -0 $python_binary -O -c 'import py_compile, sys; [py_compile.compile(f, dfile=f.partition("$RPM_BUILD_ROOT")[2]) for f in sys.argv[1:]]' || :\
}\
\
py3_byte_compile () {\
python_binary="%1"\
bytecode_compilation_path="%2"\
find $bytecode_compilation_path -type f -a -name "*.py" -print0 | xargs -0 $python_binary -O -c 'import py_compile, sys; [py_compile.compile(f, dfile=f.partition("$RPM_BUILD_ROOT")[2], optimize=opt) for opt in range(2) for f in sys.argv[1:]]' || :\
}\
\
[[ "%1" == *python3* ]] || py2_byte_compile "%1" "%2" && py3_byte_compile "%1" "%2" \
%{nil}

1
macros.systempython Normal file
View File

@ -0,0 +1 @@
%system_python_abi %{expand: }

11542
pubkeys.txt

File diff suppressed because it is too large Load Diff

View File

@ -4,7 +4,7 @@ addFilter(r'crypto-policy-non-compliance-openssl')
# TESTS:
addFilter(r'(zero-length|pem-certificate|uncompressed-zip) /usr/lib(64)?/python3\.\d+/test')
addFilter(r'(zero-length|pem-certificate|uncompressed-zip) /usr/lib(64)?/python3.\d/test')
# OTHER DELIBERATES:
@ -13,72 +13,52 @@ addFilter(r'missing-call-to-chdir-with-chroot')
# intentionally unversioned and selfobsoleted
addFilter(r'unversioned-explicit-obsoletes python')
addFilter(r'self-obsoletion python3\d+ obsoletes python3\d+')
addFilter(r'self-obsoletion python3\d obsoletes python3\d')
# intentionally hardcoded
addFilter(r'hardcoded-library-path in %{_prefix}/lib/(debug/%{_libdir}|python%{pybasever})')
# intentional for our pythonXY package
addFilter(r'python3\d+\.[^:]+: (E|W): devel-file-in-non-devel-package')
# we have non binary stuff, python files
addFilter(r'only-non-binary-in-usr-lib')
# some devel files that are deliberately needed
addFilter(r'devel-file-in-non-devel-package /usr/include/python3\.\d+m?/pyconfig-(32|64)\.h')
addFilter(r'devel-file-in-non-devel-package /usr/lib(64)?/python3\.\d+/distutils/tests/xxmodule\.c')
addFilter(r'devel-file-in-non-devel-package /usr/include/python3\.\dm/pyconfig-(32|64)\.h')
addFilter(r'devel-file-in-non-devel-package /usr/lib64/python3\.\d/distutils/tests/xxmodule\.c')
# SORRY, NOT SORRY:
# manual pages
addFilter(r'no-manual-page-for-binary (idle|pydoc|pyvenv|2to3|python3?-debug|pathfix|msgfmt|pygettext)')
addFilter(r'no-manual-page-for-binary python3?.*-config$')
addFilter(r'no-manual-page-for-binary python3\.\d+d?m?$')
addFilter(r'no-manual-page-for-binary (idle|pydoc|pyvenv|2to3|python3-debug|pathfix\.py)')
addFilter(r'no-manual-page-for-binary python3.*-config$')
addFilter(r'no-manual-page-for-binary python3.\dd?m$')
# missing documentation from subpackages
addFilter(r'^python3\d*-(debug|tkinter|test|idle)\.[^:]+: (E|W): no-documentation')
addFilter(r'^python3\d?-(debug|tkinter|test|idle)\.[^:]+: (E|W): no-documentation')
# platform python is obsoleted, but not provided
addFilter(r'obsolete-not-provided platform-python')
# RPMLINT IMPERFECTIONS
# https://github.com/rpm-software-management/rpmlint/issues/123
addFilter(r'python-bytecode-wrong-magic-value .* expected 33\d\d \(3\.7\), found 3393')
# https://github.com/rpm-software-management/rpmlint/pull/133
addFilter(r'python-bytecode-wrong-magic-value .* expected 33\d\d \(3\.7\), found 3394')
# https://bugzilla.redhat.com/show_bug.cgi?id=1550562
# https://github.com/rpm-software-management/rpmlint/issues/128
addFilter(r'python-bytecode-inconsistent-mtime .* 1970')
# RPMLINT IMPERFECTIONS:
# ifarch applied patches are OK
# https://fedoraproject.org/wiki/Packaging:Guidelines#Architecture_Support
addFilter(r'%ifarch-applied-patch')
# debugsource
addFilter(r'^python3\d*-debugsource\.[^:]+: (E|W): no-documentation')
addFilter(r'^python3\d?-debugsource\.[^:]+: (E|W): no-documentation')
# debuginfo
addFilter(r'^python3\d*-debuginfo\.[^:]+: (E|W): useless-provides debuginfo\(build-id\)')
# this is OK for F28+
addFilter(r'library-without-ldconfig-post')
addFilter(r'^python3\d?-debuginfo\.[^:]+: (E|W): useless-provides debuginfo\(build-id\)')
# debug package contains devel and non-devel files
addFilter(r'python3\d*-debug\.[^:]+: (E|W): (non-)?devel-file-in-(non-)?devel-package')
addFilter(r'python3\d?-debug.[^:]+: (E|W): (non-)?devel-file-in-(non-)?devel-package')
# this goes to other subpackage, hence not actually dangling, the read error is bogus
addFilter(r'dangling-relative-symlink /usr/lib(64)?/pkgconfig/python-3\.\d+dm?(-embed)?\.pc python-3\.\d+(-embed)?\.pc')
addFilter(r'read-error /usr/lib(64)?/pkgconfig/python-3\.\d+dm?(-embed)?\.pc \[Errno 2\]')
# the python-unversioned-command package contains dangling symlinks by design
addFilter(r'^python-unversioned-command\.[^:]+: (E|W): dangling-relative-symlink '
r'(/usr/bin/python \./python3|/usr/share/man/man1/python\.1\S* ./python3\.1\S*)$')
addFilter(r'dangling-relative-symlink /usr/lib(64)?/pkgconfig/python-3\.\ddm\.pc python-3\.\d\.pc')
addFilter(r'read-error /usr/lib(64)?/pkgconfig/python-3\.\ddm\.pc \[Errno 2\]')
# we need this macro to evaluate, even if the line starts with #
addFilter(r'macro-in-comment %\{_pyconfig(32|64)_h\}')
# Python modules don't need to be linked against libc
# Since 3.8 they are no longer linked against libpython3.8.so.1.0
addFilter(r'E: library-not-linked-against-libc /usr/lib(64)?/python3\.\d+/lib-dynload/')
addFilter(r'E: shared-lib-without-dependency-information /usr/lib(64)?/python3\.\d+/lib-dynload/')
# SPELLING ERRORS
addFilter(r'spelling-error .* en_US (bytecode|pyc|filename|tkinter|namespaces|pytest) ')

File diff suppressed because it is too large Load Diff

View File

@ -1,2 +1 @@
SHA512 (Python-3.8.2.tar.xz) = ca37ad0e7c5845f5f228566aa8ff654a8f428c7d4a5aaabff29baebb0ca3219b31ba8bb2607f89e37cf3fc564f023b8407e53a4f2c47bd99122c1cc222613e37
SHA512 (Python-3.8.2.tar.xz.asc) = 765796ab5539576bbf1578e05cdb041dbc9a9ca0d6d2040a473a00a293b49f90be11ea6e33b47889da33b25f8e360fad4adeec292f0d43e5fae233d1f03bafd2
SHA512 (Python-3.6.5.tar.xz) = 6b26fcd296b9bd8e67861eff10d14db7507711ddba947288d16d6def53135c39326b7f969c04bb2b2993f924d9e7ad3f5c5282a3915760bc0885cf0a8ea5eb51

View File

@ -1 +0,0 @@
1

View File

@ -1,4 +0,0 @@
---
standard-inventory-qcow2:
qemu:
m: 3G # Amount of VM memory

View File

@ -1,34 +0,0 @@
---
- hosts: localhost
roles:
- role: standard-test-basic
tags:
- classic
repositories:
- repo: "https://src.fedoraproject.org/tests/python.git"
dest: "python"
tests:
- smoke:
dir: python/smoke
run: VERSION=3.8 ./venv.sh
- debugsmoke:
dir: python/smoke
run: PYTHON=python3-debug TOX=false VERSION=3.8 ./venv.sh
- selftest:
dir: python/selftest
run: VERSION=3.8 X="-x test_wsgiref" ./parallel.sh
- debugtest:
dir: python/selftest
run: VERSION=3.8 PYTHON=python3-debug X="-x test_wsgiref" ./parallel.sh
- debugflags:
dir: python/flags
run: python3-debug ./assertflags.py -Og
required_packages:
- gcc # for extension building in venv and selftest
- gdb # for test_gdb
- python3-debug # for leak testing
- python3-devel # for extension building in venv and selftest
- python3-tkinter # for selftest
- python3-test # for selftest
- python3-tox # for venv tests
- glibc-all-langpacks # for locale tests