开发者社区> 问答> 正文

针对 pyspark的demo 的als.py 打包不成功,但是本地可以正常运行?400报错


D:\ra\spark-2.4.3-bin-hadoop2\spark-2.4.3-bin-hadoop2.7\examples\src\main\python
\test01>pyinstaller als.py
52 INFO: PyInstaller: 3.5
53 INFO: Python: 2.7.15
53 INFO: Platform: Windows-7-6.1.7601-SP1
55 INFO: wrote D:\ra\spark-2.4.3-bin-hadoop2\spark-2.4.3-bin-hadoop2.7\examples\
src\main\python\test01\als.spec
59 INFO: UPX is not available.
62 INFO: Extending PYTHONPATH with paths
['D:\\ra\\spark-2.4.3-bin-hadoop2\\spark-2.4.3-bin-hadoop2.7\\examples\\src\\mai
n\\python\\test01',
 'D:\\ra\\spark-2.4.3-bin-hadoop2\\spark-2.4.3-bin-hadoop2.7\\examples\\src\\mai
n\\python\\test01']
62 INFO: checking Analysis
62 INFO: Building Analysis because Analysis-00.toc is non existent
62 INFO: Initializing module dependency graph...
65 INFO: Initializing module graph hooks...
158 INFO: running Analysis Analysis-00.toc
174 INFO: Adding Microsoft.VC90.CRT to dependent assemblies of final executable
  required by e:\programdata\anaconda22\python.exe
226 INFO: Found C:\Windows\WinSxS\Manifests\amd64_policy.9.0.microsoft.vc90.crt_
1fc8b3b9a1e18e3b_9.0.30729.4940_none_acd19a1fe1da248a.manifest
274 INFO: Searching for assembly amd64_Microsoft.VC90.CRT_1fc8b3b9a1e18e3b_9.0.3
0729.4940_none ...
274 INFO: Found manifest C:\Windows\WinSxS\Manifests\amd64_microsoft.vc90.crt_1f
c8b3b9a1e18e3b_9.0.30729.4940_none_08e4299fa83d7e3c.manifest
277 INFO: Searching for file msvcr90.dll
277 INFO: Found file C:\Windows\WinSxS\amd64_microsoft.vc90.crt_1fc8b3b9a1e18e3b
_9.0.30729.4940_none_08e4299fa83d7e3c\msvcr90.dll
277 INFO: Searching for file msvcp90.dll
277 INFO: Found file C:\Windows\WinSxS\amd64_microsoft.vc90.crt_1fc8b3b9a1e18e3b
_9.0.30729.4940_none_08e4299fa83d7e3c\msvcp90.dll
277 INFO: Searching for file msvcm90.dll
277 INFO: Found file C:\Windows\WinSxS\amd64_microsoft.vc90.crt_1fc8b3b9a1e18e3b
_9.0.30729.4940_none_08e4299fa83d7e3c\msvcm90.dll
322 INFO: Found C:\Windows\WinSxS\Manifests\amd64_policy.9.0.microsoft.vc90.crt_
1fc8b3b9a1e18e3b_9.0.30729.4940_none_acd19a1fe1da248a.manifest
323 INFO: Adding redirect Microsoft.VC90.CRT version (9, 0, 21022, 8) -> (9, 0,
30729, 4940)
356 INFO: Caching module hooks...
361 INFO: Analyzing D:\ra\spark-2.4.3-bin-hadoop2\spark-2.4.3-bin-hadoop2.7\exam
ples\src\main\python\test01\als.py
5082 INFO: Processing pre-find module path hook   distutils
6496 INFO: Processing pre-safe import module hook   six.moves
6914 INFO: Processing pre-safe import module hook   _xmlplus
10487 INFO: Processing pre-safe import module hook   setuptools.extern.six.moves

11378 INFO: Processing pre-find module path hook   site
11378 INFO: site: retargeting to fake-dir 'e:\\programdata\\anaconda22\\lib\\sit
e-packages\\PyInstaller\\fake-modules'
15004 INFO: Loading module hooks...
15004 INFO: Loading module hook "hook-distutils.py"...
15006 INFO: Loading module hook "hook-sysconfig.py"...
15006 INFO: Loading module hook "hook-xml.py"...
15112 INFO: Loading module hook "hook-pycparser.py"...
15292 INFO: Loading module hook "hook-scipy.py"...
15294 INFO: Loading module hook "hook-httplib.py"...
15296 INFO: Loading module hook "hook-pydoc.py"...
15297 INFO: Excluding import 'Tkinter'
15299 INFO:   Removing import of Tkinter from module pydoc
15299 INFO: Loading module hook "hook-encodings.py"...
15730 INFO: Loading module hook "hook-_tkinter.py"...
15848 INFO: checking Tree
15848 INFO: Building Tree because Tree-00.toc is non existent
15848 INFO: Building Tree Tree-00.toc
15996 INFO: checking Tree
15997 INFO: Building Tree because Tree-01.toc is non existent
15997 INFO: Building Tree Tree-01.toc
16020 INFO: Loading module hook "hook-pkg_resources.py"...
16595 INFO: Processing pre-safe import module hook   win32com
16940 INFO: Loading module hook "hook-numpy.py"...
16941 INFO: Loading module hook "hook-pywintypes.py"...
17333 INFO: Loading module hook "hook-setuptools.py"...
18047 INFO: Loading module hook "hook-pytest.py"...
18990 INFO: Loading module hook "hook-numpy.core.py"...
19047 INFO: MKL libraries found when importing numpy. Adding MKL to binaries
19049 INFO: Loading module hook "hook-win32com.py"...
19638 INFO: Loading module hook "hook-pythoncom.py"...
20032 INFO: Loading module hook "hook-xml.dom.domreg.py"...
20056 INFO: Looking for ctypes DLLs
20125 INFO: Analyzing run-time hooks ...
20131 INFO: Including run-time hook 'pyi_rth_pkgres.py'
20132 INFO: Including run-time hook 'pyi_rth_win32comgenpy.py'
20134 INFO: Including run-time hook 'pyi_rth_multiprocessing.py'
20148 INFO: Looking for dynamic libraries
20230 WARNING: lib not found: mpich2mpi.dll dependency of e:\programdata\anacond
a22\Library\bin\mkl_blacs_mpich2_lp64.dll
20316 WARNING: lib not found: mpich2mpi.dll dependency of e:\programdata\anacond
a22\Library\bin\mkl_blacs_mpich2_ilp64.dll
20384 WARNING: lib not found: impi.dll dependency of e:\programdata\anaconda22\L
ibrary\bin\mkl_blacs_intelmpi_lp64.dll
20453 WARNING: lib not found: msmpi.dll dependency of e:\programdata\anaconda22\
Library\bin\mkl_blacs_msmpi_ilp64.dll
20529 WARNING: lib not found: impi.dll dependency of e:\programdata\anaconda22\L
ibrary\bin\mkl_blacs_intelmpi_ilp64.dll
20654 WARNING: lib not found: msmpi.dll dependency of e:\programdata\anaconda22\
Library\bin\mkl_blacs_msmpi_lp64.dll
20793 WARNING: lib not found: tbb.dll dependency of e:\programdata\anaconda22\Li
brary\bin\mkl_tbb_thread.dll
20888 WARNING: lib not found: pgc14.dll dependency of e:\programdata\anaconda22\
Library\bin\mkl_pgi_thread.dll
20953 WARNING: lib not found: pgf90rtl.dll dependency of e:\programdata\anaconda
22\Library\bin\mkl_pgi_thread.dll
21022 WARNING: lib not found: pgf90.dll dependency of e:\programdata\anaconda22\
Library\bin\mkl_pgi_thread.dll
21552 INFO: Searching for assembly amd64_Microsoft.VC90.MFC_1fc8b3b9a1e18e3b_9.0
.21022.8_none ...
21552 INFO: Found manifest e:\programdata\anaconda22\lib\site-packages\Pythonwin
\Microsoft.VC90.MFC.manifest
21556 INFO: Searching for file mfc90.dll
21556 INFO: Found file e:\programdata\anaconda22\lib\site-packages\Pythonwin\mfc
90.dll
21559 INFO: Searching for file mfc90u.dll
21562 INFO: Found file e:\programdata\anaconda22\lib\site-packages\Pythonwin\mfc
90u.dll
21565 INFO: Searching for file mfcm90.dll
21569 INFO: Found file e:\programdata\anaconda22\lib\site-packages\Pythonwin\mfc
m90.dll
21572 INFO: Searching for file mfcm90u.dll
21575 INFO: Found file e:\programdata\anaconda22\lib\site-packages\Pythonwin\mfc
m90u.dll
21632 INFO: Adding redirect Microsoft.VC90.MFC version (9, 0, 21022, 8) -> (9, 0
, 21022, 8)
22109 INFO: Looking for eggs
22111 INFO: Using Python library e:\programdata\anaconda22\python27.dll
22111 INFO: Found binding redirects:
[BindingRedirect(name=u'Microsoft.VC90.MFC', language=None, arch=u'amd64', oldVe
rsion=(9, 0, 21022, 8), newVersion=(9, 0, 21022, 8), publicKeyToken=u'1fc8b3b9a1
e18e3b'), BindingRedirect(name=u'Microsoft.VC90.CRT', language=None, arch=u'amd6
4', oldVersion=(9, 0, 21022, 8), newVersion=(9, 0, 30729, 4940), publicKeyToken=
u'1fc8b3b9a1e18e3b')]
22126 INFO: Warnings written to D:\ra\spark-2.4.3-bin-hadoop2\spark-2.4.3-bin-ha
doop2.7\examples\src\main\python\test01\build\als\warn-als.txt
22218 INFO: Graph cross-reference written to D:\ra\spark-2.4.3-bin-hadoop2\spark
-2.4.3-bin-hadoop2.7\examples\src\main\python\test01\build\als\xref-als.html
22343 INFO: checking PYZ
22345 INFO: Building PYZ because PYZ-00.toc is non existent
22345 INFO: Building PYZ (ZlibArchive) D:\ra\spark-2.4.3-bin-hadoop2\spark-2.4.3
-bin-hadoop2.7\examples\src\main\python\test01\build\als\PYZ-00.pyz
23424 INFO: Building PYZ (ZlibArchive) D:\ra\spark-2.4.3-bin-hadoop2\spark-2.4.3
-bin-hadoop2.7\examples\src\main\python\test01\build\als\PYZ-00.pyz completed su
ccessfully.
23532 INFO: checking PKG
23532 INFO: Building PKG because PKG-00.toc is non existent
23533 INFO: Building PKG (CArchive) PKG-00.pkg
23557 INFO: Building PKG (CArchive) PKG-00.pkg completed successfully.
23562 INFO: Bootloader e:\programdata\anaconda22\lib\site-packages\PyInstaller\b
ootloader\Windows-64bit\run.exe
23562 INFO: checking EXE
23563 INFO: Building EXE because EXE-00.toc is non existent
23566 INFO: Building EXE from EXE-00.toc
23572 INFO: Appending archive to EXE D:\ra\spark-2.4.3-bin-hadoop2\spark-2.4.3-b
in-hadoop2.7\examples\src\main\python\test01\build\als\als.exe
23582 INFO: Building EXE from EXE-00.toc completed successfully.
23590 INFO: checking COLLECT
23590 INFO: Building COLLECT because COLLECT-00.toc is non existent
23592 INFO: Building COLLECT COLLECT-00.toc
23605 INFO: Redirecting Microsoft.VC90.CRT version (9, 0, 21022, 8) -> (9, 0, 30
729, 4940)
Traceback (most recent call last):
  File "e:\programdata\anaconda22\lib\runpy.py", line 174, in _run_module_as_mai
n
    "__main__", fname, loader, pkg_name)
  File "e:\programdata\anaconda22\lib\runpy.py", line 72, in _run_code
    exec code in run_globals
  File "E:\ProgramData\Anaconda22\Scripts\pyinstaller.exe\__main__.py", line 7,
in <module>
  File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\__main__.py", li
ne 111, in run
    run_build(pyi_config, spec_file, **vars(args))
  File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\__main__.py", li
ne 63, in run_build
    PyInstaller.building.build_main.main(pyi_config, spec_file, **kwargs)
  File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\building\build_m
ain.py", line 844, in main
    build(specfile, kw.get('distpath'), kw.get('workpath'), kw.get('clean_build'
))
  File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\building\build_m
ain.py", line 791, in build
    exec(code, spec_namespace)
  File "D:\ra\spark-2.4.3-bin-hadoop2\spark-2.4.3-bin-hadoop2.7\examples\src\mai
n\python\test01\als.spec", line 37, in <module>
    name='als')
  File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\building\api.py"
, line 693, in __init__
    self.__postinit__()
  File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\building\datastr
uct.py", line 158, in __postinit__
    self.assemble()
  File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\building\api.py"
, line 725, in assemble
    dist_nm=inm)
  File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\building\utils.p
y", line 281, in checkCache
    shutil.copy(fnm, cachedfile)
  File "e:\programdata\anaconda22\lib\shutil.py", line 133, in copy
    copyfile(src, dst)
  File "e:\programdata\anaconda22\lib\shutil.py", line 97, in copyfile
    with open(dst, 'wb') as fdst:
IOError: [Errno 13] Permission denied: 'C:\\Users\\Administrator\\AppData\\Roami
ng\\pyinstaller\\bincache00_py27_64bit\\api-ms-win-core-string-l1-1-0.dll'

D:\ra\spark-2.4.3-bin-hadoop2\spark-2.4.3-bin-hadoop2.7\examples\src\main\python
\test01>

展开
收起
爱吃鱼的程序员 2020-06-05 14:53:59 709 0
1 条回答
写回答
取消 提交回答
  • https://developer.aliyun.com/profile/5yerqm5bn5yqg?spm=a2c6h.12873639.0.0.6eae304abcjaIB
                        <p>Traceback (most recent call last):<br>   File "e:\programdata\anaconda22\lib\runpy.py", line 174, in _run_module_as_mai<br> n<br>     "__main__", fname, loader, pkg_name)<br>   File "e:\programdata\anaconda22\lib\runpy.py", line 72, in _run_code<br>     exec code in run_globals<br>   File "E:\ProgramData\Anaconda22\Scripts\pyinstaller.exe\__main__.py", line 7,<br> in <module><br>   File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\__main__.py", li<br> ne 111, in run<br>     run_build(pyi_config, spec_file, **vars(args))<br>   File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\__main__.py", li<br> ne 63, in run_build<br>     PyInstaller.building.build_main.main(pyi_config, spec_file, **kwargs)<br>   File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\building\build_m<br> ain.py", line 844, in main<br>     build(specfile, kw.get('distpath'), kw.get('workpath'), kw.get('clean_build'<br> ))<br>   File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\building\build_m<br> ain.py", line 791, in build<br>     exec(code, spec_namespace)<br>   File "D:\ra\spark-2.4.3-bin-hadoop2\spark-2.4.3-bin-hadoop2.7\examples\src\mai<br> n\python\test01\als.spec", line 37, in <module><br>     name='als')<br>   File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\building\api.py"<br> , line 693, in __init__<br>     self.__postinit__()<br>   File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\building\datastr<br> uct.py", line 158, in __postinit__<br>     self.assemble()<br>   File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\building\api.py"<br> , line 725, in assemble<br>     dist_nm=inm)<br>   File "e:\programdata\anaconda22\lib\site-packages\PyInstaller\building\utils.p<br> y", line 281, in checkCache<br>     shutil.copy(fnm, cachedfile)<br>   File "e:\programdata\anaconda22\lib\shutil.py", line 133, in copy<br>     copyfile(src, dst)<br>   File "e:\programdata\anaconda22\lib\shutil.py", line 97, in copyfile<br>     with open(dst, 'wb') as fdst:<br> IOError: [Errno 13] Permission denied: 'C:\\Users\\Administrator\\AppData\\Roami<br> ng\\pyinstaller\\bincache00_py27_64bit\\api-ms-win-core-string-l1-1-0.dll'</p>
    
    2020-06-05 14:54:13
    赞同 展开评论 打赏
问答地址:
问答排行榜
最热
最新

相关电子书

更多
Monitoring the Dynamic Resource Usage of Scala and Python Spark Jobs in Yarn 立即下载
Build Your Next Apache Spark Job in .NET Using Mobius 立即下载
SparkSQL实践与优化 立即下载