代码之家  ›  专栏  ›  技术社区  ›  azaveri7

windows中python文件的FileNotFoundException

  •  0
  • azaveri7  · 技术社区  · 7 年前

    我使用的是spark 2.3版。

    我已经从git下载了zip文件。我有一个WordCount.py文件。

    当我尝试在cmd中运行命令时:

    spark-submit WordCount.py
    

    我得到下面的错误。

    我正在复制WordCount.py的目录中执行此命令。

    18/10/14 15:24:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    18/10/14 15:24:43 ERROR SparkContext: Error initializing SparkContext.
    java.io.FileNotFoundException: File file:/E:/notes/Hadoop/spark/course%20projects/python-spark-tutorial-master/rdd/WordCount.py does not exist
            at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:611)
            at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:824)
            at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:601)
            at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:421)
            at org.apache.spark.SparkContext.addFile(SparkContext.scala:1528)
            at org.apache.spark.SparkContext.addFile(SparkContext.scala:1498)
            at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:461)
            at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:461)
            at scala.collection.immutable.List.foreach(List.scala:381)
            at org.apache.spark.SparkContext.<init>(SparkContext.scala:461)
            at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
            at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
            at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
            at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
            at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
            at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
            at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
            at py4j.Gateway.invoke(Gateway.java:238)
            at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
            at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
            at py4j.GatewayConnection.run(GatewayConnection.java:214)
            at java.lang.Thread.run(Thread.java:748)
    Traceback (most recent call last):
      File "E:/notes/Hadoop/spark/course projects/python-spark-tutorial-master/rdd/WordCount.py", line 5, in <module>
        sc = SparkContext(conf = conf)
      File "E:\notes\Hadoop\spark\spark_installation\python\lib\pyspark.zip\pyspark\context.py", line 118, in __init__
      File "E:\notes\Hadoop\spark\spark_installation\python\lib\pyspark.zip\pyspark\context.py", line 180, in _do_init
      File "E:\notes\Hadoop\spark\spark_installation\python\lib\pyspark.zip\pyspark\context.py", line 270, in _initialize_context
      File "E:\notes\Hadoop\spark\spark_installation\python\lib\py4j-0.10.6-src.zip\py4j\java_gateway.py", line 1428, in __call__
      File "E:\notes\Hadoop\spark\spark_installation\python\lib\py4j-0.10.6-src.zip\py4j\protocol.py", line 320, in get_return_value
    py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
    : java.io.FileNotFoundException: File file:/E:/notes/Hadoop/spark/course%20projects/python-spark-tutorial-master/rdd/WordCount.py does not exist
            at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:611)
            at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:824)
            at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:601)
            at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:421)
            at org.apache.spark.SparkContext.addFile(SparkContext.scala:1528)
            at org.apache.spark.SparkContext.addFile(SparkContext.scala:1498)
            at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:461)
            at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:461)
            at scala.collection.immutable.List.foreach(List.scala:381)
            at org.apache.spark.SparkContext.<init>(SparkContext.scala:461)
            at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
            at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
            at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
            at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
            at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
            at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
            at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
            at py4j.Gateway.invoke(Gateway.java:238)
            at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
            at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
            at py4j.GatewayConnection.run(GatewayConnection.java:214)
            at java.lang.Thread.run(Thread.java:748)
    
    1 回复  |  直到 7 年前
        1
  •  2
  •   lev    7 年前

    名称中有一个空格 course projects 目录。
    尝试将项目移动到另一个没有空格的目录

    推荐文章