spark-shell运行失败

[root@liujunjie-1 conf]# spark-shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/flume-ng/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/06/17 13:53:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/06/17 13:53:17 INFO spark.SecurityManager: Changing view acls to: root
16/06/17 13:53:17 INFO spark.SecurityManager: Changing modify acls to: root
16/06/17 13:53:17 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
16/06/17 13:53:17 INFO spark.HttpServer: Starting HTTP Server
16/06/17 13:53:18 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/06/17 13:53:18 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:56812
16/06/17 13:53:18 INFO util.Utils: Successfully started service 'HTTP class server' on port 56812.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.3.0
      /_/

Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_101)
Type in expressions to have them evaluated.
Type :help for more information.
16/06/17 13:53:36 INFO spark.SparkContext: Running Spark version 1.3.0
16/06/17 13:53:36 INFO spark.SecurityManager: Changing view acls to: root
16/06/17 13:53:36 INFO spark.SecurityManager: Changing modify acls to: root
16/06/17 13:53:36 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
16/06/17 13:53:39 INFO slf4j.Slf4jLogger: Slf4jLogger started
16/06/17 13:53:39 INFO Remoting: Starting remoting
16/06/17 13:53:39 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@liujunjie-1:47675]
16/06/17 13:53:39 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriver@liujunjie-1:47675]
16/06/17 13:53:39 INFO util.Utils: Successfully started service 'sparkDriver' on port 47675.
16/06/17 13:53:40 INFO spark.SparkEnv: Registering MapOutputTracker
16/06/17 13:53:40 INFO spark.SparkEnv: Registering BlockManagerMaster
16/06/17 13:53:40 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-3b3f4837-cd18-40ab-9881-e6ff68a9902d/blockmgr-fe5fac8b-86af-4dcc-98fc-2e9d1e0805db
16/06/17 13:53:40 INFO storage.MemoryStore: MemoryStore started with capacity 267.3 MB
16/06/17 13:53:40 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-f5bbd827-9628-40b2-a7f8-20f06f35ef9a/httpd-78d20461-3dac-4e0d-b3e5-56e9f6e0ca27
16/06/17 13:53:40 INFO spark.HttpServer: Starting HTTP Server
16/06/17 13:53:40 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/06/17 13:53:40 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:56901
16/06/17 13:53:40 INFO util.Utils: Successfully started service 'HTTP file server' on port 56901.
16/06/17 13:53:40 INFO spark.SparkEnv: Registering OutputCommitCoordinator
/usr/lib/spark/bin/spark-shell: line 55: 32288 Killed                  "$FWDIR"/bin/spark-submit --class org.apache.spark.repl.Main "${SUBMISSION_OPTS[@]}" spark-shell "${APPLICATION_OPTS[@]}"
[root@liujunjie-1 conf]#
帮忙看看什么原因?

@CrazyChao - 人生不止眼前的苟且,还有诗和远方的田野!^.^

赞同来自: 唯思可达

这个是集群设置的内存与超过机器内存报错,运行不了,我之前遇到过!你可以试试设置一个值,不要用默认值!我用的虚拟机,直接改了虚拟机内存就好了!另外你这个错误好像贴的不完整啊

@CrazyChao - 人生不止眼前的苟且,还有诗和远方的田野!^.^

赞同来自: 唯思可达

[root@HTY-1 conf]# cp spark-defaults.conf.template spark-defaults.conf [root@HTY-1 conf]# vim spark-defaults.conf # Example: # spark.master spark://master:7077 # spark.eventLog.enabled true # spark.eventLog.dir hdfs://namenode:8021/directory # spark.serializer org.apache.spark.serializer.KryoSerializer # spark.driver.memory 5g # spark.executor.extraJavaOptions -XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three" 去掉注释#,设置自己的配置

唯思可达

赞同来自:

换了台机子可以跑起来, spark-shell运行需要多少内存呢?  在哪里设置这个默认值 ?

wangxiaolei

赞同来自:

把虚拟内存打开

要回复问题请先登录注册