" ./bin/spark-shell " Not working with Pre-built version of Spark 1.6 with Hadoop 2.6+ on ubuntu 14.04 -


freshly downloaded pre-built version of spark 1.6 hadoop 2.6+ on ubuntu 14.04 onto desktop.

i navigated spark shell , initiated spark per link given below quick start spark link using

./bin/spark-shell 

i receiving following errors . saw similar question asked mac osx here .

ashwin@console:~/desktop/spark-1.6.0-bin-hadoop2.6$ ./bin/spark-shell log4j:warn no appenders found logger (org.apache.hadoop.metrics2.lib.mutablemetricsfactory). log4j:warn please initialize log4j system properly. log4j:warn see http://logging.apache.org/log4j/1.2/faq.html#noconfig more info. using spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties adjust logging level use sc.setloglevel("info") welcome       ____              __      / __/__  ___ _____/ /__     _\ \/ _ \/ _ `/ __/  '_/    /___/ .__/\_,_/_/ /_/\_\   version 1.6.0       /_/  using scala version 2.10.5 (openjdk 64-bit server vm, java 1.7.0_91) type in expressions have them evaluated. type :help more information. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/01/05 12:36:25 error sparkcontext: error initializing sparkcontext. java.net.bindexception: cannot assign requested address: service 'sparkdriver' failed after 16 retries!     @ sun.nio.ch.net.bind0(native method)     @ sun.nio.ch.net.bind(net.java:463)     @ sun.nio.ch.net.bind(net.java:455)     @ sun.nio.ch.serversocketchannelimpl.bind(serversocketchannelimpl.java:223)     @ sun.nio.ch.serversocketadaptor.bind(serversocketadaptor.java:74)     @ io.netty.channel.socket.nio.nioserversocketchannel.dobind(nioserversocketchannel.java:125)     @ io.netty.channel.abstractchannel$abstractunsafe.bind(abstractchannel.java:485)     @ io.netty.channel.defaultchannelpipeline$headcontext.bind(defaultchannelpipeline.java:1089)     @ io.netty.channel.abstractchannelhandlercontext.invokebind(abstractchannelhandlercontext.java:430)     @ io.netty.channel.abstractchannelhandlercontext.bind(abstractchannelhandlercontext.java:415)     @ io.netty.channel.defaultchannelpipeline.bind(defaultchannelpipeline.java:903)     @ io.netty.channel.abstractchannel.bind(abstractchannel.java:198)     @ io.netty.bootstrap.abstractbootstrap$2.run(abstractbootstrap.java:348)     @ io.netty.util.concurrent.singlethreadeventexecutor.runalltasks(singlethreadeventexecutor.java:357)     @ io.netty.channel.nio.nioeventloop.run(nioeventloop.java:357)     @ io.netty.util.concurrent.singlethreadeventexecutor$2.run(singlethreadeventexecutor.java:111)     @ java.lang.thread.run(thread.java:745) java.net.bindexception: cannot assign requested address: service 'sparkdriver' failed after 16 retries!     @ sun.nio.ch.net.bind0(native method)     @ sun.nio.ch.net.bind(net.java:463)     @ sun.nio.ch.net.bind(net.java:455)     @ sun.nio.ch.serversocketchannelimpl.bind(serversocketchannelimpl.java:223)     @ sun.nio.ch.serversocketadaptor.bind(serversocketadaptor.java:74)     @ io.netty.channel.socket.nio.nioserversocketchannel.dobind(nioserversocketchannel.java:125)     @ io.netty.channel.abstractchannel$abstractunsafe.bind(abstractchannel.java:485)     @ io.netty.channel.defaultchannelpipeline$headcontext.bind(defaultchannelpipeline.java:1089)     @ io.netty.channel.abstractchannelhandlercontext.invokebind(abstractchannelhandlercontext.java:430)     @ io.netty.channel.abstractchannelhandlercontext.bind(abstractchannelhandlercontext.java:415)     @ io.netty.channel.defaultchannelpipeline.bind(defaultchannelpipeline.java:903)     @ io.netty.channel.abstractchannel.bind(abstractchannel.java:198)     @ io.netty.bootstrap.abstractbootstrap$2.run(abstractbootstrap.java:348)     @ io.netty.util.concurrent.singlethreadeventexecutor.runalltasks(singlethreadeventexecutor.java:357)     @ io.netty.channel.nio.nioeventloop.run(nioeventloop.java:357)     @ io.netty.util.concurrent.singlethreadeventexecutor$2.run(singlethreadeventexecutor.java:111)     @ java.lang.thread.run(thread.java:745)  java.lang.nullpointerexception     @ org.apache.spark.sql.sqlcontext$.createlistenerandui(sqlcontext.scala:1367)     @ org.apache.spark.sql.hive.hivecontext.<init>(hivecontext.scala:101)     @ sun.reflect.nativeconstructoraccessorimpl.newinstance0(native method)     @ sun.reflect.nativeconstructoraccessorimpl.newinstance(nativeconstructoraccessorimpl.java:57)     @ sun.reflect.delegatingconstructoraccessorimpl.newinstance(delegatingconstructoraccessorimpl.java:45)     @ java.lang.reflect.constructor.newinstance(constructor.java:526)     @ org.apache.spark.repl.sparkiloop.createsqlcontext(sparkiloop.scala:1028)     @ $iwc$$iwc.<init>(<console>:15)     @ $iwc.<init>(<console>:24)     @ <init>(<console>:26)     @ .<init>(<console>:30)     @ .<clinit>(<console>)     @ .<init>(<console>:7)     @ .<clinit>(<console>)     @ $print(<console>)     @ sun.reflect.nativemethodaccessorimpl.invoke0(native method)     @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:57)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)     @ java.lang.reflect.method.invoke(method.java:606)     @ org.apache.spark.repl.sparkimain$readevalprint.call(sparkimain.scala:1065)     @ org.apache.spark.repl.sparkimain$request.loadandrun(sparkimain.scala:1346)     @ org.apache.spark.repl.sparkimain.loadandrunreq$1(sparkimain.scala:840)     @ org.apache.spark.repl.sparkimain.interpret(sparkimain.scala:871)     @ org.apache.spark.repl.sparkimain.interpret(sparkimain.scala:819)     @ org.apache.spark.repl.sparkiloop.reallyinterpret$1(sparkiloop.scala:857)     @ org.apache.spark.repl.sparkiloop.interpretstartingwith(sparkiloop.scala:902)     @ org.apache.spark.repl.sparkiloop.command(sparkiloop.scala:814)     @ org.apache.spark.repl.sparkiloopinit$$anonfun$initializespark$1.apply(sparkiloopinit.scala:132)     @ org.apache.spark.repl.sparkiloopinit$$anonfun$initializespark$1.apply(sparkiloopinit.scala:124)     @ org.apache.spark.repl.sparkimain.bequietduring(sparkimain.scala:324)     @ org.apache.spark.repl.sparkiloopinit$class.initializespark(sparkiloopinit.scala:124)     @ org.apache.spark.repl.sparkiloop.initializespark(sparkiloop.scala:64)     @ org.apache.spark.repl.sparkiloop$$anonfun$org$apache$spark$repl$sparkiloop$$process$1$$anonfun$apply$mcz$sp$5.apply$mcv$sp(sparkiloop.scala:974)     @ org.apache.spark.repl.sparkiloopinit$class.runthunks(sparkiloopinit.scala:159)     @ org.apache.spark.repl.sparkiloop.runthunks(sparkiloop.scala:64)     @ org.apache.spark.repl.sparkiloopinit$class.postinitialization(sparkiloopinit.scala:108)     @ org.apache.spark.repl.sparkiloop.postinitialization(sparkiloop.scala:64)     @ org.apache.spark.repl.sparkiloop$$anonfun$org$apache$spark$repl$sparkiloop$$process$1.apply$mcz$sp(sparkiloop.scala:991)     @ org.apache.spark.repl.sparkiloop$$anonfun$org$apache$spark$repl$sparkiloop$$process$1.apply(sparkiloop.scala:945)     @ org.apache.spark.repl.sparkiloop$$anonfun$org$apache$spark$repl$sparkiloop$$process$1.apply(sparkiloop.scala:945)     @ scala.tools.nsc.util.scalaclassloader$.savingcontextloader(scalaclassloader.scala:135)     @ org.apache.spark.repl.sparkiloop.org$apache$spark$repl$sparkiloop$$process(sparkiloop.scala:945)     @ org.apache.spark.repl.sparkiloop.process(sparkiloop.scala:1059)     @ org.apache.spark.repl.main$.main(main.scala:31)     @ org.apache.spark.repl.main.main(main.scala)     @ sun.reflect.nativemethodaccessorimpl.invoke0(native method)     @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:57)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)     @ java.lang.reflect.method.invoke(method.java:606)     @ org.apache.spark.deploy.sparksubmit$.org$apache$spark$deploy$sparksubmit$$runmain(sparksubmit.scala:731)     @ org.apache.spark.deploy.sparksubmit$.dorunmain$1(sparksubmit.scala:181)     @ org.apache.spark.deploy.sparksubmit$.submit(sparksubmit.scala:206)     @ org.apache.spark.deploy.sparksubmit$.main(sparksubmit.scala:121)     @ org.apache.spark.deploy.sparksubmit.main(sparksubmit.scala)  <console>:16: error: not found: value sqlcontext          import sqlcontext.implicits._                 ^ <console>:16: error: not found: value sqlcontext          import sqlcontext.sql                 ^ 

any ?

i have seen similar problem master fails start exception when bringing cluster.

to fix altered property setting in $spark_home/conf/spark-env.sh file.

previously had set 'spark_master_ip' ip address of master node. changing public dns of box seems fix issue.


Comments

Popular posts from this blog

how to insert data php javascript mysql with multiple array session 2 -

multithreading - Exception in Application constructor -

windows - CertCreateCertificateContext returns CRYPT_E_ASN1_BADTAG / 8009310b -