binding - Failed to bind to: spark-master, using a remote cluster with two workers -


i managing working local master , 2 remote workers. now, want connect remote master has same remote workers. have tried different combinations of settings withing /etc/hosts , other reccomendations on internet, nothing worked.

the main class is:

public static void main(string[] args) {     scalainterface sinterface = new scalainterface(chunk_size,             "awsaccesskeyid",             "awssecretaccesskey");      sparkconf conf = new sparkconf().setappname("poc_java_and_spark")             .setmaster("spark://spark-master:7077");      org.apache.spark.sparkcontext sc = new org.apache.spark.sparkcontext(             conf);      sinterface.enables3connection(sc);     org.apache.spark.rdd.rdd<tuple2<path, text>> fileandline = (rdd<tuple2<path, text>>) sinterface.getmappedrdd(sc, "s3n://somebucket/");      org.apache.spark.rdd.rdd<string> pinfo = (rdd<string>) sinterface.mappartitionswithindex(fileandline);      javardd<string> pinfoj = pinfo.tojavardd();      list<string> result = pinfoj.collect();      string miscinfo = sinterface.getmiscinfo(sc, pinfo);      system.out.println(miscinfo);  } 

it fails at:

list<string> result = pinfoj.collect(); 

the error getting is:

1354 [sparkdriver-akka.actor.default-dispatcher-3] error akka.remote.transport.netty.nettytransport  - failed bind spark-master/192.168.0.191:0, shutting down netty transport 1354 [main] warn  org.apache.spark.util.utils  - service 'sparkdriver' not bind on port 0. attempting port 1. 1355 [main] debug org.apache.spark.util.akkautils  - in createactorsystem, requirecookie is: off 1363 [sparkdriver-akka.actor.default-dispatcher-3] info  akka.remote.remoteactorrefprovider$remotingterminator  - shutting down remote daemon. 1364 [sparkdriver-akka.actor.default-dispatcher-3] info  akka.remote.remoteactorrefprovider$remotingterminator  - remote daemon shut down; proceeding flushing remote transports. 1364 [sparkdriver-akka.actor.default-dispatcher-5] info  akka.remote.remoteactorrefprovider$remotingterminator  - remoting shut down. 1367 [sparkdriver-akka.actor.default-dispatcher-4] info  akka.event.slf4j.slf4jlogger  - slf4jlogger started 1370 [sparkdriver-akka.actor.default-dispatcher-6] info  remoting  - starting remoting 1380 [sparkdriver-akka.actor.default-dispatcher-4] error akka.remote.transport.netty.nettytransport  - failed bind spark-master/192.168.0.191:0, shutting down netty transport exception in thread "main" 1382 [sparkdriver-akka.actor.default-dispatcher-6] info  akka.remote.remoteactorrefprovider$remotingterminator  - shutting down remote daemon. 1382 [sparkdriver-akka.actor.default-dispatcher-6] info  akka.remote.remoteactorrefprovider$remotingterminator  - remote daemon shut down; proceeding flushing remote transports. java.net.bindexception: failed bind to: spark-master/192.168.0.191:0: service 'sparkdriver' failed after 16 retries!     @ org.jboss.netty.bootstrap.serverbootstrap.bind(serverbootstrap.java:272)     @ akka.remote.transport.netty.nettytransport$$anonfun$listen$1.apply(nettytransport.scala:393)     @ akka.remote.transport.netty.nettytransport$$anonfun$listen$1.apply(nettytransport.scala:389)     @ scala.util.success$$anonfun$map$1.apply(try.scala:206)     @ scala.util.try$.apply(try.scala:161)     @ scala.util.success.map(try.scala:206)     @ scala.concurrent.future$$anonfun$map$1.apply(future.scala:235)     @ scala.concurrent.future$$anonfun$map$1.apply(future.scala:235)     @ scala.concurrent.impl.callbackrunnable.run(promise.scala:32)     @ akka.dispatch.batchingexecutor$batch$$anonfun$run$1.processbatch$1(batchingexecutor.scala:67)     @ akka.dispatch.batchingexecutor$batch$$anonfun$run$1.apply$mcv$sp(batchingexecutor.scala:82)     @ akka.dispatch.batchingexecutor$batch$$anonfun$run$1.apply(batchingexecutor.scala:59)     @ akka.dispatch.batchingexecutor$batch$$anonfun$run$1.apply(batchingexecutor.scala:59)     @ scala.concurrent.blockcontext$.withblockcontext(blockcontext.scala:72)     @ akka.dispatch.batchingexecutor$batch.run(batchingexecutor.scala:58)     @ akka.dispatch.taskinvocation.run(abstractdispatcher.scala:41)     @ akka.dispatch.forkjoinexecutorconfigurator$akkaforkjointask.exec(abstractdispatcher.scala:393)     @ scala.concurrent.forkjoin.forkjointask.doexec(forkjointask.java:260)     @ scala.concurrent.forkjoin.forkjoinpool$workqueue.runtask(forkjoinpool.java:1339)     @ scala.concurrent.forkjoin.forkjoinpool.runworker(forkjoinpool.java:1979)     @ scala.concurrent.forkjoin.forkjoinworkerthread.run(forkjoinworkerthread.java:107) 1383 [sparkdriver-akka.actor.default-dispatcher-7] info  akka.remote.remoteactorrefprovider$remotingterminator  - remoting shut down. 1385 [delete spark temp dirs] debug org.apache.spark.util.utils  - shutdown hook called 

thank kindly help!

setting environment variable spark_local_ip=127.0.0.1 solved me.