Spark standalone configuration having multiple executors

WalkingDeadFan

I'm trying to setup a standalone Spark 2.0 server to process an analytics function in parallel. To do this I want to have a single worker with multiple executors.

I'm using :

  • Standalone Spark 2.0
  • 8 Cores
  • 24gig RAM
  • windows server 2008
  • pyspark (although this appears unrelated)

This is just for pure proof of concept purposes but I want to have 8 executors, one per each core.

I've tried to follow the other threads on this topic but for some reason it's not working for me. IE: Spark Standalone Number Executors/Cores Control

My configuration is as follows:

conf\spark-defaults.conf

spark.cores.max = 8
spark.executor.cores = 1

I have tried to also change my spark-env.sh file to no avail. Instead what is happening is that it shows that my 1 worker only has 1 executor on it. As you can see below, it still shows the standalone with 1 executor with 8 cores to it.

enter image description here

zero323

I believe you mixed up local and standalone modes:

  • Local mode is a development tool where all processes are executed inside a single JVM. Application is started in a local mode by setting master to local, local[*] or local[n]. spark.executor.cores and spark.executor.cores are not applicable in the local mode because there is only one embedded executor.
  • Standalone mode requires a standalone Spark cluster. It requires a master node (can be started using SPARK_HOME/sbin/start-master.sh script) and at least one worker node (can be started using SPARK_HOME/sbin/start-slave.sh script).

    SparkConf should use master node address to create (spark://host:port).

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related

How multiple executors are managed on the worker nodes with a Spark standalone cluster?

What are workers, executors, cores in Spark Standalone cluster?

Spark Standalone Number Executors/Cores Control

SPARK standalone cluster: Executors exit, how to track the source of the error?

How to pass ojai configuration from driver to executors in spark?

spark with high memory - use multiple executors per node?

spark job gets stuck when running in multiple executors

Spark Repartition Executors

Object cache on Spark executors

Spark failed to connect with executors

Is it available if Kafka Sink consume multiple sink into multiple table with standalone configuration?

Spark Standalone

Is it possible for multiple Executors to be launched within a single Spark worker for one Spark Application?

GIT and having multiple configuration files on Heroku

Error when submitting multiple spark applications to standalone cluster

How do I run multiple spark applications in parallel in standalone master

How to set amount of Spark executors?

Comparing data across executors in Spark

Number of Executors in Spark Local Mode

Spark Object (singleton) serialization on executors

Standalone configuration file for serilog

spring integration using multiple executors

Limiting Total Threads in Multiple Executors

spark.master configuration via REST job submission in standalone cluster is ignored

Spark load settings from multiple configuration files

How to allocate more executors per worker in Standalone cluster mode?

Spark Parallelism in Standalone Mode

Authentication for Spark standalone cluster

How does Spark prepare executors on Hadoop YARN?