Spark: How to kill running process without exiting shell? -


how can kill running process in spark shell on local osx machine without exiting?

for example, if simple .count() on rdd, can take while , want kill it.

however, if ctrl c kills whole shell.

is there way kill process not shell?

you can use master web interface kill or visualize job. find other things there log file or cluster working chart...


Comments

Popular posts from this blog

javascript - Chart.js (Radar Chart) different scaleLineColor for each scaleLine -

apache - Error with PHP mail(): Multiple or malformed newlines found in additional_header -

java - Android – MapFragment overlay button shadow, just like MyLocation button -