Spark: How to kill running process without exiting shell? -


how can kill running process in spark shell on local osx machine without exiting?

for example, if simple .count() on rdd, can take while , want kill it.

however, if ctrl c kills whole shell.

is there way kill process not shell?

you can use master web interface kill or visualize job. find other things there log file or cluster working chart...


Comments

Popular posts from this blog

apache - Error with PHP mail(): Multiple or malformed newlines found in additional_header -

jquery - ReferenceError: CKEDITOR is not defined -

android - Go back to previous fragment -