Spark: How to kill running process without exiting shell? -
how can kill running process in spark shell on local osx machine without exiting?
for example, if simple .count()
on rdd, can take while , want kill it.
however, if ctrl c
kills whole shell.
is there way kill process not shell?
you can use master web interface kill or visualize job. find other things there log file or cluster working chart...
Comments
Post a Comment