pyspark - Is it possible to get sparkcontext of an already running spark application? -
i running spark on amazon emr yarn cluster manager. trying write python app starts , caches data in memory. how can allow other python programs access cached data i.e.
i start app pcache -> cache data , keep app running. user can access same cached data running different instance.
my understanding should possible handle on running sparkcontext , access data? possible? or need set api on top of spark app access data. or may use spark job server of livy.
it not possible share sparkcontext between multiple processes. indeed options build api yourself, 1 server holding sparkcontext , clients telling it, or use spark job server generic implementation of same.
Comments
Post a Comment