-
Notifications
You must be signed in to change notification settings - Fork 2.8k
ZEPPELIN-1411. UDF with pyspark not working - object has no attribute 'parseDataType' #1404
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -154,7 +154,7 @@ protected static void startUp() throws Exception { | |
| // set spark master and other properties | ||
| sparkIntpSetting.getProperties().setProperty("master", "spark://" + getHostname() + ":7071"); | ||
| sparkIntpSetting.getProperties().setProperty("spark.cores.max", "2"); | ||
|
|
||
| sparkIntpSetting.getProperties().setProperty("zeppelin.spark.useHiveContext", "false"); | ||
| // set spark home for pyspark | ||
| sparkIntpSetting.getProperties().setProperty("spark.home", getSparkHome()); | ||
| pySpark = true; | ||
|
|
@@ -171,10 +171,16 @@ protected static void startUp() throws Exception { | |
|
|
||
| String sparkHome = getSparkHome(); | ||
| if (sparkHome != null) { | ||
| sparkIntpSetting.getProperties().setProperty("master", "spark://" + getHostname() + ":7071"); | ||
| if (System.getenv("SPARK_MASTER") != null) { | ||
| sparkIntpSetting.getProperties().setProperty("master", System.getenv("SPARK_MASTER")); | ||
| } else { | ||
| sparkIntpSetting.getProperties() | ||
| .setProperty("master", "spark://" + getHostname() + ":7071"); | ||
| } | ||
|
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Allow user to specify SPARK_MASTER, so that can run other modes (like yarn-client)
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. this is testing code only, but doesn't seem like we are using this in tests?
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It is for local system test when user want to run it in other modes (e.g. yarn-client). |
||
| sparkIntpSetting.getProperties().setProperty("spark.cores.max", "2"); | ||
| // set spark home for pyspark | ||
| sparkIntpSetting.getProperties().setProperty("spark.home", sparkHome); | ||
| sparkIntpSetting.getProperties().setProperty("zeppelin.spark.useHiveContext", "false"); | ||
| pySpark = true; | ||
|
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Disable HiveContext, otherwise will hit the issue of multiple derby instance. |
||
| sparkR = true; | ||
| } | ||
|
|
@@ -194,7 +200,11 @@ private static String getHostname() { | |
| } | ||
|
|
||
| private static String getSparkHome() { | ||
| String sparkHome = getSparkHomeRecursively(new File(System.getProperty(ZeppelinConfiguration.ConfVars.ZEPPELIN_HOME.getVarName()))); | ||
| String sparkHome = System.getenv("SPARK_HOME"); | ||
| if (sparkHome != null) { | ||
| return sparkHome; | ||
| } | ||
| sparkHome = getSparkHomeRecursively(new File(System.getProperty(ZeppelinConfiguration.ConfVars.ZEPPELIN_HOME.getVarName()))); | ||
| System.out.println("SPARK HOME detected " + sparkHome); | ||
|
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Allow user to specify SPARK_HOME, so that can use existing spark cluster |
||
| return sparkHome; | ||
| } | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
set sparkSession as null, so that it will be created again if the interpreter is scoped
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
stopshould be called on sparkSession beforesc.stop()http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.SparkSession
(as of now this is ok since
sparkSession.stop()simply callssc.stop()but this could change)There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch, when sparkSession is not null (spark 2.0), sparkSession.stop() should be called first.