Skip to content

Conversation

@andrewor14
Copy link
Contributor

@andrewor14 andrewor14 commented Apr 22, 2016

What changes were proposed in this pull request?

Spark context available as 'sc' (master = local[*], app id = local-1461283768192).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.0-SNAPSHOT
      /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_51)
Type in expressions to have them evaluated.
Type :help for more information.

scala> sql("SHOW TABLES").collect()
16/04/21 17:09:39 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/04/21 17:09:39 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
res0: Array[org.apache.spark.sql.Row] = Array([src,false])

scala> sql("SHOW TABLES").collect()
res1: Array[org.apache.spark.sql.Row] = Array([src,false])

scala> spark.createDataFrame(Seq((1, 1), (2, 2), (3, 3)))
res2: org.apache.spark.sql.DataFrame = [_1: int, _2: int]

Hive things are loaded lazily.

How was this patch tested?

Manual.

@rxin
Copy link
Contributor

rxin commented Apr 22, 2016

I think there was a suggestion from @marmbrus to just name this "spark". Would it cause any problems because it conflicts with the package name?

@andrewor14
Copy link
Contributor Author

OK, I like spark better too. Let's try it out.

@marmbrus
Copy link
Contributor

Yeah, if there are problems we don't have to do that, but val df = spark.read.json(...) is pretty nice

@andrewor14
Copy link
Contributor Author

I don't see any problems off the top of my head. The only collision is when people do import org.apache.spark, but who does that?

@SparkQA
Copy link

SparkQA commented Apr 22, 2016

Test build #56614 has finished for PR 12589 at commit e7dee4f.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Apr 22, 2016

Test build #56616 has finished for PR 12589 at commit 45783de.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@rxin
Copy link
Contributor

rxin commented Apr 22, 2016

LGTM - let's see if we run into issues in the future with this.

}
sqlContext
}
def createSparkSession(): SparkSession = Main.createSparkSession()
Copy link
Contributor

@yhuai yhuai Apr 22, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does repl/scala-2.10/src/main/scala/org/apache/spark/repl/Main.scala have createSparkSession?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hm, is there a way to do this without duplicating code?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not very sure. How about we still duplicate the code for now?

@SparkQA
Copy link

SparkQA commented Apr 22, 2016

Test build #56720 has finished for PR 12589 at commit 8642cd7.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Apr 25, 2016

Test build #56905 has finished for PR 12589 at commit e69d7cf.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

asfgit pushed a commit that referenced this pull request Apr 25, 2016
## What changes were proposed in this pull request?

This removes the class `HiveContext` itself along with all code usages associated with it. The bulk of the work was already done in #12485. This is mainly just code cleanup and actually removing the class.

Note: A couple of things will break after this patch. These will be fixed separately.
- the python HiveContext
- all the documentation / comments referencing HiveContext
- there will be no more HiveContext in the REPL (fixed by #12589)

## How was this patch tested?

No change in functionality.

Author: Andrew Or <[email protected]>

Closes #12585 from andrewor14/delete-hive-context.
@SparkQA
Copy link

SparkQA commented Apr 25, 2016

Test build #2872 has finished for PR 12589 at commit 601df7c.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class GaussianMixtureModel(JavaModel, JavaMLWritable, JavaMLReadable):
    • class GaussianMixture(JavaEstimator, HasFeaturesCol, HasPredictionCol, HasMaxIter, HasTol, HasSeed,

@SparkQA
Copy link

SparkQA commented Apr 25, 2016

Test build #2873 has finished for PR 12589 at commit 601df7c.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class GaussianMixtureModel(JavaModel, JavaMLWritable, JavaMLReadable):
    • class GaussianMixture(JavaEstimator, HasFeaturesCol, HasPredictionCol, HasMaxIter, HasTol, HasSeed,

@SparkQA
Copy link

SparkQA commented Apr 25, 2016

Test build #2874 has finished for PR 12589 at commit 601df7c.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class GaussianMixtureModel(JavaModel, JavaMLWritable, JavaMLReadable):
    • class GaussianMixture(JavaEstimator, HasFeaturesCol, HasPredictionCol, HasMaxIter, HasTol, HasSeed,

@SparkQA
Copy link

SparkQA commented Apr 25, 2016

Test build #56919 has finished for PR 12589 at commit 601df7c.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class GaussianMixtureModel(JavaModel, JavaMLWritable, JavaMLReadable):
    • class GaussianMixture(JavaEstimator, HasFeaturesCol, HasPredictionCol, HasMaxIter, HasTol, HasSeed,

@rxin
Copy link
Contributor

rxin commented Apr 25, 2016

Thanks - merging in master.

@asfgit asfgit closed this in 34336b6 Apr 25, 2016
@andrewor14 andrewor14 deleted the spark-session-repl branch April 25, 2016 22:32
@felixcheung
Copy link
Member

should this go to pyspark and sparkR shell as well?
or rather, should SparkSession be in Python and R first?

@andrewor14
Copy link
Contributor Author

it'll be there later

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants