Skip to content

Conversation

@srowen
Copy link
Member

@srowen srowen commented Apr 7, 2019

What changes were proposed in this pull request?

Fix build warnings -- see some details below.

But mostly, remove use of postfix syntax where it causes warnings without the scala.language.postfixOps import. This is mostly in expressions like "120000 milliseconds". Which, I'd like to simplify to things like "2.minutes" anyway.

How was this patch tested?

Existing tests.

…ings without the scala.language.postfixOps import
@srowen srowen self-assigned this Apr 7, 2019
ManagedBuffer block0 = new NioManagedBuffer(ByteBuffer.wrap(new byte[13]));
ManagedBuffer block1 = new NioManagedBuffer(ByteBuffer.wrap(new byte[7]));
ManagedBuffer block2 = new NioManagedBuffer(ByteBuffer.wrap(new byte[19]));
private final ManagedBuffer block0 = new NioManagedBuffer(ByteBuffer.wrap(new byte[13]));
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, this is stuff I just cleaned up while touching this file.


assertNotNull(stub);
stub.when(fetchStarter).createAndStart(any(), anyObject());
stub.when(fetchStarter).createAndStart(any(), any());
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was an avoidable deprecated method

// Set a fixed timeout for RPC here, so users shall get a SparkException thrown by
// BarrierCoordinator on timeout, instead of RPCTimeoutException from the RPC framework.
timeout = new RpcTimeout(31536000 /* = 3600 * 24 * 365 */ seconds, "barrierTimeout"))
timeout = new RpcTimeout(365.days, "barrierTimeout"))
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A good example of simplifying expressions while also removing postfixOps

eventually(timeout(10.seconds), interval(1.second)) {
events.foreach {
case e: LoadInstanceStart[PipelineStage]
case e: LoadInstanceStart[_]
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't match generic types - causes a warning

}
}

abstract class JavaSimpleScanBuilder implements ScanBuilder, Scan, Batch {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These were moved to top-level classes. Multiple top-level classes in Java are mostly a no-no, and actually generate a warning if you try to use them in another file.

@SparkQA
Copy link

SparkQA commented Apr 7, 2019

Test build #104361 has finished for PR 24314 at commit 6f67c4d.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Apr 7, 2019

Test build #104363 has finished for PR 24314 at commit 54aac39.

  • This patch fails to generate documentation.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member

@felixcheung felixcheung left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. very long diff.

@srowen
Copy link
Member Author

srowen commented Apr 7, 2019

Wait until you see what has to happen to get rid of (mis)use of scala.language.existentials, which has been hiding some wrong usage of generics

@SparkQA
Copy link

SparkQA commented Apr 8, 2019

Test build #104364 has finished for PR 24314 at commit 7e5988f.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

val rpcEndpointRef = remoteEnv.setupEndpointRef(localEnv.address, "send-authentication")
rpcEndpointRef.send("hello")
eventually(timeout(5 seconds), interval(10 millis)) {
eventually(timeout(5.seconds), interval(10.millis)) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

10.millis => 10.milliseconds

There are a few other places where this pattern is repeated:

(pr/24314) $ git diff 017919b636  | grep "^\+.*millis[^e]"
+    eventually(timeout(1.minute), interval(10.millis)) {
+        rpcEndpointRef.askSync[String]("hello", new RpcTimeout(1.millis, shortProp))
+      eventually(timeout(5.seconds), interval(10.millis)) {
+    eventually(timeout(1.minute), interval(200.millis)) {
+      Await.ready(f, 50.millis)
+      eventually(timeout(20.seconds), interval(batchDuration.milliseconds.millis)) {
+      eventually(timeout(10.seconds), interval(10.millis)) {

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Of course this is just a Nit as both correct (but from your previous changes I have the feeling milliseconds is prefered).

Copy link
Contributor

@attilapiros attilapiros left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have run some completeness grep (for finding second/milli/minute without dot prefix outside of string literals and comments) and this PR seams to me complete.

Regarding removing the unnecessary case after map and foreach expressions there are still cases Like:

mapOutput1.foreach { case mapStatus =>

But as I got it that is not the scope of this change.

So LGTM.

@srowen
Copy link
Member Author

srowen commented Apr 8, 2019

Thanks @attilapiros yeah I fixed up more of those instances. Yes I have taken the liberty of also 'fixing' some minor things like unnecessary case statements etc in nearby code but tried to keep it mostly to build warnings and postfixOps here.

@SparkQA
Copy link

SparkQA commented Apr 8, 2019

Test build #104394 has finished for PR 24314 at commit ac7bec7.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Apr 9, 2019

Test build #104409 has finished for PR 24314 at commit bebd215.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Apr 9, 2019

Test build #104434 has started for PR 24314 at commit aa61047.

@SparkQA
Copy link

SparkQA commented Apr 9, 2019

Test build #4700 has started for PR 24314 at commit aa61047.

@SparkQA
Copy link

SparkQA commented Apr 10, 2019

Test build #4701 has started for PR 24314 at commit aa61047.

@SparkQA
Copy link

SparkQA commented Apr 10, 2019

Test build #4702 has finished for PR 24314 at commit aa61047.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Apr 11, 2019

Test build #4703 has started for PR 24314 at commit aa61047.

@dongjoon-hyun
Copy link
Member

Retest this please.

@SparkQA
Copy link

SparkQA commented Apr 11, 2019

Test build #104509 has finished for PR 24314 at commit aa61047.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member Author

srowen commented Apr 11, 2019

Merged to master

@srowen srowen closed this in 4ec7f63 Apr 11, 2019
@srowen srowen deleted the SPARK-27404 branch April 16, 2019 14:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants