forked from apache/spark
-
Notifications
You must be signed in to change notification settings - Fork 1
Streamed fetch chunk #8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
Closed
Changes from all commits
Commits
Show all changes
49 commits
Select commit
Hold shift + click to select a range
900bc1f
[SPARK-24371][SQL] Added isInCollection in DataFrame API for Scala an…
dbtsai f489388
[SPARK-24365][SQL] Add Data Source write benchmark
gengliangwang a4be981
[SPARK-24331][SPARKR][SQL] Adding arrays_overlap, array_repeat, map_e…
mn-mikke 0ebb0c0
[SPARK-23754][PYTHON] Re-raising StopIteration in client code
e-dorigatti 9e7bad0
[SPARK-24419][BUILD] Upgrade SBT to 0.13.17 with Scala 2.10.7 for JDK9+
dbtsai 1e46f92
[SPARK-24369][SQL] Correct handling for multiple distinct aggregation…
maropu b142157
[SPARK-24384][PYTHON][SPARK SUBMIT] Add .py files correctly into Pyth…
HyukjinKwon ec6f971
[SPARK-23161][PYSPARK][ML] Add missing APIs to Python GBTClassifier
huaxingao 1b36f14
[SPARK-23901][SQL] Add masking functions
mgaido91 24ef7fb
[SPARK-24276][SQL] Order of literals in IN should not affect semantic…
mgaido91 0053e15
[SPARK-24337][CORE] Improve error messages for Spark conf values
PenguinToast 90ae98d
[SPARK-24146][PYSPARK][ML] spark.ml parity for sequential pattern min…
WeichenXu123 698b9a0
[WEBUI] Avoid possibility of script in query param keys
srowen 7a82e93
[SPARK-24414][UI] Calculate the correct number of tasks for a stage.
223df5d
[SPARK-24397][PYSPARK] Added TaskContext.getLocalProperty(key) in Python
tdas cc976f6
[SPARK-23900][SQL] format_number support user specifed format as argu…
wangyum 21e1fc7
[SPARK-24232][K8S] Add support for secret env vars
2c9c862
[MINOR][YARN] Add YARN-specific credential providers in debug logging…
HyukjinKwon cbaa729
[SPARK-24330][SQL] Refactor ExecuteWriteTask and Use `while` in writi…
gengliangwang b2d0226
[SPARK-24444][DOCS][PYTHON] Improve Pandas UDF docs to explain column…
BryanCutler 22df953
[SPARK-24326][MESOS] add support for local:// scheme for the app jar
98909c3
[SPARK-23920][SQL] add array_remove to remove all elements that equal…
huaxingao 6039b13
[SPARK-24351][SS] offsetLog/commitLog purge thresholdBatchId should b…
ivoson d2c3de7
Revert "[SPARK-24369][SQL] Correct handling for multiple distinct agg…
gatorsmile 09e78c1
[INFRA] Close stale PRs.
8ef167a
[SPARK-24340][CORE] Clean up non-shuffle disk block manager files fol…
jiangxb1987 a36c1a6
[SPARK-23668][K8S] Added missing config property in running-on-kubern…
liyinan926 de4feae
[SPARK-24356][CORE] Duplicate strings in File.path managed by FileSeg…
misha-cloudera a2166ec
[SPARK-24455][CORE] fix typo in TaskSchedulerImpl comment
416cd1f
[SPARK-24369][SQL] Correct handling for multiple distinct aggregation…
cloud-fan 1d9338b
[SPARK-23786][SQL] Checking column names of csv headers
MaxGekk 0be5aa2
[SPARK-23903][SQL] Add support for date extract
wangyum 7297ae0
[SPARK-21896][SQL] Fix StackOverflow caused by window functions insid…
b24d3db
[SPARK-24290][ML] add support for Array input for instrumentation.log…
lu-wang-dl ff0501b
[SPARK-24300][ML] change the way to set seed in ml.cluster.LDASuite.g…
lu-wang-dl dbb4d83
[SPARK-24215][PYSPARK] Implement _repr_html_ for dataframes in PySpark
xuanyuanking b3417b7
[SPARK-16451][REPL] Fail shell if SparkSession fails to start.
e8c1a0c
[SPARK-15784] Add Power Iteration Clustering to spark.ml
WeichenXu123 2c2a86b
[SPARK-24453][SS] Fix error recovering from the failure in a no-data …
tdas 93df3cd
[SPARK-22384][SQL] Refine partition pruning when attribute is wrapped…
e9efb62
[SPARK-24187][R][SQL] Add array_join function to SparkR
huaxingao 1706fde
Initial version
attilapiros d2753a6
introduce factory
attilapiros 616d601
Extend ProtocolSuite
attilapiros acc1e20
add test for fetch to disk
attilapiros 797f558
tiny fix
attilapiros 76f23cb
Add SASL support for FrameDecoder
attilapiros 365e673
fix
attilapiros 5899663
fix
attilapiros File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
there is a pretty simple mapping between
StreamChunkIdused byChunkFetchSuccessand the String id used byStreamResponse-- I think if you just used that you could eliminate a lot of these changeshttps://github.com/apache/spark/blob/master/common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/OneForOneBlockFetcher.java#L129
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Before the request it is decided whether the use stream or chunk fetch. See https://github.com/apache/spark/blob/master/common/network-common/src/main/java/org/apache/spark/network/client/TransportResponseHandler.java#L75
and
https://github.com/apache/spark/blob/master/common/network-common/src/main/java/org/apache/spark/network/client/TransportResponseHandler.java#L93
If we just Decode StreamResponse instead of ChunkFetchSuccess it wont work out of the box.