Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
e73b157
[SPARK-40886][BUILD] Bump Jackson Databind 2.13.4.2
pan3793 Oct 23, 2022
96b5d50
[SPARK-40821][SQL][CORE][PYTHON][SS] Introduce window_time function t…
alex-balikov Oct 23, 2022
02a2242
[SPARK-40884][BUILD] Upgrade fabric8io - `kubernetes-client` to 6.2.0
bjornjorgensen Oct 24, 2022
5d3b1e6
[SPARK-40877][SQL] Reimplement `crosstab` with dataframe operations
zhengruifeng Oct 24, 2022
6a0713a
[SPARK-40880][SQL] Reimplement `summary` with dataframe operations
zhengruifeng Oct 24, 2022
79aae64
[SPARK-40849][SS] Async log purge
jerrypeng Oct 24, 2022
f7eee09
[SPARK-40880][SQL][FOLLOW-UP] Remove unused imports
zhengruifeng Oct 24, 2022
74c8264
[SPARK-40812][CONNECT][PYTHON][FOLLOW-UP] Improve Deduplicate in Pyth…
amaliujia Oct 24, 2022
4d33ee0
[SPARK-36114][SQL] Support subqueries with correlated non-equality pr…
allisonwang-db Oct 24, 2022
58490da
[SPARK-40800][SQL] Always inline expressions in OptimizeOneRowRelatio…
allisonwang-db Oct 24, 2022
c721c72
[SPARK-40881][INFRA] Upgrade actions/cache to v3 and actions/upload-a…
Yikun Oct 24, 2022
825f219
[SPARK-40882][INFRA] Upgrade actions/setup-java to v3 with distributi…
Yikun Oct 24, 2022
9140795
[SPARK-40798][SQL] Alter partition should verify value follow storeAs…
ulysses-you Oct 24, 2022
b7a88cd
[SPARK-40821][SQL][SS][FOLLOWUP] Fix available version for new functi…
HeartSaVioR Oct 24, 2022
e966c38
[SPARK-34265][PYTHON][SQL] Instrument Python UDFs using SQL metrics
LucaCanali Oct 24, 2022
6edcafc
[SPARK-40891][SQL][TESTS] Check error classes in TableIdentifierParse…
panbingkun Oct 24, 2022
e2e449e
[SPARK-40897][DOCS] Add some PySpark APIs to References
zhengruifeng Oct 24, 2022
363b853
[SPARK-39977][BUILD] Remove unnecessary guava exclusion from jackson-…
pan3793 Oct 24, 2022
880d9bb
[SPARK-40739][SPARK-40738] Fixes for cygwin/msys2/mingw sbt build and…
philwalk Oct 24, 2022
05ad102
[SPARK-40391][SQL][TESTS][FOLLOWUP] Change to use `mockito-inline` in…
LuciferYang Oct 24, 2022
4ba7ce2
[SPARK-40857][CONNECT] Enable configurable GPRC Interceptors
grundprinzip Oct 24, 2022
9d2757c
[SPARK-40750][SQL] Migrate type check failures of math expressions on…
panbingkun Oct 24, 2022
60b1056
[SPARK-40902][MESOS][TESTS] Fix issue with mesos tests failing due to…
Oct 24, 2022
6953c01
[SPARK-40904][K8S] Support `zsh` in K8s `entrypoint.sh`
dongjoon-hyun Oct 25, 2022
e64afb6
[SPARK-40899][CONNECT] Make UserContext extensible
grundprinzip Oct 25, 2022
ae79704
[SPARK-40906][SQL] `Mode` should copy keys before inserting into Map
zhengruifeng Oct 25, 2022
139006f
[SPARK-40811][SQL][TESTS] Use `checkError()` to intercept `ParseExcep…
MaxGekk Oct 25, 2022
0b3d954
[SPARK-40836][CONNECT] AnalyzeResult should use struct for schema
amaliujia Oct 25, 2022
a27b459
[SPARK-40898][SQL] Quote function names in datatype mismatch errors
MaxGekk Oct 25, 2022
25a9dfc
[SPARK-40913][INFRA] Pin `pytest==7.1.3`
zhengruifeng Oct 25, 2022
e4d0412
[SPARK-40907][PS][SQL] PandasMode` should copy keys before inserting …
zhengruifeng Oct 25, 2022
a27ccd7
[SPARK-40879][CONNECT] Support Join UsingColumns in proto
amaliujia Oct 25, 2022
bd7ffb2
[SPARK-40905][BUILD] Upgrade rocksdbjni to 7.7.3
LuciferYang Oct 25, 2022
f5d692d
[SPARK-40888][SQL][TESTS] Check error classes in HiveQuerySuite
panbingkun Oct 25, 2022
f571f2e
[SPARK-40900][SQL] Reimplement `frequentItems` with dataframe operations
zhengruifeng Oct 26, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 11 additions & 9 deletions .github/workflows/benchmark.yml
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ jobs:
with:
fetch-depth: 0
- name: Cache Scala, SBT and Maven
uses: actions/cache@v2
uses: actions/cache@v3
with:
path: |
build/apache-maven-*
Expand All @@ -81,15 +81,15 @@ jobs:
restore-keys: |
build-
- name: Cache Coursier local repository
uses: actions/cache@v2
uses: actions/cache@v3
with:
path: ~/.cache/coursier
key: benchmark-coursier-${{ github.event.inputs.jdk }}-${{ hashFiles('**/pom.xml', '**/plugins.sbt') }}
restore-keys: |
benchmark-coursier-${{ github.event.inputs.jdk }}
- name: Cache TPC-DS generated data
id: cache-tpcds-sf-1
uses: actions/cache@v2
uses: actions/cache@v3
with:
path: ./tpcds-sf-1
key: tpcds-${{ hashFiles('.github/workflows/benchmark.yml', 'sql/core/src/test/scala/org/apache/spark/sql/TPCDSSchema.scala') }}
Expand All @@ -105,8 +105,9 @@ jobs:
run: cd tpcds-kit/tools && make OS=LINUX
- name: Install Java ${{ github.event.inputs.jdk }}
if: steps.cache-tpcds-sf-1.outputs.cache-hit != 'true'
uses: actions/setup-java@v1
uses: actions/setup-java@v3
with:
distribution: temurin
java-version: ${{ github.event.inputs.jdk }}
- name: Generate TPC-DS (SF=1) table data
if: steps.cache-tpcds-sf-1.outputs.cache-hit != 'true'
Expand Down Expand Up @@ -138,7 +139,7 @@ jobs:
with:
fetch-depth: 0
- name: Cache Scala, SBT and Maven
uses: actions/cache@v2
uses: actions/cache@v3
with:
path: |
build/apache-maven-*
Expand All @@ -149,20 +150,21 @@ jobs:
restore-keys: |
build-
- name: Cache Coursier local repository
uses: actions/cache@v2
uses: actions/cache@v3
with:
path: ~/.cache/coursier
key: benchmark-coursier-${{ github.event.inputs.jdk }}-${{ hashFiles('**/pom.xml', '**/plugins.sbt') }}
restore-keys: |
benchmark-coursier-${{ github.event.inputs.jdk }}
- name: Install Java ${{ github.event.inputs.jdk }}
uses: actions/setup-java@v1
uses: actions/setup-java@v3
with:
distribution: temurin
java-version: ${{ github.event.inputs.jdk }}
- name: Cache TPC-DS generated data
if: contains(github.event.inputs.class, 'TPCDSQueryBenchmark') || contains(github.event.inputs.class, '*')
id: cache-tpcds-sf-1
uses: actions/cache@v2
uses: actions/cache@v3
with:
path: ./tpcds-sf-1
key: tpcds-${{ hashFiles('.github/workflows/benchmark.yml', 'sql/core/src/test/scala/org/apache/spark/sql/TPCDSSchema.scala') }}
Expand All @@ -186,7 +188,7 @@ jobs:
echo "Preparing the benchmark results:"
tar -cvf benchmark-results-${{ github.event.inputs.jdk }}-${{ github.event.inputs.scala }}.tar `git diff --name-only` `git ls-files --others --exclude=tpcds-sf-1 --exclude-standard`
- name: Upload benchmark results
uses: actions/upload-artifact@v2
uses: actions/upload-artifact@v3
with:
name: benchmark-results-${{ github.event.inputs.jdk }}-${{ github.event.inputs.scala }}-${{ matrix.split }}
path: benchmark-results-${{ github.event.inputs.jdk }}-${{ github.event.inputs.scala }}.tar
Expand Down
Loading