Conversation
README.md
Outdated
There was a problem hiding this comment.
This part is covered in "Quick Start Guide" section. We don't need them any more.
README.md
Outdated
There was a problem hiding this comment.
Adding a subtitle to separate the advanced build/run options from the basic ones.
There was a problem hiding this comment.
I think maybe we should separate out the spark sql command. Currently it is like
Build Stuff
- `./gradlew build` - To build and run tests. Make sure Docker is running, as the integration tests depend on it.
- `./gradlew assemble` - To skip tests.
- `./gradlew test` - To run unit tests and integration tests.
Start Service Locally
- `./gradlew runApp` - To run the Polaris server locally on localhost:8181.
Quick Demo // Unrelated?
- `./regtests/run_spark_sql.sh` - To connect from Spark SQL. Here are some example commands to run in the Spark SQL shell:
There was a problem hiding this comment.
I thought this a lot of time actually, we like it to be more logical organized. But I didn't do this since it makes the page much longer. I still prefer a compacted front page.
README.md
Outdated
There was a problem hiding this comment.
I would probably skip the "drop" command just incase someone blindly copies and pastes. Might be nicer if they are left with something to check out
c44cf93 to
0422c19
Compare
0422c19 to
43e712a
Compare
|
Resolved the conflicts. |
* Simplify CatalogPrefixParser API (apache#3622) This component's methods do not need to have a `RealmContext` parameter: either the implementation is application-scoped, in which case the realm is irrelevant; or the implementation is request-scoped, in which case it can have the `RealmContext` injected. (FWIW, the default implementation is application-scoped.) This refactor will further simplify the remote request signing implementation. * Releasey: update check for required-checks (apache#3667) After apache#3625 the `jq` select to look for required checks can be simplified to select only the `Required Checks` check/job. * Wire external catalog properties into REST client config (apache#3480) * pass ExternalCatalog.properties through federation factories (Iceberg REST, Hive, Hadoop) * merge catalog properties with connection config, with connection config taking precedence * document proxy/timeout settings for Iceberg REST federation * add tests that exercise production merge logic * Correct build instruction, project properties require org.gradle.project prefix (apache#3680) * Correct build instruction, project properties require org.gradle.project prefix --------- Co-authored-by: Nandor Kollar <nkollar@cloudera.com> * Last merged commit 103c6ed --------- Co-authored-by: Alexandre Dutra <adutra@apache.org> Co-authored-by: Yong-Jin Lee <33987062+yj-lee0503@users.noreply.github.com> Co-authored-by: Nándor Kollár <nandorKollar@users.noreply.github.com> Co-authored-by: Nandor Kollar <nkollar@cloudera.com>
Description
run_spark_sql.shshould be the fastest way for any first-timer to try Polaris with a familiar tool like Spark sql. Any beginner can get start with Polaris + Spark SQL in 5 mins. We should promote it in the front page.Type of change
How Has This Been Tested?
Test the run_spark_sql.sh locally, which passed.
Checklist:
Here is the screenshot:
