Skip to content

Commit

Permalink
Update README
Browse files Browse the repository at this point in the history
  • Loading branch information
joker1007 committed Mar 10, 2017
1 parent e213b49 commit e4facd5
Showing 1 changed file with 35 additions and 36 deletions.
71 changes: 35 additions & 36 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,40 +28,41 @@ Because embbeded gem dependency sometimes restricts ruby environment.

### Options

| name | type | required? | default | description |
| :------------------------------------- | :------------ | :----------- | :------------------------- | :----------------------- |
| method | string | no | insert | `insert` (Streaming Insert) or `load` (load job) |
| auth_method | enum | yes | private_key | `private_key` or `json_key` or `compute_engine` or `application_default` |
| email | string | yes (private_key) | nil | GCP Service Account Email |
| private_key_path | string | yes (private_key) | nil | GCP Private Key file path |
| private_key_passphrase | string | yes (private_key) | nil | GCP Private Key Passphrase |
| json_key | string | yes (json_key) | nil | GCP JSON Key file path or JSON Key string |
| project | string | yes | nil | |
| table | string | yes (either `tables`) | nil | |
| tables | string | yes (either `table`) | nil | can set multi table names splitted by `,` |
| template_suffix | string | no | nil | can use `%{time_slice}` placeholder replaced by `time_slice_format` |
| auto_create_table | bool | no | false | If true, creates table automatically |
| skip_invalid_rows | bool | no | false | Only `insert` method. |
| max_bad_records | integer | no | 0 | Only `load` method. If the number of bad records exceeds this value, an invalid error is returned in the job result. |
| ignore_unknown_values | bool | no | false | Accept rows that contain values that do not match the schema. The unknown values are ignored. |
| schema | array | yes (either `fetch_schema` or `schema_path`) | nil | Schema Definition. It is formatted by JSON. |
| schema_path | string | yes (either `fetch_schema`) | nil | Schema Definition file path. It is formatted by JSON. |
| fetch_schema | bool | yes (either `schema_path`) | false | If true, fetch table schema definition from Bigquery table automatically. |
| fetch_schema_table | string | no | nil | If set, fetch table schema definition from this table, If fetch_schema is false, this param is ignored |
| schema_cache_expire | integer | no | 600 | Value is second. If current time is after expiration interval, re-fetch table schema definition. |
| field_string | string | no | nil | see examples. |
| field_integer | string | no | nil | see examples. |
| field_float | string | no | nil | see examples. |
| field_boolean | string | no | nil | see examples. |
| field_timestamp | string | no | nil | see examples. |
| replace_record_key | bool | no | false | see examples. |
| replace_record_key_regexp{1-10} | string | no | nil | see examples. |
| convert_hash_to_json | bool | no | false | If true, converts Hash value of record to JSON String. |
| insert_id_field | string | no | nil | Use key as `insert_id` of Streaming Insert API parameter. |
| request_timeout_sec | integer | no | nil | Bigquery API response timeout |
| request_open_timeout_sec | integer | no | 60 | Bigquery API connection, and request timeout. If you send big data to Bigquery, set large value. |
| time_partitioning_type | enum | no (either day) | nil | Type of bigquery time partitioning feature(experimental feature on BigQuery). |
| time_partitioning_expiration | time | no | nil | Expiration milliseconds for bigquery time partitioning. (experimental feature on BigQuery) |
| name | type | required? | placeholder? | default | description |
| :------------------------------------- | :------------ | :----------- | :---------- | :------------------------- | :----------------------- |
| method | string | no | no | insert | `insert` (Streaming Insert) or `load` (load job) |
| auth_method | enum | yes | no | private_key | `private_key` or `json_key` or `compute_engine` or `application_default` |
| email | string | yes (private_key) | no | nil | GCP Service Account Email |
| private_key_path | string | yes (private_key) | no | nil | GCP Private Key file path |
| private_key_passphrase | string | yes (private_key) | no | nil | GCP Private Key Passphrase |
| json_key | string | yes (json_key) | no | nil | GCP JSON Key file path or JSON Key string |
| project | string | yes | yes | nil | |
| dataset | string | yes | yes | nil | |
| table | string | yes (either `tables`) | yes | nil | |
| tables | array(string) | yes (either `table`) | yes | nil | can set multi table names splitted by `,` |
| template_suffix | string | no | yes | nil | can use `%{time_slice}` placeholder replaced by `time_slice_format` |
| auto_create_table | bool | no | no | false | If true, creates table automatically |
| skip_invalid_rows | bool | no | no | false | Only `insert` method. |
| max_bad_records | integer | no | no | 0 | Only `load` method. If the number of bad records exceeds this value, an invalid error is returned in the job result. |
| ignore_unknown_values | bool | no | no | false | Accept rows that contain values that do not match the schema. The unknown values are ignored. |
| schema | array | yes (either `fetch_schema` or `schema_path`) | no | nil | Schema Definition. It is formatted by JSON. |
| schema_path | string | yes (either `fetch_schema`) | no | nil | Schema Definition file path. It is formatted by JSON. |
| fetch_schema | bool | yes (either `schema_path`) | no | false | If true, fetch table schema definition from Bigquery table automatically. |
| fetch_schema_table | string | no | yes | nil | If set, fetch table schema definition from this table, If fetch_schema is false, this param is ignored |
| schema_cache_expire | integer | no | no | 600 | Value is second. If current time is after expiration interval, re-fetch table schema definition. |
| field_string | string | no | no | nil | see examples. |
| field_integer | string | no | no | nil | see examples. |
| field_float | string | no | no | nil | see examples. |
| field_boolean | string | no | no | nil | see examples. |
| field_timestamp | string | no | no | nil | see examples. |
| replace_record_key | bool | no | no | false | see examples. |
| replace_record_key_regexp{1-10} | string | no | no | nil | see examples. |
| convert_hash_to_json | bool | no | no | false | If true, converts Hash value of record to JSON String. |
| insert_id_field | string | no | no | nil | Use key as `insert_id` of Streaming Insert API parameter. |
| request_timeout_sec | integer | no | no | nil | Bigquery API response timeout |
| request_open_timeout_sec | integer | no | no | 60 | Bigquery API connection, and request timeout. If you send big data to Bigquery, set large value. |
| time_partitioning_type | enum | no (either day) | no | nil | Type of bigquery time partitioning feature(experimental feature on BigQuery). |
| time_partitioning_expiration | time | no | no | nil | Expiration milliseconds for bigquery time partitioning. (experimental feature on BigQuery) |

### Buffer section

Expand Down Expand Up @@ -524,8 +525,6 @@ The third method is to set `fetch_schema` to `true` to enable fetch a schema usi
</match>
```

__CAUTION: `fetch_schema` target cannot parse and replace placeholder.__

If you specify multiple tables in configuration file, plugin get all schema data from BigQuery and merge it.

NOTE: Since JSON does not define how to encode data of TIMESTAMP type,
Expand Down

0 comments on commit e4facd5

Please sign in to comment.