Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
114 commits
Select commit Hold shift + click to select a range
963d1ef
Adds 'superset export_dashboards --dashboard-id/-i --dashboard-title/…
rjurney May 3, 2019
590b6d1
Shortened line to pass flake8 test
rjurney May 7, 2019
915e6dc
Jazzed up superset export_dashboards -i/-t, now supports multiple das…
rjurney May 7, 2019
f960c4f
Work from ticket #7444 and start of this work
rjurney May 7, 2019
a606a0a
Import fix
rjurney May 7, 2019
a89a0fb
upstream merge
rjurney May 7, 2019
948254b
pass flake8 complaints
rjurney May 8, 2019
c94be1b
Added --export-data/-x flag and --export-data-dir/-d directory/locati…
rjurney May 8, 2019
2beb594
Added "includes_data" flag to dashboard export to facilitate import.
rjurney May 8, 2019
4a49f78
Created SQLALCHEMY_IMPORT_URI config string for where to import dashb…
rjurney May 8, 2019
7ad501e
Reworked Dashboard.export_dashboards to print better records on expor…
rjurney May 8, 2019
e65b976
Created get_or_create_import_db_engine() to fetch a DB engine for imp…
rjurney May 8, 2019
ec1ff26
Now importing data tables to import db if a Dashboard file we're load…
rjurney May 8, 2019
72da320
Added Dashboards.uuid column to SQLAlchemy model with Alembic migration
rjurney May 8, 2019
95636a8
Added uuid to dashboards/slices/databases/datasources via ImportMixin…
rjurney May 9, 2019
7087d6a
Merge branch 'master' of github.com:apache/incubator-superset into rj…
rjurney May 18, 2019
605e0ff
Added python-git to requirements
rjurney May 18, 2019
634555e
Now migrating all tables that inherit directly or indirectly from Imp…
rjurney May 18, 2019
35f028d
Use sa.Column
rjurney May 19, 2019
76edb93
Added TableColumn/table_columns table
rjurney May 19, 2019
d5ea88a
Initial CLI tests
rjurney May 19, 2019
a06804e
Moved JSON serialization from Dashboard.export_dashboards to superset…
rjurney May 20, 2019
5a97aea
Fixed test title
rjurney May 20, 2019
ee7b245
Remove git dev requirement
rjurney May 20, 2019
f6b0b97
Working prototype create/list examples
rjurney May 21, 2019
ea67d98
Prints table for examples
rjurney May 21, 2019
404819b
Lots of new options added to CLI
rjurney May 21, 2019
c5402fc
Moved from SQLALCHEMY_IMPORT_URI to SQLALCHEMY_EXAMPLES_URI
rjurney May 21, 2019
ded42a5
Added EXAMPLE_REPOS_TAGS and GITHUB_AUTH_TOKEN items
rjurney May 25, 2019
f73ca66
get_examples_uris expands repo name/tag to content/blog uris, list_ex…
rjurney May 25, 2019
272cde8
Removed types field from Dashboard files export, they were always typ…
rjurney May 25, 2019
6c14f16
flake8 fixes
rjurney May 25, 2019
18ba41a
Removed refernces to slices from examples command UI, added --example…
rjurney May 25, 2019
ba90363
Merge branch 'master' of github.com:apache/incubator-superset into rj…
rjurney May 25, 2019
b5a9c01
flake8 cleanup
rjurney May 25, 2019
2c72e7f
flake8 cleanup
rjurney May 25, 2019
ed2a9b8
flake8 cleanup
rjurney May 25, 2019
6cf4851
Completed test for `superset examples export` which tests tarball siz…
rjurney May 25, 2019
2132b5d
squash me
rjurney May 25, 2019
b4048c5
In progress paranoia commit, listing works, creating works, working o…
rjurney May 29, 2019
4edca2d
Doc string for get_example_data
rjurney May 29, 2019
8bb7350
Implemented CLI for example import
rjurney May 29, 2019
f9ca920
Moved from reusing dashboard_import_export.import_dashboards for exam…
rjurney May 29, 2019
3ee163f
Added get_examples_database and get_or_create_example_db at the expen…
rjurney May 29, 2019
8b8eeea
Added get_uuid method to ensure a string uuid is default value for uu…
rjurney May 29, 2019
004d917
Now serializing uuid fields in superset.utils.core.DashboardEncoder a…
rjurney May 29, 2019
ce09dc8
Now using tag v0.0.4 of rjurney examples-data
rjurney May 29, 2019
d675427
Print more than just minutes of the example created timestamp
rjurney May 29, 2019
ad74c3c
In progress commit
rjurney Jun 4, 2019
2031cd4
migration now works for mysql
rjurney Jun 4, 2019
7568de3
resolve config conflicts
rjurney Jun 5, 2019
41789b5
Use Flask-AppBuilder master
rjurney Jun 5, 2019
c713de8
Debug printing for json serialization
rjurney Jun 5, 2019
232a2f8
Reduced import of superset.utils.core in superset.utils.dashboard_imp…
rjurney Jun 5, 2019
83ca4ec
Error in example import tempdir name fixed
rjurney Jun 5, 2019
3e270a2
Set default example export title to dashboard title
rjurney Jun 5, 2019
ed5b308
Debug DashboardEncoder, cleanup
rjurney Jun 6, 2019
e2f9a5e
Fixes model serialization by using ImportMixin.export_to_dict on the …
rjurney Jun 6, 2019
0a9c607
Revert "Debug DashboardEncoder, cleanup"
rjurney Jun 6, 2019
fb6dbba
Remove debug from DashboardEncoder
rjurney Jun 7, 2019
38fbac1
Changed name of superset.model.helpers.ImportMixin to ImportExportMix…
rjurney Jun 7, 2019
ebb77dd
Nearly working json encoding
rjurney Jun 8, 2019
b340fcb
Removed JSONEncoder from DashboardEncoder completely
rjurney Jun 8, 2019
c31bba5
Added --url option to examples export and --full-fields to examples list
rjurney Jun 8, 2019
1ba0bf6
Changed config from examples db to main/superset
rjurney Jun 9, 2019
a7b1e92
Substitute example db utilities added
rjurney Jun 9, 2019
44d79e4
SqlaTable.import_obj() accepts a substitute_db_name
rjurney Jun 9, 2019
e8ba197
Substitute/debug "main" for superset/examples DB
rjurney Jun 9, 2019
578806b
Rolled back datasource substitution from main->examples db
rjurney Jun 9, 2019
c50dbe1
Flask-AppBuilder>=2.1.4
rjurney Jun 9, 2019
e4197dc
Using main db instead of examples db
rjurney Jun 10, 2019
f86f302
Added "superset export remove" and associated unit test. Also unit te…
rjurney Jun 11, 2019
fb4e032
Merge master
rjurney Jun 11, 2019
eeec06b
From PTable --> tabulate for "superset examples list"
rjurney Jun 11, 2019
23ea4aa
Now using a SQLAlchemy Model class to remove data table, assuming an …
rjurney Jun 11, 2019
ff5d44d
Fixed head for examples database migration
rjurney Jun 11, 2019
2f0a260
Made migration work for postgresql with its builtin UUID type
rjurney Jun 11, 2019
9d46442
typo fix
rjurney Jun 11, 2019
d4adbce
flake8 fixes
rjurney Jun 11, 2019
4f134dc
flake8 fixes
rjurney Jun 11, 2019
7af3337
Changed ImportExportMixin.export_to_json to export_to_json_serializable
rjurney Jun 11, 2019
19960dc
Moved repo from rjurney/examples-data to apache-superset/examples-data
rjurney Jun 11, 2019
33cd594
Removes exception handling in ImportExportMixin.DashboardEncoder
rjurney Jun 11, 2019
d0a7b24
Removed duplicate code in dashoard_import_export.py between example i…
rjurney Jun 11, 2019
e7a6509
Replace click.echo(click.style()) with click.secho()
rjurney Jun 11, 2019
d3e6b52
From single to double quotes when quotes appear
rjurney Jun 11, 2019
eef4675
removed unneeded print from unit test for cli
rjurney Jun 11, 2019
86dc7e4
Removed temporary file
rjurney Jun 12, 2019
306dacb
Changed name of DashboardEncoder to SQLAJsonEncoder as it serializes …
rjurney Jun 12, 2019
e0cf1d4
Using data.get in conditional checks for files in import data
rjurney Jun 12, 2019
c5c1353
Check for table datasources and throw exception on druid datasources …
rjurney Jun 12, 2019
7b1fa12
Added error handling to pd.read_csv/to_sql in example import
rjurney Jun 12, 2019
f93fa1a
Changed method of db initialization in superset.utils.dashboard_impor…
rjurney Jun 12, 2019
008a146
Set SQLALCHEMY_EXAMPLES_URI to default to SQLALCHEMY_DATABASE_URI
rjurney Jun 12, 2019
afc7ade
removed substitute_db_name code for now
rjurney Jun 12, 2019
e4c4efc
Removed uuid from export_fields in core model and made SQLAJsonEncode…
rjurney Jun 12, 2019
091b399
Refactored get_or_create_main_db/get_or_create_examples_db to get_or_…
rjurney Jun 13, 2019
c93d81d
ImportExportMixin(object) - the object was not needed in py3
rjurney Jun 13, 2019
5f84f11
Add default empty list to data.get("files") in dashboard_import_expor…
rjurney Jun 13, 2019
af52e8f
Removed druid tests in dict_import_export_tests as it is deprecated.
rjurney Jun 13, 2019
a1e8601
Changed debug outputs to info
rjurney Jun 13, 2019
b19059f
Removed Druid datasource import
rjurney Jun 14, 2019
73b2c99
Merge branch 'master' of github.com:apache/incubator-superset into rj…
rjurney Jun 17, 2019
8f5b685
Moved uuid column to helpers functions and away from core models
rjurney Jun 17, 2019
47b36bd
Merge branch 'master' of github.com:apache/incubator-superset into rj…
rjurney Jun 18, 2019
1b80bf2
Defined and used ImportExportMixin.export_fields_with_uuid
rjurney Jun 20, 2019
4b1ae69
Removed model class uuid exportfield and finished implementation of u…
rjurney Jun 20, 2019
2630293
Made dashboard_import_export.import_dashboards() do data table import…
rjurney Jun 20, 2019
9996781
Cleanup logging. Using superset.utils.core.get_or_create_db_by_name()…
rjurney Jun 20, 2019
67e8043
Removed comment
rjurney Jun 21, 2019
ab24559
Removed abstract superset example classes
rjurney Jun 21, 2019
10e36d3
flake8 fixes
rjurney Jun 21, 2019
f4ef469
Merge branch 'master' of github.com:apache/incubator-superset into rj…
rjurney Jun 21, 2019
5c5dfea
Removed schedulers from uuid field migrations
rjurney Jun 22, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,7 @@ six==1.11.0 # via bleach, cryptography, flask-jwt-extended, flask-
sqlalchemy-utils==0.33.11
sqlalchemy==1.3.5
sqlparse==0.2.4
tabulate==0.8.3
urllib3==1.24.3 # via requests, selenium
vine==1.1.4 # via amqp
webencodings==0.5.1 # via bleach
Expand Down
1 change: 1 addition & 0 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -105,6 +105,7 @@ def get_git_sha():
'sqlalchemy>=1.3.5,<2.0',
'sqlalchemy-utils>=0.33.2',
'sqlparse',
'tabulate>=0.8.3',
'wtforms-json',
],
extras_require={
Expand Down
298 changes: 288 additions & 10 deletions superset/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,20 +17,32 @@
# under the License.
# pylint: disable=C,R,W
from datetime import datetime
import json
import logging
from subprocess import Popen
from sys import stdout
from sys import exit, stdout
import tarfile
import tempfile

import click
from colorama import Fore, Style
from pathlib2 import Path
import requests
import yaml

from superset import (
app, appbuilder, data, db, security_manager,
)
from superset.data.helpers import (
download_url_to_blob_url, get_examples_file_list, get_examples_uris,
list_examples_table,
)
from superset.exceptions import DashboardNotFoundException, ExampleNotFoundException
from superset.utils import (
core as utils, dashboard_import_export, dict_import_export)
core as utils, dashboard_import_export, dict_import_export,
)

logging.getLogger('urllib3').setLevel(logging.WARNING)

config = app.config
celery_app = utils.get_celery_app(config)
Expand All @@ -48,7 +60,7 @@ def make_shell_context():
@app.cli.command()
def init():
"""Inits the Superset application"""
utils.get_or_create_main_db()
utils.get_or_create_db_by_name(db_name='main')
appbuilder.add_permissions(update_perms=True)
security_manager.sync_role_definitions()

Expand Down Expand Up @@ -124,10 +136,254 @@ def load_examples_run(load_test_data):
@app.cli.command()
@click.option('--load-test-data', '-t', is_flag=True, help='Load additional test data')
def load_examples(load_test_data):
"""Loads a set of Slices and Dashboards and a supporting dataset """
"""Loads a set of charts and dashboards and a supporting dataset"""
load_examples_run(load_test_data)


def exclusive(ctx_params, exclusive_params, error_message):
"""Provide exclusive option grouping"""
if sum([1 if ctx_params[p] else 0 for p in exclusive_params]) > 1:
raise click.UsageError(error_message)


@app.cli.group()
def examples():
"""Manages example dashboards/datasets"""
pass


@examples.command('export')
@click.option(
'--dashboard-id', '-i', default=None, type=int,
help='Specify a single dashboard id to export')
@click.option(
'--dashboard-title', '-t', default=None,
help='Specify a single dashboard title to export')
@click.option(
'--description', '-d', help='Description of new example', required=True)
@click.option(
'--example-title', '-e', help='Title for new example', required=False)
@click.option(
'--file-name', '-f', default='dashboard.tar.gz',
help='Specify export file name. Defaults to dashboard.tar.gz')
@click.option(
'--license', '-l', '_license', default='Apache 2.0',
help='License of the example dashboard')
@click.option(
'--url', '-u', default=None, help='URL of dataset home page')
def export_example(dashboard_id, dashboard_title, description, example_title,
file_name, _license, url):
"""Export example dashboard/datasets tarball"""
if not (dashboard_id or dashboard_title):
raise click.UsageError('must supply --dashboard-id/-i or --dashboard-title/-t')
exclusive(
click.get_current_context().params,
['dashboard_id', 'dashboard_title'],
'options --dashboard-id/-i and --dashboard-title/-t mutually exclusive')

# Export into a temporary directory and then tarball that directory
with tempfile.TemporaryDirectory() as tmp_dir_name:

try:
data = dashboard_import_export.export_dashboards(
db.session,
dashboard_ids=[dashboard_id],
dashboard_titles=[dashboard_title],
export_data=True,
export_data_dir=tmp_dir_name,
description=description,
export_title=example_title or dashboard_title,
_license=_license,
url=url,
strip_database=True)

dashboard_slug = dashboard_import_export.get_slug(
db.session,
dashboard_id=dashboard_id,
dashboard_title=dashboard_title)

out_path = f'{tmp_dir_name}/dashboard.json'

with open(out_path, 'w') as data_stream:
data_stream.write(data)

with tarfile.open(file_name, 'w:gz') as tar:
tar.add(tmp_dir_name, arcname=f'{dashboard_slug}')

click.secho(str(f'Exported example to {file_name}'), fg='blue')

except DashboardNotFoundException as e:
click.secho(str(e), fg='red')
exit(1)


@examples.command('list')
@click.option(
'--examples-repo', '-r',
help='Full name of Github repository containing examples, ex: ' +
"'apache-superset/examples-data'",
default=None)
@click.option(
'--examples-tag', '-r',
help="Tag or branch of Github repository containing examples. Defaults to 'master'",
default='master')
@click.option(
'--full-fields', '-ff', is_flag=True, default=False, help='Print full length fields')
def _list_examples(examples_repo, examples_tag, full_fields):
"""List example dashboards/datasets"""
click.echo(
list_examples_table(examples_repo, examples_tag=examples_tag,
full_fields=full_fields))


@examples.command('import')
@click.option(
'--database-uri', '-d', help='Database URI to import example to',
default=config.get('SQLALCHEMY_EXAMPLES_URI'))
@click.option(
'--examples-repo', '-r',
help='Full name of Github repository containing examples, ex: ' +
"'apache-superset/examples-data'",
default=None)
@click.option(
'--examples-tag', '-r',
help="Tag or branch of Github repository containing examples. Defaults to 'master'",
default='master')
@click.option(
'--example-title', '-e', help='Title of example to import', required=True)
def import_example(example_title, examples_repo, examples_tag, database_uri):
"""Import an example dashboard/dataset"""

# First fetch the example information from Github
examples_repos = [(examples_repo, examples_tag)] \
if examples_repo else config.get('EXAMPLE_REPOS_TAGS')
examples_repos_uris = [(r[0], r[1]) + get_examples_uris(r[0], r[1])
for r in examples_repos]
examples_files = get_examples_file_list(examples_repos_uris)

# Github authentication via a Personal Access Token for rate limit problems
headers = None
token = config.get('GITHUB_AUTH_TOKEN')
if token:
headers = {'Authorization': 'token %s' % token}

import_example_json = None
import_data_info = None
for example_file in examples_files:

metadata_download_url = example_file['metadata_file']['download_url']
example_metadata_json = requests.get(metadata_download_url,
headers=headers).content
# Cheaply load json without generating objects
example_metadata = json.loads(example_metadata_json)
if example_metadata['description']['title'] == example_title:
import_example_json = example_metadata_json
import_data_info = example_file['data_files']
logging.info(
f"Will import example '{example_title}' from {metadata_download_url}")
break

if not import_example_json:
e = ExampleNotFoundException(f'Example {example_title} not found!')
click.secho(str(e), fg='red')
exit(1)

# Parse data to get file download_urls -> blob_urls
example_metadata = json.loads(import_example_json,
object_hook=utils.decode_dashboards)

# The given download url won't work for data files, need a blob url
data_blob_urls = {}
for ex_file in example_metadata['files']:
github_info = [t for t in import_data_info
if t['name'] == ex_file['file_name']][0]
blob_url = download_url_to_blob_url(github_info['download_url'])
data_blob_urls[github_info['name']] = blob_url

try:
dashboard_import_export.import_dashboards(
db.session,
import_example_json,
is_example=True,
data_blob_urls=data_blob_urls,
database_uri=database_uri)
except Exception as e:
logging.error(f"Error importing example dashboard '{example_title}'!")
logging.exception(e)


@examples.command('remove')
@click.option(
'--example-title', '-e', help='Title of example to remove', required=True)
@click.option(
'--database-uri', '-d', help='Database URI to remove example from',
default=config.get('SQLALCHEMY_EXAMPLES_URI'))
@click.option(
'--examples-repo', '-r',
help='Full name of Github repository containing examples, ex: ' +
"'apache-superset/examples-data'",
default=None)
@click.option(
'--examples-tag', '-r',
help="Tag or branch of Github repository containing examples. Defaults to 'master'",
default='master')
def remove_example(example_title, database_uri, examples_repo, examples_tag):
"""Remove an example dashboard/dataset"""

# First fetch the example information from Github
examples_repos = [(examples_repo, examples_tag)] \
if examples_repo else config.get('EXAMPLE_REPOS_TAGS')
examples_repos_uris = [(r[0], r[1]) + get_examples_uris(r[0], r[1])
for r in examples_repos]
examples_files = get_examples_file_list(examples_repos_uris)

# Github authentication via a Personal Access Token for rate limit problems
headers = None
token = config.get('GITHUB_AUTH_TOKEN')
if token:
headers = {'Authorization': 'token %s' % token}

# temporary - substitute url provided
db_name = 'superset'

import_example_data = None
for example_file in examples_files:

metadata_download_url = example_file['metadata_file']['download_url']
example_metadata_json = requests.get(metadata_download_url,
headers=headers).content
# Cheaply load json without generating objects
example_metadata = json.loads(example_metadata_json)
if example_metadata['description']['title'] == example_title:
import_example_data = json.loads(example_metadata_json)
logging.info(
f"Will remove example '{example_title}' from '{db_name}'")
break

logging.info(import_example_data['files'])

# Get the dashboard and associated records
dashboard_title = \
import_example_data['dashboards'][0]['__Dashboard__']['dashboard_title']
logging.info(f'Got dashboard title {dashboard_title} for removal...')

utils.get_or_create_db_by_name(db_name='main')
session = db.session()

try:
dashboard_import_export.remove_dashboard(
session,
import_example_data,
dashboard_title,
database_uri=database_uri,
)
except DashboardNotFoundException as e:
logging.exception(e)
click.secho(
f'Example {example_title} associated dashboard {dashboard_title} not found!',
fg='red')


@app.cli.command()
@click.option('--datasource', '-d', help='Specify which datasource name to load, if '
'omitted, all datasources will be refreshed')
Expand Down Expand Up @@ -156,7 +412,7 @@ def refresh_druid(datasource, merge):

@app.cli.command()
@click.option(
'--path', '-p',
'--path', '-p', required=True,
help='Path to a single JSON file or path containing multiple JSON files'
'files to import (*.json)')
@click.option(
Expand All @@ -177,7 +433,7 @@ def import_dashboards(path, recursive):
try:
with f.open() as data_stream:
dashboard_import_export.import_dashboards(
db.session, data_stream)
db.session, data_stream.read())
except Exception as e:
logging.error('Error when importing dashboard from file %s', f)
logging.error(e)
Expand All @@ -190,9 +446,31 @@ def import_dashboards(path, recursive):
@click.option(
'--print_stdout', '-p', is_flag=True, default=False,
help='Print JSON to stdout')
def export_dashboards(print_stdout, dashboard_file):
"""Export dashboards to JSON"""
data = dashboard_import_export.export_dashboards(db.session)
@click.option(
'--dashboard-ids', '-i', default=None, type=int, multiple=True,
help='Specify dashboard id to export')
@click.option(
'--dashboard-titles', '-t', default=None, multiple=True,
help='Specify dashboard title to export')
@click.option(
'--export-data', '-x', default=None, is_flag=True,
help="Export the dashboard's data tables as CSV files.")
@click.option(
'--export-data-dir', '-d', default=config.get('DASHBOARD_EXPORT_DIR'),
help="Specify export directory path. Defaults to '/tmp'")
def export_dashboards(print_stdout, dashboard_file, dashboard_ids,
dashboard_titles, export_data, export_data_dir):
"""Export dashboards to JSON and optionally tables to CSV"""
try:
data = dashboard_import_export.export_dashboards(
db.session,
dashboard_ids=dashboard_ids,
dashboard_titles=dashboard_titles,
export_data=export_data,
export_data_dir=export_data_dir)
except DashboardNotFoundException as e:
click.secho(str(e), fg='red')
exit(1)
if print_stdout or not dashboard_file:
print(data)
if dashboard_file:
Expand Down Expand Up @@ -369,7 +647,7 @@ def load_test_users_run():
gamma_sqllab_role = security_manager.add_role('gamma_sqllab')
for perm in security_manager.find_role('Gamma').permissions:
security_manager.add_permission_role(gamma_sqllab_role, perm)
utils.get_or_create_main_db()
utils.get_or_create_db_by_name(db_name='main')
db_perm = utils.get_main_database(security_manager.get_session).perm
security_manager.add_permission_view_menu('database_access', db_perm)
db_pvm = security_manager.find_permission_view_menu(
Expand Down
Loading