You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, i want to save Pandas DataFrame to Postgres table with the highest speed as possible.
I found copy_expert function in PostgresHook, but it's input only sql and filename, i wouldn't like to save csv on local machine
Do i have options to load it without storing csv?
I found an example only for psycopg2 + sqlalchemy
def psql_insert_copy(table, conn, keys, data_iter): #mehod
# gets a DBAPI connection that can provide a cursor
dbapi_conn = conn.connection
with dbapi_conn.cursor() as cur:
s_buf = StringIO()
writer = csv.writer(s_buf)
writer.writerows(data_iter)
s_buf.seek(0)
columns = ', '.join('"{}"'.format(k) for k in keys)
if table.schema:
table_name = '{}.{}'.format(table.schema, table.name)
else:
table_name = table.name
sql = 'COPY {} ({}) FROM STDIN WITH CSV'.format(
table_name, columns)
cur.copy_expert(sql=sql, file=s_buf)
df = pd.read_csv("example.csv") # read csv file from your local
# Example: 'postgresql://username:password@localhost:5432/your_database'
engine = create_engine('postgresql://postgres:postgresql@localhost:5432/postgres')
start_time = time.time() # get start time before insert
df.to_sql(
name="test",
con=engine,
if_exists="append",
index=False,
method=psql_insert_copy
)
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi, i want to save Pandas DataFrame to Postgres table with the highest speed as possible.
I found copy_expert function in PostgresHook, but it's input only sql and filename, i wouldn't like to save csv on local machine
Do i have options to load it without storing csv?
I found an example only for psycopg2 + sqlalchemy
Beta Was this translation helpful? Give feedback.
All reactions