-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
varchar mysql errors #30
Comments
Hi @anthonymobile , We can choose one of the following options.
What do you think? BTW, cool project! [1] http://docs.sqlalchemy.org/en/latest/core/type_basics.html#sqlalchemy.types.Unicode |
thanks... i think for me at least this is not needed. i was looking at mysql because of performance issues on a Raspberry Pi and i refactored my code so that is moot. however, i was thinking about this a bit. it seems kind of crazy to -have- to load the entire GTFS database into memory. is there a way to select a subset of the full database through a different sqlalchemy query without substantially modifying the pygtfs code? |
I am not sure what you mean by that. You can store your database on disk with SQLite or PostgreSQL, and then the amount of memory used depends on your database settings. If you mean you want to put only a subset of the CSV files in your database, then that is way tougher to implement in pygtfs. You can filter the files yourself before putting them in the database. |
To clarify my latest comment: You can use the included
Then you can use the sqlite database in pygtfs:
The nice thing here is that sqlite is quite smart about loading only the relevant stuff that you need. |
trying to import a GTFS file to mysql using the following and getting a ton of VARCHAR length unspecified errors from sqlalchemy. what am i doing wrong?
here's the output
The text was updated successfully, but these errors were encountered: