Just wanted to share a simple py script that I built to automate uploading multiple csv files to a postgres database.
The script will clean the file name and column headers and create the SQL statements to create the db table and upload the file to the database, automatically.
I used this script almost everyday at my past job to upload data from other teams and departments.
I recorded a tutorial on youtube for anyone that wants to try to build it out (links in the README). Hope this is helpful either as a way to easily upload files to a db or as practice getting better building python scripts to automate your work.
Thanks for at least taking a look and replying =) I do know a bit of Django and will likely extend to other dbs, if there's a need. I originally connected to both HIVE and Greenplum databases because that's what I have at work. As a hobby, I just create postgres dbs on AWS so I mainly connect to those these days. Thanks for the tip.
The script will clean the file name and column headers and create the SQL statements to create the db table and upload the file to the database, automatically.
I used this script almost everyday at my past job to upload data from other teams and departments.
I recorded a tutorial on youtube for anyone that wants to try to build it out (links in the README). Hope this is helpful either as a way to easily upload files to a db or as practice getting better building python scripts to automate your work.