I remember a time when pure tech and traditional companies alike were hiring to expand their data engineering teams.
This was during the Hadoop and Spark era when people would implement data transformation and business logic in code.
Have low-code/3rd party SaaS since replaced this space or have most companies realized they aren’t getting much value out of data science and abandoned these BigData initiatives completely?
My previous company did a lot of work to move to BigQuery, which really does work quite well for data we needed to regularly access, and for things that were more rare we'd just store in GCS.
We used Apache Beam/Dataflow to do the imports/exports and the occasional custom script for data munging when necessary.
At one point we needed hundreds of nodes to do some data transformation from on prem to cloud, but on average we only needed a handful of nodes running much smaller jobs.