That's probably a good idea to get to a usable product. You may want to investigate a proper data warehouse if your workload primarily consists of large scans and aggregations, such as if you offer a user-facing dashboard which can generate arbitrary queries.
Does your data have a fixed structure, or can customers send essentially whatever they want and you have to deal with it by e.g. storing a JSON blob in each event?
BigQuery and Snowflake are the two managed services I'd recommend today if you'd like good performance and cost-effective storage. They both separate compute from storage so that your cold data isn't sitting on expensive SSDs like your Postgres instances are probably using.
They're both also significantly faster than Postgres at large scans and aggregations.
Snowflake is the most interesting to me because they offer a semi-structured data type called VARIANT which efficiently encodes semi-structured data in a column-wise format while losing only a tiny bit of performance compared to a fixed schema. This could let your customers send semi-structured or variable size data (like arrays or maps with arbitrary keys) and still keep your dashboards fast.
If you'd like to chat more, I just requested to connect with you on LinkedIn.