|
|
| | Show HN: Census – Export for data warehouses | |
34 points by borisjabes 10 months ago | hide | past | favorite | 7 comments |
|
| Hello Hacker News! We're a team of YC founders (Meldium W13, Draft S11, TapEngage S11) launching something new (https://www.getcensus.com). How many times has your business team asked you to generate yet another CSV file, write a ”quick report” in SQL, or send some custom data to a terrible API (looking at you Marketo)? We’ve built a product that connects directly to your data warehouse and syncs into apps like Salesforce, Customer.io and even Google Sheets. In fact, your business teams won’t even need to rely on engineering to manage all these pipelines. The tech stack for analyzing customer data in 2020 looks pretty great. You can load almost any data into an auto-scaling data warehouse (Snowflake, BigQuery) with easy point & click tools like Fivetran. You can build SQL models with dbt and create visual reports in Metabase. But you can’t easily push insights back into the marketing/sales/support apps. You can’t solve this with direct app integrations or “event routers” like Zapier and you definitely shouldn’t over-engineer a solution with Spark/Kafka/Airflow. We designed Census to make your data warehouse a single source of truth for modeling and transformation before publishing data back into your SaaS tools quickly and reliably. We’re proud of what we’ve built so far and there’s a lot more work to deliver on our dream of saving us all from generating & uploading yet another CSV file so we can spend more time actually building our products (or reading HN). You can check it out at https://www.getcensus.com. Since this is HN, we’d love to hear everyone’s war stories on building internal ETL solutions! |
|

Applications are open for YC Summer 2021
Guidelines
| FAQ
| Lists
| API
| Security
| Legal
| Apply to YC
| Contact
|
On its face, a Census workflow is a simple "program" that our users author in a point-and-click manner – read some data from System A and broadcast it to Systems B, C, and D. But to "compile" that program, Census has to determine:
That's just compilation – then we need to execute that compiled plan and move massive amounts of data with low latency and high throughput, all while handling byzantine failures in source and destination systems and automatically rolling back, recovering, or helping users "debug" their workflows when things go wrong.There's a lot of depth to this (and we haven't "solved" it by any means) - happy to answer questions here or at brad@getcensus.com if you have them!