Hacker Newsnew | comments | show | ask | jobs | submit login

I'm part of the Neo4j team and I'm puzzled about the import problem. I don't know about the size requirements you have but you mention 2.5M nodes and 60M edges and we run systems in production with a LOT more data (billions range). So it definitely shouldn't blow up. Maybe you ran into some bug in an older release or something else was wrong.

It's also important to note that Neo4j through the normal API is optimized for the most common use cases: reading data and transactional updates. Those operations are executed all the time during normal operation, whereas an import is typically done once at system bootstrap and then never again.

To ease migration, as part of our 1.0 release (June time frame) we will expose a new "batch injection" API that is faster for one-time imports of data sets. This is currently being developed. If you have feedback on how an API like that should behave, feel free to join the discussions on the list:

   http://neo4j.org
Cheers,

-EE




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: