Hacker News new | past | comments | ask | show | jobs | submit login

I don't understand how you can be performant with Event Sourcing. So to read the latest value I need to backtrack all the changes from the start? Every time?



You can pre-compute the latest values by consuming the event stream and building up some sort of persistent, disposable read model that has the values you want. Similar to the redis-read cache strategy that has been popular in Ruby and PHP communities in the past, for example.

Bonus: When your read model differs from the model you use to determine and enforce business logic constraints, this is a strategy that's known as CQRS (Command Query Responsibility Segregation) - this enables you to trade consistency and speed off against each other for both the read/write sides of your model independently of each other.

For example, you can have fast, inconsistent reads by reading from a cache which you use to power your user interfaces and when it comes to processing user commands, you separately and intentionally read the event stream and build up a consistent model to inform whether you can accept the incoming user command. This way you've achieved consistent-yet-slower writes, minimising the opportunity for the system state to become inconsistent with your programmed business logic, whilst minimising the processing requirement (thus time) to display a user interface, at the cost of consistency. That's not to say your user interface will be wrong - but you should be aware that under certain conditions (e.g. your read model cache is unable to reflect changes fast enough) then your read model and thus user interface will be inconsistent with the state chronicled in the event log.


Almost. Think of it like a bank account. You don't sum up every transaction amount across all time. Instead, to calculate your balance you start from the last statement balance (the last memento according to Greg Young) and apply newer events. So it's VERY fast. You can even compute and save your newer balance (memento) every few events if you'd like.


immutability and snapshotting - like git


You can cache that result and update it with each new event.


that's where the Aggregates come in, in Event Sourcing model you write to the Event Log and read from Aggregates




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: