WebCamp 2014: Supporter packs - the awesome in small bags!

Aggregating data consistently and accountably

Ensuring data consistency and mutation accountability when aggregating differing data streams, while still making that data your own.

Imagine you're dealing with social media networks. You're fetching a status from Facebook and building an entity of your own out of it. After a while, a user of your system wants to edit that entity, for example, assign some meta-data to it. Meanwhile, you decide that a tweet coming in from Twitter represents essentially the same entity as that Facebook post you already have, and you want to modify your entity with this new information. With all these differing and concurrent data streams, how do you know who updated what, and what's the real state of data?

In this talk, an abstraction layer that was built to solve the described problem will be presented. It's called an append-only architecture. It stores each new data atom separately and incrementally builds entities as new atoms come in. Consequently, it enforces consistency and provides accountability by replaying all data mutations and verifying that the data should indeed have the state that it has.

Video

Nikica Jokić

Nikica Jokić

Software developer, working for @bitovi. I dig programming languages, all kinds.

↑ Back to top
Code at Six Girl Geek Dinners Frontman HR Javascript Zagreb ZgPHP
FutureDEV HUJAK Python Hrvatska IXDA Croatia Ruby Hrvatska