Dev Update July, August 2020

Although summer is usually a time to take things slow, this year we decided to lean in, ship, publish, and as always, make steady progress towards our longer-term vision of full decentralization. Here are the some of the highlights:

  • Network whitepaper published. The real-world experiments show it’s fast and scalable. This blog highlights some key findings, and you can also check out the full whitepaper.
  • Launched the website update with an updated top page, a new Data Unions page and Papers page.
  • Data Unions 2.0 smart contracts are now ready and undergoing a third party security audit. Remaining work consists of loose ends, such as the SDKs and Core application, as well as creating an upgrade path for existing DUs.
  • Data Unions public beta running smoothly. Incremental improvements were made in preparation for the official launch.
  • Started work on the Network Explorer, which shows the real-time structure and stats of the Streamr Network.
  • Started work on human-readable, hierarchical, globally unique stream IDs with namespaces based on ENS, for example streamr.eth/demos/tramdata.
  • Storage rewrite complete, now setting up the new storage cluster in production. Will fix resend problems and prepare for opening up and decentralizing the storage market.
  • Token economics research with BlockScience continues in Phase 2, working on simple cadCAD models.
  • End-to-end encryption key exchange ready in Java SDK, while JS SDK is still WIP.
  • Buyer whitelisting feature added to the Marketplace.

Network findings

Releasing the Network whitepaper marks the completion of our academic research phase of the current Network milestone. This research is especially important to the Streamr project’s enterprise adoption track, and focused on the latency and scalability of the network, battle tested with messages propagated through real-world data centres around the world. The key findings were:

  • The upper limit of message latency is roughly around 150-350ms globally, depending on network size
  • Message latency is predictable
  • The relationship between network size and latency is logarithmic.

These findings are impressive! Not only do they show that the Network is already on par with centralized message brokers in terms of speed, they also give us great confidence that the fully decentralized network can scale without introducing significant message propagation latency. We invite you to read the full paper to learn more.

Network Developments

While the release of the Network whitepaper has been a long-term side project for the Network team, development of the Network continues to accelerate. As real-time message delivery is the primary function of the Network, so far we haven’t focused much on decentralizing the storage of historical messages. However, as the whole Network is heading towards decentralization, so is the storage functionality.  The long-term goal regarding storage is that anyone will be able to join in and a storage node. Stream owners will be able to hire one or more of these independent storage nodes to store the historical data in their streams. The completion of the storage rewrite is another big step towards full decentralization.

Token economics research

The token economics research track with BlockScience has proceeded to Phase 2. In Phase 1, mathematical formulations of the actors, actions, and agreements in the Network were created. In the current Phase 2, simulation code is being written for the first time. The simulations leverage the open source cadCAD framework developed by BlockScience. The models developed in Phase 2 are simple toy models, the purpose of which is to play around with the primitives defined in Phase 1 and verify that they are implemented correctly. In Phase 3, the first realistic models of the Streamr Network economy will be implemented.

Data Unions upgrade

On the Data Unions front, development of the 2.0 architecture is progressing well and the smart contracts are being security audited at the moment. Robustness and security have been the key drivers for this upgrade, and while 1.0 architecture is running smoothly, we need to be forward-thinking and prepare for the kind of scale and growth we expect to see in the future. Data Unions 2.0 will be the first big upgrade after the launch of the current architecture. Data Unions that are created with the current architecture will be upgradable to the Data Unions 2.0 architecture once available. We look forward to describing the upgrade in detail in a future blog post.

More control over your data

We released a heavily requested feature on the Marketplace – buyer whitelisting. This feature allows data product owners and Data Union admins to be in control of who can purchase and gain access to the product’s data. These features are useful in growing enterprise adoption of the Marketplace, because in B2B sales it’s often required that the transacting parties identify each other and perhaps sign traditional agreements.

Deprecations and breaking changes

A number of API endpoints need to be retired and replaced to be compatible with our vision of decentralization. This section summarises deprecated features and upcoming breaking changes. Items marked ‘Date TBD’ will be happening in the medium term, but a date has not yet been set.

  • The API endpoints for explicitly deleting data will be removed on the next update, because they are rarely used and are not compatible with decentralized storage. Going forward, storage nodes will expire old data based on the data retention period set on the stream.

    /api/v1/streams/${id}/deleteDataUpTo
    /api/v1/streams/${id}/deleteDataRange
    /api/v1/streams/${id}/deleteAllData

  • The API endpoints to upload CSV files to streams will be removed in the next update, because the feature is rarely used and the centralized backend is unable to sign the data on behalf of the user. Storing historical messages to streams can be done by publishing the messages to streams normally.

    /api/v1/streams/${id}/uploadCsvFile
    /api/v1/streams/$id/confirmCsvFileUpload

  • (Date TBD): Support for email/password authentication will be dropped. Users need to connect an Ethereum wallet to their Streamr user unless they’ve already done so. As part of our progress towards decentralization, we will end support for authenticating based on centralized secrets such as passwords. Going forward, authenticating with cryptographic keys/wallets will be the only supported method of authentication.
  • (Date TBD): Support for API keys will be dropped. Applications integrating to the API should authenticate with the Ethereum key-based challenge-response protocol instead of API keys. As part of our progress towards decentralization, we will eventually end support for authenticating based on centralized secrets such as API keys. Going forward, authenticating with cryptographic keys/wallets will be the only supported method of authentication.
  • (Date TBD): Support for unsigned data will be dropped. Unsigned data on the Network is not compatible with the goal of decentralization, because malicious nodes can tamper with data that is not signed. As the Streamr Network will be ready to start decentralizing at the next major milestone (Brubeck), support for unsigned data will be ceased as part of the progress towards that milestone. Users should upgrade old client library versions to newer versions that support data signing, and use Ethereum key-based authentication (see above).

Thanks for reading! If you’re a developer interested in contributing to the Streamr ecosystem, consider applying to the Community Fund for financial backing to fast track your plans.

Stay up to date

Get the latest Streamr news and articles delivered to your inbox