Security update: Riot/Web 0.13.5 released – fixing XSS vulnerability

Hi all,

Heads up that we made an emergency release of Riot/Web 0.13.5 a few hours ago to fix a XSS vulnerability found and reported by walle303 – many thanks for disclosing it responsibly.

Please upgrade to Riot/Web 0.13.5 asap. If you’re using riot.im/app or riot.im/develop this simply means hitting Refresh; otherwise please upgrade your Riot deployment as soon as possible. Alpine, Debian and Fedora/RPM packages are already updated – huge thanks to the maintainers for the fast turnaround.

The issue lies in the relatively obscure external_url feature, which lets bridges specify a URL for bridged events, letting Riot/Web users link through to the ‘original’ event (e.g. a twitter URL on a bridged tweet).  The option is hidden in a context menu and labelled “Source URL”, and is only visible on events which have the external_url field set.  Unfortunately Riot/Web didn’t sanitise the URL correctly, allowing a malicious URL to be injected – and this has been the case since the feature landed in Riot 0.9.0 (Nov 2016).

If you’re not able to upgrade to Riot/Web 0.13.5 for some reason, then please do not click on the ‘Source URL’ feature on the event context menu:

Apologies for the inconvenience,

thanks,

Matthew

3D Video Calling with Matrix, WebRTC and WebVR at FOSDEM 2018!

TL;DR: We built a proof-of-concept for FOSDEM of the world’s first(?) 3D video calling using Matrix and the iPhone X… and it looks like this!!


Last year we spent a few weeks putting together a proof of concept of using Matrix as an open, interoperable communication layer for VR/AR – showing how you can use it as an open signalling protocol to connect users within (and between) virtual worlds, with full-mesh E2E encrypted video conferencing in VR; WebRTC calls overlaid on 360 degree video, and other fun stuff. The reasons for building the demo were quite eclectic:

  1. Try to highlight that Matrix is about much more than just about instant messaging or team chat
  2. Try to encourage the community to jump in and build out more interesting use cases
  3. Learn where the state of the art in WebVR + WebGL is
  4. Kick off the process of encouraging folks to think about storing world geometry and physics in Matrix
  5. Have a fun visual demo we could show to excite potential investors in New Vector (which comically backfired when the investment community spontaneously decided that VR is still too early).

In the end it succeeded on some points (highlighting exotic uses of Matrix; learning all about WebVR) and failed on others (a surge in Matrix-for-VR) – although we did have a lot of fun showing it off at the ETHLDN meetup back in October. (Eagle eyed viewers may be amused to spot team Status & Matrix sitting together in the audience ;)

However, we still believe that Matrix is the missing link for decentralised communication within VR/AR, and we were lucky enough to get a talk about Matrix+WebRTC+WebVR accepted to the Real-Time Communications Devroom at FOSDEM 2018! So, given a new chance to show the world how cool Matrix-powered comms could be in VR/AR, myself and Dave Baker went on a (very) quick detour to update the demo a little…

One of the issues of the original demo is that the video calling bits were just putting plain old video planes into the scene – floating television screens of 2D video content, if you will. This is better than nothing, but it’s sort of missing the whole point of VR/AR: surely you want to see who you’re talking to in 3D? Ideally they should have the same presence as if they were physically in your virtual space. This could also be a big step towards fixing one of the oldest problems of video calling: gaze correction. We’ve been obsessed about gaze correction since our early days (pre-Matrix) building mobile video calling stacks: gaze correction tries to fix the fact that the break in eye contact caused by staring at the screen (rather than the camera) has a terrible impact on the emotional connection of a video call. *But* if the person you are talking to is 3D, you can always rotate them in 3D space (or reposition yourself) to correct their line of sight – to re-align their gaze so they’re actually looking (in VR) at the thing they’re looking at in real life!

Back in early 2017 it would have been wildly ambitious to build an off-the-shelf 3D video calling app – but this changed overnight in late 2017 with the introduction of the iPhone X and its TrueDepth infrared-dot-projector based depth camera; effectively a mini-Kinect. Suddenly we have a mainstream high quality depth+video camera perfectly optimised for 3D video calling, with excellent API support from Apple. So we decided to see if we could be first in the world (as far as we know) to do 3D video calling using the iPhone X, using Matrix to signal the WebRTC media and using our WebVR demo as the viewing environment!

Step 1: Hack on WebRTC to add support for the iPhone X depth camera as a capture device. This is pretty easy, at least if you’re just swapping WebRTC’s AVFoundationVideoCapturer to request the depth camera instead of the video camera: https://github.com/matrix-org/webrtc/commit/c3044670d87c305d8f8ee72751939e281bf5223f is the starting point.

Step 2: Build a custom Riot/iOS with the right WebRTC SDK.  This is relatively easy thanks to Riot/iOS using CocoaPods and Google shipping a pod for WebRTC these days – it was a matter of tweaking Google’s pod so it could be referred to directly as a local project by Riot/iOS (and so that it provided debug symbols in the form CocoaPods expects). Brief notes are at https://github.com/matrix-org/webrtc/blob/matthew/depth/matrix/build_instructions.txt – many thanks to Manu for helping on this :)

Step 3: Decide how to encode the depth buffer. Now, the official WebRTC working group quite correctly insists that depth data should be treated as a first class citizen which is modelled and compressed in its own right. However, it looks like nobody has added first-class depth support to official WebRTC yet – and if we want to be able to easily display 3D calls on generic browsers capable of running WebVR+WebRTC+Matrix, we have no choice but do the ugly thing and encode the depth into a video signal which can be compressed with VP8/VP8/H.264 etc.

A quick search showed that some folks had already proposed a method for encoding depth data into a video signal, back in the days of the Kinect: https://reality.cs.ucl.ac.uk/projects/depth-streaming/depth-streaming.pdf. The paper outlines a fairly simple approach: you encode the 16-bit depth data into the three 8-bit colour channels; putting the coarse depth data into Blue, and then finer-grained depth data into Red and Green, encoding it as a periodic triangle wave:

In practice this means that as an object gets closer towards you, it gets gradually more blue – and meanwhile it pulses through a sequence of red and green so you can refine the precise depth more easily. So we went and implemented this, building a 16-bit lookup-table to encode the half-precision floating point 16-bit depth measurements the camera yields into video: https://github.com/matrix-org/webrtc/compare/c3044670d87c305d8f8ee72751939e281bf5223f…0258a4ef14c11a0161f078c970c64574629761c2.

Placing a video call through to another Matrix client then coughed up a video stream that looks like this:

As you can see, closer things (my head) are bluer than further things (the wall), and everything’s covered with trippy red & green stripes to refine the fine detail.  For the record, the iPhone TrueDepth camera emits 640×480 depth frames at around 24Hz.

Step 4: extend matrix-vr-demo to view a dot cloud, displaced using a WebGL vertex shader based on the encoded depth info.  Dave kindly did the honours: https://github.com/matrix-org/matrix-vr-demo/commit/b14cdda605d3807080049e84181b46706cec553e

Unfortunately, it showed that the depth encoding really wasn’t working very well… you can just about make out my head, but there are dots flying around all over the place, and when you view it in profile the 3D effect was almost entirely missing.

The main problems seem to be:

  • Whenever there’s a big jump in depth, the stripes get incredibly noisy and don’t compress at all well, generating completely corrupt data at edges of objects (e.g. the sides of my head)
  • The complexity of the pattern as a whole isn’t particularly compression-friendly
  • The contrast of the red/green stripes tends to dominate, causing the arguably more important blue to get overpowered during compression.
  • Converting from 4:4:4 RGB to 4:2:0 YUV (NV12) as required by WebRTC and then back to RGB inevitably entangles the colours – meaning that the extreme contrast of the red/green stripes is very visible on the blue channel after round-tripping due to sampling artefacts.
  • I probably made a mistake by bitwise casting the 16-bit half-precision floating point depth values directly onto the 16-bit unsigned int lookup index, rather than interpreting the float as a number and building a new index into the lookup table based on its numeric value.  As a result, depth values being encoded ended up having a much lower range than they should.
  • There are probably other bugs too.

Step 5: Give up on the fancy depth encoding (for now): https://github.com/matrix-org/webrtc/commit/2f5d29352ce5d80727639991b1480f610cbdd54c.  In practice, simply picking a range of the 16-bit half-precision floats to fit in the integer range [0,255] turns out to be good enough for a quick demo (i.e. 8-bit depth buffer, but over a small subset of the 16-bit depth space) – the dot cloud suddenly looked a lot more 3D and recognisable:

Step 6: Clearly this needs colour as well as depth.  This means asking WebRTC to add VideoTracks for both video and depth to your call’s MediaStream.  Firstly, we added a simple ‘matrixDepth’ constraint to WebRTC to tell a video source whether to capture depth or not.  (Yes, I know there’s a specced way to do this, but given nothing else here is on spec, we went for the simplest approach).  However, it turns out that only one WebRTC’s AVFoundationVideoCapturer can run at a time, because it manages its own AVCaptureSession and you can only have one of those at a time in a given app.  As a result, the two capturers (one per video track) collided, with the second session killing the first session.  As a quick fix, we modified RTCAVFoundationVideoSource to accept an existing AVCaptureSession (and AVCaptureDeviceInput) so that the application itself can handle the capture session and select the device, which can then be shared between multiple capturers: https://github.com/matrix-org/webrtc/commit/9c58465ada08018b1238fb8c5d784b5570f9246b.  Finally, just needed a few lines to matrix-ios-sdk to set the constraint and send the depth as well as video… https://github.com/matrix-org/matrix-ios-sdk/compare/fa9a24a6914b207389bacdd9ad08d5386fd0644a…5947d634ae8d722133ecdbde94cccf60bb88f11d, and adding playback of both channels to the vrdemo (https://github.com/matrix-org/matrix-vr-demo/commit/4059ab671d13bb4d4eb19dd2f534d9a387e47b81 and https://github.com/matrix-org/matrix-js-sdk/commit/f3f1524fcd46d2e772fd5cd022364018c8889364) …and it worked!

However, the dot cloud obviously has some limitations – especially when you zoom in like this.

Step 7: Replace the dot cloud with a displacement-mapped mesh so that it’s solid.  So as a final tweak for the demo, Dave switched out the dot cloud for a simple A-Frame plane with 640×480 vertices, keeping the same vertex shader.  Ironically this is where we hit some nasty problems, as for some reason the video texture started being applied to the depth texture (albeit flickering a bit) – eventually we realised that the flickering was the vertex shader inexplicably flapping between using the depth and the video texture for the displacement map.  At this point we compared it between laptops, and it turns out that for some reason the integrated Intel graphics on Dave’s Macbook Pro was choking on the two video textures, whereas a AMD Radeon R9 M370X got it right.  It’s unclear if this was actually a GPU bug or an A-Frame or Three.js or WebGL or Chrome bug.  Eitherway, on switching laptop to one with discrete graphics it started working perfectly!  Finally, we tweaked the shader to try to reduce smearing, by discarding vertices where there are big discontinuities in depth (through looking at the partial derivatives of the depth texture).  This isn’t perfect yet but it’s better than nothing.  https://github.com/matrix-org/matrix-vr-demo/compare/bbd460e81ff1336cd63468e707d858d47261ea42…06abe34957732ba8c728b99f198d987fe48d0420

And here’s the end result! (complete with trancey soundtrack as the audio we recorded at FOSDEM was unusable)

Conclusion:

Hopefully this gives a bit of a taste of what proper 3D video calling could be like in VR, and how (relatively) easy it was at the Matrix level to add it in.  If anyone wants to follow along at home, the various hacky branches are:

If you’d like to get involved with hacking on Matrix in VR, please come hang out at #vr:matrix.org.

Also, New Vector (where most of the core team work) is also hiring for VoIP/VR specialists right now, so if you’d like to work on this sort of thing fulltime, please contact us at jobs@matrix.org asap!

Matthew

Update: Slides from the FOSDEM talk (adapted from this blog post by Amandine) are available at https://matrix.org/~matthew/2018-02-04%20FOSDEM%20-%20VR.pdf

Update 2: The full FOSDEM talk recording is now up already at the RTC dev room at https://video.fosdem.org/2018/H.1309/!

Status partners up with New Vector, fueling decentralised comms and the Matrix ecosystem!

Hi all,

We’re delighted to announce that our friends at Status have made a major strategic investment ($5M) in New Vector: the company which currently employs most of the Matrix.org core team.  This means that we now have the financial backing to let us focus entirely on improving the Matrix ecosystem and getting the protocol out of beta… and beyond!!

First up – massive, massive thanks to everyone who has supported us over the last 6 months since our funding situation changed: as of the end of 2017 we had enough Patreon / Liberapay / IBAN / BTC / ETH donations and sponsorship (for Matrix.org) and enough paid consulting work (for New Vector) that we’ve been able to keep almost the whole core team working on Matrix as their day job.  Simply: the core Matrix team could not have continued in its current form without the support of the community – so we will be forever indebted to everyone who has supported us: especially all our donating supporters on Patreon/Liberapay/etc, our customers at New Vector, and our big $ sponsors, including UpCloud.com (who provide *incredible* hosting for Matrix.org), PrivateInternetAccess.com, INBlockchain.com, OmiseGO and Tendermint.

The investment from Status that we’re announcing today is a massive step change as it gives us the resources to grow the team and to focus fully on Matrix’s key problem areas without distractions (whilst still supporting paid New Vector work). Please note that donations are still very appreciated however: we are in the process of setting up the Matrix.org Foundation (at last!) as the non-profit target for all future donations, such that Matrix itself has a financial means to support pure Matrix work independently of any other companies (including New Vector).

Many folks will be familiar with Status already as one of the leading projects in the Ethereum ecosystem: building a beautiful usability-focused browser for decentralised apps (DApps) which run on the Ethereum Virtual Machine – as well as providing cryptocurrency payments and chat functionality (via the Whisper protocol).  It effectively lets users access Ethereum as a usable meaningful operating system – a bit like how Riot attempts to be a flagship ‘browser’ for the Matrix ecosystem.  The reason Status is investing in Matrix is primarily to accelerate decentralisation technology and open protocols in general – and also because there are some pretty obvious advantages to the collaboration, potentially including:

  • Bridging between Matrix and Whisper (Ethereum’s own real-time communication protocol) – exposing all of the Matrix ecosystem into Ethereum and vice versa
  • Bundling up Status DApps as Matrix Widgets
  • Exposing Matrix Widgets into Status
  • Supporting Olm/Megolm such that it could be used for E2E encryption in Status
  • Collaboration on the decentralised reputation systems needed to combat abuse in both Matrix & Ethereum
  • Utilize the Status Network token within Riot.im by enabling crypto assets.
  • …and more!

We’ve spent a lot of time working with Status over the last few months whilst arranging this partnership, and we’ve been really impressed by Jarrad and Carl and the team (they even have their own golang Double Ratchet Implementation!).  It’s fair to say that Status are very much aligned with Matrix’s vision, and the projects and can help each other a lot.

It’s also worth noting that Status and Matrix are really quite complementary: Whisper (as used by Status) is entirely p2p and focuses on protecting metadata and is tightly coupled to Ethereum, whereas Matrix is standalone and more feature rich but currently lacks metadata protection.  We both have fledgling app ecosystems; Matrix through Widgets and Status through Ethereum DApps. That said, Matrix and Status are going to continue on their own paths, and Matrix will of course remain controlled by Matrix.org – but we are looking forward to learning more about each other’s tech and driving decentralisation forward in general!

Meanwhile, on the core Matrix side, the investment lets us focus immediately on the following priorities:

  • Improving Riot’s usability. As of today we are urgently hiring for a Lead Designer to join the team fulltime to revamp and address Riot’s usability issues, as this is one of the single biggest things getting in the way of Matrix uptake today.  Hit up jobs@riot.im if you’re interested!

    At the same time, we’re excited to ramp up our investment in Riot’s performance and overall polish (as well as achieving feature parity with Slack/Discord and friends) – that means we’re looking for React, Android & iOS folks to join the core team full-time asap to take the apps to the next level.  Again, jobs@riot.im if this sounds like you!
  • Getting End-to-end Encryption out of beta. We know what we need to do to push E2E out of beta (incremental key backup; cross-signing devices; improved device verification) – Status’ investment means we can build the team to get it done! Decentralised end-to-end encryption is not for the faint-hearted, but if you’re up for the challenge please get in touch at jobs@matrix.org.
  • Finishing Dendrite. Dendrite (our next-gen golang homeserver implementation) is a hugely ambitious project and right now the only folks working on it are Rich and Erik… who also happen to be supporting Synapse too.  The good news is that the community has been helping considerably with Dendrite, but it would be even better if we had more people supported to work on it full time.  If you love Go, and you love massively scalable decentralised systems, please hit up jobs@matrix.org!
  • Supporting Synapse.There is massive scope for performance improvements to Synapse, and there are thousands of deployments out there today, so we really want to improve support for Synapse.  If you love Python and Twisted, and interesting performance/profiling and efficiency work, please hit up jobs@matrix.org too!
  • Maintaining the Spec. If Matrix is anything it is the spec, and maintenance of the spec is key to the project’s success. In 2018 we intend to invest heavily in its maintenance and address outstanding API proposals, documenting APIs, not to mention updating the general technical documentation (guides, FAQ etc) on Matrix.org in general.  If you are a developer who loves spec work, we need you over at jobs@matrix.org immediately! :)

Beyond these immediate priorities, we have a long feature roadmap lined up too (highest priority first): Reactions, Message Editing, improved Widgets (e.g. Sticker Packs), Threading, Decentralised Accounts, Decentralised Identity, Decentralised Reputation, Peer-to-peer Matrix and more.  However, right now our focus has to be on improving the quality and stability of what we have today and getting it out of beta before we open yet more battlefronts.  In other words: we’re not adding more features (modulo emergencies) until the current features are polished!

So: exciting times ahead!  Never before has Matrix had the resources to fully realise its potential, and we’d like to say enormous thanks to Carl, Jarrad, Yessin and Nabil at Status for their patience and support while sorting out the investment.  We’d also like to say thanks to everyone else who offered us investment: in the end we had several viable offers on the table – and we owe sincere thanks to those who invested the time and faith to make an offer which we’ve ended up turning down.

For now, however, it’s back to work: making Riot slicker than Slack; making Synapse go faster and use less RAM; making Dendrite federate; making E2E encryption transparent and indestructible; making sure that it’s possible to implement Matrix purely by referring to the Spec.

2018 is going to be an interesting year indeed :)  Thank you all for supporting Matrix – and thanks, once again, to Status for helping to take us to the next level.

Matthew, Amandine & the whole team.

Update 1: VentureBeat is covering the news over at https://venturebeat.com/2018/01/29/status-invests-5-million-in-matrix-to-create-a-blockchain-messaging-superpower/

Update 2: IBTimes is also covering it at http://www.ibtimes.co.uk/matrix-status-ico-gains-support-non-blockchain-decentralisation-technology-1656183!

…and you can see Status’s side of the story over at https://blog.status.im/status-invests-5m-in-riot-im-4e3026a8bd50!

Synapse 0.26 released!

Hi folks,

Synapse 0.26 is here (with no changes since RC1 which we released just before Christmas).  It’s a general maintenance release, albeit with a few new features but mainly lots of bugfixes and general refinements.  Enjoy!

As always, you can get it from https://github.com/matrix-org/synapse/releases/tag/v0.26.0.

Changes in synapse v0.26.0 (2018-01-05)

No changes since v0.26.0-rc1

Changes in synapse v0.26.0-rc1 (2017-12-13)

Features:

  • Add ability for ASes to publicise groups for their users (PR #2686)
  • Add all local users to the user_directory and optionally search them (PR
    #2723)
  • Add support for custom login types for validating users (PR #2729)

Changes:

  • Update example Prometheus config to new format (PR #2648) Thanks to
    @krombel!
  • Rename redact_content option to include_content in Push API (PR #2650)
  • Declare support for r0.3.0 (PR #2677)
  • Improve upserts (PR #2684#2688#2689#2713)
  • Improve documentation of workers (PR #2700)
  • Improve tracebacks on exceptions (PR #2705)
  • Allow guest access to group APIs for reading (PR #2715)
  • Support for posting content in federation_client script (PR #2716)
  • Delete devices and pushers on logouts etc (PR #2722)

Bug fixes:

  • Fix database port script (PR #2673)
  • Fix internal server error on login with ldap_auth_provider (PR #2678) Thanks
    to @jkolo!
  • Fix error on sqlite 3.7 (PR #2697)
  • Fix OPTIONS on preview_url (PR #2707)
  • Fix error handling on dns lookup (PR #2711)
  • Fix wrong avatars when inviting multiple users when creating room (PR #2717)
  • Fix 500 when joining matrix-dev (PR #2719)

The Matrix Holiday Mini-Special (2017 edition)

Hi folks,

Since we began Matrix it’s been a sort of tradition to do a huge update on Christmas Eve to reflect on the past year and tease the future – you can check out the 2016 edition or the 2015 edition and a sort of proto-update for 2014 too if you’re feeling nostalgic.  This year I’m going to try to keep it short though, as I’m hoping to write a Very Big Update related to long-term-funding progress in the relatively near future.

2017 has been a weird year for us: progress in the core team has been relatively badly impacted by the mission to secure long-term funding, with myself (Matthew) & Amandine spending the vast majority of our time handling the meta-problem of keeping the core team secure rather than actually working on the project itself.  Meanwhile we’ve lost a few of the original team during the disruption, which has particularly impacted Spec, E2E and Dendrite progress (such are the risks of running a very lean team in the first place!).  However, against the odds, we have (hopefully) prevailed – and this is almost entirely due to the massive support we’ve seen through donations via Patreon, Liberapay, Ethereum, Bitcoin and PayPal, and some much-appreciated paid consulting work.

Simply put, without the donation support we would have not been able to pay the core team over the last 3 months, and we would not be able to pay for the legal costs of setting up the team as an independent company, and we would be completely screwed for securing large-scale long-term funding if we couldn’t point to the community’s support as evidence that Matrix is worthy of funding.  So: we sincerely owe our thanks to those who heeded the call to arms and are supporting us.  We’ve also been pretty lucky in benefiting from the skyrocketing value of Ethereum and Bitcoin donations.  And even if/when long-term funding is secured for New Vector (the company we formed in July to hire the core team), donations will continue to be vital to support the Matrix.org Foundation itself as an independent non-profit entity – as it’s obviously not in Matrix.org’s interests to be entirely financially dependent on New Vector.  Hopefully this whole episode will end up being a bit like a Save Star Trek scenario – where something fun and amazing almost gets almost wiped out when it’s only a few years old due to corporate factors… only for the community to band together to save it, and then for it to go from strength to strength for the next 50 years or more! :D

That said, we’ve made some major progress this year anyway: the addition of Widgets to Matrix; the addition of Communities (aka Groups) and Flair; major improvements to E2E encryption (even though it’s not out of beta yet); lots of progress on Dendrite (the minimum-viable phase 1 is now about 75% complete); switching everything over to Jitsi for group video conferencing; rewriting onboarding for Riot/Web; Antiscam/spam support for cryptocommunities; the whole VR proof-of-concept of Matrix+WebVR+WebRTC video and voip calling; Version 0.3 of the Matrix spec; and a whole lot more which I’m probably forgetting right now.  And meanwhile the community has been more active than ever, with major new clients like Nheko hitting the scene with a large and loyal community of open source contributors (over the last few weeks I’ve literally seen more nheko PRs fly past than Riot ones!) – and we’ve also been *incredibly* glad of community contributions towards Dendrite.  Dendrite is already way ahead of Synapse in terms of % community contributed code – we have hope that it will end up being a model FOSS project :)

So what lies ahead?  It’s hard to predict the level of progress we’re going to make in the core team, as it really depends on long-term funding.  Whatever happens, one of our top priorities is to improve our governance so that everyone can better contribute in places that have historically been more blocked on the core team (i.e. the spec; synapse)… whilst still maintaining coherency across the project.  Ideally we’ll end up with more folks pushing Matrix forwards from both the wider world and the core team however, and right now the main priorities are:

  • Phase 2 of Communities: letting users filter their current view of Matrix to rooms associated with a given subset of communities (if desired), for Slack/Discord-style semantics
  • Fixing the remaining end-to-end encryption failures (although the majority of them have now been solved)
  • Finalising proper UI/UX for end-to-end encryption (at last), including the option to transparently back up your room keys if desired.
  • Dendrite Phase 1
  • Performance in Riot (on all platforms)
  • Editable messages
  • Reactions
  • Making widgets much more useful
  • Paid integrations and hosting options to help avoid further funding nightmares.

Looking at the bigger picture, what we’d really love for 2018 would be to finally get to a 1.0 release of the Matrix Spec (i.e. catching up on our massive backlog of merging unstable spec drafts & proposals into the spec) – and for Dendrite to start to replace Synapse as the reference home server from Matrix.org and become really ubiquitous, and for E2E encryption be turned on by default in private rooms.  Beyond the above list, we don’t really have any other features urgently planned (threading, for instance, is on hold until we have the rest of the above sorted) – but we believe that if we stabilise everything we have today (plus that list), then there is no reason for Matrix to not fulfil its full potential as a true global open decentralised communications standard.  And then it’s on to threading, P2P matrix, decentralised reputation and all that good stuff!

It’s going to be a crazy year ahead, either way: so thank you, once again, for supporting Matrix – whether that’s financially, or by contributing code, or running a server, or just using the protocol as a user.  We literally wouldn’t be here without you!! :)

Matthew, Amandine & the whole core team.