* feat(bindings/crypto-nodejs): Add `#[napi(strict)]` to force type checking from JavaScript.
* chore(bindings/crypto-nodejs): Use our own fork of `napi-rs` for the moment.
* feat(crypto-nodejs): Download lib binary in postinstall
* build(crypto-nodejs): Workflow to prebuild napi bindings
* ci(crypto-nodejs): Disable broken target, install without download
* ci(apple-ffi): Don't run for drafts
* ci(coverage): Don't run for draft PRs
* fix(crypto-nodejs): bind to current version for download
* fix(crypto-nodejs): Ignore libs and package
* ci(crypto-nodejs): Build and upload NPM package
* fix(crypto-nodejs): Set proper target list
* ci(crypto-nodejs): Remove FreeBSD from build pipeline
* ci(crypto-nodejs): Linkers for linux cross compile
* ci(crypto-nodejs): Add arm64 build for windows
* ci(crypto-nodejs): Proper linkers for arm and musl
* ci(crypto-nodejs): Correct apt command for musl
* fix(crypto-nodejs): Drop arm64 linux musl support
* ci(crypto-nodejs): Manual Workflow trigger process
* chore(crypto-nodejs): Get Github to pickup our action
* ci(crypto-nodejs): Add i686 Linux built
* ci(crypto-nodejs): Configure cliff for nodejs changelogs
* ci(crypto-nodejs): Proper gcc for i868 targets
* docs(crypto-nodejs): Add supported targets for npm install
* ci(crypto-nodejs): Limit building of binaries to tags
* style: consol.log -> console.info; Improve docs
Co-authored-by: Ivan Enderlin <ivan@mnt.io>
* activate for testing
* fix broken merge
* 0.1.0
* fix(js): put in the proper package name
* activate for PR for testing
* fix(nodejs): getting ready for publishing
* ci(crypto-nodejs): Adding docs and fixing naming for workflows
* typo: missed one
* fixing package name
Co-authored-by: Ivan Enderlin <ivan@mnt.io>
This patch adds customized event types, currently only for the
m.room_key and m.secret.send to-device events.
This allows us to:
a) Deserialize the session_key field into a vodozemac type
b) Control when we zeroize secrets better
We don't want to clone a struct that contains a secret.
However, on the Node.js side, we can only receive arguments by
references. The problem we have is that we cannot transfer the
ownership of `MediaEncryptionInfo` to `AttachmentDecryptor` because we
don't own it. To simulate this behavior, we use `Option.take`.
A new method then appears:
`EncryptedAttachment.hasMediaEncryptionInfoBeenConsumed` to know if
the media encryption info has been consumed by `Attachment.decrypt`
already or not. That way, we can decrypt only once. It is possible to
do a JSON-encoded backup of the media encryption info by calling
`EncryptedAttachment.mediaEncryptionInfo` though.
First, u128 has a bug in `serde`,
cf. https://github.com/serde-rs/json/issues/625.
Second, we don't need to represent the timeout as a u128, it's clearly
too large. This patch tries to convert it to u64. It should never
fail, but we propagate the error anyway.
This patch provides a new API to encrypt and decrypt attachment,
i.e. big buffer of type `Uint8Array`.
It's based on `matrix_sdk_crypto::AttachmentEncryptor` and `AttachmentDecryptor`.
Vodozemac used to accept and return strings when encrypting and
decrypting. This is quite unusual for a pure cryptographic library so we
switched towards the usual setup where we encrypt/decrypt raw bytes.
Since we do encrypt/decrypt JSON strings in Matrix land, we do the
string conversions over here.
In async functions, the Node.js GC may or may not (that's a random
behavior) collect the arguments passed to the function as soon as it
returns. The function may not be executed yet, since it's async. Thus,
it leads to memory corruption: The function tries to read later on the
value inside an argument and… it crashes at best.
To avoid this bug, there is no other choice than cloning the values
before the function returns, in its “sync path” (so before any
transformation of an `.await` point into an “async block”).
The performance impact is not “massive”, I'm not sure it could be
noticeable easily since it is most of the time related to identifiers
(e.g. `UserId`), which are cheap to clone. I have to find the balance
here, and cloning offers the best trade off from my point of view.