Compare commits

..

137 Commits

Author SHA1 Message Date
7a7229d70e fmt: group_imports = "StdExternalCrate" 2026-03-08 17:41:06 +08:00
e4004ccb6d feat: bytecode 2026-03-08 17:41:06 +08:00
843ae6cfb4 refactor: use GhostCell to provide interior mutability in Ir 2026-03-08 16:57:34 +08:00
c24d6a8bb3 chore: merge codegen::compile and codegen::compile_scoped 2026-02-25 21:14:35 +08:00
d7351e907b feat: thunk caching (WIP) 2026-02-25 21:14:27 +08:00
550223a1d7 refactor: tidy 2026-02-21 22:30:13 +08:00
53dbee3514 refactor(downgrade): use bumpalo 2026-02-21 21:54:40 +08:00
e1517c338e chore: tidy 2026-02-20 16:12:54 +08:00
45096f5254 fix: fetchGit 2026-02-19 22:34:41 +08:00
b57fea3104 optimize: short-circuit update (//) 2026-02-19 21:54:55 +08:00
4380fa85c4 optimize: compact 2026-02-19 21:14:02 +08:00
99045aa76c chore: fmt 2026-02-19 20:14:06 +08:00
7eb3acf26f optimize: use v8::Script::compile to run script directly in op_import 2026-02-19 19:58:02 +08:00
b424f60f9f optimize: avoid using #[serde] in ops 2026-02-19 19:11:56 +08:00
42031edac1 optimize: generate shorter code 2026-02-19 17:16:35 +08:00
04dcadfd61 optimize 2026-02-18 20:45:50 +08:00
c3c39bda0c feat: v8 profiling 2026-02-18 20:45:50 +08:00
782092b91e optimize: type check 2026-02-18 20:45:50 +08:00
ae5febd5dd feat(cli): compile 2026-02-18 20:45:42 +08:00
3cc7c7be75 chore: update deps; restructure tests; use Map over Record 2026-02-17 12:02:54 +08:00
f49634ccc0 optimize: use Map to represent NixAttrs 2026-02-17 10:35:37 +08:00
4a885c18b8 chore: add eslint 2026-02-17 10:35:37 +08:00
37e395c0e3 optimize: builtins.intersectAttrs 2026-02-16 23:07:52 +08:00
16a8480d29 feat(cli): support eval file 2026-02-16 21:52:08 +08:00
f0a0593d4c feat: implement builtins.toXML 2026-02-16 19:52:33 +08:00
ce64a82da3 feat: inspector 2026-02-16 19:52:33 +08:00
5c48e5cfdd feat: implement hash related primops 2026-02-15 19:55:29 +08:00
7836f8c869 refactor: handle derivation generation on Rust side 2026-02-15 19:38:11 +08:00
e357678d70 feat: implement realisePath 2026-02-15 18:26:24 +08:00
2f2c690023 chore(runtime-ts): fix linter errors 2026-02-15 12:20:31 +08:00
cf4dd6c379 fix: preserve string context in builtins.{path,fetch*} 2026-02-14 20:06:38 +08:00
31c7a62311 chore(typos): ignore lang tests 2026-02-14 19:02:26 +08:00
ad5d047c01 chore: eliminate Result::unwrap 2026-02-14 19:01:06 +08:00
795742e3d8 deps: upgrade dependencies 2026-02-14 19:01:00 +08:00
60cd61d771 feat: implement fromTOML; fix fromJSON implementation 2026-02-14 13:30:31 +08:00
d95a6e509c feat: tidy fetcher (partial)
* shouldn't have used LLM to implement this...
2026-02-13 21:57:51 +08:00
48a43bed55 fix: fetchTarball 2026-02-13 21:19:34 +08:00
6d8f1e79e6 fix: handle NixPath in nixValueToJson 2026-02-13 21:19:34 +08:00
df4edaf5bb fix: preserve string context in + operator 2026-02-13 20:14:46 +08:00
3aee3c67b9 optimization: with scope lookup 2026-02-13 18:50:42 +08:00
a8f1c81b60 fix: derivation (WIP) 2026-02-12 00:18:12 +08:00
249eaf3c11 refactor 2026-02-12 00:18:12 +08:00
a79e20c417 fix: drvDeep 2026-02-12 00:18:12 +08:00
bd9eb638af fix: deepSeq 2026-02-08 15:35:59 +08:00
d09b84676c refactor: with_thunk_scope 2026-02-08 12:48:01 +08:00
1346ae5405 refactor 2026-02-08 12:21:48 +08:00
6b46e466c2 feat: implement scopedImport & REPL scoping 2026-02-08 03:53:53 +08:00
f154010120 chore: update flake.lock 2026-02-08 02:50:48 +08:00
f5364ded1e feat: handle regex operations on rust side 2026-02-08 01:20:38 +08:00
26e7b74585 refactor: remove unused span_utils.rs 2026-02-08 01:20:38 +08:00
e8a28a6d2f feat: use ere (compile-time regex compilation) 2026-02-07 18:42:16 +08:00
216930027d refactor: move ops to runtime/ops.rs 2026-02-07 16:28:19 +08:00
4d6fd6d614 feat: eval_shallow & eval_deep 2026-01-31 22:09:12 +08:00
b7f4ece472 fix: force the first argument of builtins.trace 2026-01-31 20:26:53 +08:00
a68681b4f5 fix: derivation references 2026-01-31 20:19:31 +08:00
ba3e2ae3de feat: set v8 stack size to 8 MiB 2026-01-31 19:29:09 +08:00
c5aee21514 fix: remove incorrect dynamic attr check 2026-01-31 18:59:11 +08:00
8f01ce2eb4 fix: handle __functor in forceFunction 2026-01-31 18:58:44 +08:00
a08f0e78a3 feat: builtins.placeholder 2026-01-31 18:54:13 +08:00
547f8f3828 fix: disable TCO test 2026-01-31 17:51:01 +08:00
aa368cb12e fix: toJSON test 2026-01-31 17:29:57 +08:00
0360bbe4aa fix: canonicalize paths 2026-01-31 17:23:28 +08:00
db64763d77 fix: escape "${" 2026-01-31 17:23:07 +08:00
1aba28d97b fix: symlink resolution 2026-01-31 16:55:44 +08:00
6838f9a0cf fix: unsafeGetAttrPos call on functionArgs 2026-01-31 16:46:16 +08:00
cb539c52c3 chore: fmt 2026-01-31 16:23:23 +08:00
b8f8b5764d fix: print test 2026-01-31 15:36:03 +08:00
1cfa8223c6 fix: path related tests 2026-01-31 15:03:30 +08:00
13874ca6ca fix: structuredAttrs 2026-01-31 13:53:02 +08:00
5703329850 fix: copy path to store in concatStringsWithContext 2026-01-31 12:47:32 +08:00
f0812c9063 refactor: recursive attrset; fix attrset merging 2026-01-31 12:07:13 +08:00
97854afafa fix: getenv test 2026-01-30 22:52:03 +08:00
9545b0fcae fix: recursive attrs 2026-01-30 22:51:42 +08:00
aee46b0b49 feat: use CppNix's test suite 2026-01-30 22:47:47 +08:00
1f835e7b06 fix: derivation semantic 2026-01-30 22:45:37 +08:00
9ee2dd5c08 fix: inherit in recursive attribute sets 2026-01-29 18:40:13 +08:00
084968c08a fix: implication operator (->) 2026-01-29 17:35:19 +08:00
058ef44259 refactor(codegen): less allocation 2026-01-29 11:42:40 +08:00
86953dd9d3 refactor: thunk scope 2026-01-27 10:45:40 +08:00
d1f87260a6 fix: infinite recursion on perl (WIP) 2026-01-27 10:45:40 +08:00
3186cfe6e4 feat(error): stack trace 2026-01-25 03:06:13 +08:00
4d68fb26d9 fix: derivation & derivationStrict 2026-01-25 03:06:13 +08:00
51f7f4079b feat: builtins.parseDrvName 2026-01-25 03:06:13 +08:00
7136f57c12 feat: builtins.unsafeGetAttrPos & __curPos 2026-01-25 03:06:13 +08:00
05b66070a3 feat: builtins.readType & builtins.readDir 2026-01-25 03:06:13 +08:00
13a7d761f4 feat: lookup path (builtins.findFile) 2026-01-25 03:06:13 +08:00
3d315cd050 feat: debug thunk location 2026-01-25 03:06:13 +08:00
62ec37f3ad feat: mkAttrs 2026-01-25 03:06:13 +08:00
a6aded7bea fix: return defaultVal when lhs is not attrset in selectWithDefault 2026-01-25 03:06:13 +08:00
296c0398a4 fix(builtins.zipAttrsWith): laziness 2026-01-25 03:06:13 +08:00
10430e2006 refactor: RuntimeContext 2026-01-25 03:06:13 +08:00
f46ee9d48f chore: tidy 2026-01-24 11:20:12 +08:00
e58ebbe408 fix: maybe_thunk 2026-01-24 02:20:33 +08:00
2f5f84c6c1 fix: removeAttrs 2026-01-24 01:58:08 +08:00
33775092ee fix: preserve string context 2026-01-24 00:57:42 +08:00
ef5d8c3b29 feat: builtins.functionArgs 2026-01-24 00:57:42 +08:00
56a8ba9475 feat: builtins.genericClosure; refactor type check 2026-01-24 00:57:42 +08:00
58c3e67409 fix: force the attrset before checking its type in Nix.hasAttr 2026-01-24 00:54:30 +08:00
7679a3b67f fix: list related primops 2026-01-24 00:54:30 +08:00
95faa7b35f fix: make attrValues' order consistent 2026-01-24 00:54:30 +08:00
43b8959842 fix: use forceBool in codegen 2026-01-24 00:54:30 +08:00
041d7b7dd2 feat: builtins.getEnv 2026-01-24 00:54:30 +08:00
15c4159dcc refactor: error handling 2026-01-24 00:54:30 +08:00
2cb85529c9 feat: NamedSource 2026-01-20 20:12:57 +08:00
e310133421 feat: better error handling 2026-01-20 08:55:08 +08:00
208b996627 fix: error message of Nix.select 2026-01-18 16:53:12 +08:00
9aee36a0e2 fix: relative path resolution
comma operator...
2026-01-18 16:42:54 +08:00
dcb853ea0a feat: logging; copy to store; fix daemon: write_slice -> write_all 2026-01-18 16:42:54 +08:00
2441e10607 feat: use snix nix-compat; implement metadata cache 2026-01-18 16:42:21 +08:00
611255d42c feat: nix_nar 2026-01-18 01:10:49 +08:00
2ad662c765 feat: initial nix-daemon implementation 2026-01-18 01:04:25 +08:00
52bf46407a feat: string context 2026-01-18 00:52:31 +08:00
513b43965c feat: do not use Object.defineProperty? 2026-01-18 00:52:11 +08:00
09bfbca64a refactor: tidy; fix runtime path resolution 2026-01-17 16:42:10 +08:00
f2fc12026f feat: initial path implementation 2026-01-17 12:20:18 +08:00
97e5e7b995 feat: regex related builtins 2026-01-16 21:50:32 +08:00
e620f39a4a fix: use coerceToString 2026-01-16 21:30:56 +08:00
5341ad6c27 feat: builtins.compareVersions 2026-01-16 21:07:15 +08:00
4f8edab795 fix: fetchTree & fetchTarball 2026-01-16 21:05:44 +08:00
e676d2f9f4 fix: unwrap non-recursive let bindings 2026-01-16 21:01:46 +08:00
b6a6630a93 feat: always resolve path at runtime 2026-01-16 21:01:14 +08:00
62abfff439 fix: make update operator lazy 2026-01-16 20:38:41 +08:00
55825788b8 chore: directly map nix boolean/null to javascript boolean/null 2026-01-16 20:38:15 +08:00
b4e0b53cde fix: select 2026-01-14 17:38:15 +08:00
6cd87aa653 chore: tidy 2026-01-14 17:38:12 +08:00
a8683e720b fix(codegen): string escape 2026-01-12 17:44:19 +08:00
3b6804dde6 feat: toJSON 2026-01-11 18:57:52 +08:00
4c505edef5 fix: let 2026-01-11 18:57:52 +08:00
75cb3bfaf1 fix: SCC interscope reference 2026-01-11 18:57:52 +08:00
7d04d8262f fix: duplicate definition check in let-in 2026-01-11 18:57:52 +08:00
c8e617fe24 fix: escape attr keys 2026-01-11 18:57:52 +08:00
158784cbe8 fix: lazy select_with_default 2026-01-11 18:57:52 +08:00
5b1750b1ba feat: thunk loop debugging 2026-01-11 18:57:52 +08:00
160b59b8bf feat: __functor 2026-01-11 18:57:52 +08:00
0538463bf0 fix: Path::canonicalize -> normalize_path
* Nix doesn't require path to exist
2026-01-11 18:57:52 +08:00
621d4ea5c0 fix: lazy select_with_default 2026-01-11 18:57:52 +08:00
3f7fd02263 feat: initial fetcher implementation 2026-01-11 18:57:14 +08:00
528 changed files with 26057 additions and 5156 deletions

9
.gitignore vendored
View File

@@ -1,3 +1,12 @@
target/
/.direnv/
# Profiling
flamegraph*.svg
perf.data*
profile.json.gz
prof.json
*.cpuprofile
*.cpuprofile.gz
*v8.log*

View File

@@ -1,7 +1,27 @@
vim.lsp.config("biome", {
root_dir = function (bufnr, on_dir)
root_dir = function (_bufnr, on_dir)
on_dir(vim.fn.getcwd())
end
})
vim.lsp.config("eslint", {
settings = {
eslint = {
options = {
configFile = "./nix-js/runtime-ts/eslint.config.mts"
}
}
}
})
vim.lsp.config("rust_analyzer", {
settings = {
["rust-analyzer"] = {
cargo = {
features = {
"inspector"
}
}
}
}
})
return {}

2434
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -4,3 +4,7 @@ members = [
"nix-js",
"nix-js-macros"
]
[profile.profiling]
inherits = "release"
debug = true

31
Justfile Normal file
View File

@@ -0,0 +1,31 @@
[no-exit-message]
@repl:
cargo run -- repl
[no-exit-message]
@eval expr:
cargo run -- eval --expr '{{expr}}'
[no-exit-message]
@replr:
cargo run --release -- repl
[no-exit-message]
@evalr expr:
cargo run --release -- eval --expr '{{expr}}'
[no-exit-message]
@repli:
cargo run --release --features inspector -- --inspect-brk 127.0.0.1:9229 repl
[no-exit-message]
@evali expr:
cargo run --release --features inspector -- --inspect-brk 127.0.0.1:9229 eval --expr '{{expr}}'
[no-exit-message]
@replp:
cargo run --release --features prof -- repl
[no-exit-message]
@evalp expr:
cargo run --release --features prof -- eval --expr '{{expr}}'

View File

@@ -1,5 +1,5 @@
{
"$schema": "https://biomejs.dev/schemas/2.3.9/schema.json",
"$schema": "https://biomejs.dev/schemas/2.3.14/schema.json",
"vcs": {
"enabled": true,
"clientKind": "git",
@@ -20,10 +20,37 @@
"linter": {
"rules": {
"style": {
"useNamingConvention": "warn"
"useNamingConvention": {
"level": "warn",
"options": {
"strictCase": false,
"conventions": [
{
"selector": { "kind": "objectLiteralProperty" },
"formats": ["camelCase", "PascalCase", "CONSTANT_CASE"]
},
{
"selector": { "kind": "typeProperty" },
"formats": ["camelCase", "snake_case"]
}
]
}
}
}
}
},
"overrides": [
{
"includes": ["**/global.d.ts"],
"linter": {
"rules": {
"style": {
"useNamingConvention": "off"
}
}
}
}
],
"javascript": {
"formatter": {
"arrowParentheses": "always",

16
default.nix Normal file
View File

@@ -0,0 +1,16 @@
let
lockFile = builtins.fromJSON (builtins.readFile ./flake.lock);
flake-compat-node = lockFile.nodes.${lockFile.nodes.root.inputs.flake-compat};
flake-compat = builtins.fetchTarball {
inherit (flake-compat-node.locked) url;
sha256 = flake-compat-node.locked.narHash;
};
flake = (
import flake-compat {
src = ./.;
copySourceTreeToStore = false;
}
);
in
flake.defaultNix

33
flake.lock generated
View File

@@ -8,11 +8,11 @@
"rust-analyzer-src": "rust-analyzer-src"
},
"locked": {
"lastModified": 1767250179,
"narHash": "sha256-PnQdWvPZqHp+7yaHWDFX3NYSKaOy0fjkwpR+rIQC7AY=",
"lastModified": 1770966612,
"narHash": "sha256-S6k14z/JsDwX6zZyLucDBTOe/9RsvxH9GTUxHn2o4vc=",
"owner": "nix-community",
"repo": "fenix",
"rev": "a3eaf682db8800962943a77ab77c0aae966f9825",
"rev": "e90d48dcfaebac7ea7a5687888a2d0733be26343",
"type": "github"
},
"original": {
@@ -21,13 +21,27 @@
"type": "github"
}
},
"flake-compat": {
"flake": false,
"locked": {
"lastModified": 1751685974,
"narHash": "sha256-NKw96t+BgHIYzHUjkTK95FqYRVKB8DHpVhefWSz/kTw=",
"rev": "549f2762aebeff29a2e5ece7a7dc0f955281a1d1",
"type": "tarball",
"url": "https://git.lix.systems/api/v1/repos/lix-project/flake-compat/archive/549f2762aebeff29a2e5ece7a7dc0f955281a1d1.tar.gz"
},
"original": {
"type": "tarball",
"url": "https://git.lix.systems/lix-project/flake-compat/archive/main.tar.gz"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1767116409,
"narHash": "sha256-5vKw92l1GyTnjoLzEagJy5V5mDFck72LiQWZSOnSicw=",
"lastModified": 1770841267,
"narHash": "sha256-9xejG0KoqsoKEGp2kVbXRlEYtFFcDTHjidiuX8hGO44=",
"owner": "nixos",
"repo": "nixpkgs",
"rev": "cad22e7d996aea55ecab064e84834289143e44a0",
"rev": "ec7c70d12ce2fc37cb92aff673dcdca89d187bae",
"type": "github"
},
"original": {
@@ -40,17 +54,18 @@
"root": {
"inputs": {
"fenix": "fenix",
"flake-compat": "flake-compat",
"nixpkgs": "nixpkgs"
}
},
"rust-analyzer-src": {
"flake": false,
"locked": {
"lastModified": 1767191410,
"narHash": "sha256-cCZGjubgDWmstvFkS6eAw2qk2ihgWkycw55u2dtLd70=",
"lastModified": 1770934477,
"narHash": "sha256-GX0cINHhhzUbQHyDYN2Mc+ovb6Sx/4yrF95VVou9aW4=",
"owner": "rust-lang",
"repo": "rust-analyzer",
"rev": "a9026e6d5068172bf5a0d52a260bb290961d1cb4",
"rev": "931cd553be123b11db1435ac7ea5657e62e5e601",
"type": "github"
},
"original": {

View File

@@ -3,6 +3,10 @@
nixpkgs.url = "github:nixos/nixpkgs/nixos-unstable";
fenix.url = "github:nix-community/fenix";
fenix.inputs.nixpkgs.follows = "nixpkgs";
flake-compat = {
url = "https://git.lix.systems/lix-project/flake-compat/archive/main.tar.gz";
flake = false;
};
};
outputs = { nixpkgs, fenix, ... }:
let
@@ -14,7 +18,7 @@
{
default = pkgs.mkShell {
packages = with pkgs; [
(fenix.packages.${system}.stable.withComponents [
(fenix.packages.${system}.latest.withComponents [
"cargo"
"clippy"
"rust-src"
@@ -23,15 +27,22 @@
"rust-analyzer"
])
cargo-outdated
cargo-machete
lldb
valgrind
hyperfine
just
samply
jq
tokei
nodejs
nodePackages.npm
biome
claude-code
codex
opencode
];
};
}

View File

@@ -7,7 +7,7 @@ edition = "2024"
proc-macro = true
[dependencies]
convert_case = "0.8"
quote = "1.0"
convert_case = "0.11"
proc-macro2 = "1.0"
quote = "1.0"
syn = { version = "2.0", features = ["full"] }

View File

@@ -4,22 +4,22 @@
//! an Intermediate Representation (IR) that follows a specific pattern. It generates:
//! 1. An enum representing the different kinds of IR nodes.
//! 2. Structs for each of the variants that have fields.
//! 3. `Ref` and `Mut` versions of the main enum for ergonomic pattern matching on references.
//! 4. `From` implementations to easily convert from a struct variant (e.g., `BinOp`) to the main enum (`Ir::BinOp`).
//! 5. A `To[IrName]` trait to provide a convenient `.to_ir()` method on the variant structs.
//! 3. `From` implementations to easily convert from a struct variant (e.g., `BinOp`) to the main enum (`Ir::BinOp`).
//! 4. A `To[IrName]` trait to provide a convenient `.to_ir()` method on the variant structs.
use convert_case::{Case, Casing};
use proc_macro::TokenStream;
use quote::{format_ident, quote};
use syn::{
FieldsNamed, Ident, Token, Type, parenthesized,
Expr, ExprPath, FieldsNamed, GenericArgument, GenericParam, Generics, Ident, Path, PathSegment,
Token, Type, TypePath, parenthesized,
parse::{Parse, ParseStream, Result},
punctuated::Punctuated,
token,
};
/// Represents one of the variants passed to the `ir!` macro.
pub enum VariantInput {
enum VariantInput {
/// A unit-like variant, e.g., `Arg`.
Unit(Ident),
/// A tuple-like variant with one unnamed field, e.g., `ExprRef(ExprId)`.
@@ -29,11 +29,12 @@ pub enum VariantInput {
}
/// The top-level input for the `ir!` macro.
pub struct MacroInput {
struct MacroInput {
/// The name of the main IR enum to be generated (e.g., `Ir`).
pub base_name: Ident,
base_name: Ident,
generics: Generics,
/// The list of variants for the enum.
pub variants: Punctuated<VariantInput, Token![,]>,
variants: Punctuated<VariantInput, Token![,]>,
}
impl Parse for VariantInput {
@@ -64,13 +65,14 @@ impl Parse for VariantInput {
impl Parse for MacroInput {
fn parse(input: ParseStream) -> Result<Self> {
// The macro input is expected to be: `IrName, Variant1, Variant2, ...`
let base_name = input.parse()?;
input.parse::<Token![,]>()?;
let generics = Generics::parse(input)?;
input.parse::<Token![;]>()?;
let variants = Punctuated::parse_terminated(input)?;
Ok(MacroInput {
base_name,
generics,
variants,
})
}
@@ -81,17 +83,40 @@ pub fn ir_impl(input: TokenStream) -> TokenStream {
let parsed_input = syn::parse_macro_input!(input as MacroInput);
let base_name = &parsed_input.base_name;
let ref_name = format_ident!("{}Ref", base_name);
let mut_name = format_ident!("{}Mut", base_name);
let generic_params = &parsed_input.generics.params;
let mk_ident_path = |ident| Path {
leading_colon: None,
segments: Punctuated::from_iter(std::iter::once(PathSegment {
ident,
arguments: Default::default(),
})),
};
let generic_args = {
generic_params
.iter()
.map(|arg| match arg {
GenericParam::Lifetime(lifetime) => {
GenericArgument::Lifetime(lifetime.lifetime.clone())
}
GenericParam::Const(cnst) => GenericArgument::Const(Expr::Path(ExprPath {
path: mk_ident_path(cnst.ident.clone()),
attrs: Vec::new(),
qself: None,
})),
GenericParam::Type(ty) => GenericArgument::Type(Type::Path(TypePath {
path: mk_ident_path(ty.ident.clone()),
qself: None,
})),
})
.collect::<Punctuated<_, Token![,]>>()
};
let where_clause = &parsed_input.generics.where_clause;
let to_trait_name = format_ident!("To{}", base_name);
let to_trait_fn_name = format_ident!("to_{}", base_name.to_string().to_case(Case::Snake));
let mut enum_variants = Vec::new();
let mut struct_defs = Vec::new();
let mut ref_variants = Vec::new();
let mut mut_variants = Vec::new();
let mut as_ref_arms = Vec::new();
let mut as_mut_arms = Vec::new();
let mut span_arms = Vec::new();
let mut from_impls = Vec::new();
let mut to_trait_impls = Vec::new();
@@ -99,48 +124,81 @@ pub fn ir_impl(input: TokenStream) -> TokenStream {
match variant {
VariantInput::Unit(name) => {
let inner_type = name.clone();
struct_defs.push(quote! {
#[derive(Debug)]
pub struct #name {
pub span: rnix::TextRange,
}
});
enum_variants.push(quote! { #name(#inner_type) });
ref_variants.push(quote! { #name(&'a #inner_type) });
mut_variants.push(quote! { #name(&'a mut #inner_type) });
as_ref_arms.push(quote! { Self::#name(inner) => #ref_name::#name(inner) });
as_mut_arms.push(quote! { Self::#name(inner) => #mut_name::#name(inner) });
span_arms.push(quote! { Self::#name(inner) => inner.span });
from_impls.push(quote! {
impl From<#inner_type> for #base_name {
impl <#generic_params> From<#inner_type> for #base_name <#generic_args> #where_clause {
fn from(val: #inner_type) -> Self { #base_name::#name(val) }
}
});
to_trait_impls.push(quote! {
impl #to_trait_name for #name {
fn #to_trait_fn_name(self) -> #base_name { #base_name::from(self) }
impl <#generic_params> #to_trait_name <#generic_args> for #name #where_clause {
fn #to_trait_fn_name(self) -> #base_name <#generic_args> { #base_name::from(self) }
}
});
}
VariantInput::Tuple(name, ty) => {
enum_variants.push(quote! { #name(#ty) });
ref_variants.push(quote! { #name(&'a #ty) });
mut_variants.push(quote! { #name(&'a mut #ty) });
as_ref_arms.push(quote! { Self::#name(inner) => #ref_name::#name(inner) });
as_mut_arms.push(quote! { Self::#name(inner) => #mut_name::#name(inner) });
}
VariantInput::Struct(name, fields) => {
let inner_type = name.clone();
let field_name = format_ident!("inner");
struct_defs.push(quote! {
#[derive(Debug)]
pub struct #name #fields
pub struct #name {
pub #field_name: #ty,
pub span: rnix::TextRange,
}
});
let inner_type = name.clone();
enum_variants.push(quote! { #name(#inner_type) });
ref_variants.push(quote! { #name(&'a #inner_type) });
mut_variants.push(quote! { #name(&'a mut #inner_type) });
as_ref_arms.push(quote! { Self::#name(inner) => #ref_name::#name(inner) });
as_mut_arms.push(quote! { Self::#name(inner) => #mut_name::#name(inner) });
span_arms.push(quote! { Self::#name(inner) => inner.span });
from_impls.push(quote! {
impl From<#inner_type> for #base_name {
impl <#generic_params> From<#inner_type> for #base_name <#generic_args> #where_clause {
fn from(val: #inner_type) -> Self { #base_name::#name(val) }
}
});
to_trait_impls.push(quote! {
impl #to_trait_name for #name {
fn #to_trait_fn_name(self) -> #base_name { #base_name::from(self) }
impl <#generic_params> #to_trait_name <#generic_args> for #name #where_clause {
fn #to_trait_fn_name(self) -> #base_name <#generic_args> { #base_name::from(self) }
}
});
}
VariantInput::Struct(name, mut fields) => {
let inner_type = name.clone();
fields.named.iter_mut().for_each(|field| {
field.vis = syn::Visibility::Public(syn::token::Pub::default());
});
fields.named.push(syn::Field {
attrs: vec![],
vis: syn::Visibility::Public(syn::token::Pub::default()),
mutability: syn::FieldMutability::None,
ident: Some(format_ident!("span")),
colon_token: Some(syn::token::Colon::default()),
ty: syn::parse_quote!(rnix::TextRange),
});
struct_defs.push(quote! {
#[derive(Debug)]
pub struct #name <#generic_params> #where_clause #fields
});
enum_variants.push(quote! { #name(#inner_type <#generic_args>) });
span_arms.push(quote! { Self::#name(inner) => inner.span });
from_impls.push(quote! {
impl <#generic_params> From<#inner_type <#generic_args>> for #base_name <#generic_args> #where_clause {
fn from(val: #inner_type <#generic_args>) -> Self { #base_name::#name(val) }
}
});
to_trait_impls.push(quote! {
impl <#generic_params> #to_trait_name <#generic_args> for #name <#generic_args> #where_clause {
fn #to_trait_fn_name(self) -> #base_name <#generic_args> { #base_name::from(self) }
}
});
}
@@ -150,38 +208,18 @@ pub fn ir_impl(input: TokenStream) -> TokenStream {
// Assemble the final generated code.
let expanded = quote! {
/// The main IR enum, generated by the `ir!` macro.
#[derive(Debug, IsVariant, Unwrap, TryUnwrap)]
pub enum #base_name {
#[derive(Debug)]
pub enum #base_name <#generic_params> #where_clause {
#( #enum_variants ),*
}
// The struct definitions for the enum variants.
#( #struct_defs )*
/// An immutable reference version of the IR enum.
#[derive(Debug, IsVariant, Unwrap, TryUnwrap)]
pub enum #ref_name<'a> {
#( #ref_variants ),*
}
/// A mutable reference version of the IR enum.
#[derive(Debug, IsVariant, Unwrap, TryUnwrap)]
pub enum #mut_name<'a> {
#( #mut_variants ),*
}
impl #base_name {
/// Converts a `&Ir` into a `IrRef`.
pub fn as_ref(&self) -> #ref_name<'_> {
impl <#generic_params> #base_name <#generic_args> #where_clause {
pub fn span(&self) -> rnix::TextRange {
match self {
#( #as_ref_arms ),*
}
}
/// Converts a `&mut Ir` into a `IrMut`.
pub fn as_mut(&mut self) -> #mut_name<'_> {
match self {
#( #as_mut_arms ),*
#( #span_arms ),*
}
}
}
@@ -190,9 +228,9 @@ pub fn ir_impl(input: TokenStream) -> TokenStream {
#( #from_impls )*
/// A trait for converting a variant struct into the main IR enum.
pub trait #to_trait_name {
pub trait #to_trait_name <#generic_params> #where_clause {
/// Performs the conversion.
fn #to_trait_fn_name(self) -> #base_name;
fn #to_trait_fn_name(self) -> #base_name <#generic_args>;
}
// Implement the `ToIr` trait for each variant struct.

View File

@@ -7,34 +7,83 @@ build = "build.rs"
[dependencies]
mimalloc = "0.1"
tokio = { version = "1.41", features = ["rt-multi-thread", "sync", "net", "io-util"] }
nix-compat = { git = "https://git.snix.dev/snix/snix.git", version = "0.1.0", features = ["wire", "async"] }
# REPL
anyhow = "1.0"
rustyline = "14.0"
rustyline = "17.0"
# CLI
clap = { version = "4", features = ["derive"] }
# Logging
tracing = "0.1"
tracing-subscriber = { version = "0.3", features = ["env-filter"] }
derive_more = { version = "2", features = ["full"] }
thiserror = "2"
miette = { version = "7.4", features = ["fancy"] }
hashbrown = "0.16"
petgraph = "0.8"
string-interner = "0.19"
bumpalo = { version = "3.20", features = ["allocator-api2", "boxed", "collections"] }
rust-embed="8.11"
itertools = "0.14"
regex = "1.11"
deno_core = "0.376"
deno_core = "0.385"
deno_error = "0.7"
nix-nar = "0.3"
sha2 = "0.10"
sha1 = "0.10"
md5 = "0.8"
hex = "0.4"
rnix = "0.12"
base64 = "0.22"
reqwest = { version = "0.13", features = ["blocking", "rustls"], default-features = false }
tar = "0.4"
flate2 = "1.0"
xz2 = "0.1"
bzip2 = "0.6"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
# spec 1.0.0
toml = "0.9.9"
dirs = "6.0"
tempfile = "3.24"
rusqlite = { version = "0.38", features = ["bundled"] }
rnix = "0.14"
rowan = "0.16"
nix-js-macros = { path = "../nix-js-macros" }
ere = "0.2.4"
num_enum = "0.7.5"
tap = "1.0.1"
# Inspector (optional)
fastwebsockets = { version = "0.10", features = ["upgrade"], optional = true }
hyper = { version = "1", features = ["http1", "server"], optional = true }
hyper-util = { version = "0.1", features = ["tokio"], optional = true }
http-body-util = { version = "0.1", optional = true }
http = { version = "1", optional = true }
uuid = { version = "1", features = ["v4"], optional = true }
ghost-cell = "0.2.6"
colored = "3.1.1"
[features]
inspector = ["dep:fastwebsockets", "dep:hyper", "dep:hyper-util", "dep:http-body-util", "dep:http", "dep:uuid"]
prof = []
[dev-dependencies]
tempfile = "3.24"
criterion = { version = "0.5", features = ["html_reports"] }
criterion = { version = "0.8", features = ["html_reports"] }
test-log = { version = "0.2", features = ["trace"] }
[[bench]]
name = "basic_ops"
@@ -45,7 +94,7 @@ name = "builtins"
harness = false
[[bench]]
name = "scc_optimization"
name = "thunk_scope"
harness = false
[[bench]]

View File

@@ -1,6 +1,8 @@
mod utils;
use criterion::{Criterion, black_box, criterion_group, criterion_main};
use std::hint::black_box;
use criterion::{Criterion, criterion_group, criterion_main};
use utils::eval;
fn bench_arithmetic(c: &mut Criterion) {

View File

@@ -1,6 +1,8 @@
mod utils;
use criterion::{Criterion, black_box, criterion_group, criterion_main};
use std::hint::black_box;
use criterion::{Criterion, criterion_group, criterion_main};
use utils::eval;
fn bench_builtin_math(c: &mut Criterion) {

View File

@@ -1,6 +1,8 @@
mod utils;
use criterion::{Criterion, black_box, criterion_group, criterion_main};
use std::hint::black_box;
use criterion::{Criterion, criterion_group, criterion_main};
use nix_js::context::Context;
use utils::compile;

View File

@@ -1,6 +1,8 @@
mod utils;
use criterion::{Criterion, black_box, criterion_group, criterion_main};
use std::hint::black_box;
use criterion::{Criterion, criterion_group, criterion_main};
use utils::eval;
fn bench_non_recursive(c: &mut Criterion) {

View File

@@ -1,16 +1,25 @@
#![allow(dead_code)]
use nix_js::context::Context;
use nix_js::error::{Result, Source};
use nix_js::value::Value;
pub fn eval(expr: &str) -> Value {
Context::new().unwrap().eval_code(expr).unwrap()
Context::new()
.unwrap()
.eval(Source::new_eval(expr.into()).unwrap())
.unwrap()
}
pub fn eval_result(expr: &str) -> Result<Value, nix_js::error::Error> {
Context::new().unwrap().eval_code(expr)
pub fn eval_result(expr: &str) -> Result<Value> {
Context::new()
.unwrap()
.eval(Source::new_eval(expr.into()).unwrap())
}
pub fn compile(expr: &str) -> String {
Context::new().unwrap().compile_code(expr).unwrap()
Context::new()
.unwrap()
.compile(Source::new_eval(expr.into()).unwrap())
.unwrap()
}

View File

@@ -1,4 +1,3 @@
use std::env;
use std::path::Path;
use std::process::Command;
@@ -14,6 +13,7 @@ fn main() {
println!("cargo::rerun-if-changed=runtime-ts/src");
println!("cargo::rerun-if-changed=runtime-ts/package.json");
println!("cargo::rerun-if-changed=runtime-ts/tsconfig.json");
println!("cargo::rerun-if-changed=runtime-ts/build.mjs");
if !runtime_ts_dir.join("node_modules").exists() {
println!("Installing npm dependencies...");
@@ -67,9 +67,4 @@ fn main() {
} else {
panic!("dist/runtime.js not found after build");
}
// Print build info
if env::var("CARGO_CFG_DEBUG_ASSERTIONS").is_ok() {
println!("Built runtime.js in DEBUG mode");
}
}

View File

@@ -4,5 +4,5 @@ await esbuild.build({
entryPoints: ["src/index.ts"],
outfile: "dist/runtime.js",
bundle: true,
minify: true,
// minify: true,
});

View File

@@ -0,0 +1,20 @@
import js from "@eslint/js";
import { defineConfig } from "eslint/config";
import globals from "globals";
import tseslint from "typescript-eslint";
export default defineConfig([
js.configs.recommended,
...tseslint.configs.recommended,
{
files: ["**/*.{js,mjs,cjs,ts,mts,cts}"],
languageOptions: { globals: globals.es2022 },
rules: {
"no-unused-vars": "off",
"@typescript-eslint/no-unused-vars": ["error", { varsIgnorePattern: "^_", argsIgnorePattern: "^_" }],
},
},
{
ignores: ["dist/**/*"],
},
]);

File diff suppressed because it is too large Load Diff

View File

@@ -3,12 +3,20 @@
"version": "0.1.0",
"private": true,
"scripts": {
"check": "tsc --noEmit && npx eslint && biome check",
"typecheck": "tsc --noEmit",
"build": "node build.mjs",
"dev": "npm run typecheck && npm run build"
},
"devDependencies": {
"esbuild": "^0.24.2",
"typescript": "^5.7.2"
"eslint": "^9.39.2",
"typescript": "^5.7.2",
"typescript-eslint": "^8.55.0",
"jiti": "^2.6.1"
},
"dependencies": {
"globals": "^17.3.0",
"js-sdsl": "^4.4.2"
}
}

View File

@@ -1,29 +1,26 @@
/**
* Arithmetic builtin functions
*/
import { op } from "../operators";
import { coerceNumeric, forceInt, forceNumeric } from "../type-assert";
import type { NixBool, NixInt, NixNumber, NixValue } from "../types";
import { forceNumeric, coerceNumeric, forceInt } from "../type-assert";
export const add =
(a: NixValue) =>
(b: NixValue): bigint | number => {
const [av, bv] = coerceNumeric(forceNumeric(a), forceNumeric(b));
return (av as any) + (bv as any);
return (av as never) + (bv as never);
};
export const sub =
(a: NixValue) =>
(b: NixValue): bigint | number => {
const [av, bv] = coerceNumeric(forceNumeric(a), forceNumeric(b));
return (av as any) - (bv as any);
return (av as never) - (bv as never);
};
export const mul =
(a: NixValue) =>
(b: NixValue): bigint | number => {
const [av, bv] = coerceNumeric(forceNumeric(a), forceNumeric(b));
return (av as any) * (bv as any);
return (av as never) * (bv as never);
};
export const div =
@@ -35,10 +32,9 @@ export const div =
throw new RangeError("Division by zero");
}
return (av as any) / (bv as any);
return (av as never) / (bv as never);
};
// Bitwise operations - only for integers
export const bitAnd =
(a: NixValue) =>
(b: NixValue): NixInt => {
@@ -66,4 +62,4 @@ export const bitXor =
export const lessThan =
(a: NixValue) =>
(b: NixValue): NixBool =>
forceNumeric(a) < forceNumeric(b);
op.lt(a, b);

View File

@@ -1,57 +1,65 @@
/**
* Attribute set operation builtin functions
*/
import { mkPos, select } from "../helpers";
import { createThunk } from "../thunk";
import { forceAttrs, forceFunction, forceList, forceStringValue } from "../type-assert";
import { ATTR_POSITIONS, type NixAttrs, type NixList, type NixValue } from "../types";
import type { NixValue, NixAttrs, NixList } from "../types";
import { forceAttrs, forceString, forceFunction, forceList } from "../type-assert";
export const attrNames = (set: NixValue): string[] => Array.from(forceAttrs(set).keys()).sort();
export const attrNames = (set: NixValue): string[] => Object.keys(forceAttrs(set)).sort();
export const attrValues = (set: NixValue): NixValue[] => Object.values(forceAttrs(set));
export const attrValues = (set: NixValue): NixValue[] =>
Array.from(forceAttrs(set).entries())
.sort(([a], [b]) => {
if (a < b) {
return -1;
} else if (a === b) {
return 0;
} else {
return 1;
}
})
.map(([_, val]) => val);
export const getAttr =
(s: NixValue) =>
(set: NixValue): NixValue =>
forceAttrs(set)[forceString(s)];
select(forceAttrs(set), [s]);
export const hasAttr =
(s: NixValue) =>
(set: NixValue): boolean =>
Object.hasOwn(forceAttrs(set), forceString(s));
forceAttrs(set).has(forceStringValue(s));
export const mapAttrs =
(f: NixValue) =>
(attrs: NixValue): NixAttrs => {
const new_attrs: NixAttrs = {};
const forced_attrs = forceAttrs(attrs);
const forced_f = forceFunction(f);
for (const key in forced_attrs) {
new_attrs[key] = forceFunction(forced_f(key))(forced_attrs[key]);
const forcedAttrs = forceAttrs(attrs);
const forcedF = forceFunction(f);
const newAttrs: NixAttrs = new Map();
for (const [key, val] of forcedAttrs) {
newAttrs.set(
key,
createThunk(() => forceFunction(forcedF(key))(val), "created by mapAttrs"),
);
}
return new_attrs;
return newAttrs;
};
export const removeAttrs =
(attrs: NixValue) =>
(list: NixValue): NixAttrs => {
const new_attrs: NixAttrs = {};
const forced_attrs = forceAttrs(attrs);
const forced_list = forceList(list);
for (const key in forced_attrs) {
if (!(key in forced_list)) {
new_attrs[key] = forced_attrs[key];
}
const newAttrs: NixAttrs = new Map(forceAttrs(attrs));
const forcedList = forceList(list);
for (const item of forcedList) {
newAttrs.delete(forceStringValue(item));
}
return new_attrs;
return newAttrs;
};
export const listToAttrs = (e: NixValue): NixAttrs => {
const attrs: NixAttrs = {};
const forced_e = [...forceList(e)].reverse();
for (const obj of forced_e) {
const attrs: NixAttrs = new Map();
const forcedE = [...forceList(e)].reverse();
for (const obj of forcedE) {
const item = forceAttrs(obj);
attrs[forceString(item.name)] = item.value;
attrs.set(forceStringValue(select(item, ["name"])), select(item, ["value"]));
}
return attrs;
};
@@ -61,10 +69,18 @@ export const intersectAttrs =
(e2: NixValue): NixAttrs => {
const f1 = forceAttrs(e1);
const f2 = forceAttrs(e2);
const attrs: NixAttrs = {};
for (const key of Object.keys(f2)) {
if (Object.hasOwn(f1, key)) {
attrs[key] = f2[key];
const attrs: NixAttrs = new Map();
if (f1.size < f2.size) {
for (const [key] of f1) {
if (f2.has(key)) {
attrs.set(key, f2.get(key) as NixValue);
}
}
} else {
for (const [key] of f2) {
if (f1.has(key)) {
attrs.set(key, f2.get(key) as NixValue);
}
}
}
return attrs;
@@ -73,22 +89,71 @@ export const intersectAttrs =
export const catAttrs =
(attr: NixValue) =>
(list: NixValue): NixList => {
const key = forceString(attr);
const key = forceStringValue(attr);
return forceList(list)
.map((set) => forceAttrs(set)[key])
.filter((val) => val !== undefined);
.map((set) => forceAttrs(set).get(key))
.filter((val) => val !== undefined) as NixList;
};
export const groupBy =
(f: NixValue) =>
(list: NixValue): NixAttrs => {
const attrs: NixAttrs = {};
const forced_f = forceFunction(f);
const forced_list = forceList(list);
for (const elem of forced_list) {
const key = forceString(forced_f(elem));
if (!attrs[key]) attrs[key] = [];
(attrs[key] as NixList).push(elem);
const attrs: NixAttrs = new Map();
const forcedF = forceFunction(f);
const forcedList = forceList(list);
for (const elem of forcedList) {
const key = forceStringValue(forcedF(elem));
if (!attrs.has(key)) attrs.set(key, []);
(attrs.get(key) as NixList).push(elem);
}
return attrs;
};
export const zipAttrsWith =
(f: NixValue) =>
(list: NixValue): NixValue => {
const listForced = forceList(list);
const attrMap = new Map<string, NixValue[]>();
for (const item of listForced) {
const attrs = forceAttrs(item);
for (const [key, value] of attrs) {
if (!attrMap.has(key)) {
attrMap.set(key, []);
}
(attrMap.get(key) as NixValue[]).push(value);
}
}
const result: NixAttrs = new Map();
for (const [name, values] of attrMap.entries()) {
result.set(
name,
createThunk(() => forceFunction(forceFunction(f)(name))(values)),
);
}
return result;
};
export const unsafeGetAttrPos =
(attrName: NixValue) =>
(attrSet: NixValue): NixValue => {
const name = forceStringValue(attrName);
const attrs = forceAttrs(attrSet);
if (!attrs.has(name)) {
return null;
}
const positions = attrs[ATTR_POSITIONS];
if (!positions || !positions.has(name)) {
return null;
}
const span = positions.get(name) as number;
return mkPos(span);
};

View File

@@ -1,28 +1,45 @@
import type { NixValue, NixAttrs, NixString } from "../types";
import { isStringWithContext } from "../types";
import { forceNixString, forceAttrs, forceBool, forceList, forceString } from "../type-assert";
import { force } from "../thunk";
import {
type NixStringContext,
getStringValue,
getStringContext,
mkStringWithContext,
decodeContextElem,
getStringContext,
getStringValue,
mkStringWithContext,
type NixStringContext,
parseContextToInfoMap,
} from "../string-context";
import { force } from "../thunk";
import { forceAttrs, forceList, forceString, forceStringValue } from "../type-assert";
import type { NixAttrs, NixString, NixValue } from "../types";
import { isStringWithContext } from "../types";
/**
* builtins.hasContext - Check if string has context
*
* Returns true if the string has any store path references.
*/
export const hasContext = (value: NixValue): boolean => {
const s = forceNixString(value);
const s = forceString(value);
return isStringWithContext(s) && s.context.size > 0;
};
/**
* builtins.unsafeDiscardStringContext - Remove all context from string
*
* IMPORTANT: This discards string context, returning only the string value.
* Use with caution as it removes derivation dependencies.
*/
export const unsafeDiscardStringContext = (value: NixValue): string => {
const s = forceNixString(value);
const s = forceString(value);
return getStringValue(s);
};
/**
* builtins.unsafeDiscardOutputDependency - Convert DrvDeep to Opaque context
*
* IMPORTANT: Transforms "all outputs" references (=) to plain path references.
* Preserves other context types unchanged.
*/
export const unsafeDiscardOutputDependency = (value: NixValue): NixString => {
const s = forceNixString(value);
const s = forceString(value);
const strValue = getStringValue(s);
const context = getStringContext(s);
@@ -47,15 +64,19 @@ export const unsafeDiscardOutputDependency = (value: NixValue): NixString => {
return mkStringWithContext(strValue, newContext);
};
/**
* builtins.addDrvOutputDependencies - Convert Opaque to DrvDeep context
*
* IMPORTANT: Transforms plain derivation path references to "all outputs" references (=).
* The string must have exactly one context element which must be a .drv path.
*/
export const addDrvOutputDependencies = (value: NixValue): NixString => {
const s = forceNixString(value);
const s = forceString(value);
const strValue = getStringValue(s);
const context = getStringContext(s);
if (context.size !== 1) {
throw new Error(
`context of string '${strValue}' must have exactly one element, but has ${context.size}`,
);
throw new Error(`context of string '${strValue}' must have exactly one element, but has ${context.size}`);
}
const [encoded] = context;
@@ -79,56 +100,76 @@ export const addDrvOutputDependencies = (value: NixValue): NixString => {
return mkStringWithContext(strValue, newContext);
};
/**
* builtins.getContext - Extract context as structured attribute set
*
* Returns an attribute set mapping store paths to their context info:
* - path: true if it's a plain store path reference (opaque)
* - allOutputs: true if it references all derivation outputs (drvDeep, encoded as =path)
* - outputs: list of specific output names (built, encoded as !output!path)
*/
export const getContext = (value: NixValue): NixAttrs => {
const s = forceNixString(value);
const s = forceString(value);
const context = getStringContext(s);
const infoMap = parseContextToInfoMap(context);
const result: NixAttrs = {};
const result: NixAttrs = new Map();
for (const [path, info] of infoMap) {
const attrs: NixAttrs = {};
const attrs: NixAttrs = new Map();
if (info.path) {
attrs["path"] = true;
attrs.set("path", true);
}
if (info.allOutputs) {
attrs["allOutputs"] = true;
attrs.set("allOutputs", true);
}
if (info.outputs.length > 0) {
attrs["outputs"] = info.outputs;
attrs.set("outputs", info.outputs);
}
result[path] = attrs;
result.set(path, attrs);
}
return result;
};
/**
* builtins.appendContext - Add context to a string
*
* IMPORTANT: Merges the provided context attribute set with any existing context
* from the input string. Used to manually construct strings with specific
* derivation dependencies.
*
* Context format matches getContext output:
* - path: boolean - add as opaque reference
* - allOutputs: boolean - add as drvDeep reference (=)
* - outputs: [string] - add as built references (!output!)
*/
export const appendContext =
(strValue: NixValue) =>
(ctxValue: NixValue): NixString => {
const s = forceNixString(strValue);
const s = forceString(strValue);
const strVal = getStringValue(s);
const existingContext = getStringContext(s);
const ctxAttrs = forceAttrs(ctxValue);
const newContext: NixStringContext = new Set(existingContext);
for (const [path, infoVal] of Object.entries(ctxAttrs)) {
for (const [path, infoVal] of ctxAttrs) {
if (!path.startsWith("/nix/store/")) {
throw new Error(`context key '${path}' is not a store path`);
}
const info = forceAttrs(infoVal);
const info = forceAttrs(infoVal as NixValue);
if ("path" in info) {
const pathVal = force(info["path"]);
if (info.has("path")) {
const pathVal = force(info.get("path") as NixValue);
if (pathVal === true) {
newContext.add(path);
}
}
if ("allOutputs" in info) {
const allOutputs = force(info["allOutputs"]);
if (info.has("allOutputs")) {
const allOutputs = force(info.get("allOutputs") as NixValue);
if (allOutputs === true) {
if (!path.endsWith(".drv")) {
throw new Error(
@@ -139,15 +180,15 @@ export const appendContext =
}
}
if ("outputs" in info) {
const outputs = forceList(info["outputs"]);
if (info.has("outputs")) {
const outputs = forceList(info.get("outputs") as NixValue);
if (outputs.length > 0 && !path.endsWith(".drv")) {
throw new Error(
`tried to add derivation output context of ${path}, which is not a derivation, to a string`,
);
}
for (const output of outputs) {
const outputName = forceString(output);
const outputName = forceStringValue(output);
newContext.add(`!${outputName}!${path}`);
}
}

View File

@@ -2,30 +2,47 @@
* Conversion and serialization builtin functions
*/
import type { NixValue, NixString } from "../types";
import { isStringWithContext } from "../types";
import { force } from "../thunk";
import {
type NixStringContext,
mkStringWithContext,
addBuiltContext,
mkStringWithContext,
type NixStringContext,
StringWithContext,
} from "../string-context";
import { forceFunction } from "../type-assert";
import { force, isThunk } from "../thunk";
import { forceFunction, forceStringNoCtx } from "../type-assert";
import type { NixString, NixValue } from "../types";
import { isNixPath, isStringWithContext, NixPath } from "../types";
import { isAttrs, isPath, typeOf } from "./type-check";
export const fromJSON = (e: NixValue): never => {
throw new Error("Not implemented: fromJSON");
export const fromJSON = (e: NixValue): NixValue => {
const str = force(e);
if (typeof str !== "string" && !isStringWithContext(str)) {
throw new TypeError(`builtins.fromJSON: expected a string, got ${typeOf(str)}`);
}
const jsonStr = isStringWithContext(str) ? str.value : str;
return Deno.core.ops.op_from_json(jsonStr) as NixValue;
};
export const fromTOML = (e: NixValue): never => {
throw new Error("Not implemented: fromTOML");
export const fromTOML = (e: NixValue): NixValue => {
const toml = forceStringNoCtx(e);
return Deno.core.ops.op_from_toml(toml) as NixValue;
};
export const toJSON = (e: NixValue): never => {
throw new Error("Not implemented: toJSON");
export const toJSON = (e: NixValue): NixString => {
const context: Set<string> = new Set();
const string = JSON.stringify(nixValueToJson(e, true, context, true));
if (context.size === 0) {
return string;
}
return mkStringWithContext(string, context);
};
export const toXML = (e: NixValue): never => {
throw new Error("Not implemented: toXML");
export const toXML = (e: NixValue): NixString => {
const [xml, context] = Deno.core.ops.op_to_xml(force(e));
if (context.length === 0) {
return xml;
}
return mkStringWithContext(xml, new Set(context));
};
/**
@@ -41,25 +58,6 @@ export enum StringCoercionMode {
ToString = 2,
}
/**
* Helper function to get human-readable type names for error messages
*/
const typeName = (value: NixValue): string => {
const val = force(value);
if (typeof val === "bigint") return "int";
if (typeof val === "number") return "float";
if (typeof val === "boolean") return "boolean";
if (typeof val === "string") return "string";
if (isStringWithContext(val)) return "string";
if (val === null) return "null";
if (Array.isArray(val)) return "list";
if (typeof val === "function") return "lambda";
if (typeof val === "object") return "attribute set";
return `unknown type`;
};
export interface CoerceResult {
value: string;
context: NixStringContext;
@@ -69,6 +67,12 @@ export interface CoerceResult {
* Coerce a Nix value to a string according to the specified mode.
* This implements the same behavior as Lix's EvalState::coerceToString.
*
* IMPORTANT: String context preservation rules:
* - StringWithContext: Context is collected in outContext parameter
* - Derivations (with outPath): Built context is added for the drvPath/outputName
* - Lists (ToString mode): Context from all elements is merged
* - All other coercions: No context added
*
* @param value - The value to coerce
* @param mode - The coercion mode (controls which types are allowed)
* @param copyToStore - If true, paths should be copied to the Nix store (not implemented yet)
@@ -89,9 +93,9 @@ export interface CoerceResult {
*/
export const coerceToString = (
value: NixValue,
mode: StringCoercionMode = StringCoercionMode.ToString,
mode: StringCoercionMode,
copyToStore: boolean = false,
outContext?: NixStringContext,
outContext: NixStringContext,
): string => {
const v = force(value);
@@ -101,42 +105,50 @@ export const coerceToString = (
}
if (isStringWithContext(v)) {
if (outContext) {
for (const elem of v.context) {
outContext.add(elem);
}
for (const elem of v.context) {
outContext.add(elem);
}
return v.value;
}
// Paths coerce to their string value
if (isNixPath(v)) {
if (copyToStore) {
const pathStr = v.value;
const storePath = Deno.core.ops.op_copy_path_to_store(pathStr);
outContext.add(storePath);
return storePath;
}
return v.value;
}
if (typeof v === "object" && v !== null && !Array.isArray(v)) {
// First, try the __toString method if present
// This allows custom types to define their own string representation
if ("__toString" in v) {
// Force the method in case it's a thunk
const toStringMethod = forceFunction(v.__toString);
const result = force(toStringMethod(v));
// Recursively coerceToString
return coerceToString(result, mode, copyToStore, outContext);
}
// If no __toString, try outPath (used for derivations and store paths)
// This allows derivation objects like { outPath = "/nix/store/..."; } to be coerced
if ("outPath" in v) {
// Recursively coerce the outPath value
const outPath = coerceToString(v.outPath, mode, copyToStore, outContext);
if ("type" in v && v.type === "derivation" && "drvPath" in v) {
const drvPath = force(v.drvPath);
if (typeof drvPath === "string" && outContext) {
const outputName = "outputName" in v ? String(force(v.outputName)) : "out";
addBuiltContext(outContext, drvPath, outputName);
}
if (v instanceof Map) {
if (v.has("__toString")) {
const toStringMethod = forceFunction(v.get("__toString") as NixValue);
const result = force(toStringMethod(v));
return coerceToString(result, mode, copyToStore, outContext);
}
if (v.has("outPath")) {
const outPath = coerceToString(v.get("outPath") as NixValue, mode, copyToStore, outContext);
if (v.has("type") && v.get("type") === "derivation" && v.has("drvPath") && outContext) {
const drvPathValue = force(v.get("drvPath") as NixValue);
const drvPathStr = isStringWithContext(drvPathValue)
? drvPathValue.value
: typeof drvPathValue === "string"
? drvPathValue
: null;
if (drvPathStr) {
const outputName = v.has("outputName") ? String(force(v.get("outputName") as NixValue)) : "out";
addBuiltContext(outContext, drvPathStr, outputName);
}
}
return outPath;
}
return outPath;
}
// Attribute sets without __toString or outPath cannot be coerced
throw new TypeError(`cannot coerce ${typeName(v)} to a string`);
throw new TypeError(`cannot coerce ${typeOf(v)} to a string`);
}
// Integer coercion is allowed in Interpolation and ToString modes
@@ -204,7 +216,7 @@ export const coerceToString = (
}
}
throw new TypeError(`cannot coerce ${typeName(v)} to a string`);
throw new TypeError(`cannot coerce ${typeOf(v)} to a string`);
};
/**
@@ -224,6 +236,46 @@ export const coerceToStringWithContext = (
return mkStringWithContext(str, context);
};
/**
* Coerce a Nix value to an absolute path string.
* This implements the same behavior as Lix's EvalState::coerceToPath.
*
* @param value - The value to coerce
* @param outContext - Optional context set to collect string contexts
* @returns The absolute path string
* @throws TypeError if the value cannot be coerced to a string
* @throws Error if the result is not an absolute path
*
* Semantics:
* - Coerces to string using Strict mode (same as coerceToString with Base mode)
* - Validates the result is non-empty and starts with '/'
* - Returns the path string (not a NixPath object)
* - Preserves string context if present
*/
export const coerceToPath = (value: NixValue, outContext: NixStringContext): string => {
const forced = force(value);
if (isPath(forced)) {
return forced.value;
}
if (isAttrs(forced) && forced.has("__toString")) {
const toStringFunc = forceFunction(forced.get("__toString") as NixValue);
return coerceToPath(toStringFunc(forced), outContext);
}
const pathStr = coerceToString(value, StringCoercionMode.Base, false, outContext);
if (pathStr === "") {
throw new Error("string doesn't represent an absolute path: empty string");
}
if (pathStr[0] !== "/") {
throw new Error(`string '${pathStr}' doesn't represent an absolute path`);
}
return pathStr;
};
/**
* builtins.toString - Convert a value to a string
*
@@ -236,3 +288,84 @@ export const coerceToStringWithContext = (
export const toStringFunc = (value: NixValue): NixString => {
return coerceToStringWithContext(value, StringCoercionMode.ToString, false);
};
export type JsonValue = number | boolean | string | null | { [key: string]: JsonValue } | Array<JsonValue>;
export const nixValueToJson = (
value: NixValue,
strict: boolean,
outContext: NixStringContext,
copyToStore: boolean,
seen: Set<NixValue> = new Set(),
): JsonValue => {
const v = strict ? force(value) : value;
if (isThunk(v) || typeof v === "function")
throw new Error(`cannot convert ${isThunk(v) ? "thunk" : "lambda"} to JSON`);
if (v === null) return null;
if (typeof v === "bigint") {
const num = Number(v);
if (v > Number.MAX_SAFE_INTEGER || v < Number.MIN_SAFE_INTEGER) {
console.warn(`integer ${v} exceeds safe range, precision may be lost`);
}
return num;
}
if (typeof v === "number") return v;
if (typeof v === "boolean") return v;
if (typeof v === "string") return v;
if (v instanceof StringWithContext) {
for (const elem of v.context) {
outContext.add(elem);
}
return v.value;
}
if (v instanceof NixPath) {
if (copyToStore) {
const storePath = Deno.core.ops.op_copy_path_to_store(v.value);
outContext.add(storePath);
return storePath;
} else {
return v.value;
}
}
// FIXME: is this check necessary?
// if (seen.has(v)) {
// throw new Error("cycle detected in toJSON");
// } else {
// seen.add(v)
// }
if (Array.isArray(v)) {
return v.map((item) => nixValueToJson(item, strict, outContext, copyToStore, seen));
}
if (v instanceof Map) {
if (v.has("__toString") && typeof force(v.get("__toString") as NixValue) === "function") {
const toStringMethod = force(v.get("__toString") as NixValue) as (self: typeof v) => NixValue;
const result = force(toStringMethod(v));
if (typeof result === "string") {
return result;
}
if (isStringWithContext(result)) {
for (const elem of result.context) {
outContext.add(elem);
}
return result.value;
}
return nixValueToJson(result, strict, outContext, copyToStore, seen);
}
if (v.has("outPath")) {
return nixValueToJson(v.get("outPath") as NixValue, strict, outContext, copyToStore, seen);
}
const result: { [key: string]: JsonValue } = {};
const keys = Array.from(v.keys()).sort();
for (const key of keys) {
result[key] = nixValueToJson(v.get(key) as NixValue, strict, outContext, copyToStore, seen);
}
return result;
}
throw new Error(`cannot convert ${typeof v} to JSON`);
};

View File

@@ -1,28 +1,125 @@
import type { NixValue, NixAttrs } from "../types";
import { forceString, forceList, forceNixString } from "../type-assert";
import { force } from "../thunk";
import { type DerivationData, type OutputInfo, generateAterm } from "../derivation-helpers";
import { coerceToString, StringCoercionMode } from "./conversion";
import {
addBuiltContext,
addDrvDeepContext,
mkStringWithContext,
type NixStringContext,
extractInputDrvsAndSrcs,
isStringWithContext,
HAS_CONTEXT,
} from "../string-context";
import { force } from "../thunk";
import { forceAttrs, forceList, forceStringNoCtx, forceStringValue } from "../type-assert";
import type { NixAttrs, NixValue } from "../types";
import { coerceToString, type JsonValue, nixValueToJson, StringCoercionMode } from "./conversion";
const forceAttrs = (value: NixValue): NixAttrs => {
const forced = force(value);
if (typeof forced !== "object" || forced === null || Array.isArray(forced) || isStringWithContext(forced)) {
throw new TypeError(`Expected attribute set for derivation, got ${typeof forced}`);
export interface OutputInfo {
path: string;
hashAlgo: string;
hash: string;
}
export interface DerivationData {
name: string;
outputs: Map<string, OutputInfo>;
inputDrvs: Map<string, Set<string>>;
inputSrcs: Set<string>;
platform: string;
builder: string;
args: string[];
env: Map<string, string>;
}
export const escapeString = (s: string): string => {
let result = "";
for (const char of s) {
switch (char) {
case '"':
result += '\\"';
break;
case "\\":
result += "\\\\";
break;
case "\n":
result += "\\n";
break;
case "\r":
result += "\\r";
break;
case "\t":
result += "\\t";
break;
default:
result += char;
}
}
return forced;
return `"${result}"`;
};
const quoteString = (s: string): string => `"${s}"`;
const cmpByKey = <T>(a: [string, T], b: [string, T]): number => (a[0] < b[0] ? -1 : a[0] > b[0] ? 1 : 0);
export const generateAterm = (drv: DerivationData): string => {
const outputEntries: string[] = [];
const sortedOutputs = Array.from(drv.outputs.entries()).sort(cmpByKey);
for (const [name, info] of sortedOutputs) {
outputEntries.push(
`(${quoteString(name)},${quoteString(info.path)},${quoteString(info.hashAlgo)},${quoteString(info.hash)})`,
);
}
const outputs = outputEntries.join(",");
const inputDrvEntries: string[] = [];
const sortedInputDrvs = Array.from(drv.inputDrvs.entries()).sort(cmpByKey);
for (const [drvPath, outputs] of sortedInputDrvs) {
const sortedOuts = Array.from(outputs).sort();
const outList = `[${sortedOuts.map(quoteString).join(",")}]`;
inputDrvEntries.push(`(${quoteString(drvPath)},${outList})`);
}
const inputDrvs = inputDrvEntries.join(",");
const sortedInputSrcs = Array.from(drv.inputSrcs).sort();
const inputSrcs = sortedInputSrcs.map(quoteString).join(",");
const args = drv.args.map(escapeString).join(",");
const envs = Array.from(drv.env.entries())
.sort(cmpByKey)
.map(([k, v]) => `(${escapeString(k)},${escapeString(v)})`);
return `Derive([${outputs}],[${inputDrvs}],[${inputSrcs}],${quoteString(drv.platform)},${escapeString(drv.builder)},[${args}],[${envs}])`;
};
export const generateAtermModulo = (drv: DerivationData, inputDrvHashes: Map<string, string>): string => {
const outputEntries: string[] = [];
const sortedOutputs = Array.from(drv.outputs.entries()).sort(cmpByKey);
for (const [name, info] of sortedOutputs) {
outputEntries.push(
`(${quoteString(name)},${quoteString(info.path)},${quoteString(info.hashAlgo)},${quoteString(info.hash)})`,
);
}
const outputs = outputEntries.join(",");
const inputDrvEntries: string[] = [];
const sortedInputDrvHashes = Array.from(inputDrvHashes.entries()).sort(cmpByKey);
for (const [drvHash, outputs] of sortedInputDrvHashes) {
const sortedOuts = outputs.split(",").sort();
const outList = `[${sortedOuts.map(quoteString).join(",")}]`;
inputDrvEntries.push(`(${quoteString(drvHash)},${outList})`);
}
const inputDrvs = inputDrvEntries.join(",");
const sortedInputSrcs = Array.from(drv.inputSrcs).sort();
const inputSrcs = sortedInputSrcs.map(quoteString).join(",");
const args = drv.args.map(escapeString).join(",");
const envs = Array.from(drv.env.entries())
.sort(cmpByKey)
.map(([k, v]) => `(${escapeString(k)},${escapeString(v)})`);
return `Derive([${outputs}],[${inputDrvs}],[${inputSrcs}],${quoteString(drv.platform)},${escapeString(drv.builder)},[${args}],[${envs}])`;
};
const validateName = (attrs: NixAttrs): string => {
if (!("name" in attrs)) {
if (!attrs.has("name")) {
throw new Error("derivation: missing required attribute 'name'");
}
const name = forceString(attrs.name);
const name = forceStringValue(attrs.get("name") as NixValue);
if (!name) {
throw new Error("derivation: 'name' cannot be empty");
}
@@ -33,26 +130,20 @@ const validateName = (attrs: NixAttrs): string => {
};
const validateBuilder = (attrs: NixAttrs, outContext: NixStringContext): string => {
if (!("builder" in attrs)) {
if (!attrs.has("builder")) {
throw new Error("derivation: missing required attribute 'builder'");
}
return coerceToString(attrs.builder, StringCoercionMode.ToString, false, outContext);
return coerceToString(attrs.get("builder") as NixValue, StringCoercionMode.ToString, true, outContext);
};
const validateSystem = (attrs: NixAttrs): string => {
if (!("system" in attrs)) {
if (!attrs.has("system")) {
throw new Error("derivation: missing required attribute 'system'");
}
return forceString(attrs.system);
return forceStringValue(attrs.get("system") as NixValue);
};
const extractOutputs = (attrs: NixAttrs): string[] => {
if (!("outputs" in attrs)) {
return ["out"];
}
const outputsList = forceList(attrs.outputs);
const outputs = outputsList.map((o) => forceString(o));
const validateOutputs = (outputs: string[]): void => {
if (outputs.length === 0) {
throw new Error("derivation: outputs list cannot be empty");
}
@@ -68,71 +159,59 @@ const extractOutputs = (attrs: NixAttrs): string[] => {
}
seen.add(output);
}
};
const extractOutputs = (attrs: NixAttrs, structuredAttrs: boolean): string[] => {
if (!attrs.has("outputs")) {
return ["out"];
}
let outputs: string[];
if (structuredAttrs) {
const outputsList = forceList(attrs.get("outputs") as NixValue);
outputs = outputsList.map((o) => forceStringValue(o));
} else {
const outputsStr = coerceToString(
attrs.get("outputs") as NixValue,
StringCoercionMode.ToString,
false,
new Set(),
);
outputs = outputsStr
.trim()
.split(/\s+/)
.filter((s) => s.length > 0);
}
validateOutputs(outputs);
return outputs;
};
const extractArgs = (attrs: NixAttrs, outContext: NixStringContext): string[] => {
if (!("args" in attrs)) {
if (!attrs.has("args")) {
return [];
}
const argsList = forceList(attrs.args);
return argsList.map((a) => coerceToString(a, StringCoercionMode.ToString, false, outContext));
const argsList = forceList(attrs.get("args") as NixValue);
return argsList.map((a) => coerceToString(a, StringCoercionMode.ToString, true, outContext));
};
const nixValueToJson = (
value: NixValue,
seen = new Set<object>(),
outContext?: NixStringContext,
): any => {
const v = force(value);
const structuredAttrsExcludedKeys = new Set([
"__structuredAttrs",
"__ignoreNulls",
"__contentAddressed",
"__impure",
"args",
]);
if (v === null) return null;
if (typeof v === "boolean") return v;
if (typeof v === "string") return v;
if (typeof v === "number") return v;
const specialAttrs = new Set(["args", "__ignoreNulls", "__contentAddressed", "__impure"]);
if (typeof v === "object" && HAS_CONTEXT in v && "context" in v) {
if (outContext) {
for (const elem of v.context) {
outContext.add(elem);
}
}
return v.value;
const sortedJsonStringify = (obj: Record<string, JsonValue>): string => {
const sortedKeys = Object.keys(obj).sort();
const sortedObj: Record<string, JsonValue> = {};
for (const key of sortedKeys) {
sortedObj[key] = obj[key];
}
if (typeof v === "bigint") {
const num = Number(v);
if (v > Number.MAX_SAFE_INTEGER || v < Number.MIN_SAFE_INTEGER) {
console.warn(`derivation: integer ${v} exceeds safe range, precision may be lost in __structuredAttrs`);
}
return num;
}
if (typeof v === "object" && v !== null) {
if (seen.has(v)) {
throw new Error("derivation: circular reference detected in __structuredAttrs");
}
seen.add(v);
}
if (Array.isArray(v)) {
return v.map((item) => nixValueToJson(item, seen, outContext));
}
if (typeof v === "object") {
const result: Record<string, any> = {};
for (const [key, val] of Object.entries(v)) {
result[key] = nixValueToJson(val, seen, outContext);
}
return result;
}
if (typeof v === "function") {
throw new Error("derivation: cannot serialize function in __structuredAttrs");
}
throw new Error(`derivation: cannot serialize ${typeof v} to JSON`);
return JSON.stringify(sortedObj);
};
const extractEnv = (
@@ -140,41 +219,73 @@ const extractEnv = (
structuredAttrs: boolean,
ignoreNulls: boolean,
outContext: NixStringContext,
drvName: string,
): Map<string, string> => {
const specialAttrs = new Set([
"name",
"builder",
"system",
"args",
"outputs",
"__structuredAttrs",
"__ignoreNulls",
"__contentAddressed",
"impure",
]);
const env = new Map<string, string>();
if (structuredAttrs) {
const jsonAttrs: Record<string, any> = {};
for (const [key, value] of Object.entries(attrs)) {
if (!specialAttrs.has(key)) {
const jsonAttrs: Record<string, JsonValue> = {};
for (const [key, value] of attrs) {
if (!structuredAttrsExcludedKeys.has(key)) {
const forcedValue = force(value);
if (ignoreNulls && forcedValue === null) {
continue;
}
jsonAttrs[key] = nixValueToJson(value, new Set(), outContext);
jsonAttrs[key] = nixValueToJson(value, true, outContext, true);
}
if (key === "allowedReferences") {
console.warn(
`In a derivation named '${drvName}', 'structuredAttrs' disables the effect of ` +
`the derivation attribute 'allowedReferences'; use ` +
`'outputChecks.<output>.allowedReferences' instead`,
);
}
if (key === "allowedRequisites") {
console.warn(
`In a derivation named '${drvName}', 'structuredAttrs' disables the effect of ` +
`the derivation attribute 'allowedRequisites'; use ` +
`'outputChecks.<output>.allowedRequisites' instead`,
);
}
if (key === "disallowedReferences") {
console.warn(
`In a derivation named '${drvName}', 'structuredAttrs' disables the effect of ` +
`the derivation attribute 'disallowedReferences'; use ` +
`'outputChecks.<output>.disallowedReferences' instead`,
);
}
if (key === "disallowedRequisites") {
console.warn(
`In a derivation named '${drvName}', 'structuredAttrs' disables the effect of ` +
`the derivation attribute 'disallowedRequisites'; use ` +
`'outputChecks.<output>.disallowedRequisites' instead`,
);
}
if (key === "maxSize") {
console.warn(
`In a derivation named '${drvName}', 'structuredAttrs' disables the effect of ` +
`the derivation attribute 'maxSize'; use ` +
`'outputChecks.<output>.maxSize' instead`,
);
}
if (key === "maxClosureSize") {
console.warn(
`In a derivation named '${drvName}', 'structuredAttrs' disables the effect of ` +
`the derivation attribute 'maxClosureSize'; use ` +
`'outputChecks.<output>.maxClosureSize' instead`,
);
}
}
env.set("__json", JSON.stringify(jsonAttrs));
env.set("__json", sortedJsonStringify(jsonAttrs));
} else {
for (const [key, value] of Object.entries(attrs)) {
for (const [key, value] of attrs) {
if (!specialAttrs.has(key)) {
const forcedValue = force(value);
const forcedValue = force(value as NixValue);
if (ignoreNulls && forcedValue === null) {
continue;
}
env.set(key, coerceToString(value, StringCoercionMode.ToString, false, outContext));
env.set(key, coerceToString(value as NixValue, StringCoercionMode.ToString, true, outContext));
}
}
}
@@ -188,20 +299,40 @@ interface FixedOutputInfo {
hashMode: string;
}
const extractFixedOutputInfo = (attrs: NixAttrs): FixedOutputInfo | null => {
if (!("outputHash" in attrs)) {
const extractFixedOutputInfo = (attrs: NixAttrs, ignoreNulls: boolean): FixedOutputInfo | null => {
if (!attrs.has("outputHash")) {
return null;
}
const hash = forceString(attrs.outputHash);
const hashAlgo = "outputHashAlgo" in attrs ? forceString(attrs.outputHashAlgo) : "sha256";
const hashMode = "outputHashMode" in attrs ? forceString(attrs.outputHashMode) : "flat";
const hashValue = force(attrs.get("outputHash") as NixValue);
if (ignoreNulls && hashValue === null) {
return null;
}
const hashRaw = forceStringNoCtx(hashValue);
let hashAlgo = null;
if (attrs.has("outputHashAlgo")) {
const algoValue = force(attrs.get("outputHashAlgo") as NixValue);
if (!(ignoreNulls && algoValue === null)) {
hashAlgo = forceStringNoCtx(algoValue);
}
}
let hashMode = "flat";
if (attrs.has("outputHashMode")) {
const modeValue = force(attrs.get("outputHashMode") as NixValue);
if (!(ignoreNulls && modeValue === null)) {
hashMode = forceStringValue(modeValue);
}
}
if (hashMode !== "flat" && hashMode !== "recursive") {
throw new Error(`derivation: invalid outputHashMode '${hashMode}' (must be 'flat' or 'recursive')`);
}
return { hash, hashAlgo, hashMode };
const parsed = Deno.core.ops.op_parse_hash(hashRaw, hashAlgo);
return { hash: parsed.hex, hashAlgo: parsed.algo, hashMode };
};
const validateFixedOutputConstraints = (fixedOutput: FixedOutputInfo | null, outputs: string[]) => {
@@ -218,148 +349,60 @@ export const derivationStrict = (args: NixValue): NixAttrs => {
const builder = validateBuilder(attrs, collectedContext);
const platform = validateSystem(attrs);
const outputs = extractOutputs(attrs);
const fixedOutputInfo = extractFixedOutputInfo(attrs);
const structuredAttrs = attrs.has("__structuredAttrs")
? force(attrs.get("__structuredAttrs") as NixValue) === true
: false;
const ignoreNulls = attrs.has("__ignoreNulls")
? force(attrs.get("__ignoreNulls") as NixValue) === true
: false;
const outputs = extractOutputs(attrs, structuredAttrs);
const fixedOutputInfo = extractFixedOutputInfo(attrs, ignoreNulls);
validateFixedOutputConstraints(fixedOutputInfo, outputs);
const structuredAttrs = "__structuredAttrs" in attrs ? force(attrs.__structuredAttrs) === true : false;
if (attrs.has("__contentAddressed") && force(attrs.get("__contentAddressed") as NixValue) === true) {
throw new Error("ca derivations are not supported");
}
const ignoreNulls = "__ignoreNulls" in attrs ? force(attrs.__ignoreNulls) === true : false;
if (attrs.has("__impure") && force(attrs.get("__impure") as NixValue) === true) {
throw new Error("impure derivations are not supported");
}
const drvArgs = extractArgs(attrs, collectedContext);
const env = extractEnv(attrs, structuredAttrs, ignoreNulls, collectedContext);
const env = extractEnv(attrs, structuredAttrs, ignoreNulls, collectedContext, drvName);
env.set("name", drvName);
env.set("builder", builder);
env.set("system", platform);
if (outputs.length > 1 || outputs[0] !== "out") {
env.set("outputs", outputs.join(" "));
}
const envEntries: [string, string][] = Array.from(env.entries());
const contextArray: string[] = Array.from(collectedContext);
const { inputDrvs, inputSrcs } = extractInputDrvsAndSrcs(collectedContext);
let outputInfos: Map<string, OutputInfo>;
let drvPath: string;
if (fixedOutputInfo) {
const pathName = Deno.core.ops.op_output_path_name(drvName, "out");
const outPath = Deno.core.ops.op_make_fixed_output_path(
fixedOutputInfo.hashAlgo,
fixedOutputInfo.hash,
fixedOutputInfo.hashMode,
pathName,
);
const hashAlgoPrefix = fixedOutputInfo.hashMode === "recursive" ? "r:" : "";
outputInfos = new Map([
[
"out",
{
path: outPath,
hashAlgo: hashAlgoPrefix + fixedOutputInfo.hashAlgo,
hash: fixedOutputInfo.hash,
},
],
]);
env.set("out", outPath);
const finalDrv: DerivationData = {
name: drvName,
outputs: outputInfos,
inputDrvs,
inputSrcs,
platform,
builder,
args: drvArgs,
env,
};
const finalAterm = generateAterm(finalDrv);
const finalDrvHash = Deno.core.ops.op_sha256_hex(finalAterm);
drvPath = Deno.core.ops.op_make_store_path("text", finalDrvHash, `${drvName}.drv`);
} else {
const maskedOutputs = new Map<string, OutputInfo>(
outputs.map((o) => [
o,
{
path: "",
hashAlgo: "",
hash: "",
},
]),
);
const maskedEnv = new Map(env);
for (const output of outputs) {
maskedEnv.set(output, "");
}
const maskedDrv: DerivationData = {
name: drvName,
outputs: maskedOutputs,
inputDrvs,
inputSrcs,
platform,
builder,
args: drvArgs,
env: maskedEnv,
};
const maskedAterm = generateAterm(maskedDrv);
const drvModuloHash = Deno.core.ops.op_sha256_hex(maskedAterm);
outputInfos = new Map<string, OutputInfo>();
for (const outputName of outputs) {
const pathName = Deno.core.ops.op_output_path_name(drvName, outputName);
const outPath = Deno.core.ops.op_make_store_path(`output:${outputName}`, drvModuloHash, pathName);
outputInfos.set(outputName, {
path: outPath,
hashAlgo: "",
hash: "",
});
env.set(outputName, outPath);
}
const finalDrv: DerivationData = {
...maskedDrv,
outputs: outputInfos,
env,
};
const finalAterm = generateAterm(finalDrv);
const finalDrvHash = Deno.core.ops.op_sha256_hex(finalAterm);
drvPath = Deno.core.ops.op_make_store_path("text", finalDrvHash, `${drvName}.drv`);
}
const result: NixAttrs = {
type: "derivation",
drvPath,
name: drvName,
const rustResult: {
drvPath: string;
outputs: [string, string][];
} = Deno.core.ops.op_finalize_derivation(
drvName,
builder,
system: platform,
};
platform,
outputs,
drvArgs,
envEntries,
contextArray,
fixedOutputInfo,
);
for (const [outputName, outputInfo] of outputInfos.entries()) {
result[outputName] = outputInfo.path;
}
const result: NixAttrs = new Map();
if (outputInfos.has("out")) {
result.outPath = outputInfos.get("out")!.path;
}
const drvPathContext = new Set<string>();
addDrvDeepContext(drvPathContext, rustResult.drvPath);
result.set("drvPath", mkStringWithContext(rustResult.drvPath, drvPathContext));
if (drvArgs.length > 0) {
result.args = drvArgs;
}
if (!structuredAttrs) {
for (const [key, value] of env.entries()) {
if (!["name", "builder", "system", ...outputs].includes(key)) {
result[key] = value;
}
}
for (const [outputName, outputPath] of rustResult.outputs) {
const outputContext = new Set<string>();
addBuiltContext(outputContext, rustResult.drvPath, outputName);
result.set(outputName, mkStringWithContext(outputPath, outputContext));
}
return result;
};
export const derivation = (args: NixValue): NixAttrs => {
return derivationStrict(args);
export const derivationStub = (_: NixValue): NixAttrs => {
throw new Error("unreachable: stub derivation implementation called");
};

View File

@@ -0,0 +1,17 @@
import type { NixValue } from "../types";
export const getFlake = (_attrs: NixValue): never => {
throw new Error("Not implemented: getFlake");
};
export const parseFlakeName = (_s: NixValue): never => {
throw new Error("Not implemented: parseFlakeName");
};
export const parseFlakeRef = (_s: NixValue): never => {
throw new Error("Not implemented: parseFlakeRef");
};
export const flakeRefToString = (_attrs: NixValue): never => {
throw new Error("Not implemented: flakeRefToString");
};

View File

@@ -1,10 +1,8 @@
/**
* Functional programming builtin functions
*/
import { CatchableError, type NixValue } from "../types";
import { printValue } from "../print";
import { force } from "../thunk";
import { forceString } from "../type-assert";
import { CatchableError, type NixValue } from "../types";
import { coerceToString, StringCoercionMode } from "./conversion";
import { isAttrs } from "./type-check";
export const seq =
(e1: NixValue) =>
@@ -15,8 +13,27 @@ export const seq =
export const deepSeq =
(e1: NixValue) =>
(e2: NixValue): never => {
throw new Error("Not implemented: deepSeq");
(e2: NixValue): NixValue => {
const seen: Set<NixValue> = new Set();
const recurse = (e: NixValue) => {
if (!seen.has(e)) {
seen.add(e);
} else {
return;
}
const forced = force(e);
if (Array.isArray(forced)) {
for (const val of forced) {
recurse(val);
}
} else if (isAttrs(forced)) {
for (const [_, val] of forced.entries()) {
recurse(val);
}
}
};
recurse(e1);
return e2;
};
export const abort = (s: NixValue): never => {
@@ -24,13 +41,15 @@ export const abort = (s: NixValue): never => {
};
export const throwFunc = (s: NixValue): never => {
throw new CatchableError(forceString(s));
throw new CatchableError(coerceToString(s, StringCoercionMode.Base, false, new Set()));
};
export const trace = (e1: NixValue, e2: NixValue): NixValue => {
console.log(`trace: ${force(e1)}`);
return e2;
};
export const trace =
(e1: NixValue) =>
(e2: NixValue): NixValue => {
console.error(`trace: ${printValue(force(e1))}`);
return e2;
};
export const warn =
(e1: NixValue) =>

View File

@@ -0,0 +1,34 @@
import { select } from "../helpers";
import { forceAttrs, forceStringNoCtx, forceStringValue } from "../type-assert";
import type { NixValue } from "../types";
import { realisePath } from "./io";
export const hashFile =
(type: NixValue) =>
(p: NixValue): string => {
const algo = forceStringNoCtx(type);
const pathStr = realisePath(p);
return Deno.core.ops.op_hash_file(algo, pathStr);
};
export const hashString =
(type: NixValue) =>
(s: NixValue): string => {
const algo = forceStringNoCtx(type);
const data = forceStringValue(s);
return Deno.core.ops.op_hash_string(algo, data);
};
export const convertHash = (args: NixValue): string => {
const attrs = forceAttrs(args);
const hash = forceStringNoCtx(select(attrs, ["hash"]));
let hashAlgo: string | null = null;
if (attrs.has("hashAlgo")) {
hashAlgo = forceStringNoCtx(select(attrs, ["hashAlgo"]));
}
const toHashFormat = forceStringNoCtx(select(attrs, ["toHashFormat"]));
return Deno.core.ops.op_convert_hash(hash, hashAlgo, toHashFormat);
};

View File

@@ -1,82 +1,64 @@
/**
* Main builtins export
* Combines all builtin function categories into the global `builtins` object
*/
import { createThunk, force } from "../thunk";
import type { NixAttrs, NixFunction, NixValue } from "../types";
import * as arithmetic from "./arithmetic";
import * as attrs from "./attrs";
import * as conversion from "./conversion";
import * as derivation from "./derivation";
import * as flake from "./flake";
import * as functional from "./functional";
import * as hash from "./hash";
import * as io from "./io";
import * as list from "./list";
import * as math from "./math";
import * as misc from "./misc";
import * as pathOps from "./path";
import * as string from "./string";
import * as typeCheck from "./type-check";
import { createThunk } from "../thunk";
/**
* Symbol used to mark functions as primops (primitive operations)
* This is similar to IS_THUNK but for builtin functions
*/
export const PRIMOP_METADATA = Symbol("primop_metadata");
/**
* Metadata interface for primop functions
*/
export interface PrimopMetadata {
/** The name of the primop (e.g., "add", "map") */
name: string;
/** Total arity of the function (number of arguments it expects) */
arity: number;
/** Number of arguments already applied (for partial applications) */
applied: number;
}
/**
* Mark a function as a primop with metadata
* For curried functions, this recursively marks each layer
*
* @param func - The function to mark
* @param name - Name of the primop
* @param arity - Total number of arguments expected
* @param applied - Number of arguments already applied (default: 0)
* @returns The marked function
*/
export const mkPrimop = <T extends Function>(
func: T,
export const mkPrimop = (
func: NixFunction,
name: string,
arity: number,
applied: number = 0,
): T => {
// Mark this function as a primop
(func as any)[PRIMOP_METADATA] = {
): ((...args: NixValue[]) => NixValue) => {
func[PRIMOP_METADATA] = {
name,
arity,
applied,
} as PrimopMetadata;
} satisfies PrimopMetadata;
// If this is a curried function and not fully applied,
// wrap it to mark the next layer too
if (applied < arity - 1) {
const wrappedFunc = ((...args: any[]) => {
const result = func(...args);
// If result is a function, mark it as the next layer
const wrappedFunc: NixFunction = ((arg: NixValue) => {
const result = func(arg);
if (typeof result === "function") {
return mkPrimop(result, name, arity, applied + args.length);
return mkPrimop(result, name, arity, applied + 1);
}
return result;
}) as any;
});
// Copy the primop metadata to the wrapper
wrappedFunc[PRIMOP_METADATA] = {
name,
arity,
applied,
} as PrimopMetadata;
};
return wrappedFunc as T;
return wrappedFunc;
}
return func;
};
/**
* Type guard to check if a value is a primop
* @param value - Value to check
* @returns true if value is marked as a primop
*/
export const is_primop = (value: unknown): value is Function & { [PRIMOP_METADATA]: PrimopMetadata } => {
export const isPrimop = (
value: NixValue,
): value is NixFunction & { [PRIMOP_METADATA]: PrimopMetadata } => {
return (
typeof value === "function" &&
PRIMOP_METADATA in value &&
@@ -85,180 +67,158 @@ export const is_primop = (value: unknown): value is Function & { [PRIMOP_METADAT
);
};
/**
* Get primop metadata from a function
* @param func - Function to get metadata from
* @returns Metadata if function is a primop, undefined otherwise
*/
export const get_primop_metadata = (func: unknown): PrimopMetadata | undefined => {
if (is_primop(func)) {
export const getPrimopMetadata = (func: NixValue): PrimopMetadata | undefined => {
if (isPrimop(func)) {
return func[PRIMOP_METADATA];
}
return undefined;
};
// Import all builtin categories
import * as arithmetic from "./arithmetic";
import * as math from "./math";
import * as typeCheck from "./type-check";
import * as list from "./list";
import * as attrs from "./attrs";
import * as string from "./string";
import * as functional from "./functional";
import * as io from "./io";
import * as conversion from "./conversion";
import * as misc from "./misc";
import * as derivation from "./derivation";
export const builtins: NixAttrs = new Map<string, NixValue>(
Object.entries({
add: mkPrimop(arithmetic.add, "add", 2),
sub: mkPrimop(arithmetic.sub, "sub", 2),
mul: mkPrimop(arithmetic.mul, "mul", 2),
div: mkPrimop(arithmetic.div, "div", 2),
bitAnd: mkPrimop(arithmetic.bitAnd, "bitAnd", 2),
bitOr: mkPrimop(arithmetic.bitOr, "bitOr", 2),
bitXor: mkPrimop(arithmetic.bitXor, "bitXor", 2),
lessThan: mkPrimop(arithmetic.lessThan, "lessThan", 2),
/**
* The global builtins object
* Contains 80+ Nix builtin functions plus metadata
*
* All functions are curried for Nix semantics:
* - Single argument functions: (a) => result
* - Multi-argument functions: (a) => (b) => result
*
* All primop functions are marked with PRIMOP_METADATA symbol for runtime introspection
*/
export const builtins: any = {
add: mkPrimop(arithmetic.add, "add", 2),
sub: mkPrimop(arithmetic.sub, "sub", 2),
mul: mkPrimop(arithmetic.mul, "mul", 2),
div: mkPrimop(arithmetic.div, "div", 2),
bitAnd: mkPrimop(arithmetic.bitAnd, "bitAnd", 2),
bitOr: mkPrimop(arithmetic.bitOr, "bitOr", 2),
bitXor: mkPrimop(arithmetic.bitXor, "bitXor", 2),
lessThan: mkPrimop(arithmetic.lessThan, "lessThan", 2),
ceil: mkPrimop(math.ceil, "ceil", 1),
floor: mkPrimop(math.floor, "floor", 1),
ceil: mkPrimop(math.ceil, "ceil", 1),
floor: mkPrimop(math.floor, "floor", 1),
isAttrs: mkPrimop((e: NixValue) => typeCheck.isAttrs(force(e)), "isAttrs", 1),
isBool: mkPrimop((e: NixValue) => typeCheck.isBool(force(e)), "isBool", 1),
isFloat: mkPrimop((e: NixValue) => typeCheck.isFloat(force(e)), "isFloat", 1),
isFunction: mkPrimop((e: NixValue) => typeCheck.isFunction(force(e)), "isFunction", 1),
isInt: mkPrimop((e: NixValue) => typeCheck.isInt(force(e)), "isInt", 1),
isList: mkPrimop((e: NixValue) => typeCheck.isList(force(e)), "isList", 1),
isNull: mkPrimop((e: NixValue) => typeCheck.isNull(force(e)), "isNull", 1),
isPath: mkPrimop((e: NixValue) => typeCheck.isPath(force(e)), "isPath", 1),
isString: mkPrimop((e: NixValue) => typeCheck.isString(force(e)), "isString", 1),
typeOf: mkPrimop((e: NixValue) => typeCheck.typeOf(force(e)), "typeOf", 1),
isAttrs: mkPrimop(typeCheck.isAttrs, "isAttrs", 1),
isBool: mkPrimop(typeCheck.isBool, "isBool", 1),
isFloat: mkPrimop(typeCheck.isFloat, "isFloat", 1),
isFunction: mkPrimop(typeCheck.isFunction, "isFunction", 1),
isInt: mkPrimop(typeCheck.isInt, "isInt", 1),
isList: mkPrimop(typeCheck.isList, "isList", 1),
isNull: mkPrimop(typeCheck.isNull, "isNull", 1),
isPath: mkPrimop(typeCheck.isPath, "isPath", 1),
isString: mkPrimop(typeCheck.isString, "isString", 1),
typeOf: mkPrimop(typeCheck.typeOf, "typeOf", 1),
map: mkPrimop(list.map, "map", 2),
filter: mkPrimop(list.filter, "filter", 2),
length: mkPrimop(list.length, "length", 1),
head: mkPrimop(list.head, "head", 1),
tail: mkPrimop(list.tail, "tail", 1),
elem: mkPrimop(list.elem, "elem", 2),
elemAt: mkPrimop(list.elemAt, "elemAt", 2),
concatLists: mkPrimop(list.concatLists, "concatLists", 1),
concatMap: mkPrimop(list.concatMap, "concatMap", 2),
"foldl'": mkPrimop(list.foldlPrime, "foldl'", 3),
sort: mkPrimop(list.sort, "sort", 2),
partition: mkPrimop(list.partition, "partition", 2),
genList: mkPrimop(list.genList, "genList", 2),
all: mkPrimop(list.all, "all", 2),
any: mkPrimop(list.any, "any", 2),
map: mkPrimop(list.map, "map", 2),
filter: mkPrimop(list.filter, "filter", 2),
length: mkPrimop(list.length, "length", 1),
head: mkPrimop(list.head, "head", 1),
tail: mkPrimop(list.tail, "tail", 1),
elem: mkPrimop(list.elem, "elem", 2),
elemAt: mkPrimop(list.elemAt, "elemAt", 2),
concatLists: mkPrimop(list.concatLists, "concatLists", 1),
concatMap: mkPrimop(list.concatMap, "concatMap", 2),
"foldl'": mkPrimop(list.foldlPrime, "foldl'", 3),
sort: mkPrimop(list.sort, "sort", 2),
partition: mkPrimop(list.partition, "partition", 2),
genList: mkPrimop(list.genList, "genList", 2),
all: mkPrimop(list.all, "all", 2),
any: mkPrimop(list.any, "any", 2),
attrNames: mkPrimop(attrs.attrNames, "attrNames", 1),
attrValues: mkPrimop(attrs.attrValues, "attrValues", 1),
getAttr: mkPrimop(attrs.getAttr, "getAttr", 2),
hasAttr: mkPrimop(attrs.hasAttr, "hasAttr", 2),
mapAttrs: mkPrimop(attrs.mapAttrs, "mapAttrs", 2),
removeAttrs: mkPrimop(attrs.removeAttrs, "removeAttrs", 2),
listToAttrs: mkPrimop(attrs.listToAttrs, "listToAttrs", 1),
intersectAttrs: mkPrimop(attrs.intersectAttrs, "intersectAttrs", 2),
catAttrs: mkPrimop(attrs.catAttrs, "catAttrs", 2),
groupBy: mkPrimop(attrs.groupBy, "groupBy", 2),
zipAttrsWith: mkPrimop(attrs.zipAttrsWith, "zipAttrsWith", 2),
unsafeGetAttrPos: mkPrimop(attrs.unsafeGetAttrPos, "unsafeGetAttrPos", 2),
attrNames: mkPrimop(attrs.attrNames, "attrNames", 1),
attrValues: mkPrimop(attrs.attrValues, "attrValues", 1),
getAttr: mkPrimop(attrs.getAttr, "getAttr", 2),
hasAttr: mkPrimop(attrs.hasAttr, "hasAttr", 2),
mapAttrs: mkPrimop(attrs.mapAttrs, "mapAttrs", 2),
removeAttrs: mkPrimop(attrs.removeAttrs, "removeAttrs", 2),
listToAttrs: mkPrimop(attrs.listToAttrs, "listToAttrs", 1),
intersectAttrs: mkPrimop(attrs.intersectAttrs, "intersectAttrs", 2),
catAttrs: mkPrimop(attrs.catAttrs, "catAttrs", 2),
groupBy: mkPrimop(attrs.groupBy, "groupBy", 2),
stringLength: mkPrimop(string.stringLength, "stringLength", 1),
substring: mkPrimop(string.substring, "substring", 3),
concatStringsSep: mkPrimop(string.concatStringsSep, "concatStringsSep", 2),
baseNameOf: mkPrimop(pathOps.baseNameOf, "baseNameOf", 1),
dirOf: mkPrimop(pathOps.dirOf, "dirOf", 1),
toPath: mkPrimop(pathOps.toPath, "toPath", 1),
match: mkPrimop(string.match, "match", 2),
split: mkPrimop(string.split, "split", 2),
stringLength: mkPrimop(string.stringLength, "stringLength", 1),
substring: mkPrimop(string.substring, "substring", 3),
concatStringsSep: mkPrimop(string.concatStringsSep, "concatStringsSep", 2),
baseNameOf: mkPrimop(string.baseNameOf, "baseNameOf", 1),
seq: mkPrimop(functional.seq, "seq", 2),
deepSeq: mkPrimop(functional.deepSeq, "deepSeq", 2),
abort: mkPrimop(functional.abort, "abort", 1),
throw: mkPrimop(functional.throwFunc, "throw", 1),
trace: mkPrimop(functional.trace, "trace", 2),
warn: mkPrimop(functional.warn, "warn", 2),
break: mkPrimop(functional.breakFunc, "break", 1),
seq: mkPrimop(functional.seq, "seq", 2),
deepSeq: mkPrimop(functional.deepSeq, "deepSeq", 2),
abort: mkPrimop(functional.abort, "abort", 1),
throw: mkPrimop(functional.throwFunc, "throw", 1),
trace: mkPrimop(functional.trace, "trace", 2),
warn: mkPrimop(functional.warn, "warn", 2),
break: mkPrimop(functional.breakFunc, "break", 1),
derivation: mkPrimop(derivation.derivationStub, "derivation", 1),
derivationStrict: mkPrimop(derivation.derivationStrict, "derivationStrict", 1),
derivation: mkPrimop(derivation.derivation, "derivation", 1),
derivationStrict: mkPrimop(derivation.derivationStrict, "derivationStrict", 1),
import: mkPrimop(io.importFunc, "import", 1),
scopedImport: mkPrimop(io.scopedImport, "scopedImport", 2),
storePath: mkPrimop(io.storePath, "storePath", 1),
fetchClosure: mkPrimop(io.fetchClosure, "fetchClosure", 1),
fetchMercurial: mkPrimop(io.fetchMercurial, "fetchMercurial", 1),
fetchGit: mkPrimop(io.fetchGit, "fetchGit", 1),
fetchTarball: mkPrimop(io.fetchTarball, "fetchTarball", 1),
fetchTree: mkPrimop(io.fetchTree, "fetchTree", 1),
fetchurl: mkPrimop(io.fetchurl, "fetchurl", 1),
readDir: mkPrimop(io.readDir, "readDir", 1),
readFile: mkPrimop(io.readFile, "readFile", 1),
readFileType: mkPrimop(io.readFileType, "readFileType", 1),
pathExists: mkPrimop(io.pathExists, "pathExists", 1),
path: mkPrimop(io.path, "path", 1),
toFile: mkPrimop(io.toFile, "toFile", 2),
filterSource: mkPrimop(io.filterSource, "filterSource", 2),
findFile: mkPrimop(io.findFile, "findFile", 2),
getEnv: mkPrimop(io.getEnv, "getEnv", 1),
import: mkPrimop(io.importFunc, "import", 1),
scopedImport: mkPrimop(io.scopedImport, "scopedImport", 2),
storePath: mkPrimop(io.storePath, "storePath", 1),
fetchClosure: mkPrimop(io.fetchClosure, "fetchClosure", 1),
fetchMercurial: mkPrimop(io.fetchMercurial, "fetchMercurial", 1),
fetchGit: mkPrimop(io.fetchGit, "fetchGit", 1),
fetchTarball: mkPrimop(io.fetchTarball, "fetchTarball", 1),
fetchTree: mkPrimop(io.fetchTree, "fetchTree", 1),
fetchurl: mkPrimop(io.fetchurl, "fetchurl", 1),
readDir: mkPrimop(io.readDir, "readDir", 1),
readFile: mkPrimop(io.readFile, "readFile", 1),
readFileType: mkPrimop(io.readFileType, "readFileType", 1),
pathExists: mkPrimop(io.pathExists, "pathExists", 1),
path: mkPrimop(io.path, "path", 1),
toFile: mkPrimop(io.toFile, "toFile", 2),
toPath: mkPrimop(io.toPath, "toPath", 1),
filterSource: mkPrimop(io.filterSource, "filterSource", 2),
findFile: mkPrimop(io.findFile, "findFile", 2),
getEnv: mkPrimop(io.getEnv, "getEnv", 1),
fromJSON: mkPrimop(conversion.fromJSON, "fromJSON", 1),
fromTOML: mkPrimop(conversion.fromTOML, "fromTOML", 1),
toJSON: mkPrimop(conversion.toJSON, "toJSON", 1),
toXML: mkPrimop(conversion.toXML, "toXML", 1),
toString: mkPrimop(conversion.toStringFunc, "toString", 1),
fromJSON: mkPrimop(conversion.fromJSON, "fromJSON", 1),
fromTOML: mkPrimop(conversion.fromTOML, "fromTOML", 1),
toJSON: mkPrimop(conversion.toJSON, "toJSON", 1),
toXML: mkPrimop(conversion.toXML, "toXML", 1),
toString: mkPrimop(conversion.toStringFunc, "toString", 1),
hashFile: mkPrimop(hash.hashFile, "hashFile", 2),
hashString: mkPrimop(hash.hashString, "hashString", 2),
convertHash: mkPrimop(hash.convertHash, "convertHash", 2),
addErrorContext: mkPrimop(misc.addErrorContext, "addErrorContext", 1),
appendContext: mkPrimop(misc.appendContext, "appendContext", 1),
getContext: mkPrimop(misc.getContext, "getContext", 1),
hasContext: mkPrimop(misc.hasContext, "hasContext", 1),
hashFile: mkPrimop(misc.hashFile, "hashFile", 2),
hashString: mkPrimop(misc.hashString, "hashString", 2),
convertHash: mkPrimop(misc.convertHash, "convertHash", 2),
unsafeDiscardOutputDependency: mkPrimop(
misc.unsafeDiscardOutputDependency,
"unsafeDiscardOutputDependency",
1,
),
unsafeDiscardStringContext: mkPrimop(misc.unsafeDiscardStringContext, "unsafeDiscardStringContext", 1),
unsafeGetAttrPos: mkPrimop(misc.unsafeGetAttrPos, "unsafeGetAttrPos", 2),
addDrvOutputDependencies: mkPrimop(misc.addDrvOutputDependencies, "addDrvOutputDependencies", 2),
compareVersions: mkPrimop(misc.compareVersions, "compareVersions", 2),
dirOf: mkPrimop(misc.dirOf, "dirOf", 1),
flakeRefToString: mkPrimop(misc.flakeRefToString, "flakeRefToString", 1),
functionArgs: mkPrimop(misc.functionArgs, "functionArgs", 1),
genericClosure: mkPrimop(misc.genericClosure, "genericClosure", 1),
getFlake: mkPrimop(misc.getFlake, "getFlake", 1),
match: mkPrimop(misc.match, "match", 2),
outputOf: mkPrimop(misc.outputOf, "outputOf", 2),
parseDrvName: mkPrimop(misc.parseDrvName, "parseDrvName", 1),
parseFlakeName: mkPrimop(misc.parseFlakeName, "parseFlakeName", 1),
parseFlakeRef: mkPrimop(misc.parseFlakeRef, "parseFlakeRef", 1),
placeholder: mkPrimop(misc.placeholder, "placeholder", 1),
replaceStrings: mkPrimop(misc.replaceStrings, "replaceStrings", 3),
split: mkPrimop(misc.split, "split", 2),
splitVersion: mkPrimop(misc.splitVersion, "splitVersion", 1),
traceVerbose: mkPrimop(misc.traceVerbose, "traceVerbose", 2),
tryEval: mkPrimop(misc.tryEval, "tryEval", 1),
zipAttrsWith: mkPrimop(misc.zipAttrsWith, "zipAttrsWith", 2),
flakeRefToString: mkPrimop(flake.flakeRefToString, "flakeRefToString", 1),
getFlake: mkPrimop(flake.getFlake, "getFlake", 1),
parseFlakeName: mkPrimop(flake.parseFlakeName, "parseFlakeName", 1),
parseFlakeRef: mkPrimop(flake.parseFlakeRef, "parseFlakeRef", 1),
builtins: createThunk(() => builtins),
currentSystem: createThunk(() => {
throw new Error("Not implemented: currentSystem");
addErrorContext: mkPrimop(misc.addErrorContext, "addErrorContext", 1),
appendContext: mkPrimop(misc.appendContext, "appendContext", 1),
getContext: mkPrimop(misc.getContext, "getContext", 1),
hasContext: mkPrimop(misc.hasContext, "hasContext", 1),
unsafeDiscardOutputDependency: mkPrimop(
misc.unsafeDiscardOutputDependency,
"unsafeDiscardOutputDependency",
1,
),
unsafeDiscardStringContext: mkPrimop(misc.unsafeDiscardStringContext, "unsafeDiscardStringContext", 1),
addDrvOutputDependencies: mkPrimop(misc.addDrvOutputDependencies, "addDrvOutputDependencies", 2),
compareVersions: mkPrimop(misc.compareVersions, "compareVersions", 2),
functionArgs: mkPrimop(misc.functionArgs, "functionArgs", 1),
genericClosure: mkPrimop(misc.genericClosure, "genericClosure", 1),
outputOf: mkPrimop(misc.outputOf, "outputOf", 2),
parseDrvName: mkPrimop(misc.parseDrvName, "parseDrvName", 1),
placeholder: mkPrimop(misc.placeholder, "placeholder", 1),
replaceStrings: mkPrimop(misc.replaceStrings, "replaceStrings", 3),
splitVersion: mkPrimop(misc.splitVersion, "splitVersion", 1),
traceVerbose: mkPrimop(misc.traceVerbose, "traceVerbose", 2),
tryEval: mkPrimop(misc.tryEval, "tryEval", 1),
builtins: createThunk(() => builtins, "builtins"),
currentSystem: createThunk(() => {
return "x86_64-linux";
}, "currentSystem"),
currentTime: createThunk(() => Date.now(), "currentTime"),
false: false,
true: true,
null: null,
langVersion: 6,
nixPath: [],
nixVersion: "2.31.2",
storeDir: createThunk(() => {
throw new Error("stub storeDir evaluated");
}),
}),
currentTime: createThunk(() => Date.now()),
false: false,
true: true,
null: null,
langVersion: 6,
nixPath: [],
nixVersion: "NIX_JS_VERSION",
storeDir: "/nix/store",
};
);

View File

@@ -1,98 +1,486 @@
/**
* I/O and filesystem builtin functions
* Implemented via Rust ops exposed through deno_core
*/
import { select } from "../helpers";
import { getPathValue } from "../path";
import type { NixStringContext, StringWithContext } from "../string-context";
import { addOpaqueContext, decodeContextElem, mkStringWithContext } from "../string-context";
import { force } from "../thunk";
import {
forceAttrs,
forceBool,
forceFunction,
forceList,
forceStringNoCtx,
forceStringValue,
} from "../type-assert";
import type { NixAttrs, NixString, NixValue } from "../types";
import { CatchableError, isNixPath, NixPath } from "../types";
import { coerceToPath, coerceToString, StringCoercionMode } from "./conversion";
import { baseNameOf } from "./path";
import { isAttrs, isPath, isString } from "./type-check";
import { execBytecode, execBytecodeScoped } from "../vm";
import { forceString } from "../type-assert";
import type { NixValue } from "../types";
const importCache = new Map<string, NixValue>();
// Declare Deno.core.ops global (provided by deno_core runtime)
const realiseContext = (context: NixStringContext): void => {
for (const encoded of context) {
const elem = decodeContextElem(encoded);
if (elem.type === "built") {
throw new Error(
`cannot build derivation '${elem.drvPath}' during evaluation because import-from-derivation is not supported`,
);
}
}
};
export const realisePath = (value: NixValue): string => {
const context: NixStringContext = new Set();
const pathStr = coerceToPath(value, context);
if (context.size > 0) {
realiseContext(context);
}
return pathStr;
};
export const importFunc = (path: NixValue): NixValue => {
// For MVP: only support string paths
// TODO: After implementing path type, also accept path values
const pathStr = forceString(path);
const pathStr = realisePath(path);
// Call Rust op - returns JS code string
const code = Deno.core.ops.op_import(pathStr);
const cached = importCache.get(pathStr);
if (cached !== undefined) {
return cached;
}
return Function(`return (${code})`)();
const [code, currentDir] = Deno.core.ops.op_import(pathStr);
const result = execBytecode(code, currentDir);
importCache.set(pathStr, result);
return result;
};
export const scopedImport =
(scope: NixValue) =>
(path: NixValue): never => {
throw new Error("Not implemented: scopedImport");
(path: NixValue): NixValue => {
const scopeAttrs = forceAttrs(scope);
const scopeKeys = Array.from(scopeAttrs.keys());
const pathStr = realisePath(path);
const [code, currentDir] = Deno.core.ops.op_scoped_import(pathStr, scopeKeys);
return execBytecodeScoped(code, currentDir, scopeAttrs);
};
export const storePath = (args: NixValue): never => {
throw new Error("Not implemented: storePath");
export const storePath = (pathArg: NixValue): StringWithContext => {
const context: NixStringContext = new Set();
const pathStr = coerceToPath(pathArg, context);
const validatedPath: string = Deno.core.ops.op_store_path(pathStr);
context.add(validatedPath);
return mkStringWithContext(validatedPath, context);
};
export const fetchClosure = (args: NixValue): never => {
export const fetchClosure = (_args: NixValue): never => {
throw new Error("Not implemented: fetchClosure");
};
export const fetchMercurial = (args: NixValue): never => {
export interface FetchUrlResult {
storePath: string;
hash: string;
}
export interface FetchTarballResult {
storePath: string;
narHash: string;
}
export interface FetchGitResult {
outPath: string;
rev: string;
shortRev: string;
revCount: number;
lastModified: number;
lastModifiedDate: string;
submodules: boolean;
narHash: string | null;
}
const normalizeUrlInput = (
args: NixValue,
): { url: string; hash?: string; name?: string; executable?: boolean } => {
const forced = force(args);
if (typeof forced === "string") {
return { url: forced };
}
const attrs = forceAttrs(args);
const url = forceStringValue(select(attrs, ["url"]));
const hash = attrs.has("sha256")
? forceStringValue(attrs.get("sha256") as NixValue)
: attrs.has("hash")
? forceStringValue(attrs.get("hash") as NixValue)
: undefined;
const name = attrs.has("name") ? forceStringValue(attrs.get("name") as NixValue) : undefined;
const executable = attrs.has("executable") ? forceBool(attrs.get("executable") as NixValue) : false;
return { url, hash, name, executable };
};
const normalizeTarballInput = (args: NixValue): { url: string; sha256?: string; name?: string } => {
const forced = force(args);
if (isAttrs(forced)) {
const url = resolvePseudoUrl(forceStringNoCtx(select(forced, ["url"])));
const sha256 = forced.has("sha256") ? forceStringNoCtx(forced.get("sha256") as NixValue) : undefined;
const nameRaw = forced.has("name") ? forceStringNoCtx(forced.get("name") as NixValue) : undefined;
const name = nameRaw === "" ? (baseNameOf(nameRaw) as string) : nameRaw;
return { url, sha256, name };
} else {
return { url: forceStringNoCtx(forced) };
}
};
const resolvePseudoUrl = (url: string) => {
if (url.startsWith("channel:")) {
return `https://channels.nixos.org/${url.substring(8)}/nixexprs.tar.xz`;
} else {
return url;
}
};
export const fetchurl = (args: NixValue): NixString => {
const { url, hash, name, executable } = normalizeUrlInput(args);
const result: FetchUrlResult = Deno.core.ops.op_fetch_url(
url,
hash ?? null,
name ?? null,
executable ?? false,
);
const context: NixStringContext = new Set();
addOpaqueContext(context, result.storePath);
return mkStringWithContext(result.storePath, context);
};
export const fetchTarball = (args: NixValue): NixString => {
const { url, name, sha256 } = normalizeTarballInput(args);
const result: FetchTarballResult = Deno.core.ops.op_fetch_tarball(url, name ?? null, sha256 ?? null);
const context: NixStringContext = new Set();
addOpaqueContext(context, result.storePath);
return mkStringWithContext(result.storePath, context);
};
export const fetchGit = (args: NixValue): NixAttrs => {
const forced = force(args);
const disposedContext: NixStringContext = new Set();
if (isString(forced) || isPath(forced)) {
const url = coerceToString(forced, StringCoercionMode.Base, false, disposedContext);
const result = Deno.core.ops.op_fetch_git(url, null, null, false, false, false, null);
const outContext: NixStringContext = new Set();
addOpaqueContext(outContext, result.outPath);
return new Map<string, NixValue>([
["outPath", mkStringWithContext(result.outPath, outContext)],
["rev", result.rev],
["shortRev", result.shortRev],
["revCount", BigInt(result.revCount)],
["lastModified", BigInt(result.lastModified)],
["lastModifiedDate", result.lastModifiedDate],
["submodules", result.submodules],
["narHash", result.narHash],
]);
}
const attrs = forceAttrs(args);
const url = forceStringValue(select(attrs, ["url"]));
const gitRef = attrs.has("ref") ? forceStringValue(attrs.get("ref") as NixValue) : null;
const rev = attrs.has("rev") ? forceStringValue(attrs.get("rev") as NixValue) : null;
const shallow = attrs.has("shallow") ? forceBool(attrs.get("shallow") as NixValue) : false;
const submodules = attrs.has("submodules") ? forceBool(attrs.get("submodules") as NixValue) : false;
const allRefs = attrs.has("allRefs") ? forceBool(attrs.get("allRefs") as NixValue) : false;
const name = attrs.has("name") ? forceStringValue(attrs.get("name") as NixValue) : null;
const result: FetchGitResult = Deno.core.ops.op_fetch_git(
url,
gitRef,
rev,
shallow,
submodules,
allRefs,
name,
);
const outContext: NixStringContext = new Set();
addOpaqueContext(outContext, result.outPath);
return new Map<string, NixValue>([
["outPath", mkStringWithContext(result.outPath, outContext)],
["rev", result.rev],
["shortRev", result.shortRev],
["revCount", BigInt(result.revCount)],
["lastModified", BigInt(result.lastModified)],
["lastModifiedDate", result.lastModifiedDate],
["submodules", result.submodules],
["narHash", result.narHash],
]);
};
export const fetchMercurial = (_args: NixValue): NixAttrs => {
throw new Error("Not implemented: fetchMercurial");
};
export const fetchGit = (args: NixValue): never => {
throw new Error("Not implemented: fetchGit");
export const fetchTree = (args: NixValue): NixAttrs => {
const attrs = forceAttrs(args);
const type = attrs.has("type") ? forceStringValue(attrs.get("type") as NixValue) : "auto";
switch (type) {
case "git":
return fetchGit(args);
case "hg":
case "mercurial":
return fetchMercurial(args);
case "tarball":
return new Map<string, NixValue>([["outPath", fetchTarball(args)]]);
case "file":
return new Map<string, NixValue>([["outPath", fetchurl(args)]]);
case "path": {
const path = forceStringValue(select(attrs, ["path"]));
return new Map<string, NixValue>([["outPath", path]]);
}
case "github":
case "gitlab":
case "sourcehut":
return fetchGitForge(type, attrs);
default:
return autoDetectAndFetch(attrs);
}
};
export const fetchTarball = (args: NixValue): never => {
throw new Error("Not implemented: fetchTarball");
const fetchGitForge = (forge: string, attrs: NixAttrs): NixAttrs => {
const owner = forceStringValue(select(forge, ["owner"]));
const repo = forceStringValue(select(forge, ["repo"]));
const rev = attrs.has("rev")
? forceStringValue(attrs.get("rev") as NixValue)
: attrs.has("ref")
? forceStringValue(attrs.get("ref") as NixValue)
: "HEAD";
const host = attrs.has("host") ? forceStringValue(attrs.get("host") as NixValue) : undefined;
let tarballUrl: string;
switch (forge) {
case "github": {
const apiHost = host || "github.com";
tarballUrl = `https://api.${apiHost}/repos/${owner}/${repo}/tarball/${rev}`;
break;
}
case "gitlab": {
const glHost = host || "gitlab.com";
tarballUrl = `https://${glHost}/api/v4/projects/${owner}%2F${repo}/repository/archive.tar.gz?sha=${rev}`;
break;
}
case "sourcehut": {
const shHost = host || "git.sr.ht";
tarballUrl = `https://${shHost}/${owner}/${repo}/archive/${rev}.tar.gz`;
break;
}
default:
throw new Error(`Unknown forge type: ${forge}`);
}
const outPath = fetchTarball(new Map<string, NixValue>([["url", tarballUrl], ...attrs]));
return new Map<string, NixValue>([
["outPath", outPath],
["rev", rev],
["shortRev", rev.substring(0, 7)],
]);
};
export const fetchTree = (args: NixValue): never => {
throw new Error("Not implemented: fetchTree");
const autoDetectAndFetch = (attrs: NixAttrs): NixAttrs => {
const url = forceStringValue(select(attrs, ["url"]));
if (url.endsWith(".git") || url.includes("github.com") || url.includes("gitlab.com")) {
return fetchGit(attrs);
}
if (
url.endsWith(".tar.gz") ||
url.endsWith(".tar.xz") ||
url.endsWith(".tar.bz2") ||
url.endsWith(".tgz")
) {
return new Map<string, NixValue>([["outPath", fetchTarball(attrs)]]);
}
return new Map<string, NixValue>([["outPath", fetchurl(attrs)]]);
};
export const fetchurl = (args: NixValue): never => {
throw new Error("Not implemented: fetchurl");
};
export const readDir = (path: NixValue): never => {
throw new Error("Not implemented: readDir");
export const readDir = (path: NixValue): NixAttrs => {
const pathStr = realisePath(path);
return Deno.core.ops.op_read_dir(pathStr);
};
export const readFile = (path: NixValue): string => {
const pathStr = forceString(path);
const pathStr = realisePath(path);
return Deno.core.ops.op_read_file(pathStr);
};
export const readFileType = (path: NixValue): never => {
throw new Error("Not implemented: readFileType");
export const readFileType = (path: NixValue): string => {
const pathStr = realisePath(path);
return Deno.core.ops.op_read_file_type(pathStr);
};
export const pathExists = (path: NixValue): boolean => {
const pathStr = forceString(path);
return Deno.core.ops.op_path_exists(pathStr);
try {
const pathStr = realisePath(path);
return Deno.core.ops.op_path_exists(pathStr);
} catch {
return false;
}
};
export const path = (args: NixValue): never => {
throw new Error("Not implemented: path");
/**
* builtins.path
* Add a path to the Nix store with fine-grained control
*
* Parameters (attribute set):
* - path (required): Path to add to the store
* - name (optional): Name to use in store path (defaults to basename)
* - filter (optional): Function (path, type) -> bool
* - recursive (optional): Boolean, default true (NAR vs flat hashing)
* - sha256 (optional): Expected SHA-256 hash (hex-encoded)
*
* Returns: Store path string
*/
export const path = (args: NixValue): NixString => {
const attrs = forceAttrs(args);
if (!attrs.has("path")) {
throw new TypeError("builtins.path: 'path' attribute is required");
}
const pathValue = force(attrs.get("path") as NixValue);
let pathStr: string;
if (isNixPath(pathValue)) {
pathStr = getPathValue(pathValue);
} else {
pathStr = forceStringValue(pathValue);
}
const name = attrs.has("name") ? forceStringValue(attrs.get("name") as NixValue) : null;
const recursive = attrs.has("recursive") ? forceBool(attrs.get("recursive") as NixValue) : true;
const sha256 = attrs.has("sha256") ? forceStringValue(attrs.get("sha256") as NixValue) : null;
let storePath: string;
if (attrs.has("filter")) {
const filterFn = forceFunction(attrs.get("filter") as NixValue);
const entries: [string, string][] = Deno.core.ops.op_walk_dir(pathStr);
const includePaths: string[] = [];
for (const [relPath, fileType] of entries) {
const fullPath = `${pathStr}/${relPath}`;
const innerFn = forceFunction(filterFn(fullPath));
const shouldInclude = force(innerFn(fileType));
if (shouldInclude === true) {
includePaths.push(relPath);
}
}
storePath = Deno.core.ops.op_add_filtered_path(pathStr, name, recursive, sha256, includePaths);
} else {
storePath = Deno.core.ops.op_add_path(pathStr, name, recursive, sha256);
}
const context: NixStringContext = new Set();
addOpaqueContext(context, storePath);
return mkStringWithContext(storePath, context);
};
export const toFile = (name: NixValue, s: NixValue): never => {
throw new Error("Not implemented: toFile");
};
export const toFile =
(nameArg: NixValue) =>
(contentsArg: NixValue): StringWithContext => {
const name = forceStringValue(nameArg);
export const toPath = (name: NixValue, s: NixValue): never => {
throw new Error("Not implemented: toPath");
};
if (name.includes("/")) {
throw new Error("builtins.toFile: name cannot contain '/'");
}
if (name === "." || name === "..") {
throw new Error("builtins.toFile: invalid name");
}
export const filterSource = (args: NixValue): never => {
throw new Error("Not implemented: filterSource");
const context: NixStringContext = new Set();
const contents = coerceToString(contentsArg, StringCoercionMode.ToString, false, context);
const references: string[] = Array.from(context);
const storePath: string = Deno.core.ops.op_to_file(name, contents, references);
return mkStringWithContext(storePath, new Set([storePath]));
};
export const filterSource =
(_filter: NixValue) =>
(_path: NixValue): never => {
throw new Error("Not implemented: filterSource");
};
const suffixIfPotentialMatch = (prefix: string, path: string): string | null => {
const n = prefix.length;
const needSeparator = n > 0 && n < path.length;
if (needSeparator && path[n] !== "/") {
return null;
}
if (!path.startsWith(prefix)) {
return null;
}
return needSeparator ? path.substring(n + 1) : path.substring(n);
};
export const findFile =
(search: NixValue) =>
(lookup: NixValue): never => {
throw new Error("Not implemented: findFile");
(searchPath: NixValue) =>
(lookupPath: NixValue): NixPath => {
const forcedSearchPath = forceList(searchPath);
const lookupPathStr = forceStringNoCtx(lookupPath);
for (const item of forcedSearchPath) {
const attrs = forceAttrs(item);
const prefix = attrs.has("prefix") ? forceStringNoCtx(attrs.get("prefix") as NixValue) : "";
if (!attrs.has("path")) {
throw new Error("findFile: search path element is missing 'path' attribute");
}
const suffix = suffixIfPotentialMatch(prefix, lookupPathStr);
if (suffix === null) {
continue;
}
const context: NixStringContext = new Set();
const pathVal = coerceToString(
attrs.get("path") as NixValue,
StringCoercionMode.Interpolation,
false,
context,
);
if (context.size > 0) {
throw new Error("findFile: path with string context is not yet supported");
}
const resolvedPath = Deno.core.ops.op_resolve_path(pathVal, "");
const candidatePath =
suffix.length > 0 ? Deno.core.ops.op_resolve_path(suffix, resolvedPath) : resolvedPath;
if (Deno.core.ops.op_path_exists(candidatePath)) {
return new NixPath(candidatePath);
}
}
if (lookupPathStr.startsWith("nix/")) {
// FIXME: special path type
return new NixPath(`<${lookupPathStr}>`);
}
throw new CatchableError(`file '${lookupPathStr}' was not found in the Nix search path`);
};
export const getEnv = (s: NixValue): never => {
throw new Error("Not implemented: getEnv");
export const getEnv = (s: NixValue): string => {
return Deno.core.ops.op_get_env(forceStringValue(s));
};

View File

@@ -1,21 +1,29 @@
/**
* List operation builtin functions
* All functions are properly curried
*/
import type { NixValue, NixList, NixAttrs } from "../types";
import { op } from "../operators";
import { force } from "../thunk";
import { forceList, forceFunction, forceInt } from "../type-assert";
import { forceBool, forceFunction, forceInt, forceList } from "../type-assert";
import type { NixAttrs, NixList, NixValue } from "../types";
export const map =
(f: NixValue) =>
(list: NixValue): NixList =>
forceList(list).map(forceFunction(f));
(list: NixValue): NixList => {
const forcedList = forceList(list);
if (forcedList.length) {
const func = forceFunction(f);
return forcedList.map(func);
}
return [];
};
export const filter =
(f: NixValue) =>
(list: NixValue): NixList =>
forceList(list).filter(forceFunction(f));
(list: NixValue): NixList => {
const forcedList = forceList(list);
if (forcedList.length) {
const func = forceFunction(f);
return forcedList.filter((e) => forceBool(func(e)));
}
return [];
};
export const length = (e: NixValue): bigint => {
const forced = force(e);
@@ -30,7 +38,7 @@ export const tail = (list: NixValue): NixList => forceList(list).slice(1);
export const elem =
(x: NixValue) =>
(xs: NixValue): boolean =>
forceList(xs).includes(force(x));
forceList(xs).find((e) => op.eq(x, e)) !== undefined;
export const elemAt =
(xs: NixValue) =>
@@ -61,23 +69,23 @@ export const concatMap =
};
export const foldlPrime =
(op_fn: NixValue) =>
(opFn: NixValue) =>
(nul: NixValue) =>
(list: NixValue): NixValue => {
const forced_op = forceFunction(op_fn);
const forcedOp = forceFunction(opFn);
return forceList(list).reduce((acc: NixValue, cur: NixValue) => {
return forceFunction(forced_op(acc))(cur);
return forceFunction(forcedOp(acc))(cur);
}, nul);
};
export const sort =
(cmp: NixValue) =>
(list: NixValue): NixList => {
const forced_list = [...forceList(list)];
const forced_cmp = forceFunction(cmp);
return forced_list.sort((a, b) => {
if (force(forceFunction(forced_cmp(a))(b))) return -1;
if (force(forceFunction(forced_cmp(b))(a))) return 1;
const forcedList = [...forceList(list)];
const forcedCmp = forceFunction(cmp);
return forcedList.sort((a, b) => {
if (force(forceFunction(forcedCmp(a))(b))) return -1;
if (force(forceFunction(forcedCmp(b))(a))) return 1;
return 0;
});
};
@@ -85,20 +93,21 @@ export const sort =
export const partition =
(pred: NixValue) =>
(list: NixValue): NixAttrs => {
const forced_list = forceList(list);
const forced_pred = forceFunction(pred);
const attrs = {
right: [] as NixList,
wrong: [] as NixList,
};
for (const elem of forced_list) {
if (force(forced_pred(elem))) {
attrs.right.push(elem);
const forcedList = forceList(list);
const forcedPred = forceFunction(pred);
const right: NixList = [];
const wrong: NixList = [];
for (const elem of forcedList) {
if (force(forcedPred(elem))) {
right.push(elem);
} else {
attrs.wrong.push(elem);
wrong.push(elem);
}
}
return attrs;
return new Map<string, NixValue>([
["right", right],
["wrong", wrong],
]);
};
export const genList =
@@ -116,10 +125,21 @@ export const genList =
export const all =
(pred: NixValue) =>
(list: NixValue): boolean =>
forceList(list).every(forceFunction(pred));
(list: NixValue): boolean => {
const forcedList = forceList(list);
if (forcedList.length) {
const f = forceFunction(pred);
return forcedList.every((e) => forceBool(f(e)));
}
return true;
};
export const any =
(pred: NixValue) =>
(list: NixValue): boolean =>
forceList(list).some(forceFunction(pred));
(list: NixValue): boolean => {
// CppNix forces `pred` eagerly
const f = forceFunction(pred);
const forcedList = forceList(list);
// `false` when no element
return forcedList.some((e) => forceBool(f(e)));
};

View File

@@ -1,9 +1,5 @@
/**
* Math builtin functions
*/
import type { NixValue } from "../types";
import { forceNumeric } from "../type-assert";
import type { NixValue } from "../types";
export const ceil = (x: NixValue): bigint => {
const val = forceNumeric(x);

View File

@@ -1,16 +1,32 @@
/**
* Miscellaneous builtin functions
*/
import { OrderedSet } from "js-sdsl";
import { select } from "../helpers";
import { compareValues } from "../operators";
import {
getStringContext,
getStringValue,
mkStringWithContext,
type NixStringContext,
} from "../string-context";
import { force } from "../thunk";
import { CatchableError } from "../types";
import type { NixBool, NixStrictValue, NixValue, NixString } from "../types";
import {
forceAttrs,
forceFunction,
forceList,
forceString,
forceStringNoCtx,
forceStringValue,
} from "../type-assert";
import type { NixAttrs, NixStrictValue, NixValue } from "../types";
import { ATTR_POSITIONS, CatchableError } from "../types";
import * as context from "./context";
import { isBool, isFloat, isInt, isList, isString, typeOf } from "./type-check";
export const addErrorContext =
(e1: NixValue) =>
(e2: NixValue): never => {
throw new Error("Not implemented: addErrorContext");
(_e1: NixValue) =>
(e2: NixValue): NixValue => {
// FIXME:
// console.log("[WARNING]: addErrorContext not implemented");
return e2;
};
export const appendContext = context.appendContext;
@@ -19,125 +35,310 @@ export const getContext = context.getContext;
export const hasContext = context.hasContext;
export const hashFile =
(type: NixValue) =>
(p: NixValue): never => {
throw new Error("Not implemented: hashFile");
};
export const hashString =
(type: NixValue) =>
(p: NixValue): never => {
throw new Error("Not implemented: hashString");
};
export const convertHash = (args: NixValue): never => {
throw new Error("Not implemented: convertHash");
};
export const unsafeDiscardOutputDependency = context.unsafeDiscardOutputDependency;
export const unsafeDiscardStringContext = context.unsafeDiscardStringContext;
export const unsafeGetAttrPos = (s: NixValue): never => {
throw new Error("Not implemented: unsafeGetAttrPos");
};
export const addDrvOutputDependencies = context.addDrvOutputDependencies;
export const compareVersions =
(s1: NixValue) =>
(s2: NixValue): never => {
throw new Error("Not implemented: compareVersions");
(s2: NixValue): NixValue => {
const str1 = forceStringValue(s1);
const str2 = forceStringValue(s2);
let i1 = 0;
let i2 = 0;
while (i1 < str1.length || i2 < str2.length) {
const c1 = nextComponent(str1, i1);
const c2 = nextComponent(str2, i2);
i1 = c1.nextIndex;
i2 = c2.nextIndex;
if (componentsLt(c1.component, c2.component)) {
return -1n;
} else if (componentsLt(c2.component, c1.component)) {
return 1n;
}
}
return 0n;
};
export const dirOf = (s: NixValue): never => {
throw new Error("Not implemented: dirOf");
interface ComponentResult {
component: string;
nextIndex: number;
}
function nextComponent(s: string, startIdx: number): ComponentResult {
let p = startIdx;
// Skip any dots and dashes (component separators)
while (p < s.length && (s[p] === "." || s[p] === "-")) {
p++;
}
if (p >= s.length) {
return { component: "", nextIndex: p };
}
const start = p;
// If the first character is a digit, consume the longest sequence of digits
if (s[p] >= "0" && s[p] <= "9") {
while (p < s.length && s[p] >= "0" && s[p] <= "9") {
p++;
}
} else {
// Otherwise, consume the longest sequence of non-digit, non-separator characters
while (p < s.length && !(s[p] >= "0" && s[p] <= "9") && s[p] !== "." && s[p] !== "-") {
p++;
}
}
return { component: s.substring(start, p), nextIndex: p };
}
function componentsLt(c1: string, c2: string): boolean {
const n1 = c1.match(/^[0-9]+$/) ? BigInt(c1) : null;
const n2 = c2.match(/^[0-9]+$/) ? BigInt(c2) : null;
// Both are numbers: compare numerically
if (n1 !== null && n2 !== null) {
return n1 < n2;
}
// Empty string < number
if (c1 === "" && n2 !== null) {
return true;
}
// Special case: "pre" comes before everything except another "pre"
if (c1 === "pre" && c2 !== "pre") {
return true;
}
if (c2 === "pre") {
return false;
}
// Assume that `2.3a' < `2.3.1'
if (n2 !== null) {
return true;
}
if (n1 !== null) {
return false;
}
// Both are strings: compare lexicographically
return c1 < c2;
}
export const functionArgs = (f: NixValue): NixAttrs => {
const func = forceFunction(f);
if (func.args) {
const ret: NixAttrs = new Map();
for (const key of func.args.required) {
ret.set(key, false);
}
for (const key of func.args.optional) {
ret.set(key, true);
}
const positions = func.args.positions;
if (positions) {
ret[ATTR_POSITIONS] = positions;
}
return ret;
}
return new Map();
};
export const flakeRefToString = (attrs: NixValue): never => {
throw new Error("Not implemented: flakeRefToString");
const checkComparable = (value: NixStrictValue): void => {
if (isString(value) || isInt(value) || isFloat(value) || isBool(value) || isList(value)) {
return;
}
throw new Error(`Unsupported key type for genericClosure: ${typeOf(value)}`);
};
export const functionArgs = (f: NixValue): never => {
throw new Error("Not implemented: functionArgs");
};
export const genericClosure = (args: NixValue): NixValue => {
const forcedArgs = forceAttrs(args);
const startSet = select(forcedArgs, ["startSet"]);
const operator = select(forcedArgs, ["operator"]);
export const genericClosure = (args: NixValue): never => {
throw new Error("Not implemented: genericClosure");
};
const initialList = forceList(startSet);
const opFunction = forceFunction(operator);
export const getFlake = (attrs: NixValue): never => {
throw new Error("Not implemented: getFlake");
};
const resultSet = new OrderedSet<NixStrictValue>(undefined, compareValues);
const resultList: NixStrictValue[] = [];
const queue: NixStrictValue[] = [];
export const match =
(regex: NixValue) =>
(str: NixValue): never => {
throw new Error("Not implemented: match");
};
for (const item of initialList) {
const itemAttrs = forceAttrs(item);
const key = force(select(itemAttrs, ["key"]));
checkComparable(key);
if (resultSet.find(key).equals(resultSet.end())) {
resultSet.insert(key);
resultList.push(itemAttrs);
queue.push(itemAttrs);
}
}
let head = 0;
while (head < queue.length) {
const currentItem = queue[head++];
const newItems = forceList(opFunction(currentItem));
for (const newItem of newItems) {
const newItemAttrs = forceAttrs(newItem);
const key = force(select(newItemAttrs, ["key"]));
checkComparable(key);
if (resultSet.find(key).equals(resultSet.end())) {
resultSet.insert(key);
resultList.push(newItemAttrs);
queue.push(newItemAttrs);
}
}
}
return resultList;
};
export const outputOf =
(drv: NixValue) =>
(out: NixValue): never => {
throw new Error("Not implemented: outputOf");
(_drv: NixValue) =>
(_out: NixValue): never => {
throw new Error("Not implemented: outputOf (part of dynamic-derivation)");
};
export const parseDrvName = (s: NixValue): never => {
throw new Error("Not implemented: parseDrvName");
export const parseDrvName = (s: NixValue): NixAttrs => {
const fullName = forceStringNoCtx(s);
let name = fullName;
let version = "";
for (let i = 0; i < fullName.length; ++i) {
if (fullName[i] === "-" && i + 1 < fullName.length && !/[a-zA-Z]/.test(fullName[i + 1])) {
name = fullName.substring(0, i);
version = fullName.substring(i + 1);
break;
}
}
return new Map<string, NixValue>([
["name", name],
["version", version],
]);
};
export const parseFlakeName = (s: NixValue): never => {
throw new Error("Not implemented: parseFlakeName");
};
export const parseFlakeRef = (s: NixValue): never => {
throw new Error("Not implemented: parseFlakeRef");
};
export const placeholder = (output: NixValue): never => {
throw new Error("Not implemented: placeholder");
export const placeholder = (output: NixValue): NixValue => {
const outputStr = forceStringNoCtx(output);
return Deno.core.ops.op_make_placeholder(outputStr);
};
export const replaceStrings =
(from: NixValue) =>
(to: NixValue) =>
(s: NixValue): never => {
throw new Error("Not implemented: replaceStrings");
(s: NixValue): NixValue => {
const fromList = forceList(from);
const toList = forceList(to);
const inputStr = forceString(s);
const inputStrValue = getStringValue(inputStr);
const resultContext: NixStringContext = getStringContext(inputStr);
if (fromList.length !== toList.length) {
throw new Error("'from' and 'to' arguments passed to builtins.replaceStrings have different lengths");
}
const toCache = new Map<number, string>();
const toContextCache = new Map<number, NixStringContext>();
let result = "";
let pos = 0;
while (pos <= inputStrValue.length) {
let found = false;
for (let i = 0; i < fromList.length; i++) {
const pattern = forceStringValue(fromList[i]);
if (inputStrValue.substring(pos).startsWith(pattern)) {
found = true;
if (!toCache.has(i)) {
const replacementStr = forceString(toList[i]);
const replacementValue = getStringValue(replacementStr);
const replacementContext = getStringContext(replacementStr);
toCache.set(i, replacementValue);
toContextCache.set(i, replacementContext);
for (const elem of replacementContext) {
resultContext.add(elem);
}
}
const replacement = toCache.get(i) as string;
result += replacement;
if (pattern.length === 0) {
if (pos < inputStrValue.length) {
result += inputStrValue[pos];
}
pos++;
} else {
pos += pattern.length;
}
break;
}
}
if (!found) {
if (pos < inputStrValue.length) {
result += inputStrValue[pos];
}
pos++;
}
}
if (resultContext.size === 0) {
return result;
}
return mkStringWithContext(result, resultContext);
};
export const split = (regex: NixValue, str: NixValue): never => {
throw new Error("Not implemented: split");
export const splitVersion = (s: NixValue): NixValue => {
const version = forceStringValue(s);
const components: string[] = [];
let idx = 0;
while (idx < version.length) {
const result = nextComponent(version, idx);
if (result.component === "") {
break;
}
components.push(result.component);
idx = result.nextIndex;
}
return components;
};
export const splitVersion = (s: NixValue): never => {
throw new Error("Not implemented: splitVersion");
};
export const traceVerbose =
(_e1: NixValue) =>
(e2: NixValue): NixStrictValue => {
// TODO: implement traceVerbose
return force(e2);
};
export const traceVerbose = (e1: NixValue, e2: NixValue): never => {
throw new Error("Not implemented: traceVerbose");
};
export const tryEval = (e: NixValue): { success: NixBool; value: NixStrictValue } => {
export const tryEval = (e: NixValue): NixAttrs => {
try {
return {
success: true,
value: force(e),
};
return new Map<string, NixValue>([
["success", true],
["value", force(e)],
]);
} catch (err) {
if (err instanceof CatchableError) {
return {
success: false,
value: false,
};
return new Map<string, NixValue>([
["success", false],
["value", false],
]);
} else {
throw err;
}
}
};
export const zipAttrsWith =
(f: NixValue) =>
(list: NixValue): never => {
throw new Error("Not implemented: zipAttrsWith");
};

View File

@@ -0,0 +1,133 @@
import { mkPath } from "../path";
import { mkStringWithContext, type NixStringContext } from "../string-context";
import { force } from "../thunk";
import type { NixPath, NixString, NixValue } from "../types";
import { isNixPath } from "../types";
import { coerceToPath, coerceToString, StringCoercionMode } from "./conversion";
/**
* builtins.baseNameOf
* Get the last component of a path or string
* Always returns a string (coerces paths)
* Preserves string context if present
*
* Implements Nix's legacyBaseNameOf logic:
* - If string ends with '/', removes only the final slash
* - Then returns everything after the last remaining '/'
*
* Examples:
* - baseNameOf ./foo/bar → "bar"
* - baseNameOf "/foo/bar/" → "bar" (trailing slash removed first)
* - baseNameOf "foo" → "foo"
*/
export const baseNameOf = (s: NixValue): NixString => {
const context: NixStringContext = new Set();
const pathStr = coerceToString(s, StringCoercionMode.Base, false, context);
if (pathStr.length === 0) {
if (context.size === 0) {
return "";
}
return mkStringWithContext("", context);
}
let last = pathStr.length - 1;
if (pathStr[last] === "/" && last > 0) {
last -= 1;
}
let pos = last;
while (pos >= 0 && pathStr[pos] !== "/") {
pos -= 1;
}
if (pos === -1) {
pos = 0;
} else {
pos += 1;
}
const result = pathStr.substring(pos, last + 1);
// Preserve string context if present
if (context.size > 0) {
return mkStringWithContext(result, context);
}
return result;
};
/**
* builtins.dirOf
* Get the directory part of a path or string
* TYPE-PRESERVING: path → path, string → string
*
* Examples:
* - dirOf ./foo/bar → ./foo (path)
* - dirOf "/foo/bar" → "/foo" (string)
* - dirOf "/" → "/" (same type as input)
*/
export const dirOf = (s: NixValue): NixPath | NixString => {
const forced = force(s);
// Path input → path output
if (isNixPath(forced)) {
const pathStr = forced.value;
const lastSlash = pathStr.lastIndexOf("/");
if (lastSlash === -1) {
return mkPath(".");
}
if (lastSlash === 0) {
return mkPath("/");
}
return mkPath(pathStr.slice(0, lastSlash));
}
// String input → string output
const outContext: NixStringContext = new Set();
const pathStr = coerceToString(s, StringCoercionMode.Base, false, outContext);
const lastSlash = pathStr.lastIndexOf("/");
if (lastSlash === -1) {
return ".";
}
if (lastSlash === 0) {
return "/";
}
const result = pathStr.slice(0, lastSlash);
if (outContext.size > 0) {
return mkStringWithContext(result, outContext);
}
return result;
};
/**
* builtins.toPath
* Convert a value to an absolute path string.
* DEPRECATED: Use `/. + "/path"` to convert a string into an absolute path.
*
* This validates that the input can be coerced to an absolute path.
* Returns a **string** (not a NixPath), with context preserved.
*
* Examples:
* - toPath "/foo" → "/foo" (string)
* - toPath "/foo/bar" → "/foo/bar" (string)
* - toPath "foo" → ERROR (not absolute)
* - toPath "" → ERROR (empty)
*/
export const toPath = (s: NixValue): NixString => {
const context: NixStringContext = new Set();
const pathStr = coerceToPath(s, context);
if (context.size === 0) {
return pathStr;
}
return mkStringWithContext(pathStr, context);
};

View File

@@ -1,40 +1,94 @@
/**
* String operation builtin functions
*/
import {
getStringContext,
getStringValue,
mkStringWithContext,
type NixStringContext,
} from "../string-context";
import { forceInt, forceList, forceString, forceStringValue } from "../type-assert";
import type { NixInt, NixString, NixValue } from "../types";
import { coerceToString, StringCoercionMode } from "./conversion";
import type { NixValue } from "../types";
import { forceString, forceList, forceInt } from "../type-assert";
export const stringLength = (e: NixValue): number => forceString(e).length;
export const stringLength = (e: NixValue): NixInt => BigInt(forceStringValue(e).length);
export const substring =
(start: NixValue) =>
(len: NixValue) =>
(s: NixValue): string => {
const str = forceString(s);
(s: NixValue): NixString => {
const startPos = Number(forceInt(start));
const length = Number(forceInt(len));
return str.substring(startPos, startPos + length);
if (startPos < 0) {
throw new Error("negative start position in 'substring'");
}
const str = forceString(s);
const strValue = getStringValue(str);
const context = getStringContext(str);
if (length === 0) {
if (context.size === 0) {
return "";
}
return mkStringWithContext("", context);
}
const actualLength = length < 0 ? Number.MAX_SAFE_INTEGER : length;
const result = startPos >= strValue.length ? "" : strValue.substring(startPos, startPos + actualLength);
if (context.size === 0) {
return result;
}
return mkStringWithContext(result, context);
};
export const concatStringsSep =
(sep: NixValue) =>
(list: NixValue): string =>
forceList(list).join(forceString(sep));
(list: NixValue): NixString => {
const context: NixStringContext = new Set();
const separator = coerceToString(sep, StringCoercionMode.Interpolation, false, context);
export const baseNameOf = (x: NixValue): string => {
const str = forceString(x);
if (str.length === 0) return "";
const parts = forceList(list).map((elem) =>
coerceToString(elem, StringCoercionMode.Interpolation, false, context),
);
let last = str.length - 1;
if (str[last] === "/" && last > 0) last -= 1;
const result = parts.join(separator);
let pos = last;
while (pos >= 0 && str[pos] !== "/") pos -= 1;
if (context.size === 0) {
return result;
}
return mkStringWithContext(result, context);
};
if (pos !== 0 || (pos === 0 && str[pos] === "/")) {
pos += 1;
}
export const match =
(regex: NixValue) =>
(str: NixValue): NixValue => {
const regexStr = forceStringValue(regex);
const inputStr = forceStringValue(str);
return str.substring(pos, last + 1);
};
const result = Deno.core.ops.op_match(regexStr, inputStr);
if (result === null) {
return null;
}
return result.map((g) => (g !== null ? g : null));
};
export const split =
(regex: NixValue) =>
(str: NixValue): NixValue => {
const regexStr = forceStringValue(regex);
const inputStr = forceString(str);
const inputStrValue = getStringValue(inputStr);
const result = Deno.core.ops.op_split(regexStr, inputStrValue);
if (result.length === 1 && typeof result[0] === "string") {
return [inputStr];
}
return result.map((item) => {
if (typeof item === "string") {
return item;
}
return item.map((g) => (g !== null ? g : null));
});
};

View File

@@ -1,61 +1,63 @@
/**
* Type checking builtin functions
*/
import type {
NixAttrs,
NixBool,
NixFloat,
NixFunction,
NixInt,
NixList,
NixNull,
NixStrictValue,
NixString,
NixValue,
import {
isNixPath,
isStringWithContext,
type NixAttrs,
type NixBool,
type NixFloat,
type NixFunction,
type NixInt,
type NixList,
type NixNull,
type NixPath,
type NixStrictValue,
type NixString,
} from "../types";
import { force } from "../thunk";
export const isAttrs = (e: NixValue): e is NixAttrs => {
const val = force(e);
return typeof val === "object" && !Array.isArray(val) && val !== null;
export const isNixString = (v: NixStrictValue): v is NixString => {
return typeof v === "string" || isStringWithContext(v);
};
export const isBool = (e: NixValue): e is NixBool => typeof force(e) === "boolean";
export const isAttrs = (e: NixStrictValue): e is NixAttrs => {
return e instanceof Map;
};
export const isFloat = (e: NixValue): e is NixFloat => {
const val = force(e);
export const isBool = (e: NixStrictValue): e is NixBool => typeof e === "boolean";
export const isFloat = (e: NixStrictValue): e is NixFloat => {
const val = e;
return typeof val === "number"; // Only number is float
};
export const isFunction = (e: NixValue): e is NixFunction => typeof force(e) === "function";
export const isFunction = (e: NixStrictValue): e is NixFunction => typeof e === "function";
export const isInt = (e: NixValue): e is NixInt => {
const val = force(e);
export const isInt = (e: NixStrictValue): e is NixInt => {
const val = e;
return typeof val === "bigint"; // Only bigint is int
};
export const isList = (e: NixValue): e is NixList => Array.isArray(force(e));
export const isList = (e: NixStrictValue): e is NixList => Array.isArray(e);
export const isNull = (e: NixValue): e is NixNull => force(e) === null;
export const isNull = (e: NixStrictValue): e is NixNull => e === null;
export const isPath = (e: NixValue): never => {
throw new Error("Not implemented: isPath");
export const isPath = (e: NixStrictValue): e is NixPath => {
const val = e;
return isNixPath(val);
};
export const isString = (e: NixValue): e is NixString => typeof force(e) === "string";
export const isString = (e: NixStrictValue): e is NixString =>
typeof e === "string" || isStringWithContext(e);
export const typeOf = (e: NixValue): string => {
const val = force(e);
export type NixType = "int" | "float" | "bool" | "string" | "path" | "null" | "list" | "lambda" | "set";
export const typeOf = (e: NixStrictValue): NixType => {
if (typeof e === "bigint") return "int";
if (typeof e === "number") return "float";
if (typeof e === "boolean") return "bool";
if (e === null) return "null";
if (isNixString(e)) return "string";
if (isNixPath(e)) return "path";
if (Array.isArray(e)) return "list";
if (e instanceof Map) return "set";
if (typeof e === "function") return "lambda";
if (typeof val === "bigint") return "int";
if (typeof val === "number") return "float";
if (typeof val === "boolean") return "bool";
if (typeof val === "string") return "string";
if (val === null) return "null";
if (Array.isArray(val)) return "list";
if (typeof val === "function") return "lambda";
if (typeof val === "object") return "set";
throw new TypeError(`Unknown Nix type: ${typeof val}`);
throw new TypeError(`Unknown Nix type: ${typeof e}`);
};

View File

@@ -1,71 +0,0 @@
export interface OutputInfo {
path: string;
hashAlgo: string;
hash: string;
}
export interface DerivationData {
name: string;
outputs: Map<string, OutputInfo>;
inputDrvs: Map<string, Set<string>>;
inputSrcs: Set<string>;
platform: string;
builder: string;
args: string[];
env: Map<string, string>;
}
export const escapeString = (s: string): string => {
let result = "";
for (const char of s) {
switch (char) {
case '"':
result += '\\"';
break;
case "\\":
result += "\\\\";
break;
case "\n":
result += "\\n";
break;
case "\r":
result += "\\r";
break;
case "\t":
result += "\\t";
break;
default:
result += char;
}
}
return `"${result}"`;
};
const quoteString = (s: string): string => `"${s}"`;
export const generateAterm = (drv: DerivationData): string => {
const outputEntries: string[] = [];
const sortedOutputs = Array.from(drv.outputs.entries()).sort();
for (const [name, info] of sortedOutputs) {
outputEntries.push(
`(${quoteString(name)},${quoteString(info.path)},${quoteString(info.hashAlgo)},${quoteString(info.hash)})`,
);
}
const outputs = outputEntries.join(",");
const inputDrvEntries: string[] = [];
for (const [drvPath, outputs] of drv.inputDrvs) {
const outList = `[${Array.from(outputs).map(quoteString).join(",")}]`;
inputDrvEntries.push(`(${quoteString(drvPath)},${outList})`);
}
const inputDrvs = inputDrvEntries.join(",");
const inputSrcs = Array.from(drv.inputSrcs).map(quoteString).join(",");
const args = drv.args.map(escapeString).join(",");
const envs = Array.from(drv.env.entries())
.sort()
.map(([k, v]) => `(${escapeString(k)},${escapeString(v)})`);
return `Derive([${outputs}],[${inputDrvs}],[${inputSrcs}],${quoteString(drv.platform)},${quoteString(drv.builder)},[${args}],[${envs}])`;
};

View File

@@ -1,27 +1,122 @@
/**
* Helper functions for nix-js runtime
*/
import type { NixValue, NixAttrs, NixBool, NixString } from "./types";
import { forceAttrs, forceString } from "./type-assert";
import { isAttrs } from "./builtins/type-check";
import { coerceToString, StringCoercionMode } from "./builtins/conversion";
import { type NixStringContext, mkStringWithContext } from "./string-context";
import { isAttrs, typeOf } from "./builtins/type-check";
import { mkPath } from "./path";
import { isStringWithContext, mkStringWithContext, type NixStringContext } from "./string-context";
import { force } from "./thunk";
import { forceAttrs, forceBool, forceFunction, forceStringNoCtx, forceStringValue } from "./type-assert";
import type { NixAttrs, NixBool, NixPath, NixString, NixValue } from "./types";
import { CatchableError, isNixPath } from "./types";
interface StackFrame {
span: number;
message: string;
}
const callStack: StackFrame[] = [];
const MAX_STACK_DEPTH = 1000;
function enrichError(error: unknown): Error {
const err = error instanceof Error ? error : new Error(String(error));
if (callStack.length === 0) {
return err;
}
const nixStackLines = callStack.map((frame) => {
return `NIX_STACK_FRAME:${frame.span}:${frame.message}`;
});
// Prepend stack frames to error stack
err.stack = `${nixStackLines.join("\n")}\n${err.stack || ""}`;
return err;
}
const pushContext = (message: string, span: number): void => {
if (callStack.length >= MAX_STACK_DEPTH) {
callStack.shift();
}
callStack.push({ span, message });
};
const popContext = (): void => {
callStack.pop();
};
export const withContext = <T>(message: string, span: number, fn: () => T): T => {
pushContext(message, span);
try {
return fn();
} catch (error) {
throw enrichError(error);
} finally {
popContext();
}
};
/**
* Concatenate multiple values into a string with context
* Concatenate multiple values into a string or path with context
* This is used for string interpolation like "hello ${world}"
*
* IMPORTANT: String context handling:
* - All contexts from interpolated values are merged into the result
* - Path mode: Store contexts are forbidden (will throw error)
* - String mode: All contexts are preserved and merged
*
* @param parts - Array of values to concatenate
* @returns String with merged contexts from all parts
* @param forceString - If true, result is always a string (paths are copied to store)
* @returns String or Path with merged contexts from all parts
*/
export const concatStringsWithContext = (parts: NixValue[]): NixString => {
export const concatStringsWithContext = (parts: NixValue[], forceString: boolean): NixString | NixPath => {
if (parts.length === 0) {
return "";
}
const forced = parts.map(force);
const firstIsPath = !forceString && isNixPath(forced[0]);
if (firstIsPath) {
let result = (forced[0] as NixPath).value;
for (let i = 1; i < forced.length; i++) {
const part = forced[i];
if (isNixPath(part)) {
result += part.value;
} else if (typeof part === "string") {
result += part;
} else if (isStringWithContext(part)) {
if (part.context.size > 0) {
throw new TypeError("a string that refers to a store path cannot be appended to a path");
}
result += part.value;
} else {
const tempContext: NixStringContext = new Set();
const coerced = coerceToString(part, StringCoercionMode.Interpolation, false, tempContext);
if (tempContext.size > 0) {
throw new TypeError("a string that refers to a store path cannot be appended to a path");
}
result += coerced;
}
}
return mkPath(result);
}
const context: NixStringContext = new Set();
const strParts: string[] = [];
for (const part of parts) {
const str = coerceToString(part, StringCoercionMode.Interpolation, false, context);
strParts.push(str);
for (const part of forced) {
if (isNixPath(part)) {
const str = coerceToString(part, StringCoercionMode.Interpolation, true, context);
strParts.push(str);
} else {
const str = coerceToString(part, StringCoercionMode.Interpolation, false, context);
strParts.push(str);
}
}
const value = strParts.join("");
@@ -33,114 +128,199 @@ export const concatStringsWithContext = (parts: NixValue[]): NixString => {
return mkStringWithContext(value, context);
};
/**
* Resolve a path (handles both absolute and relative paths)
* For relative paths, resolves against current import stack
*
* @param path - Path string (may be relative or absolute)
* @returns Absolute path string
*/
export const resolvePath = (path: NixValue): string => {
const path_str = forceString(path);
return Deno.core.ops.op_resolve_path(path_str);
};
export const resolvePath = (currentDir: string, path: NixValue): NixPath => {
const forced = force(path);
let pathStr: string;
/**
* Select an attribute from an attribute set
* Used by codegen for attribute access (e.g., obj.key)
*
* @param obj - Attribute set to select from
* @param key - Key to select
* @returns The value at obj[key]
* @throws Error if obj is null/undefined or key not found
*/
export const select = (obj: NixValue, key: NixValue): NixValue => {
const forced_obj = forceAttrs(obj);
const forced_key = forceString(key);
if (!(forced_key in forced_obj)) {
throw new Error(`Attribute '${forced_key}' not found`);
if (isNixPath(forced)) {
pathStr = forced.value;
} else {
pathStr = forceStringValue(path);
}
return forced_obj[forced_key];
const resolved = Deno.core.ops.op_resolve_path(currentDir, pathStr);
return mkPath(resolved);
};
/**
* Select an attribute with a default value
* Used for Nix's `obj.key or default` syntax
*
* @param obj - Attribute set to select from
* @param key - Key to select
* @param default_val - Value to return if key not found
* @returns obj[key] if exists, otherwise default_val
*/
export const selectWithDefault = (obj: NixValue, key: NixValue, default_val: NixValue): NixValue => {
const attrs = forceAttrs(obj);
const forced_key = forceString(key);
export const select = (obj: NixValue, attrpath: NixValue[], span?: number): NixValue => {
if (span !== undefined) {
if (callStack.length >= MAX_STACK_DEPTH) {
callStack.shift();
}
const frame: StackFrame = { span, message: "while selecting attribute" };
callStack.push(frame);
try {
return selectImpl(obj, attrpath);
} catch (error) {
try {
const path = attrpath.map((a) => forceStringValue(a)).join(".");
if (path) frame.message = `while selecting attribute [${path}]`;
} catch {
throw enrichError(error);
}
throw enrichError(error);
} finally {
callStack.pop();
}
} else {
return selectImpl(obj, attrpath);
}
};
if (!(forced_key in attrs)) {
return default_val;
function selectImpl(obj: NixValue, attrpath: NixValue[]): NixValue {
let attrs = forceAttrs(obj);
for (let i = 0; i < attrpath.length - 1; i++) {
const key = forceStringValue(attrpath[i]);
if (!attrs.has(key)) {
throw new Error(`Attribute '${key}' not found`);
}
const cur = forceAttrs(attrs.get(key) as NixValue);
attrs = cur;
}
return attrs[forced_key];
const last = forceStringValue(attrpath[attrpath.length - 1]);
if (!attrs.has(last)) {
throw new Error(`Attribute '${last}' not found`);
}
return attrs.get(last) as NixValue;
}
export const selectWithDefault = (
obj: NixValue,
attrpath: NixValue[],
defaultVal: NixValue,
span?: number,
): NixValue => {
if (span !== undefined) {
if (callStack.length >= MAX_STACK_DEPTH) {
callStack.shift();
}
const frame: StackFrame = { span, message: "while selecting attribute" };
callStack.push(frame);
try {
return selectWithDefaultImpl(obj, attrpath, defaultVal);
} catch (error) {
try {
const path = attrpath.map((a) => forceStringValue(a)).join(".");
if (path) frame.message = `while selecting attribute [${path}]`;
} catch {
throw enrichError(error);
}
throw enrichError(error);
} finally {
callStack.pop();
}
} else {
return selectWithDefaultImpl(obj, attrpath, defaultVal);
}
};
function selectWithDefaultImpl(obj: NixValue, attrpath: NixValue[], defaultVal: NixValue): NixValue {
let attrs = force(obj);
if (!isAttrs(attrs)) {
return defaultVal;
}
for (let i = 0; i < attrpath.length - 1; i++) {
const key = forceStringValue(attrpath[i]);
if (!attrs.has(key)) {
return defaultVal;
}
const cur = force(attrs.get(key) as NixValue);
if (!isAttrs(cur)) {
return defaultVal;
}
attrs = cur;
}
const last = forceStringValue(attrpath[attrpath.length - 1]);
if (attrs.has(last)) {
return attrs.get(last) as NixValue;
}
return defaultVal;
}
export const hasAttr = (obj: NixValue, attrpath: NixValue[]): NixBool => {
if (!isAttrs(obj)) {
const forced = force(obj);
if (!isAttrs(forced)) {
return false;
}
let attrs = obj;
let attrs = forced;
for (const attr of attrpath.slice(0, -1)) {
const cur = attrs[forceString(attr)];
for (let i = 0; i < attrpath.length - 1; i++) {
const key = forceStringNoCtx(attrpath[i]);
if (!attrs.has(key)) {
return false;
}
const cur = force(attrs.get(key) as NixValue);
if (!isAttrs(cur)) {
return false;
}
attrs = cur;
}
return true;
return attrs.has(forceStringValue(attrpath[attrpath.length - 1]));
};
/**
* Validate function parameters
* Used for pattern matching in function parameters
*
* Example: { a, b ? 1, ... }: ...
* - required: ["a"]
* - allowed: ["a", "b"] (or null if ellipsis "..." present)
*
* @param arg - Argument object to validate
* @param required - Array of required parameter names (or null)
* @param allowed - Array of allowed parameter names (or null for ellipsis)
* @returns The forced argument object
* @throws Error if required param missing or unexpected param present
*/
export const validateParams = (
arg: NixValue,
required: string[] | null,
allowed: string[] | null,
): NixAttrs => {
const forced_arg = forceAttrs(arg);
// Check required parameters
if (required) {
for (const key of required) {
if (!Object.hasOwn(forced_arg, key)) {
throw new Error(`Function called without required argument '${key}'`);
}
export const call = (func: NixValue, arg: NixValue, span?: number): NixValue => {
if (span !== undefined) {
if (callStack.length >= MAX_STACK_DEPTH) {
callStack.shift();
}
}
// Check allowed parameters (if not using ellipsis)
if (allowed) {
const allowed_set = new Set(allowed);
for (const key in forced_arg) {
if (!allowed_set.has(key)) {
throw new Error(`Function called with unexpected argument '${key}'`);
}
callStack.push({ span, message: "from call site" });
try {
return callImpl(func, arg);
} catch (error) {
throw enrichError(error);
} finally {
callStack.pop();
}
} else {
return callImpl(func, arg);
}
return forced_arg;
};
function callImpl(func: NixValue, arg: NixValue): NixValue {
const forced = force(func);
if (typeof forced === "function") {
forced.args?.check(arg);
return forced(arg);
}
if (forced instanceof Map && forced.has("__functor")) {
const functor = forceFunction(forced.get("__functor") as NixValue);
return call(callImpl(functor, forced), arg);
}
throw new Error(`attempt to call something which is not a function but ${typeOf(forced)}`);
}
export const assert = (assertion: NixValue, expr: NixValue, assertionRaw: string, span: number): NixValue => {
if (forceBool(assertion)) {
return expr;
}
withContext("while evaluating assertion", span, () => {
throw new CatchableError(`assertion '${assertionRaw}' failed`);
});
throw "unreachable";
};
export const mkPos = (span: number): NixAttrs => {
return Deno.core.ops.op_decode_span(span);
};
interface WithScope {
env: NixValue;
last: WithScope | null;
}
export const lookupWith = (name: string, withScope: WithScope): NixValue => {
let current: WithScope | null = withScope;
while (current !== null) {
const attrs = forceAttrs(current.env);
if (attrs.has(name)) {
return attrs.get(name) as NixValue;
}
current = current.last;
}
throw new Error(`undefined variable '${name}'`);
};

View File

@@ -4,37 +4,95 @@
* All functionality is exported via the global `Nix` object
*/
import { createThunk, force, isThunk, IS_THUNK } from "./thunk";
import { select, selectWithDefault, validateParams, resolvePath, hasAttr, concatStringsWithContext } from "./helpers";
import { op } from "./operators";
import { builtins, PRIMOP_METADATA } from "./builtins";
import { coerceToString, StringCoercionMode } from "./builtins/conversion";
import {
assert,
call,
concatStringsWithContext,
hasAttr,
lookupWith,
mkPos,
resolvePath,
select,
selectWithDefault,
} from "./helpers";
import { op } from "./operators";
import { HAS_CONTEXT } from "./string-context";
import { createThunk, DEBUG_THUNKS, force, forceDeep, forceShallow, IS_CYCLE, IS_THUNK } from "./thunk";
import { forceBool } from "./type-assert";
import { IS_PATH, mkAttrs, mkFunction, type NixValue } from "./types";
import { execBytecode, execBytecodeScoped, vmStrings, vmConstants } from "./vm";
export type NixRuntime = typeof Nix;
/**
* The global Nix runtime object
*/
const replBindings: Map<string, NixValue> = new Map();
export const Nix = {
IS_THUNK,
IS_CYCLE,
HAS_CONTEXT,
IS_PATH,
PRIMOP_METADATA,
DEBUG_THUNKS,
createThunk,
force,
isThunk,
IS_THUNK,
HAS_CONTEXT,
forceBool,
forceShallow,
forceDeep,
assert,
call,
hasAttr,
select,
selectWithDefault,
validateParams,
lookupWith,
resolvePath,
coerceToString,
concatStringsWithContext,
StringCoercionMode,
mkAttrs,
mkFunction,
mkPos,
op,
builtins,
PRIMOP_METADATA,
strings: vmStrings,
constants: vmConstants,
execBytecode,
execBytecodeScoped,
replBindings,
setReplBinding: (name: string, value: NixValue) => {
replBindings.set(name, value);
},
getReplBinding: (name: string) => replBindings.get(name),
};
globalThis.Nix = Nix;
globalThis.$t = createThunk;
globalThis.$f = force;
globalThis.$fb = forceBool;
globalThis.$a = assert;
globalThis.$c = call;
globalThis.$h = hasAttr;
globalThis.$s = select;
globalThis.$sd = selectWithDefault;
globalThis.$l = lookupWith;
globalThis.$r = resolvePath;
globalThis.$cs = concatStringsWithContext;
globalThis.$ma = mkAttrs;
globalThis.$mf = mkFunction;
globalThis.$mp = mkPos;
globalThis.$gb = Nix.getReplBinding;
globalThis.$oa = op.add;
globalThis.$os = op.sub;
globalThis.$om = op.mul;
globalThis.$od = op.div;
globalThis.$oe = op.eq;
globalThis.$ol = op.lt;
globalThis.$og = op.gt;
globalThis.$oc = op.concat;
globalThis.$ou = op.update;
globalThis.$b = builtins;
globalThis.$e = new Map();

View File

@@ -1,33 +1,126 @@
/**
* Nix operators module
* Implements all binary and unary operators used by codegen
*/
import type { NixValue, NixList, NixAttrs, NixString } from "./types";
import { isStringWithContext } from "./types";
import { force } from "./thunk";
import { forceNumeric, forceList, forceAttrs, coerceNumeric } from "./type-assert";
import { coerceToString, StringCoercionMode } from "./builtins/conversion";
import { isNixString, typeOf } from "./builtins/type-check";
import { mkPath } from "./path";
import {
getStringValue,
getStringContext,
getStringValue,
mergeContexts,
mkStringWithContext,
type NixStringContext,
} from "./string-context";
import { force } from "./thunk";
import { coerceNumeric, forceAttrs, forceBool, forceList, forceNumeric } from "./type-assert";
import type { NixAttrs, NixList, NixPath, NixString, NixValue } from "./types";
import { isNixPath } from "./types";
const isNixString = (v: unknown): v is NixString => {
return typeof v === "string" || isStringWithContext(v);
const canCoerceToString = (v: NixValue): boolean => {
const forced = force(v);
if (isNixString(forced)) return true;
if (forced instanceof Map) {
if (forced.has("outPath") || forced.has("__toString")) return true;
}
return false;
};
export const compareValues = (a: NixValue, b: NixValue): -1 | 0 | 1 => {
const av = force(a);
const bv = force(b);
// Handle float/int mixed comparisons
if (typeof av === "number" && typeof bv === "bigint") {
const cmp = av - Number(bv);
return cmp < 0 ? -1 : cmp > 0 ? 1 : 0;
}
if (typeof av === "bigint" && typeof bv === "number") {
const cmp = Number(av) - bv;
return cmp < 0 ? -1 : cmp > 0 ? 1 : 0;
}
const typeA = typeOf(av);
const typeB = typeOf(bv);
// Types must match (except float/int which is handled above)
if (typeA !== typeB) {
throw new TypeError(`cannot compare ${typeOf(av)} with ${typeOf(bv)}`);
}
if (typeA === "int" || typeA === "float") {
return (av as never) < (bv as never) ? -1 : av === bv ? 0 : 1;
}
if (typeA === "string") {
const strA = getStringValue(av as NixString);
const strB = getStringValue(bv as NixString);
return strA < strB ? -1 : strA > strB ? 1 : 0;
}
if (typeA === "path") {
const aPath = av as NixPath;
const bPath = bv as NixPath;
return aPath.value < bPath.value ? -1 : aPath.value > bPath.value ? 1 : 0;
}
if (typeA === "list") {
const aList = av as NixList;
const bList = bv as NixList;
for (let i = 0; ; i++) {
// Equal if same length, else aList > bList
if (i === bList.length) {
return i === aList.length ? 0 : 1;
} else if (i === aList.length) {
return -1; // aList < bList
} else if (!op.eq(aList[i], bList[i])) {
return compareValues(aList[i], bList[i]);
}
}
}
// Other types are incomparable
throw new TypeError(
`cannot compare ${typeOf(av)} with ${typeOf(bv)}; values of that type are incomparable`,
);
};
/**
* Operator object exported as Nix.op
* All operators referenced by codegen (e.g., Nix.op.add, Nix.op.eq)
*/
export const op = {
add: (a: NixValue, b: NixValue): bigint | number | NixString => {
add: (a: NixValue, b: NixValue): bigint | number | NixString | NixPath => {
const av = force(a);
const bv = force(b);
// Path concatenation: path + string = path
if (isNixPath(av)) {
if (isNixString(bv)) {
const strB = getStringValue(bv);
const ctxB = getStringContext(bv);
if (ctxB.size > 0) {
throw new TypeError("a string that refers to a store path cannot be appended to a path");
}
return mkPath(av.value + strB);
}
// FIXME: handle corepkgs
// path + path: concatenate
if (isNixPath(bv)) {
return mkPath(av.value + bv.value);
}
}
// String + path: result is string (path coerces to string)
if (isNixString(av) && isNixPath(bv)) {
const strA = getStringValue(av);
const ctxA = getStringContext(av);
const result = strA + bv.value;
if (ctxA.size === 0) {
return result;
}
return mkStringWithContext(result, ctxA);
}
// String concatenation
if (isNixString(av) && isNixString(bv)) {
// Merge string context
const strA = getStringValue(av);
const strB = getStringValue(bv);
const ctxA = getStringContext(av);
@@ -40,18 +133,31 @@ export const op = {
return mkStringWithContext(strA + strB, mergeContexts(ctxA, ctxB));
}
// Auto-coerce to string if possible
if (canCoerceToString(a) && canCoerceToString(b)) {
const context: NixStringContext = new Set();
const strA = coerceToString(a, StringCoercionMode.Interpolation, false, context);
const strB = coerceToString(b, StringCoercionMode.Interpolation, false, context);
const result = strA + strB;
if (context.size === 0) {
return result;
}
return mkStringWithContext(result, context);
}
// Perform numeric addition otherwise
const [numA, numB] = coerceNumeric(forceNumeric(a), forceNumeric(b));
return (numA as any) + (numB as any);
return (numA as never) + (numB as never);
},
sub: (a: NixValue, b: NixValue): bigint | number => {
const [av, bv] = coerceNumeric(forceNumeric(a), forceNumeric(b));
return (av as any) - (bv as any);
return (av as never) - (bv as never);
},
mul: (a: NixValue, b: NixValue): bigint | number => {
const [av, bv] = coerceNumeric(forceNumeric(a), forceNumeric(b));
return (av as any) * (bv as any);
return (av as never) * (bv as never);
},
div: (a: NixValue, b: NixValue): bigint | number => {
@@ -61,17 +167,17 @@ export const op = {
throw new RangeError("Division by zero");
}
return (av as any) / (bv as any);
return (av as never) / (bv as never);
},
eq: (a: NixValue, b: NixValue): boolean => {
const av = force(a);
const bv = force(b);
if (isNixString(av) && isNixString(bv)) {
return getStringValue(av) === getStringValue(bv);
}
// Pointer equality
if (av === bv) return true;
// Special case: int == float type compatibility
if (typeof av === "bigint" && typeof bv === "number") {
return Number(av) === bv;
}
@@ -79,63 +185,101 @@ export const op = {
return av === Number(bv);
}
return av === bv;
const typeA = typeOf(av);
const typeB = typeOf(bv);
// All other types must match exactly
if (typeA !== typeB) return false;
if (typeA === "int" || typeA === "float" || typeA === "bool" || typeA === "null") {
return av === bv;
}
// String comparison (handles both plain strings and StringWithContext)
if (typeA === "string") {
return getStringValue(av as NixString) === getStringValue(bv as NixString);
}
// Path comparison
if (typeA === "path") {
return (av as NixPath).value === (bv as NixPath).value;
}
if (typeA === "list") {
const aList = av as NixList;
const bList = bv as NixList;
if (aList.length !== bList.length) return false;
for (let i = 0; i < aList.length; i++) {
if (!op.eq(aList[i], bList[i])) return false;
}
return true;
}
if (typeA === "set") {
const attrsA = av as NixAttrs;
const attrsB = bv as NixAttrs;
if (attrsA.has("type") && attrsB.has("type")) {
const typeValA = force(attrsA.get("type") as NixValue);
const typeValB = force(attrsB.get("type") as NixValue);
if (typeValA === "derivation" && typeValB === "derivation") {
if (attrsA.has("outPath") && attrsB.has("outPath")) {
return op.eq(attrsA.get("outPath") as NixValue, attrsB.get("outPath") as NixValue);
}
}
}
const keysA = Array.from(attrsA.keys()).sort();
const keysB = Array.from(attrsB.keys()).sort();
if (keysA.length !== keysB.length) return false;
for (let i = 0; i < keysA.length; i++) {
if (keysA[i] !== keysB[i]) return false;
if (!op.eq(attrsA.get(keysA[i]) as NixValue, attrsB.get(keysB[i]) as NixValue)) return false;
}
return true;
}
// Other types are incomparable
return false;
},
neq: (a: NixValue, b: NixValue): boolean => {
return !op.eq(a, b);
},
lt: (a: NixValue, b: NixValue): boolean => {
const av = force(a);
const bv = force(b);
if (isNixString(av) && isNixString(bv)) {
return getStringValue(av) < getStringValue(bv);
}
const [numA, numB] = coerceNumeric(forceNumeric(a), forceNumeric(b));
return (numA as any) < (numB as any);
return compareValues(a, b) < 0;
},
lte: (a: NixValue, b: NixValue): boolean => {
const av = force(a);
const bv = force(b);
if (isNixString(av) && isNixString(bv)) {
return getStringValue(av) <= getStringValue(bv);
}
const [numA, numB] = coerceNumeric(forceNumeric(a), forceNumeric(b));
return (numA as any) <= (numB as any);
return compareValues(a, b) <= 0;
},
gt: (a: NixValue, b: NixValue): boolean => {
const av = force(a);
const bv = force(b);
if (isNixString(av) && isNixString(bv)) {
return getStringValue(av) > getStringValue(bv);
}
const [numA, numB] = coerceNumeric(forceNumeric(a), forceNumeric(b));
return (numA as any) > (numB as any);
return compareValues(a, b) > 0;
},
gte: (a: NixValue, b: NixValue): boolean => {
const av = force(a);
const bv = force(b);
if (isNixString(av) && isNixString(bv)) {
return getStringValue(av) >= getStringValue(bv);
}
const [numA, numB] = coerceNumeric(forceNumeric(a), forceNumeric(b));
return (numA as any) >= (numB as any);
return compareValues(a, b) >= 0;
},
bnot: (a: NixValue): boolean => !force(a),
bnot: (a: NixValue): boolean => !forceBool(a),
concat: (a: NixValue, b: NixValue): NixList => {
return Array.prototype.concat.call(forceList(a), forceList(b));
return forceList(a).concat(forceList(b));
},
update: (a: NixValue, b: NixValue): NixAttrs => {
return { ...forceAttrs(a), ...forceAttrs(b) };
const mapA = forceAttrs(a);
const mapB = forceAttrs(b);
if (mapA.size === 0) {
return mapB;
}
if (mapB.size === 0) {
return mapA;
}
const result: NixAttrs = new Map(mapA);
for (const [k, v] of mapB) {
result.set(k, v);
}
return result;
},
};

View File

@@ -0,0 +1,38 @@
import { NixPath } from "./types";
const canonicalizePath = (path: string): string => {
const parts: string[] = [];
let i = 0;
const len = path.length;
while (i < len) {
while (i < len && path[i] === "/") i++;
if (i >= len) break;
let j = i;
while (j < len && path[j] !== "/") j++;
const component = path.slice(i, j);
i = j;
if (component === "..") {
if (parts.length > 0) {
parts.pop();
}
} else if (component !== ".") {
parts.push(component);
}
}
if (parts.length === 0) {
return "/";
}
return `/${parts.join("/")}`;
};
export const mkPath = (value: string): NixPath => {
return new NixPath(canonicalizePath(value));
};
export const getPathValue = (p: NixPath): string => {
return p.value;
};

View File

@@ -0,0 +1,111 @@
import { getPrimopMetadata, isPrimop } from "./builtins/index";
import { isStringWithContext } from "./string-context";
import { IS_CYCLE, isThunk } from "./thunk";
import { isNixPath, type NixValue } from "./types";
export const printValue = (value: NixValue, seen: WeakSet<object> = new WeakSet()): string => {
if (isThunk(value)) {
return "«thunk»";
}
if (value === null) {
return "null";
}
if (typeof value === "boolean") {
return value ? "true" : "false";
}
if (typeof value === "bigint") {
return value.toString();
}
if (typeof value === "number") {
return value.toString();
}
if (typeof value === "string") {
return printString(value);
}
if (typeof value === "function") {
if (isPrimop(value)) {
const meta = getPrimopMetadata(value);
if (meta && meta.applied > 0) {
return "<PRIMOP-APP>";
}
return "<PRIMOP>";
}
return "<LAMBDA>";
}
if (IS_CYCLE in value) {
return "«repeated»";
}
if (isNixPath(value)) {
return value.value;
}
if (isStringWithContext(value)) {
return printString(value.value);
}
if (Array.isArray(value)) {
if (value.length > 0) {
if (seen.has(value)) {
return "«repeated»";
}
seen.add(value);
}
const items = value.map((v) => printValue(v, seen)).join(" ");
return `[ ${items} ]`;
}
if (seen.has(value)) {
return "«repeated»";
}
if (value.size > 0) {
seen.add(value);
}
const entries = [...value.entries()]
.map(([k, v]) => `${printSymbol(k)} = ${printValue(v, seen)};`)
.join(" ");
return `{${entries ? ` ${entries} ` : " "}}`;
};
const printString = (s: string): string => {
let result = '"';
for (const c of s) {
switch (c) {
case "\\":
result += "\\\\";
break;
case '"':
result += '\\"';
break;
case "\n":
result += "\\n";
break;
case "\r":
result += "\\r";
break;
case "\t":
result += "\\t";
break;
default:
result += c;
}
}
return `${result}"`;
};
const SYMBOL_REGEX = /^[a-zA-Z_][a-zA-Z0-9_'-]*$/;
const printSymbol = (s: string): string => {
if (SYMBOL_REGEX.test(s)) {
return s;
}
return printString(s);
};

View File

@@ -1,3 +1,5 @@
import type { NixStrictValue } from "./types";
export const HAS_CONTEXT = Symbol("HAS_CONTEXT");
export interface StringContextOpaque {
@@ -20,18 +22,22 @@ export type StringContextElem = StringContextOpaque | StringContextDrvDeep | Str
export type NixStringContext = Set<string>;
export interface StringWithContext {
readonly [HAS_CONTEXT]: true;
export class StringWithContext {
readonly [HAS_CONTEXT] = true as const;
value: string;
context: NixStringContext;
constructor(value: string, context: NixStringContext) {
this.value = value;
this.context = context;
}
}
export const isStringWithContext = (v: unknown): v is StringWithContext => {
return typeof v === "object" && v !== null && HAS_CONTEXT in v && (v as StringWithContext)[HAS_CONTEXT] === true;
export const isStringWithContext = (v: NixStrictValue): v is StringWithContext => {
return v instanceof StringWithContext;
};
export const mkStringWithContext = (value: string, context: NixStringContext): StringWithContext => {
return { [HAS_CONTEXT]: true, value, context };
return new StringWithContext(value, context);
};
export const mkPlainString = (value: string): string => value;
@@ -43,11 +49,12 @@ export const getStringValue = (s: string | StringWithContext): string => {
return s;
};
const emptyContext: NixStringContext = new Set();
export const getStringContext = (s: string | StringWithContext): NixStringContext => {
if (isStringWithContext(s)) {
return s.context;
}
return new Set();
return emptyContext;
};
export const mergeContexts = (...contexts: NixStringContext[]): NixStringContext => {
@@ -60,28 +67,6 @@ export const mergeContexts = (...contexts: NixStringContext[]): NixStringContext
return result;
};
export const concatStringsWithContext = (
strings: (string | StringWithContext)[],
): string | StringWithContext => {
const parts: string[] = [];
const contexts: NixStringContext[] = [];
for (const s of strings) {
parts.push(getStringValue(s));
const ctx = getStringContext(s);
if (ctx.size > 0) {
contexts.push(ctx);
}
}
const value = parts.join("");
if (contexts.length === 0) {
return value;
}
return mkStringWithContext(value, mergeContexts(...contexts));
};
export const encodeContextElem = (elem: StringContextElem): string => {
switch (elem.type) {
case "opaque":
@@ -160,35 +145,9 @@ export const parseContextToInfoMap = (context: NixStringContext): Map<string, Pa
}
}
return result;
};
export const extractInputDrvsAndSrcs = (
context: NixStringContext,
): { inputDrvs: Map<string, Set<string>>; inputSrcs: Set<string> } => {
const inputDrvs = new Map<string, Set<string>>();
const inputSrcs = new Set<string>();
for (const encoded of context) {
const elem = decodeContextElem(encoded);
switch (elem.type) {
case "opaque":
inputSrcs.add(elem.path);
break;
case "drvDeep":
inputSrcs.add(elem.drvPath);
break;
case "built": {
let outputs = inputDrvs.get(elem.drvPath);
if (!outputs) {
outputs = new Set<string>();
inputDrvs.set(elem.drvPath, outputs);
}
outputs.add(elem.output);
break;
}
}
for (const info of result.values()) {
info.outputs.sort();
}
return { inputDrvs, inputSrcs };
return result;
};

View File

@@ -1,41 +1,48 @@
/**
* Lazy evaluation system for nix-js
* Implements thunks for lazy evaluation of Nix expressions
*/
import { isAttrs, isList } from "./builtins/type-check";
import { StringWithContext } from "./string-context";
import type { NixAttrs, NixStrictValue, NixValue } from "./types";
import { NixPath } from "./types";
import type { NixValue, NixThunkInterface, NixStrictValue } from "./types";
/**
* Symbol used to mark objects as thunks
* This is exported to Rust via Nix.IS_THUNK
*/
export const IS_THUNK = Symbol("is_thunk");
const forceStack: NixThunk[] = [];
const MAX_FORCE_DEPTH = 1000;
export const DEBUG_THUNKS = { enabled: true };
/**
* NixThunk class - represents a lazy, unevaluated expression
*
* A thunk wraps a function that produces a value when called.
* Once evaluated, the result is cached to avoid recomputation.
*
* Thunk states:
* - Unevaluated: func is defined, result is undefined
* - Evaluating (blackhole): func is undefined, result is undefined
* - Evaluated: func is undefined, result is defined
*/
export class NixThunk implements NixThunkInterface {
[key: symbol]: any;
export class NixThunk {
readonly [IS_THUNK] = true as const;
func: (() => NixValue) | undefined;
result: NixStrictValue | undefined;
readonly label: string | undefined;
constructor(func: () => NixValue) {
constructor(func: () => NixValue, label?: string) {
this.func = func;
this.result = undefined;
this.label = label;
}
toString(): string {
if (this.label) {
return `«thunk ${this.label}»`;
}
return `«thunk»`;
}
}
/**
* Type guard to check if a value is a thunk
* @param value - Value to check
* @returns true if value is a NixThunk
*/
export const isThunk = (value: unknown): value is NixThunkInterface => {
return value !== null && typeof value === "object" && IS_THUNK in value && value[IS_THUNK] === true;
export const isThunk = (value: NixValue): value is NixThunk => {
return value instanceof NixThunk;
};
/**
@@ -43,32 +50,139 @@ export const isThunk = (value: unknown): value is NixThunkInterface => {
* If the value is a thunk, evaluate it and cache the result
* If already evaluated or not a thunk, return as-is
*
* Uses "blackhole" detection to catch infinite recursion:
* - Before evaluating, set func to undefined (entering blackhole state)
* - If we encounter a thunk with func=undefined and result=undefined, it's a blackhole
*
* @param value - Value to force (may be a thunk)
* @returns The forced/evaluated value
* @throws Error if infinite recursion is detected
*/
export const force = (value: NixValue): NixStrictValue => {
if (!isThunk(value)) {
return value;
}
// Already evaluated - return cached result
if (value.func === undefined) {
return value.result!;
if (value.result === undefined) {
const thunk = value as NixThunk;
let msg = `infinite recursion encountered at ${thunk}\n`;
if (DEBUG_THUNKS.enabled) {
msg += "Force chain (most recent first):\n";
for (let i = forceStack.length - 1; i >= 0; i--) {
const t = forceStack[i];
msg += ` ${i + 1}. ${t}`;
msg += "\n";
}
}
throw new Error(msg);
}
return value.result;
}
// Evaluate and cache
const result = force(value.func());
value.result = result;
value.func = undefined;
const thunk = value as NixThunk;
const func = thunk.func as () => NixValue;
thunk.func = undefined;
return result;
if (DEBUG_THUNKS.enabled) {
forceStack.push(thunk);
if (forceStack.length > MAX_FORCE_DEPTH) {
let msg = `force depth exceeded ${MAX_FORCE_DEPTH} at ${thunk}\n`;
msg += "Force chain (most recent first):\n";
for (let i = forceStack.length - 1; i >= Math.max(0, forceStack.length - 20); i--) {
const t = forceStack[i];
msg += ` ${i + 1}. ${t}`;
msg += "\n";
}
throw new Error(msg);
}
}
try {
const result = force(func());
thunk.result = result;
return result;
} catch (e) {
thunk.func = func;
throw e;
} finally {
if (DEBUG_THUNKS.enabled) {
forceStack.pop();
}
}
};
export const createThunk = (func: () => NixValue, label?: string): NixThunk => {
return new NixThunk(func, label);
};
export const IS_CYCLE = Symbol("is_cycle");
export const CYCLE_MARKER = { [IS_CYCLE]: true as const };
/**
* Create a new thunk from a function
* @param func - Function that produces a value when called
* @returns A new NixThunk wrapping the function
* Deeply force a value, handling cycles by returning a special marker.
* Uses WeakSet to track seen objects and avoid infinite recursion.
* Returns a fully forced value where thunks are replaced with their results.
* Cyclic references are replaced with CYCLE_MARKER, preserving the container type.
*/
export const createThunk = (func: () => NixValue): NixThunkInterface => {
return new NixThunk(func);
export const forceDeep = (value: NixValue, seen: WeakSet<object> = new WeakSet()): NixStrictValue => {
const forced = force(value);
if (forced === null || typeof forced !== "object") {
return forced;
}
if (seen.has(forced)) {
return CYCLE_MARKER;
}
if ((isAttrs(forced) && forced.size > 0) || (isList(forced) && forced.length > 0)) {
seen.add(forced);
}
if (forced instanceof StringWithContext || forced instanceof NixPath) {
return forced;
}
if (Array.isArray(forced)) {
return forced.map((item) => forceDeep(item, seen));
}
if (forced instanceof Map) {
const result: NixAttrs = new Map();
for (const [key, val] of forced) {
result.set(key, forceDeep(val, seen));
}
return result;
}
return forced;
};
export const forceShallow = (value: NixValue): NixStrictValue => {
const forced = force(value);
if (forced === null || typeof forced !== "object") {
return forced;
}
if (Array.isArray(forced)) {
return forced.map((item) => {
const forcedItem = force(item);
if (typeof forcedItem === "object" && forcedItem === forced) {
return CYCLE_MARKER;
} else {
return forcedItem;
}
});
}
if (isAttrs(forced)) {
const result: NixAttrs = new Map();
for (const [key, val] of forced) {
const forcedVal = force(val as NixValue);
result.set(key, forcedVal === forced ? CYCLE_MARKER : forcedVal);
}
return result;
}
return forced;
};

View File

@@ -1,70 +1,48 @@
/**
* Type assertion helpers for runtime type checking
* These functions force evaluation and verify the type, throwing errors on mismatch
*/
import type { NixValue, NixList, NixAttrs, NixFunction, NixInt, NixFloat, NixNumber, NixString } from "./types";
import { isStringWithContext } from "./types";
import { isAttrs, isFunction, typeOf } from "./builtins/type-check";
import { force } from "./thunk";
import { getStringValue } from "./string-context";
import type {
NixAttrs,
NixFloat,
NixFunction,
NixInt,
NixList,
NixNumber,
NixPath,
NixString,
NixValue,
} from "./types";
import { isNixPath, isStringWithContext } from "./types";
const typeName = (value: NixValue): string => {
const val = force(value);
if (typeof val === "bigint") return "int";
if (typeof val === "number") return "float";
if (typeof val === "boolean") return "boolean";
if (typeof val === "string") return "string";
if (isStringWithContext(val)) return "string";
if (val === null) return "null";
if (Array.isArray(val)) return "list";
if (typeof val === "function") return "lambda";
if (typeof val === "object") return "attribute set";
throw new TypeError(`Unknown Nix type: ${typeof val}`);
};
/**
* Force a value and assert it's a list
* @throws TypeError if value is not a list after forcing
*/
export const forceList = (value: NixValue): NixList => {
const forced = force(value);
if (!Array.isArray(forced)) {
throw new TypeError(`Expected list, got ${typeName(forced)}`);
throw new TypeError(`Expected list, got ${typeOf(forced)}`);
}
return forced;
};
/**
* Force a value and assert it's a function
* @throws TypeError if value is not a function after forcing
*/
export const forceFunction = (value: NixValue): NixFunction => {
const forced = force(value);
if (typeof forced !== "function") {
throw new TypeError(`Expected function, got ${typeName(forced)}`);
if (isFunction(forced)) {
return forced;
}
return forced;
if (forced instanceof Map && forced.has("__functor")) {
const functorSet = forced as NixAttrs;
const functor = forceFunction(functorSet.get("__functor") as NixValue);
return (arg: NixValue) => forceFunction(functor(functorSet))(arg);
}
throw new TypeError(`Expected function, got ${typeOf(forced)}`);
};
/**
* Force a value and assert it's an attribute set
* @throws TypeError if value is not an attribute set after forcing
*/
export const forceAttrs = (value: NixValue): NixAttrs => {
const forced = force(value);
if (typeof forced !== "object" || Array.isArray(forced) || forced === null || isStringWithContext(forced)) {
throw new TypeError(`Expected attribute set, got ${typeName(forced)}`);
if (!isAttrs(forced)) {
throw new TypeError(`Expected attribute set, got ${typeOf(forced)}`);
}
return forced;
};
/**
* Force a value and assert it's a string (plain or with context)
* @throws TypeError if value is not a string after forcing
*/
export const forceString = (value: NixValue): string => {
export const forceStringValue = (value: NixValue): string => {
const forced = force(value);
if (typeof forced === "string") {
return forced;
@@ -72,14 +50,10 @@ export const forceString = (value: NixValue): string => {
if (isStringWithContext(forced)) {
return forced.value;
}
throw new TypeError(`Expected string, got ${typeName(forced)}`);
throw new TypeError(`Expected string, got ${typeOf(forced)}`);
};
/**
* Force a value and assert it's a string, returning NixString (preserving context)
* @throws TypeError if value is not a string after forcing
*/
export const forceNixString = (value: NixValue): NixString => {
export const forceString = (value: NixValue): NixString => {
const forced = force(value);
if (typeof forced === "string") {
return forced;
@@ -87,78 +61,67 @@ export const forceNixString = (value: NixValue): NixString => {
if (isStringWithContext(forced)) {
return forced;
}
throw new TypeError(`Expected string, got ${typeName(forced)}`);
throw new TypeError(`Expected string, got ${typeOf(forced)}`);
};
/**
* Get the plain string value from any NixString
*/
export const nixStringValue = (s: NixString): string => {
return getStringValue(s);
export const forceStringNoCtx = (value: NixValue): string => {
const forced = force(value);
if (typeof forced === "string") {
return forced;
}
if (isStringWithContext(forced)) {
throw new TypeError(`the string '${forced.value}' is not allowed to refer to a store path`);
}
throw new TypeError(`Expected string, got ${typeOf(forced)}`);
};
/**
* Force a value and assert it's a boolean
* @throws TypeError if value is not a boolean after forcing
*/
export const forceBool = (value: NixValue): boolean => {
const forced = force(value);
if (typeof forced !== "boolean") {
throw new TypeError(`Expected boolean, got ${typeName(forced)}`);
throw new TypeError(`Expected boolean, got ${typeOf(forced)}`);
}
return forced;
};
/**
* Force a value and extract int value
* @throws TypeError if value is not an int
*/
export const forceInt = (value: NixValue): NixInt => {
const forced = force(value);
if (typeof forced === "bigint") {
return forced;
}
throw new TypeError(`Expected int, got ${typeName(forced)}`);
throw new TypeError(`Expected int, got ${typeOf(forced)}`);
};
/**
* Force a value and extract float value
* @throws TypeError if value is not a float
*/
export const forceFloat = (value: NixValue): NixFloat => {
const forced = force(value);
if (typeof forced === "number") {
return forced;
}
throw new TypeError(`Expected float, got ${typeName(forced)}`);
throw new TypeError(`Expected float, got ${typeOf(forced)}`);
};
/**
* Force a value and extract numeric value (int or float)
* @throws TypeError if value is not a numeric type
*/
export const forceNumeric = (value: NixValue): NixNumber => {
const forced = force(value);
if (typeof forced === "bigint" || typeof forced === "number") {
return forced;
}
throw new TypeError(`Expected numeric type, got ${typeName(forced)}`);
throw new TypeError(`Expected numeric type, got ${typeOf(forced)}`);
};
/**
* Coerce two numeric values to a common type for arithmetic
* Rule: If either is float, convert both to float; otherwise keep as bigint
* @returns [a, b] tuple of coerced values
*/
export const coerceNumeric = (a: NixNumber, b: NixNumber): [NixFloat, NixFloat] | [NixInt, NixInt] => {
const aIsInt = typeof a === "bigint";
const bIsInt = typeof b === "bigint";
// If either is float, convert both to float
if (!aIsInt || !bIsInt) {
return [aIsInt ? Number(a) : a, bIsInt ? Number(b) : b];
return [Number(a), Number(b)];
}
// Both are integers
return [a, b];
};
export const forceNixPath = (value: NixValue): NixPath => {
const forced = force(value);
if (isNixPath(forced)) {
return forced;
}
throw new TypeError(`Expected path, got ${typeOf(forced)}`);
};

View File

@@ -1,13 +1,24 @@
/**
* Core TypeScript type definitions for nix-js runtime
*/
import { IS_THUNK } from "./thunk";
import { type StringWithContext, HAS_CONTEXT, isStringWithContext } from "./string-context";
import { PRIMOP_METADATA, type PrimopMetadata } from "./builtins";
import { HAS_CONTEXT, isStringWithContext, type StringWithContext } from "./string-context";
import { type CYCLE_MARKER, force, type NixThunk } from "./thunk";
import { forceAttrs, forceStringNoCtx } from "./type-assert";
export { HAS_CONTEXT, isStringWithContext };
export type { StringWithContext };
// Nix primitive types
export const IS_PATH = Symbol("IS_PATH");
export class NixPath {
readonly [IS_PATH] = true as const;
value: string;
constructor(value: string) {
this.value = value;
}
}
export const isNixPath = (v: NixStrictValue): v is NixPath => {
return v instanceof NixPath;
};
export type NixInt = bigint;
export type NixFloat = number;
export type NixNumber = NixInt | NixFloat;
@@ -15,49 +26,90 @@ export type NixBool = boolean;
export type NixString = string | StringWithContext;
export type NixNull = null;
// Nix composite types
export const ATTR_POSITIONS = Symbol("attrPositions");
export type NixList = NixValue[];
export type NixAttrs = { [key: string]: NixValue };
export type NixFunction = (arg: NixValue) => NixValue;
export type NixAttrs = Map<string, NixValue> & { [ATTR_POSITIONS]?: Map<string, number> };
export type NixFunction = ((arg: NixValue) => NixValue) & {
args?: NixArgs;
[PRIMOP_METADATA]?: PrimopMetadata;
};
export class NixArgs {
required: string[];
optional: string[];
allowed: Set<string>;
ellipsis: boolean;
positions: Map<string, number>;
constructor(required: string[], optional: string[], positions: Map<string, number>, ellipsis: boolean) {
this.required = required;
this.optional = optional;
this.positions = positions;
this.ellipsis = ellipsis;
this.allowed = new Set(required.concat(optional));
}
check(arg: NixValue) {
const attrs = forceAttrs(arg);
/**
* Interface for lazy thunk values
* Thunks delay evaluation until forced
*/
export interface NixThunkInterface {
readonly [IS_THUNK]: true;
func: (() => NixValue) | undefined;
result: NixStrictValue | undefined;
for (const key of this.required) {
if (!attrs.has(key)) {
throw new Error(`Function called without required argument '${key}'`);
}
}
if (!this.ellipsis) {
for (const key of attrs.keys()) {
if (!this.allowed.has(key)) {
throw new Error(`Function called with unexpected argument '${key}'`);
}
}
}
}
}
export const mkFunction = (
f: (arg: NixValue) => NixValue,
required: string[],
optional: string[],
positions: Map<string, number>,
ellipsis: boolean,
): NixFunction => {
const func: NixFunction = f;
func.args = new NixArgs(required, optional, positions, ellipsis);
return func;
};
export const mkAttrs = (
attrs: NixAttrs,
positions: Map<string, number>,
dyns?: { dynKeys: NixValue[]; dynVals: NixValue[]; dynSpans: number[] },
): NixAttrs => {
if (dyns) {
const len = dyns.dynKeys.length;
for (let i = 0; i < len; i++) {
const key = force(dyns.dynKeys[i]);
if (key === null) {
continue;
}
const str = forceStringNoCtx(key);
attrs.set(str, dyns.dynVals[i]);
positions.set(str, dyns.dynSpans[i]);
}
}
if (positions.size > 0) {
attrs[ATTR_POSITIONS] = positions;
}
return attrs;
};
// Union of all Nix primitive types
export type NixPrimitive = NixNull | NixBool | NixInt | NixFloat | NixString;
export type NixValue =
| NixPrimitive
| NixPath
| NixList
| NixAttrs
| NixFunction
| NixThunk
| typeof CYCLE_MARKER;
export type NixStrictValue = Exclude<NixValue, NixThunk>;
/**
* NixValue: Union type representing any possible Nix value
* This is the core type used throughout the runtime
*/
export type NixValue = NixPrimitive | NixList | NixAttrs | NixFunction | NixThunkInterface;
export type NixStrictValue = Exclude<NixValue, NixThunkInterface>;
/**
* CatchableError: Error type thrown by `builtins.throw`
* This can be caught by `builtins.tryEval`
*/
export class CatchableError extends Error {}
// Operator function signatures
export type BinaryOp<T = NixValue, U = NixValue, R = NixValue> = (a: T, b: U) => R;
export type UnaryOp<T = NixValue, R = NixValue> = (a: T) => R;
/**
* Curried function types - All Nix builtins must be curried!
*
* Examples:
* - add: Curried2<number, number, number> = (a) => (b) => a + b
* - map: Curried2<NixFunction, NixList, NixList> = (f) => (list) => list.map(f)
*/
export type Curried2<A, B, R> = (a: A) => (b: B) => R;
export type Curried3<A, B, C, R> = (a: A) => (b: B) => (c: C) => R;
export type Curried4<A, B, C, D, R> = (a: A) => (b: B) => (c: C) => (d: D) => R;

View File

@@ -1,23 +1,133 @@
import type { NixRuntime } from "..";
import type { builtins } from "../builtins";
import type { FetchGitResult, FetchTarballResult, FetchUrlResult } from "../builtins/io";
import type {
assert,
call,
concatStringsWithContext,
hasAttr,
lookupWith,
mkPos,
resolvePath,
select,
selectWithDefault,
} from "../helpers";
import type { op } from "../operators";
import type { createThunk, force } from "../thunk";
import type { forceBool } from "../type-assert";
import type { mkAttrs, mkFunction, NixAttrs, NixStrictValue } from "../types";
declare global {
var Nix: NixRuntime;
var $t: typeof createThunk;
var $f: typeof force;
var $fb: typeof forceBool;
var $a: typeof assert;
var $c: typeof call;
var $h: typeof hasAttr;
var $s: typeof select;
var $sd: typeof selectWithDefault;
var $l: typeof lookupWith;
var $r: typeof resolvePath;
var $cs: typeof concatStringsWithContext;
var $ma: typeof mkAttrs;
var $mf: typeof mkFunction;
var $mp: typeof mkPos;
var $oa: typeof op.add;
var $os: typeof op.sub;
var $om: typeof op.mul;
var $od: typeof op.div;
var $oe: typeof op.eq;
var $ol: typeof op.lt;
var $og: typeof op.gt;
var $oc: typeof op.concat;
var $ou: typeof op.update;
var $b: typeof builtins;
var $e: NixAttrs;
var $gb: typeof Nix.getReplBinding;
namespace Deno {
namespace core {
namespace ops {
function op_resolve_path(path: string): string;
function op_import(path: string): string;
function op_import(path: string): [Uint8Array, string];
function op_scoped_import(path: string, scopeKeys: string[]): [Uint8Array, string];
function op_resolve_path(currentDir: string, path: string): string;
function op_read_file(path: string): string;
function op_read_file_type(path: string): string;
function op_read_dir(path: string): Map<string, string>;
function op_path_exists(path: string): boolean;
function op_sha256_hex(data: string): string;
function op_make_store_path(ty: string, hash_hex: string, name: string): string;
function op_output_path_name(drv_name: string, output_name: string): string;
function op_make_fixed_output_path(
hash_algo: string,
hash: string,
hash_mode: string,
name: string,
function op_walk_dir(path: string): [string, string][];
function op_make_placeholder(output: string): string;
function op_store_path(path: string): string;
function op_convert_hash(hash: string, hashAlgo: string | null, toHashFormat: string): string;
function op_hash_string(algo: string, data: string): string;
function op_hash_file(algo: string, path: string): string;
function op_parse_hash(hashStr: string, algo: string | null): { hex: string; algo: string };
function op_add_path(
path: string,
name: string | null,
recursive: boolean,
sha256: string | null,
): string;
function op_add_filtered_path(
path: string,
name: string | null,
recursive: boolean,
sha256: string | null,
includePaths: string[],
): string;
function op_decode_span(span: number): NixAttrs;
function op_to_file(name: string, contents: string, references: string[]): string;
function op_copy_path_to_store(path: string): string;
function op_get_env(key: string): string;
function op_match(regex: string, text: string): (string | null)[] | null;
function op_split(regex: string, text: string): (string | (string | null)[])[];
function op_from_json(json: string): NixStrictValue;
function op_from_toml(toml: string): NixStrictValue;
function op_to_xml(e: NixValue): [string, string[]];
function op_finalize_derivation(
name: string,
builder: string,
platform: string,
outputs: string[],
args: string[],
env: [string, string][],
context: string[],
fixedOutput: { hashAlgo: string; hash: string; hashMode: string } | null,
): { drvPath: string; outputs: [string, string][] };
function op_fetch_url(
url: string,
expectedHash: string | null,
name: string | null,
executable: boolean,
): FetchUrlResult;
function op_fetch_tarball(
url: string,
name: string | null,
sha256: string | null,
): FetchTarballResult;
function op_fetch_git(
url: string,
ref: string | null,
rev: string | null,
shallow: boolean,
submodules: boolean,
allRefs: boolean,
name: string | null,
): FetchGitResult;
}
}
}

617
nix-js/runtime-ts/src/vm.ts Normal file
View File

@@ -0,0 +1,617 @@
import {
assert,
call,
concatStringsWithContext,
hasAttr,
lookupWith,
mkPos,
resolvePath,
select,
selectWithDefault,
} from "./helpers";
import { op } from "./operators";
import { NixThunk } from "./thunk";
import { forceBool } from "./type-assert";
import { mkAttrs, NixArgs, type NixAttrs, type NixFunction, type NixValue } from "./types";
import { builtins } from "./builtins";
enum Op {
PushConst = 0x01,
PushString = 0x02,
PushNull = 0x03,
PushTrue = 0x04,
PushFalse = 0x05,
LoadLocal = 0x06,
LoadOuter = 0x07,
StoreLocal = 0x08,
AllocLocals = 0x09,
MakeThunk = 0x0A,
MakeClosure = 0x0B,
MakePatternClosure = 0x0C,
Call = 0x0D,
CallNoSpan = 0x0E,
MakeAttrs = 0x0F,
MakeAttrsDyn = 0x10,
MakeEmptyAttrs = 0x11,
Select = 0x12,
SelectDefault = 0x13,
HasAttr = 0x14,
MakeList = 0x15,
OpAdd = 0x16,
OpSub = 0x17,
OpMul = 0x18,
OpDiv = 0x19,
OpEq = 0x20,
OpNeq = 0x21,
OpLt = 0x22,
OpGt = 0x23,
OpLeq = 0x24,
OpGeq = 0x25,
OpConcat = 0x26,
OpUpdate = 0x27,
OpNeg = 0x28,
OpNot = 0x29,
ForceBool = 0x30,
JumpIfFalse = 0x31,
JumpIfTrue = 0x32,
Jump = 0x33,
ConcatStrings = 0x34,
ResolvePath = 0x35,
Assert = 0x36,
PushWith = 0x37,
PopWith = 0x38,
WithLookup = 0x39,
LoadBuiltins = 0x40,
LoadBuiltin = 0x41,
MkPos = 0x43,
LoadReplBinding = 0x44,
LoadScopedBinding = 0x45,
Return = 0x46,
}
interface ScopeChain {
locals: NixValue[];
parent: ScopeChain | null;
}
interface WithScope {
env: NixValue;
last: WithScope | null;
}
const strings: string[] = [];
const constants: NixValue[] = [];
const $e: NixAttrs = new Map();
function readU16(code: Uint8Array, offset: number): number {
return code[offset] | (code[offset + 1] << 8);
}
function readU32(code: Uint8Array, offset: number): number {
return (
code[offset] |
(code[offset + 1] << 8) |
(code[offset + 2] << 16) |
(code[offset + 3] << 24)
) >>> 0;
}
function readI32(code: Uint8Array, offset: number): number {
return code[offset] | (code[offset + 1] << 8) | (code[offset + 2] << 16) | (code[offset + 3] << 24);
}
export function execBytecode(code: Uint8Array, currentDir: string): NixValue {
const chain: ScopeChain = { locals: [], parent: null };
return execFrame(code, 0, chain, currentDir, null, null);
}
export function execBytecodeScoped(
code: Uint8Array,
currentDir: string,
scopeMap: NixAttrs,
): NixValue {
const chain: ScopeChain = { locals: [], parent: null };
return execFrame(code, 0, chain, currentDir, null, scopeMap);
}
function execFrame(
code: Uint8Array,
startPc: number,
chain: ScopeChain,
currentDir: string,
withScope: WithScope | null,
scopeMap: NixAttrs | null,
): NixValue {
const locals = chain.locals;
const stack: NixValue[] = [];
let pc = startPc;
for (;;) {
const opcode = code[pc++];
switch (opcode) {
case Op.PushConst: {
const idx = readU32(code, pc);
pc += 4;
stack.push(constants[idx]);
break;
}
case Op.PushString: {
const idx = readU32(code, pc);
pc += 4;
stack.push(strings[idx]);
break;
}
case Op.PushNull:
stack.push(null);
break;
case Op.PushTrue:
stack.push(true);
break;
case Op.PushFalse:
stack.push(false);
break;
case Op.LoadLocal: {
const idx = readU32(code, pc);
pc += 4;
stack.push(locals[idx]);
break;
}
case Op.LoadOuter: {
const layer = code[pc++];
const idx = readU32(code, pc);
pc += 4;
let c: ScopeChain = chain;
for (let i = 0; i < layer; i++) c = c.parent!;
stack.push(c.locals[idx]);
break;
}
case Op.StoreLocal: {
const idx = readU32(code, pc);
pc += 4;
locals[idx] = stack.pop()!;
break;
}
case Op.AllocLocals: {
const n = readU32(code, pc);
pc += 4;
for (let i = 0; i < n; i++) locals.push(null);
break;
}
case Op.MakeThunk: {
const bodyPc = readU32(code, pc);
pc += 4;
const labelIdx = readU32(code, pc);
pc += 4;
const label = strings[labelIdx];
const scopeChain = chain;
const scopeCode = code;
const scopeDir = currentDir;
const scopeWith = withScope;
stack.push(
new NixThunk(
() => execFrame(scopeCode, bodyPc, scopeChain, scopeDir, scopeWith, null),
label,
),
);
break;
}
case Op.MakeClosure: {
const bodyPc = readU32(code, pc);
pc += 4;
const nSlots = readU32(code, pc);
pc += 4;
const closureChain = chain;
const closureCode = code;
const closureDir = currentDir;
const closureWith = withScope;
const func: NixFunction = (arg: NixValue) => {
const innerLocals = new Array<NixValue>(1 + nSlots).fill(null);
innerLocals[0] = arg;
const innerChain: ScopeChain = { locals: innerLocals, parent: closureChain };
return execFrame(closureCode, bodyPc, innerChain, closureDir, closureWith, null);
};
stack.push(func);
break;
}
case Op.MakePatternClosure: {
const bodyPc = readU32(code, pc);
pc += 4;
const nSlots = readU32(code, pc);
pc += 4;
const nRequired = readU16(code, pc);
pc += 2;
const nOptional = readU16(code, pc);
pc += 2;
const hasEllipsis = code[pc++] !== 0;
const required: string[] = [];
for (let i = 0; i < nRequired; i++) {
required.push(strings[readU32(code, pc)]);
pc += 4;
}
const optional: string[] = [];
for (let i = 0; i < nOptional; i++) {
optional.push(strings[readU32(code, pc)]);
pc += 4;
}
const positions = new Map<string, number>();
const nTotal = nRequired + nOptional;
for (let i = 0; i < nTotal; i++) {
const nameIdx = readU32(code, pc);
pc += 4;
const spanId = readU32(code, pc);
pc += 4;
positions.set(strings[nameIdx], spanId);
}
const closureChain = chain;
const closureCode = code;
const closureDir = currentDir;
const closureWith = withScope;
const func: NixFunction = (arg: NixValue) => {
const innerLocals = new Array<NixValue>(1 + nSlots).fill(null);
innerLocals[0] = arg;
const innerChain: ScopeChain = { locals: innerLocals, parent: closureChain };
return execFrame(closureCode, bodyPc, innerChain, closureDir, closureWith, null);
};
func.args = new NixArgs(required, optional, positions, hasEllipsis);
stack.push(func);
break;
}
case Op.Call: {
const spanId = readU32(code, pc);
pc += 4;
const arg = stack.pop()!;
const func = stack.pop()!;
stack.push(call(func, arg, spanId));
break;
}
case Op.CallNoSpan: {
const arg = stack.pop()!;
const func = stack.pop()!;
stack.push(call(func, arg));
break;
}
case Op.MakeAttrs: {
const n = readU32(code, pc);
pc += 4;
const spanValues: number[] = [];
for (let i = 0; i < n; i++) {
spanValues.push(stack.pop() as number);
}
spanValues.reverse();
const map: NixAttrs = new Map();
const posMap = new Map<string, number>();
const pairs: [string, NixValue][] = [];
for (let i = 0; i < n; i++) {
const val = stack.pop()!;
const key = stack.pop() as string;
pairs.push([key, val]);
}
pairs.reverse();
for (let i = 0; i < n; i++) {
map.set(pairs[i][0], pairs[i][1]);
posMap.set(pairs[i][0], spanValues[i]);
}
stack.push(mkAttrs(map, posMap));
break;
}
case Op.MakeAttrsDyn: {
const nStatic = readU32(code, pc);
pc += 4;
const nDyn = readU32(code, pc);
pc += 4;
const dynTriples: [NixValue, NixValue, number][] = [];
for (let i = 0; i < nDyn; i++) {
const dynSpan = stack.pop() as number;
const dynVal = stack.pop()!;
const dynKey = stack.pop()!;
dynTriples.push([dynKey, dynVal, dynSpan]);
}
dynTriples.reverse();
const spanValues: number[] = [];
for (let i = 0; i < nStatic; i++) {
spanValues.push(stack.pop() as number);
}
spanValues.reverse();
const map: NixAttrs = new Map();
const posMap = new Map<string, number>();
const pairs: [string, NixValue][] = [];
for (let i = 0; i < nStatic; i++) {
const val = stack.pop()!;
const key = stack.pop() as string;
pairs.push([key, val]);
}
pairs.reverse();
for (let i = 0; i < nStatic; i++) {
map.set(pairs[i][0], pairs[i][1]);
posMap.set(pairs[i][0], spanValues[i]);
}
const dynKeys: NixValue[] = [];
const dynVals: NixValue[] = [];
const dynSpans: number[] = [];
for (const [k, v, s] of dynTriples) {
dynKeys.push(k);
dynVals.push(v);
dynSpans.push(s);
}
stack.push(mkAttrs(map, posMap, { dynKeys, dynVals, dynSpans }));
break;
}
case Op.MakeEmptyAttrs:
stack.push($e);
break;
case Op.Select: {
const nKeys = readU16(code, pc);
pc += 2;
const spanId = readU32(code, pc);
pc += 4;
const keys: NixValue[] = [];
for (let i = 0; i < nKeys; i++) keys.push(stack.pop()!);
keys.reverse();
const obj = stack.pop()!;
stack.push(select(obj, keys, spanId));
break;
}
case Op.SelectDefault: {
const nKeys = readU16(code, pc);
pc += 2;
const spanId = readU32(code, pc);
pc += 4;
const defaultVal = stack.pop()!;
const keys: NixValue[] = [];
for (let i = 0; i < nKeys; i++) keys.push(stack.pop()!);
keys.reverse();
const obj = stack.pop()!;
stack.push(selectWithDefault(obj, keys, defaultVal, spanId));
break;
}
case Op.HasAttr: {
const nKeys = readU16(code, pc);
pc += 2;
const keys: NixValue[] = [];
for (let i = 0; i < nKeys; i++) keys.push(stack.pop()!);
keys.reverse();
const obj = stack.pop()!;
stack.push(hasAttr(obj, keys));
break;
}
case Op.MakeList: {
const count = readU32(code, pc);
pc += 4;
const items: NixValue[] = new Array(count);
for (let i = count - 1; i >= 0; i--) {
items[i] = stack.pop()!;
}
stack.push(items);
break;
}
case Op.OpAdd: {
const b = stack.pop()!;
const a = stack.pop()!;
stack.push(op.add(a, b));
break;
}
case Op.OpSub: {
const b = stack.pop()!;
const a = stack.pop()!;
stack.push(op.sub(a, b));
break;
}
case Op.OpMul: {
const b = stack.pop()!;
const a = stack.pop()!;
stack.push(op.mul(a, b));
break;
}
case Op.OpDiv: {
const b = stack.pop()!;
const a = stack.pop()!;
stack.push(op.div(a, b));
break;
}
case Op.OpEq: {
const b = stack.pop()!;
const a = stack.pop()!;
stack.push(op.eq(a, b));
break;
}
case Op.OpNeq: {
const b = stack.pop()!;
const a = stack.pop()!;
stack.push(!op.eq(a, b));
break;
}
case Op.OpLt: {
const b = stack.pop()!;
const a = stack.pop()!;
stack.push(op.lt(a, b));
break;
}
case Op.OpGt: {
const b = stack.pop()!;
const a = stack.pop()!;
stack.push(op.gt(a, b));
break;
}
case Op.OpLeq: {
const b = stack.pop()!;
const a = stack.pop()!;
stack.push(!op.gt(a, b));
break;
}
case Op.OpGeq: {
const b = stack.pop()!;
const a = stack.pop()!;
stack.push(!op.lt(a, b));
break;
}
case Op.OpConcat: {
const b = stack.pop()!;
const a = stack.pop()!;
stack.push(op.concat(a, b));
break;
}
case Op.OpUpdate: {
const b = stack.pop()!;
const a = stack.pop()!;
stack.push(op.update(a, b));
break;
}
case Op.OpNeg: {
const a = stack.pop()!;
stack.push(op.sub(0n, a));
break;
}
case Op.OpNot: {
const a = stack.pop()!;
stack.push(!forceBool(a));
break;
}
case Op.ForceBool: {
const val = stack.pop()!;
stack.push(forceBool(val));
break;
}
case Op.JumpIfFalse: {
const offset = readI32(code, pc);
pc += 4;
const val = stack.pop()!;
if (val === false) {
pc += offset;
}
break;
}
case Op.JumpIfTrue: {
const offset = readI32(code, pc);
pc += 4;
const val = stack.pop()!;
if (val === true) {
pc += offset;
}
break;
}
case Op.Jump: {
const offset = readI32(code, pc);
pc += 4;
pc += offset;
break;
}
case Op.ConcatStrings: {
const nParts = readU16(code, pc);
pc += 2;
const forceString = code[pc++] !== 0;
const parts: NixValue[] = new Array(nParts);
for (let i = nParts - 1; i >= 0; i--) {
parts[i] = stack.pop()!;
}
stack.push(concatStringsWithContext(parts, forceString));
break;
}
case Op.ResolvePath: {
const pathExpr = stack.pop()!;
stack.push(resolvePath(currentDir, pathExpr));
break;
}
case Op.Assert: {
const rawIdx = readU32(code, pc);
pc += 4;
const spanId = readU32(code, pc);
pc += 4;
const expr = stack.pop()!;
const assertion = stack.pop()!;
stack.push(assert(assertion, expr, strings[rawIdx], spanId));
break;
}
case Op.PushWith: {
const namespace = stack.pop()!;
withScope = { env: namespace, last: withScope };
break;
}
case Op.PopWith:
withScope = withScope!.last;
break;
case Op.WithLookup: {
const nameIdx = readU32(code, pc);
pc += 4;
stack.push(lookupWith(strings[nameIdx], withScope!));
break;
}
case Op.LoadBuiltins:
stack.push(builtins);
break;
case Op.LoadBuiltin: {
const idx = readU32(code, pc);
pc += 4;
stack.push(builtins.get(strings[idx])!);
break;
}
case Op.MkPos: {
const spanId = readU32(code, pc);
pc += 4;
stack.push(mkPos(spanId));
break;
}
case Op.LoadReplBinding: {
const idx = readU32(code, pc);
pc += 4;
stack.push(Nix.getReplBinding(strings[idx]));
break;
}
case Op.LoadScopedBinding: {
const idx = readU32(code, pc);
pc += 4;
stack.push(scopeMap!.get(strings[idx])!);
break;
}
case Op.Return:
return stack.pop()!;
default:
throw new Error(`Unknown bytecode opcode: ${opcode ? `0x${opcode.toString(16)}` : "undefined"} at pc=${pc - 1}`);
}
}
}
declare const Nix: {
getReplBinding: (name: string) => NixValue;
};
export { strings as vmStrings, constants as vmConstants };

View File

@@ -1,23 +0,0 @@
use anyhow::Result;
use nix_js::context::Context;
use std::process::exit;
fn main() -> Result<()> {
let mut args = std::env::args();
if args.len() != 2 {
eprintln!("Usage: {} expr", args.next().unwrap());
exit(1);
}
args.next();
let expr = args.next().unwrap();
match Context::new()?.eval_code(&expr) {
Ok(value) => {
println!("{value}");
Ok(())
}
Err(err) => {
eprintln!("Error: {err}");
Err(anyhow::anyhow!("{err}"))
}
}
}

View File

@@ -1,53 +0,0 @@
use anyhow::Result;
use regex::Regex;
use rustyline::DefaultEditor;
use rustyline::error::ReadlineError;
use nix_js::context::Context;
fn main() -> Result<()> {
let mut rl = DefaultEditor::new()?;
let mut context = Context::new()?;
let re = Regex::new(r"^\s*([a-zA-Z_][a-zA-Z0-9_'-]*)\s*=(.*)$").unwrap();
loop {
let readline = rl.readline("nix-js-repl> ");
match readline {
Ok(line) => {
if line.trim().is_empty() {
continue;
}
let _ = rl.add_history_entry(line.as_str());
if let Some(_caps) = re.captures(&line) {
eprintln!("Error: binding not implemented yet");
continue;
/* let ident = caps.get(1).unwrap().as_str();
let expr = caps.get(2).unwrap().as_str().trim();
if expr.is_empty() {
eprintln!("Error: missing expression after '='");
continue;
}
if let Err(err) = context.add_binding(ident, expr) {
eprintln!("Error: {}", err);
} */
} else {
match context.eval_code(&line) {
Ok(value) => println!("{value}"),
Err(err) => eprintln!("Error: {err}"),
}
}
}
Err(ReadlineError::Interrupted) => {
println!();
}
Err(ReadlineError::Eof) => {
println!("CTRL-D");
break;
}
Err(err) => {
eprintln!("Error: {err:?}");
break;
}
}
}
Ok(())
}

906
nix-js/src/bytecode.rs Normal file
View File

@@ -0,0 +1,906 @@
use std::ops::Deref;
use std::path::Path;
use hashbrown::HashMap;
use num_enum::TryFromPrimitive;
use rnix::TextRange;
use crate::ir::{ArgId, Attr, BinOpKind, Ir, Param, RawIrRef, SymId, ThunkId, UnOpKind};
#[derive(Clone, Hash, Eq, PartialEq)]
pub(crate) enum Constant {
Int(i64),
Float(u64),
}
pub struct Bytecode {
pub code: Box<[u8]>,
pub current_dir: String,
}
pub(crate) trait BytecodeContext {
fn intern_string(&mut self, s: &str) -> u32;
fn intern_constant(&mut self, c: Constant) -> u32;
fn register_span(&self, range: TextRange) -> u32;
fn get_sym(&self, id: SymId) -> &str;
fn get_current_dir(&self) -> &Path;
}
#[repr(u8)]
#[derive(Clone, Copy, TryFromPrimitive)]
#[allow(clippy::enum_variant_names)]
pub enum Op {
PushConst = 0x01,
PushString = 0x02,
PushNull = 0x03,
PushTrue = 0x04,
PushFalse = 0x05,
LoadLocal = 0x06,
LoadOuter = 0x07,
StoreLocal = 0x08,
AllocLocals = 0x09,
MakeThunk = 0x0A,
MakeClosure = 0x0B,
MakePatternClosure = 0x0C,
Call = 0x0D,
CallNoSpan = 0x0E,
MakeAttrs = 0x0F,
MakeAttrsDyn = 0x10,
MakeEmptyAttrs = 0x11,
Select = 0x12,
SelectDefault = 0x13,
HasAttr = 0x14,
MakeList = 0x15,
OpAdd = 0x16,
OpSub = 0x17,
OpMul = 0x18,
OpDiv = 0x19,
OpEq = 0x20,
OpNeq = 0x21,
OpLt = 0x22,
OpGt = 0x23,
OpLeq = 0x24,
OpGeq = 0x25,
OpConcat = 0x26,
OpUpdate = 0x27,
OpNeg = 0x28,
OpNot = 0x29,
ForceBool = 0x30,
JumpIfFalse = 0x31,
JumpIfTrue = 0x32,
Jump = 0x33,
ConcatStrings = 0x34,
ResolvePath = 0x35,
Assert = 0x36,
PushWith = 0x37,
PopWith = 0x38,
WithLookup = 0x39,
LoadBuiltins = 0x40,
LoadBuiltin = 0x41,
MkPos = 0x43,
LoadReplBinding = 0x44,
LoadScopedBinding = 0x45,
Return = 0x46,
}
struct ScopeInfo {
depth: u16,
arg_id: Option<ArgId>,
thunk_map: HashMap<ThunkId, u32>,
}
struct BytecodeEmitter<'a, Ctx: BytecodeContext> {
ctx: &'a mut Ctx,
code: Vec<u8>,
scope_stack: Vec<ScopeInfo>,
}
pub(crate) fn compile_bytecode(ir: RawIrRef<'_>, ctx: &mut impl BytecodeContext) -> Bytecode {
let current_dir = ctx.get_current_dir().to_string_lossy().to_string();
let mut emitter = BytecodeEmitter::new(ctx);
emitter.emit_toplevel(ir);
Bytecode {
code: emitter.code.into_boxed_slice(),
current_dir,
}
}
pub(crate) fn compile_bytecode_scoped(
ir: RawIrRef<'_>,
ctx: &mut impl BytecodeContext,
) -> Bytecode {
let current_dir = ctx.get_current_dir().to_string_lossy().to_string();
let mut emitter = BytecodeEmitter::new(ctx);
emitter.emit_toplevel_scoped(ir);
Bytecode {
code: emitter.code.into_boxed_slice(),
current_dir,
}
}
impl<'a, Ctx: BytecodeContext> BytecodeEmitter<'a, Ctx> {
fn new(ctx: &'a mut Ctx) -> Self {
Self {
ctx,
code: Vec::with_capacity(4096),
scope_stack: Vec::with_capacity(32),
}
}
#[inline]
fn emit_op(&mut self, op: Op) {
self.code.push(op as u8);
}
#[inline]
fn emit_u8(&mut self, val: u8) {
self.code.push(val);
}
#[inline]
fn emit_u16(&mut self, val: u16) {
self.code.extend_from_slice(&val.to_le_bytes());
}
#[inline]
fn emit_u32(&mut self, val: u32) {
self.code.extend_from_slice(&val.to_le_bytes());
}
#[inline]
fn emit_i32_placeholder(&mut self) -> usize {
let offset = self.code.len();
self.code.extend_from_slice(&[0u8; 4]);
offset
}
#[inline]
fn patch_i32(&mut self, offset: usize, val: i32) {
self.code[offset..offset + 4].copy_from_slice(&val.to_le_bytes());
}
#[inline]
fn emit_jump_placeholder(&mut self) -> usize {
self.emit_op(Op::Jump);
self.emit_i32_placeholder()
}
#[inline]
fn patch_jump_target(&mut self, placeholder_offset: usize) {
let current_pos = self.code.len();
let relative_offset = (current_pos as i32) - (placeholder_offset as i32) - 4;
self.patch_i32(placeholder_offset, relative_offset);
}
fn current_depth(&self) -> u16 {
self.scope_stack.last().map_or(0, |s| s.depth)
}
fn resolve_thunk(&self, id: ThunkId) -> (u16, u32) {
for scope in self.scope_stack.iter().rev() {
if let Some(&local_idx) = scope.thunk_map.get(&id) {
let layer = self.current_depth() - scope.depth;
return (layer, local_idx);
}
}
panic!("ThunkId {:?} not found in any scope", id);
}
fn resolve_arg(&self, id: ArgId) -> (u16, u32) {
for scope in self.scope_stack.iter().rev() {
if scope.arg_id == Some(id) {
let layer = self.current_depth() - scope.depth;
return (layer, 0);
}
}
panic!("ArgId {:?} not found in any scope", id);
}
fn emit_load(&mut self, layer: u16, local: u32) {
if layer == 0 {
self.emit_op(Op::LoadLocal);
self.emit_u32(local);
} else {
self.emit_op(Op::LoadOuter);
self.emit_u8(layer as u8);
self.emit_u32(local);
}
}
fn count_with_thunks(&self, ir: RawIrRef<'_>) -> usize {
match ir.deref() {
Ir::With { thunks, body, .. } => thunks.len() + self.count_with_thunks(*body),
Ir::TopLevel { thunks, body } => thunks.len() + self.count_with_thunks(*body),
Ir::If { cond, consq, alter } => {
self.count_with_thunks(*cond)
+ self.count_with_thunks(*consq)
+ self.count_with_thunks(*alter)
}
Ir::BinOp { lhs, rhs, .. } => {
self.count_with_thunks(*lhs) + self.count_with_thunks(*rhs)
}
Ir::UnOp { rhs, .. } => self.count_with_thunks(*rhs),
Ir::Call { func, arg, .. } => {
self.count_with_thunks(*func) + self.count_with_thunks(*arg)
}
Ir::Assert {
assertion, expr, ..
} => self.count_with_thunks(*assertion) + self.count_with_thunks(*expr),
Ir::Select { expr, default, .. } => {
self.count_with_thunks(*expr) + default.map_or(0, |d| self.count_with_thunks(d))
}
Ir::HasAttr { lhs, .. } => self.count_with_thunks(*lhs),
Ir::ConcatStrings { parts, .. } => {
parts.iter().map(|p| self.count_with_thunks(*p)).sum()
}
Ir::Path(p) => self.count_with_thunks(*p),
Ir::List { items } => items.iter().map(|item| self.count_with_thunks(*item)).sum(),
Ir::AttrSet { stcs, dyns } => {
stcs.iter()
.map(|(_, &(val, _))| self.count_with_thunks(val))
.sum::<usize>()
+ dyns
.iter()
.map(|&(k, v, _)| self.count_with_thunks(k) + self.count_with_thunks(v))
.sum::<usize>()
}
_ => 0,
}
}
fn collect_all_thunks<'ir>(
&self,
own_thunks: &[(ThunkId, RawIrRef<'ir>)],
body: RawIrRef<'ir>,
) -> Vec<(ThunkId, RawIrRef<'ir>)> {
let mut all = Vec::from(own_thunks);
self.collect_with_thunks_recursive(body, &mut all);
let mut i = 0;
while i < all.len() {
let thunk_body = all[i].1;
self.collect_with_thunks_recursive(thunk_body, &mut all);
i += 1;
}
all
}
fn collect_with_thunks_recursive<'ir>(
&self,
ir: RawIrRef<'ir>,
out: &mut Vec<(ThunkId, RawIrRef<'ir>)>,
) {
match ir.deref() {
Ir::With { thunks, body, .. } => {
for &(id, inner) in thunks.iter() {
out.push((id, inner));
}
self.collect_with_thunks_recursive(*body, out);
}
Ir::TopLevel { thunks, body } => {
for &(id, inner) in thunks.iter() {
out.push((id, inner));
}
self.collect_with_thunks_recursive(*body, out);
}
Ir::If { cond, consq, alter } => {
self.collect_with_thunks_recursive(*cond, out);
self.collect_with_thunks_recursive(*consq, out);
self.collect_with_thunks_recursive(*alter, out);
}
Ir::BinOp { lhs, rhs, .. } => {
self.collect_with_thunks_recursive(*lhs, out);
self.collect_with_thunks_recursive(*rhs, out);
}
Ir::UnOp { rhs, .. } => self.collect_with_thunks_recursive(*rhs, out),
Ir::Call { func, arg, .. } => {
self.collect_with_thunks_recursive(*func, out);
self.collect_with_thunks_recursive(*arg, out);
}
Ir::Assert {
assertion, expr, ..
} => {
self.collect_with_thunks_recursive(*assertion, out);
self.collect_with_thunks_recursive(*expr, out);
}
Ir::Select { expr, default, .. } => {
self.collect_with_thunks_recursive(*expr, out);
if let Some(d) = default {
self.collect_with_thunks_recursive(*d, out);
}
}
Ir::HasAttr { lhs, .. } => self.collect_with_thunks_recursive(*lhs, out),
Ir::ConcatStrings { parts, .. } => {
for p in parts.iter() {
self.collect_with_thunks_recursive(*p, out);
}
}
Ir::Path(p) => self.collect_with_thunks_recursive(*p, out),
Ir::List { items } => {
for item in items.iter() {
self.collect_with_thunks_recursive(*item, out);
}
}
Ir::AttrSet { stcs, dyns } => {
for (_, &(val, _)) in stcs.iter() {
self.collect_with_thunks_recursive(val, out);
}
for &(key, val, _) in dyns.iter() {
self.collect_with_thunks_recursive(key, out);
self.collect_with_thunks_recursive(val, out);
}
}
_ => {}
}
}
fn push_scope(&mut self, has_arg: bool, arg_id: Option<ArgId>, thunk_ids: &[ThunkId]) {
let depth = self.scope_stack.len() as u16;
let thunk_base = if has_arg { 1u32 } else { 0u32 };
let thunk_map = thunk_ids
.iter()
.enumerate()
.map(|(i, &id)| (id, thunk_base + i as u32))
.collect();
self.scope_stack.push(ScopeInfo {
depth,
arg_id,
thunk_map,
});
}
fn pop_scope(&mut self) {
self.scope_stack.pop();
}
fn emit_toplevel(&mut self, ir: RawIrRef<'_>) {
match ir.deref() {
Ir::TopLevel { body, thunks } => {
let with_thunk_count = self.count_with_thunks(*body);
let total_slots = thunks.len() + with_thunk_count;
let all_thunks = self.collect_all_thunks(thunks, *body);
let thunk_ids: Vec<ThunkId> = all_thunks.iter().map(|&(id, _)| id).collect();
self.push_scope(false, None, &thunk_ids);
if total_slots > 0 {
self.emit_op(Op::AllocLocals);
self.emit_u32(total_slots as u32);
}
self.emit_scope_thunks(thunks);
self.emit_expr(*body);
self.emit_op(Op::Return);
self.pop_scope();
}
_ => {
self.push_scope(false, None, &[]);
self.emit_expr(ir);
self.emit_op(Op::Return);
self.pop_scope();
}
}
}
fn emit_toplevel_scoped(&mut self, ir: RawIrRef<'_>) {
match ir.deref() {
Ir::TopLevel { body, thunks } => {
let with_thunk_count = self.count_with_thunks(*body);
let total_slots = thunks.len() + with_thunk_count;
let all_thunks = self.collect_all_thunks(thunks, *body);
let thunk_ids: Vec<ThunkId> = all_thunks.iter().map(|&(id, _)| id).collect();
self.push_scope(false, None, &thunk_ids);
if total_slots > 0 {
self.emit_op(Op::AllocLocals);
self.emit_u32(total_slots as u32);
}
self.emit_scope_thunks(thunks);
self.emit_expr(*body);
self.emit_op(Op::Return);
self.pop_scope();
}
_ => {
self.push_scope(false, None, &[]);
self.emit_expr(ir);
self.emit_op(Op::Return);
self.pop_scope();
}
}
}
fn emit_scope_thunks(&mut self, thunks: &[(ThunkId, RawIrRef<'_>)]) {
for &(id, inner) in thunks {
let label = format!("e{}", id.0);
let label_idx = self.ctx.intern_string(&label);
let skip_patch = self.emit_jump_placeholder();
let entry_point = self.code.len() as u32;
self.emit_expr(inner);
self.emit_op(Op::Return);
self.patch_jump_target(skip_patch);
self.emit_op(Op::MakeThunk);
self.emit_u32(entry_point);
self.emit_u32(label_idx);
let (_, local_idx) = self.resolve_thunk(id);
self.emit_op(Op::StoreLocal);
self.emit_u32(local_idx);
}
}
fn emit_expr(&mut self, ir: RawIrRef<'_>) {
match ir.deref() {
&Ir::Int(x) => {
let idx = self.ctx.intern_constant(Constant::Int(x));
self.emit_op(Op::PushConst);
self.emit_u32(idx);
}
&Ir::Float(x) => {
let idx = self.ctx.intern_constant(Constant::Float(x.to_bits()));
self.emit_op(Op::PushConst);
self.emit_u32(idx);
}
&Ir::Bool(true) => self.emit_op(Op::PushTrue),
&Ir::Bool(false) => self.emit_op(Op::PushFalse),
Ir::Null => self.emit_op(Op::PushNull),
Ir::Str(s) => {
let idx = self.ctx.intern_string(s.deref());
self.emit_op(Op::PushString);
self.emit_u32(idx);
}
&Ir::Path(p) => {
self.emit_expr(p);
self.emit_op(Op::ResolvePath);
}
&Ir::If { cond, consq, alter } => {
self.emit_expr(cond);
self.emit_op(Op::ForceBool);
self.emit_op(Op::JumpIfFalse);
let else_placeholder = self.emit_i32_placeholder();
let after_jif = self.code.len();
self.emit_expr(consq);
self.emit_op(Op::Jump);
let end_placeholder = self.emit_i32_placeholder();
let after_jump = self.code.len();
let else_offset = (after_jump as i32) - (after_jif as i32);
self.patch_i32(else_placeholder, else_offset);
self.emit_expr(alter);
let end_offset = (self.code.len() as i32) - (after_jump as i32);
self.patch_i32(end_placeholder, end_offset);
}
&Ir::BinOp { lhs, rhs, kind } => {
self.emit_binop(lhs, rhs, kind);
}
&Ir::UnOp { rhs, kind } => match kind {
UnOpKind::Neg => {
self.emit_expr(rhs);
self.emit_op(Op::OpNeg);
}
UnOpKind::Not => {
self.emit_expr(rhs);
self.emit_op(Op::OpNot);
}
},
&Ir::Func {
body,
ref param,
arg,
ref thunks,
} => {
self.emit_func(arg, thunks, param, body);
}
Ir::AttrSet { stcs, dyns } => {
self.emit_attrset(stcs, dyns);
}
Ir::List { items } => {
for &item in items.iter() {
self.emit_expr(item);
}
self.emit_op(Op::MakeList);
self.emit_u32(items.len() as u32);
}
&Ir::Call { func, arg, span } => {
self.emit_expr(func);
self.emit_expr(arg);
let span_id = self.ctx.register_span(span);
self.emit_op(Op::Call);
self.emit_u32(span_id);
}
&Ir::Arg(id) => {
let (layer, local) = self.resolve_arg(id);
self.emit_load(layer, local);
}
&Ir::TopLevel { body, ref thunks } => {
self.emit_toplevel_inner(body, thunks);
}
&Ir::Select {
expr,
ref attrpath,
default,
span,
} => {
self.emit_select(expr, attrpath, default, span);
}
&Ir::Thunk(id) => {
let (layer, local) = self.resolve_thunk(id);
self.emit_load(layer, local);
}
Ir::Builtins => {
self.emit_op(Op::LoadBuiltins);
}
&Ir::Builtin(name) => {
let sym = self.ctx.get_sym(name).to_string();
let idx = self.ctx.intern_string(&sym);
self.emit_op(Op::LoadBuiltin);
self.emit_u32(idx);
}
&Ir::ConcatStrings {
ref parts,
force_string,
} => {
for &part in parts.iter() {
self.emit_expr(part);
}
self.emit_op(Op::ConcatStrings);
self.emit_u16(parts.len() as u16);
self.emit_u8(if force_string { 1 } else { 0 });
}
&Ir::HasAttr { lhs, ref rhs } => {
self.emit_has_attr(lhs, rhs);
}
Ir::Assert {
assertion,
expr,
assertion_raw,
span,
} => {
let raw_idx = self.ctx.intern_string(assertion_raw);
let span_id = self.ctx.register_span(*span);
self.emit_expr(*assertion);
self.emit_expr(*expr);
self.emit_op(Op::Assert);
self.emit_u32(raw_idx);
self.emit_u32(span_id);
}
&Ir::CurPos(span) => {
let span_id = self.ctx.register_span(span);
self.emit_op(Op::MkPos);
self.emit_u32(span_id);
}
&Ir::ReplBinding(name) => {
let sym = self.ctx.get_sym(name).to_string();
let idx = self.ctx.intern_string(&sym);
self.emit_op(Op::LoadReplBinding);
self.emit_u32(idx);
}
&Ir::ScopedImportBinding(name) => {
let sym = self.ctx.get_sym(name).to_string();
let idx = self.ctx.intern_string(&sym);
self.emit_op(Op::LoadScopedBinding);
self.emit_u32(idx);
}
&Ir::With {
namespace,
body,
ref thunks,
} => {
self.emit_with(namespace, body, thunks);
}
&Ir::WithLookup(name) => {
let sym = self.ctx.get_sym(name).to_string();
let idx = self.ctx.intern_string(&sym);
self.emit_op(Op::WithLookup);
self.emit_u32(idx);
}
}
}
fn emit_binop(&mut self, lhs: RawIrRef<'_>, rhs: RawIrRef<'_>, kind: BinOpKind) {
use BinOpKind::*;
match kind {
And => {
self.emit_expr(lhs);
self.emit_op(Op::ForceBool);
self.emit_op(Op::JumpIfFalse);
let skip_placeholder = self.emit_i32_placeholder();
let after_jif = self.code.len();
self.emit_expr(rhs);
self.emit_op(Op::ForceBool);
self.emit_op(Op::Jump);
let end_placeholder = self.emit_i32_placeholder();
let after_jump = self.code.len();
let false_offset = (after_jump as i32) - (after_jif as i32);
self.patch_i32(skip_placeholder, false_offset);
self.emit_op(Op::PushFalse);
let end_offset = (self.code.len() as i32) - (after_jump as i32);
self.patch_i32(end_placeholder, end_offset);
}
Or => {
self.emit_expr(lhs);
self.emit_op(Op::ForceBool);
self.emit_op(Op::JumpIfTrue);
let skip_placeholder = self.emit_i32_placeholder();
let after_jit = self.code.len();
self.emit_expr(rhs);
self.emit_op(Op::ForceBool);
self.emit_op(Op::Jump);
let end_placeholder = self.emit_i32_placeholder();
let after_jump = self.code.len();
let true_offset = (after_jump as i32) - (after_jit as i32);
self.patch_i32(skip_placeholder, true_offset);
self.emit_op(Op::PushTrue);
let end_offset = (self.code.len() as i32) - (after_jump as i32);
self.patch_i32(end_placeholder, end_offset);
}
Impl => {
self.emit_expr(lhs);
self.emit_op(Op::ForceBool);
self.emit_op(Op::JumpIfFalse);
let skip_placeholder = self.emit_i32_placeholder();
let after_jif = self.code.len();
self.emit_expr(rhs);
self.emit_op(Op::ForceBool);
self.emit_op(Op::Jump);
let end_placeholder = self.emit_i32_placeholder();
let after_jump = self.code.len();
let true_offset = (after_jump as i32) - (after_jif as i32);
self.patch_i32(skip_placeholder, true_offset);
self.emit_op(Op::PushTrue);
let end_offset = (self.code.len() as i32) - (after_jump as i32);
self.patch_i32(end_placeholder, end_offset);
}
PipeL => {
self.emit_expr(rhs);
self.emit_expr(lhs);
self.emit_op(Op::CallNoSpan);
}
PipeR => {
self.emit_expr(lhs);
self.emit_expr(rhs);
self.emit_op(Op::CallNoSpan);
}
_ => {
self.emit_expr(lhs);
self.emit_expr(rhs);
self.emit_op(match kind {
Add => Op::OpAdd,
Sub => Op::OpSub,
Mul => Op::OpMul,
Div => Op::OpDiv,
Eq => Op::OpEq,
Neq => Op::OpNeq,
Lt => Op::OpLt,
Gt => Op::OpGt,
Leq => Op::OpLeq,
Geq => Op::OpGeq,
Con => Op::OpConcat,
Upd => Op::OpUpdate,
_ => unreachable!(),
});
}
}
}
fn emit_func(
&mut self,
arg: ArgId,
thunks: &[(ThunkId, RawIrRef<'_>)],
param: &Option<Param<'_>>,
body: RawIrRef<'_>,
) {
let with_thunk_count = self.count_with_thunks(body);
let total_slots = thunks.len() + with_thunk_count;
let all_thunks = self.collect_all_thunks(thunks, body);
let thunk_ids: Vec<ThunkId> = all_thunks.iter().map(|&(id, _)| id).collect();
let skip_patch = self.emit_jump_placeholder();
let entry_point = self.code.len() as u32;
self.push_scope(true, Some(arg), &thunk_ids);
self.emit_scope_thunks(thunks);
self.emit_expr(body);
self.emit_op(Op::Return);
self.pop_scope();
self.patch_jump_target(skip_patch);
if let Some(Param {
required,
optional,
ellipsis,
}) = param
{
self.emit_op(Op::MakePatternClosure);
self.emit_u32(entry_point);
self.emit_u32(total_slots as u32);
self.emit_u16(required.len() as u16);
self.emit_u16(optional.len() as u16);
self.emit_u8(if *ellipsis { 1 } else { 0 });
for &(sym, _) in required.iter() {
let name = self.ctx.get_sym(sym).to_string();
let idx = self.ctx.intern_string(&name);
self.emit_u32(idx);
}
for &(sym, _) in optional.iter() {
let name = self.ctx.get_sym(sym).to_string();
let idx = self.ctx.intern_string(&name);
self.emit_u32(idx);
}
for &(sym, span) in required.iter().chain(optional.iter()) {
let name = self.ctx.get_sym(sym).to_string();
let name_idx = self.ctx.intern_string(&name);
let span_id = self.ctx.register_span(span);
self.emit_u32(name_idx);
self.emit_u32(span_id);
}
} else {
self.emit_op(Op::MakeClosure);
self.emit_u32(entry_point);
self.emit_u32(total_slots as u32);
}
}
fn emit_attrset(
&mut self,
stcs: &crate::ir::HashMap<'_, SymId, (RawIrRef<'_>, TextRange)>,
dyns: &[(RawIrRef<'_>, RawIrRef<'_>, TextRange)],
) {
if stcs.is_empty() && dyns.is_empty() {
self.emit_op(Op::MakeEmptyAttrs);
return;
}
if !dyns.is_empty() {
for (&sym, &(val, _)) in stcs.iter() {
let key = self.ctx.get_sym(sym).to_string();
let idx = self.ctx.intern_string(&key);
self.emit_op(Op::PushString);
self.emit_u32(idx);
self.emit_expr(val);
}
for (_, &(_, span)) in stcs.iter() {
let span_id = self.ctx.register_span(span);
let idx = self.ctx.intern_constant(Constant::Int(span_id as i64));
self.emit_op(Op::PushConst);
self.emit_u32(idx);
}
for &(key, val, span) in dyns.iter() {
self.emit_expr(key);
self.emit_expr(val);
let span_id = self.ctx.register_span(span);
let idx = self.ctx.intern_constant(Constant::Int(span_id as i64));
self.emit_op(Op::PushConst);
self.emit_u32(idx);
}
self.emit_op(Op::MakeAttrsDyn);
self.emit_u32(stcs.len() as u32);
self.emit_u32(dyns.len() as u32);
} else {
for (&sym, &(val, _)) in stcs.iter() {
let key = self.ctx.get_sym(sym).to_string();
let idx = self.ctx.intern_string(&key);
self.emit_op(Op::PushString);
self.emit_u32(idx);
self.emit_expr(val);
}
for (_, &(_, span)) in stcs.iter() {
let span_id = self.ctx.register_span(span);
let idx = self.ctx.intern_constant(Constant::Int(span_id as i64));
self.emit_op(Op::PushConst);
self.emit_u32(idx);
}
self.emit_op(Op::MakeAttrs);
self.emit_u32(stcs.len() as u32);
}
}
fn emit_select(
&mut self,
expr: RawIrRef<'_>,
attrpath: &[Attr<RawIrRef<'_>>],
default: Option<RawIrRef<'_>>,
span: TextRange,
) {
self.emit_expr(expr);
for attr in attrpath.iter() {
match attr {
Attr::Str(sym, _) => {
let key = self.ctx.get_sym(*sym).to_string();
let idx = self.ctx.intern_string(&key);
self.emit_op(Op::PushString);
self.emit_u32(idx);
}
Attr::Dynamic(expr, _) => {
self.emit_expr(*expr);
}
}
}
if let Some(default) = default {
self.emit_expr(default);
let span_id = self.ctx.register_span(span);
self.emit_op(Op::SelectDefault);
self.emit_u16(attrpath.len() as u16);
self.emit_u32(span_id);
} else {
let span_id = self.ctx.register_span(span);
self.emit_op(Op::Select);
self.emit_u16(attrpath.len() as u16);
self.emit_u32(span_id);
}
}
fn emit_has_attr(&mut self, lhs: RawIrRef<'_>, rhs: &[Attr<RawIrRef<'_>>]) {
self.emit_expr(lhs);
for attr in rhs.iter() {
match attr {
Attr::Str(sym, _) => {
let key = self.ctx.get_sym(*sym).to_string();
let idx = self.ctx.intern_string(&key);
self.emit_op(Op::PushString);
self.emit_u32(idx);
}
Attr::Dynamic(expr, _) => {
self.emit_expr(*expr);
}
}
}
self.emit_op(Op::HasAttr);
self.emit_u16(rhs.len() as u16);
}
fn emit_with(
&mut self,
namespace: RawIrRef<'_>,
body: RawIrRef<'_>,
thunks: &[(ThunkId, RawIrRef<'_>)],
) {
self.emit_expr(namespace);
self.emit_op(Op::PushWith);
self.emit_scope_thunks(thunks);
self.emit_expr(body);
self.emit_op(Op::PopWith);
}
fn emit_toplevel_inner(&mut self, body: RawIrRef<'_>, thunks: &[(ThunkId, RawIrRef<'_>)]) {
self.emit_scope_thunks(thunks);
self.emit_expr(body);
}
}

View File

@@ -1,309 +1,618 @@
use itertools::Itertools as _;
use std::fmt::{self, Write as _};
use std::ops::Deref;
use std::path::Path;
use rnix::TextRange;
use crate::ir::*;
use crate::value::Symbol;
pub(crate) trait Compile<Ctx: CodegenContext> {
fn compile(&self, ctx: &Ctx) -> String;
macro_rules! code {
($buf:expr, $ctx:expr; $($item:expr)*) => {{
$(
($item).compile($ctx, $buf);
)*
}};
($buf:expr, $ctx:expr; $($item:expr)*) => {{
$(
($item).compile($ctx, $buf);
)*
}};
($buf:expr, $fmt:literal, $($arg:tt)*) => {
write!($buf, $fmt, $($arg)*).unwrap()
};
($buf:expr, $fmt:literal) => {
write!($buf, $fmt).unwrap()
};
}
pub(crate) fn compile<const SCOPED: bool>(expr: RawIrRef<'_>, ctx: &impl CodegenContext) -> String {
let mut buf = CodeBuffer::with_capacity(8192);
code!(
&mut buf, ctx;
"((" { if SCOPED { "_s" } else { "" } } ")=>{"
"const _d="
quoted(&ctx.get_current_dir().display().to_string())
",_w=null;"
"return " expr
"})" { if SCOPED { "" } else { "()" } }
);
buf.into_string()
}
struct CodeBuffer {
buf: String,
}
impl fmt::Write for CodeBuffer {
#[inline]
fn write_str(&mut self, s: &str) -> fmt::Result {
self.buf.push_str(s);
Ok(())
}
}
impl CodeBuffer {
#[inline]
fn with_capacity(capacity: usize) -> Self {
Self {
buf: String::with_capacity(capacity),
}
}
#[inline]
fn push_str(&mut self, s: &str) {
self.buf.push_str(s);
}
#[inline]
fn into_string(self) -> String {
self.buf
}
}
struct Quoted<'a>(&'a str);
#[inline]
fn quoted(s: &str) -> Quoted<'_> {
Quoted(s)
}
struct Escaped<'a>(&'a str);
impl<Ctx: CodegenContext> Compile<Ctx> for Escaped<'_> {
fn compile(&self, _ctx: &Ctx, buf: &mut CodeBuffer) {
for c in self.0.chars() {
let _ = match c {
'\\' => buf.write_str("\\\\"),
'"' => buf.write_str("\\\""),
'\n' => buf.write_str("\\n"),
'\r' => buf.write_str("\\r"),
'\t' => buf.write_str("\\t"),
_ => buf.write_char(c),
};
}
}
}
#[inline]
fn escaped(s: &str) -> Escaped<'_> {
Escaped(s)
}
struct Joined<I, F> {
items: I,
sep: &'static str,
write_fn: F,
}
#[inline]
fn joined<Ctx: CodegenContext, I: Iterator, F: Fn(&Ctx, &mut CodeBuffer, I::Item)>(
items: I,
sep: &'static str,
write_fn: F,
) -> Joined<I, F> {
Joined {
items,
sep,
write_fn,
}
}
trait Compile<Ctx: CodegenContext> {
fn compile(&self, ctx: &Ctx, buf: &mut CodeBuffer);
}
impl<Ctx: CodegenContext> Compile<Ctx> for str {
fn compile(&self, _ctx: &Ctx, buf: &mut CodeBuffer) {
buf.push_str(self);
}
}
impl<Ctx: CodegenContext> Compile<Ctx> for usize {
fn compile(&self, _ctx: &Ctx, buf: &mut CodeBuffer) {
let _ = write!(buf, "{self}");
}
}
impl<Ctx: CodegenContext> Compile<Ctx> for bool {
fn compile(&self, _ctx: &Ctx, buf: &mut CodeBuffer) {
let _ = write!(buf, "{self}");
}
}
impl<Ctx: CodegenContext> Compile<Ctx> for Quoted<'_> {
fn compile(&self, ctx: &Ctx, buf: &mut CodeBuffer) {
code!(buf, ctx; "\"" escaped(self.0) "\"")
}
}
impl<Ctx: CodegenContext, I, F> Compile<Ctx> for Joined<I, F>
where
I: IntoIterator + Clone,
F: Fn(&Ctx, &mut CodeBuffer, I::Item) + Clone,
{
fn compile(&self, ctx: &Ctx, buf: &mut CodeBuffer) {
let mut iter = self.items.clone().into_iter();
if let Some(first) = iter.next() {
(self.write_fn)(ctx, buf, first);
for item in iter {
buf.push_str(self.sep);
(self.write_fn)(ctx, buf, item);
}
}
}
}
impl<Ctx: CodegenContext> Compile<Ctx> for rnix::TextRange {
fn compile(&self, ctx: &Ctx, buf: &mut CodeBuffer) {
code!(buf, "{}", ctx.register_span(*self));
}
}
pub(crate) trait CodegenContext {
fn get_ir(&self, id: ExprId) -> &Ir;
fn get_sym(&self, id: SymId) -> &str;
fn get_sym(&self, id: SymId) -> Symbol<'_>;
fn get_current_dir(&self) -> &Path;
fn get_store_dir(&self) -> &str;
fn get_current_source_id(&self) -> usize;
fn register_span(&self, range: rnix::TextRange) -> usize;
}
impl<Ctx: CodegenContext> Compile<Ctx> for Ir {
fn compile(&self, ctx: &Ctx) -> String {
match self {
Ir::Int(int) => format!("{int}n"), // Generate BigInt literal
Ir::Float(float) => float.to_string(),
impl<Ctx: CodegenContext> Compile<Ctx> for Symbol<'_> {
fn compile(&self, ctx: &Ctx, buf: &mut CodeBuffer) {
quoted(self).compile(ctx, buf);
}
}
impl<Ctx: CodegenContext> Compile<Ctx> for RawIrRef<'_> {
fn compile(&self, ctx: &Ctx, buf: &mut CodeBuffer) {
match self.deref() {
Ir::Int(int) => {
code!(buf, "{}n", int);
}
Ir::Float(float) => {
code!(buf, "{}", float);
}
Ir::Bool(bool) => {
code!(buf, "{}", bool);
}
Ir::Null => {
code!(buf, ctx; "null");
}
Ir::Str(s) => {
// Escape string for JavaScript
let mut escaped = String::with_capacity(s.val.len() + 2);
escaped.push('"');
for c in s.val.chars() {
match c {
'\\' => escaped.push_str("\\\\"),
'\"' => escaped.push_str("\\\""),
'\n' => escaped.push_str("\\n"),
'\r' => escaped.push_str("\\r"),
'\t' => escaped.push_str("\\t"),
_ => escaped.push(c),
}
}
escaped.push('"');
escaped
code!(buf, ctx; quoted(s));
}
Ir::Path(p) => {
// Path needs runtime resolution for interpolated paths
let path_expr = ctx.get_ir(p.expr).compile(ctx);
format!("Nix.resolvePath({})", path_expr)
// Nix.resolvePath
code!(buf, ctx; "$r(_d," p ")");
}
&Ir::If(If { cond, consq, alter }) => {
let cond = ctx.get_ir(cond).compile(ctx);
let consq = ctx.get_ir(consq).compile(ctx);
let alter = ctx.get_ir(alter).compile(ctx);
format!("({cond})?({consq}):({alter})")
Ir::If { cond, consq, alter } => {
code!(buf, ctx; "$fb(" cond ")?(" consq "):(" alter ")");
}
Ir::BinOp(x) => x.compile(ctx),
Ir::UnOp(x) => x.compile(ctx),
Ir::Func(x) => x.compile(ctx),
Ir::AttrSet(x) => x.compile(ctx),
Ir::List(x) => x.compile(ctx),
&Ir::Call(Call { func, arg }) => {
let func = ctx.get_ir(func).compile(ctx);
let arg = ctx.get_ir(arg).compile(ctx);
format!("Nix.force({func})({arg})")
&Ir::BinOp { lhs, rhs, kind } => compile_binop(lhs, rhs, kind, ctx, buf),
&Ir::UnOp { rhs, kind } => compile_unop(rhs, kind, ctx, buf),
&Ir::Func {
body,
ref param,
arg,
ref thunks,
} => compile_func(arg, thunks, param, body, ctx, buf),
Ir::AttrSet { stcs, dyns } => compile_attrset(stcs, dyns, ctx, buf),
Ir::List { items } => compile_list(items, ctx, buf),
Ir::Call { func, arg, span } => {
code!(buf, ctx;
"$c("
func
","
arg
","
span
")"
);
}
Ir::Arg(x) => format!("arg{}", x.0),
Ir::Let(x) => x.compile(ctx),
Ir::Select(x) => x.compile(ctx),
&Ir::Thunk(expr_id) => {
let inner = ctx.get_ir(expr_id).compile(ctx);
format!("Nix.createThunk(()=>({}))", inner)
Ir::Arg(x) => {
code!(buf, "a{}", x.0);
}
&Ir::ExprRef(expr_id) => {
format!("expr{}", expr_id.0)
&Ir::TopLevel { body, ref thunks } => compile_toplevel(body, thunks, ctx, buf),
&Ir::Select {
expr,
ref attrpath,
default,
span,
} => compile_select(expr, attrpath, default, span, ctx, buf),
Ir::Thunk(ThunkId(id)) => {
code!(buf, "e{}", id);
}
Ir::Builtins(_) => "Nix.builtins".to_string(),
&Ir::Builtin(Builtin(name)) => format!("Nix.builtins[\"{}\"]", ctx.get_sym(name)),
Ir::ConcatStrings(x) => x.compile(ctx),
Ir::HasAttr(x) => x.compile(ctx),
&Ir::Assert(Assert { assertion, expr }) => {
let assertion = ctx.get_ir(assertion).compile(ctx);
let expr = ctx.get_ir(expr).compile(ctx);
format!("({assertion})?({expr}):(()=>{{throw \"assertion failed\"}})()")
Ir::Builtins => {
// Nix.builtins
code!(buf, ctx; "$b");
}
&Ir::Builtin(name) => {
// Nix.builtins
code!(buf, ctx; "$b.get(" ctx.get_sym(name) ")");
}
&Ir::ConcatStrings {
ref parts,
force_string,
} => compile_concat_strings(parts, force_string, ctx, buf),
&Ir::HasAttr { lhs, ref rhs } => compile_has_attr(lhs, rhs, ctx, buf),
Ir::Assert {
assertion,
expr,
assertion_raw,
span: assert_span,
} => {
// Nix.assert
code!(buf, ctx;
"$a("
assertion
","
expr
","
quoted(assertion_raw)
","
assert_span
")"
);
}
Ir::CurPos(span) => {
// Nix.mkPos
code!(buf, ctx; "$mp(" span ")");
}
&Ir::ReplBinding(name) => {
// Nix.getReplBinding
code!(buf, ctx; "$gb(" ctx.get_sym(name) ")");
}
&Ir::ScopedImportBinding(name) => {
code!(buf, ctx; "_s.get(" ctx.get_sym(name) ")");
}
&Ir::With {
namespace,
body,
ref thunks,
} => compile_with(namespace, body, thunks, ctx, buf),
&Ir::WithLookup(name) => {
// Nix.lookupWith
code!(buf, ctx; "$l(" ctx.get_sym(name) ",_w)");
}
}
}
}
impl<Ctx: CodegenContext> Compile<Ctx> for BinOp {
fn compile(&self, ctx: &Ctx) -> String {
use BinOpKind::*;
let lhs = ctx.get_ir(self.lhs).compile(ctx);
let rhs = ctx.get_ir(self.rhs).compile(ctx);
match self.kind {
Add => format!("Nix.op.add({},{})", lhs, rhs),
Sub => format!("Nix.op.sub({},{})", lhs, rhs),
Mul => format!("Nix.op.mul({},{})", lhs, rhs),
Div => format!("Nix.op.div({},{})", lhs, rhs),
Eq => format!("Nix.op.eq({},{})", lhs, rhs),
Neq => format!("Nix.op.neq({},{})", lhs, rhs),
Lt => format!("Nix.op.lt({},{})", lhs, rhs),
Gt => format!("Nix.op.gt({},{})", lhs, rhs),
Leq => format!("Nix.op.lte({},{})", lhs, rhs),
Geq => format!("Nix.op.gte({},{})", lhs, rhs),
// Short-circuit operators: use JavaScript native && and ||
And => format!("Nix.force({}) && Nix.force({})", lhs, rhs),
Or => format!("Nix.force({}) || Nix.force({})", lhs, rhs),
Impl => format!("(!Nix.force({}) || Nix.force({}))", lhs, rhs),
Con => format!("Nix.op.concat({},{})", lhs, rhs),
Upd => format!("Nix.op.update({},{})", lhs, rhs),
PipeL => format!("Nix.force({})({})", rhs, lhs),
PipeR => format!("Nix.force({})({})", lhs, rhs),
}
}
}
impl<Ctx: CodegenContext> Compile<Ctx> for UnOp {
fn compile(&self, ctx: &Ctx) -> String {
use UnOpKind::*;
let rhs = ctx.get_ir(self.rhs).compile(ctx);
match self.kind {
Neg => format!("Nix.op.sub(0n,{rhs})"),
Not => format!("Nix.op.bnot({rhs})"),
}
}
}
impl<Ctx: CodegenContext> Compile<Ctx> for Func {
fn compile(&self, ctx: &Ctx) -> String {
let id = ctx.get_ir(self.arg).as_ref().unwrap_arg().0;
let body = ctx.get_ir(self.body).compile(ctx);
// Generate parameter validation code
let param_check = self.generate_param_check(ctx);
if param_check.is_empty() {
// Simple function without parameter validation
format!("arg{id}=>({body})")
} else {
// Function with parameter validation (use block statement, not object literal)
format!("arg{id}=>{{{}return {}}}", param_check, body)
}
}
}
impl Func {
fn generate_param_check<Ctx: CodegenContext>(&self, ctx: &Ctx) -> String {
let has_checks = self.param.required.is_some() || self.param.allowed.is_some();
if !has_checks {
return String::new();
}
let id = ctx.get_ir(self.arg).as_ref().unwrap_arg().0;
// Build required parameter array
let required = if let Some(req) = &self.param.required {
let keys: Vec<_> = req
.iter()
.map(|&sym| format!("\"{}\"", ctx.get_sym(sym)))
.collect();
format!("[{}]", keys.join(","))
} else {
"null".to_string()
};
// Build allowed parameter array
let allowed = if let Some(allow) = &self.param.allowed {
let keys: Vec<_> = allow
.iter()
.map(|&sym| format!("\"{}\"", ctx.get_sym(sym)))
.collect();
format!("[{}]", keys.join(","))
} else {
"null".to_string()
};
// Call Nix.validateParams and store the result
format!("Nix.validateParams(arg{},{},{});", id, required, allowed)
}
}
impl<Ctx: CodegenContext> Compile<Ctx> for Let {
fn compile(&self, ctx: &Ctx) -> String {
let info = &self.binding_sccs;
let mut js_statements = Vec::new();
for (scc_exprs, is_recursive) in info.sccs.iter() {
if *is_recursive {
for &expr in scc_exprs {
js_statements.push(format!("let expr{}", expr.0));
}
for &expr in scc_exprs {
let value = ctx.get_ir(expr).compile(ctx);
js_statements.push(format!("expr{}={}", expr.0, value));
}
} else {
for &expr in scc_exprs {
let ir = ctx.get_ir(expr);
let value = if let Ir::Thunk(inner) = ir {
ctx.get_ir(*inner).compile(ctx)
} else {
ir.compile(ctx)
};
js_statements.push(format!("let expr{}={}", expr.0, value));
}
}
}
let body = ctx.get_ir(self.body).compile(ctx);
format!("(()=>{{{}; return {}}})()", js_statements.join(";"), body)
}
}
impl<Ctx: CodegenContext> Compile<Ctx> for Select {
fn compile(&self, ctx: &Ctx) -> String {
let expr = ctx.get_ir(self.expr).compile(ctx);
let mut result = expr;
let attr_count = self.attrpath.len();
for (i, attr) in self.attrpath.iter().enumerate() {
let is_last = i == attr_count - 1;
result = match attr {
Attr::Str(sym) => {
let key = ctx.get_sym(*sym);
if let Some(default) = self.default
&& is_last
{
let default_val = ctx.get_ir(default).compile(ctx);
format!(
"Nix.selectWithDefault({}, \"{}\", {})",
result, key, default_val
)
} else {
format!("Nix.select({}, \"{}\")", result, key)
}
}
Attr::Dynamic(expr_id) => {
let key = ctx.get_ir(*expr_id).compile(ctx);
if let Some(default) = self.default
&& is_last
{
let default_val = ctx.get_ir(default).compile(ctx);
format!(
"Nix.selectWithDefault({}, {}, {})",
result, key, default_val
)
} else {
format!("Nix.select({}, {})", result, key)
}
}
fn compile_binop<'ir>(
lhs: RawIrRef<'ir>,
rhs: RawIrRef<'ir>,
kind: BinOpKind,
ctx: &impl CodegenContext,
buf: &mut CodeBuffer,
) {
use BinOpKind::*;
match kind {
Add | Sub | Mul | Div | Eq | Neq | Lt | Gt | Leq | Geq | Con | Upd => {
let op_func = match kind {
Add => "$oa",
Sub => "$os",
Mul => "$om",
Div => "$od",
Eq => "$oe",
Neq => "!$oe",
Lt => "$ol",
Gt => "$og",
Leq => "!$og",
Geq => "!$ol",
Con => "$oc",
Upd => "$ou",
_ => unreachable!(),
};
code!(
buf, ctx;
op_func "(" lhs "," rhs ")"
);
}
result
}
}
impl<Ctx: CodegenContext> Compile<Ctx> for AttrSet {
fn compile(&self, ctx: &Ctx) -> String {
let mut attrs = Vec::new();
for (&sym, &expr) in &self.stcs {
let key = ctx.get_sym(sym);
let value = ctx.get_ir(expr).compile(ctx);
attrs.push(format!("\"{}\": {}", key, value));
And => {
code!(
buf, ctx;
"$fb(" lhs ")" "&&" "$fb(" rhs ")"
);
}
for (key_expr, value_expr) in &self.dyns {
let key = ctx.get_ir(*key_expr).compile(ctx);
let value = ctx.get_ir(*value_expr).compile(ctx);
attrs.push(format!("[{}]: {}", key, value));
Or => {
code!(
buf, ctx;
"$fb(" lhs ")" "||" "$fb(" rhs ")"
);
}
Impl => {
code!(
buf, ctx;
"!$fb(" lhs ")" "||" "$fb(" rhs ")"
);
}
PipeL => {
code!(buf, ctx; "$c(" rhs "," lhs ")");
}
PipeR => {
code!(buf, ctx; "$c(" lhs "," rhs ")");
}
format!("{{{}}}", attrs.join(", "))
}
}
impl<Ctx: CodegenContext> Compile<Ctx> for List {
fn compile(&self, ctx: &Ctx) -> String {
let list = self
.items
.iter()
.map(|item| ctx.get_ir(*item).compile(ctx))
.join(",");
format!("[{list}]")
fn compile_unop(
rhs: RawIrRef<'_>,
kind: UnOpKind,
ctx: &impl CodegenContext,
buf: &mut CodeBuffer,
) {
use UnOpKind::*;
match kind {
Neg => {
// 0 - rhs
code!(buf, ctx; "$os(0n," rhs ")");
}
Not => {
code!(buf, ctx; "!$fb(" rhs ")");
}
}
}
impl<Ctx: CodegenContext> Compile<Ctx> for ConcatStrings {
fn compile(&self, ctx: &Ctx) -> String {
let parts: Vec<String> = self
.parts
.iter()
.map(|part| ctx.get_ir(*part).compile(ctx))
.collect();
fn compile_func<'ir, Ctx: CodegenContext>(
ArgId(id): ArgId,
thunks: &[(ThunkId, RawIrRef<'ir>)],
param: &Option<Param<'ir>>,
body: RawIrRef<'ir>,
ctx: &Ctx,
buf: &mut CodeBuffer,
) {
let has_thunks = !thunks.is_empty();
format!("Nix.concatStringsWithContext([{}])", parts.join(","))
}
}
impl<Ctx: CodegenContext> Compile<Ctx> for HasAttr {
fn compile(&self, ctx: &Ctx) -> String {
let lhs = ctx.get_ir(self.lhs).compile(ctx);
let attrpath = self
.rhs
.iter()
.map(|attr| match attr {
Attr::Str(sym) => {
format!("\"{}\"", ctx.get_sym(*sym))
}
Attr::Dynamic(expr_id) => ctx.get_ir(*expr_id).compile(ctx),
if let Some(Param {
required,
optional,
ellipsis,
}) = &param
{
code!(buf, "$mf(a{}=>", id);
if has_thunks {
code!(buf, ctx; "{" thunks "return " body "}");
} else {
code!(buf, ctx; "(" body ")");
}
code!(buf, ctx;
",["
joined(required.iter(), ",", |ctx: &Ctx, buf, &(sym, _)| {
code!(buf, ctx; ctx.get_sym(sym));
})
.join(",");
format!("Nix.hasAttr({lhs}, [{attrpath}])")
"],["
joined(optional.iter(), ",", |ctx: &Ctx, buf, &(sym, _)| {
code!(buf, ctx; ctx.get_sym(sym));
})
"],new Map(["
joined(required.iter().chain(optional.iter()), ",", |ctx: &Ctx, buf, &(sym, span)| {
code!(buf, ctx; "[" ctx.get_sym(sym) "," span "]");
})
"]),"
ellipsis
")"
);
} else {
code!(buf, "a{}=>", id);
if has_thunks {
code!(buf, ctx; "{" thunks "return " body "}");
} else {
code!(buf, ctx; "(" body ")");
}
}
}
impl<'ir, Ctx: CodegenContext> Compile<Ctx> for [(ThunkId, RawIrRef<'ir>)] {
fn compile(&self, ctx: &Ctx, buf: &mut CodeBuffer) {
if self.is_empty() {
return;
}
code!(
buf, ctx;
"const "
joined(self.iter(), ",", |ctx: &Ctx, buf, &(slot, inner)| {
code!(buf, ctx; "e" slot.0 "=$t(()=>(" inner ")," "'e" slot.0 "')");
})
";"
);
}
}
fn compile_toplevel<'ir, Ctx: CodegenContext>(
body: RawIrRef<'ir>,
thunks: &[(ThunkId, RawIrRef<'ir>)],
ctx: &Ctx,
buf: &mut CodeBuffer,
) {
if thunks.is_empty() {
body.compile(ctx, buf);
} else {
code!(buf, ctx; "(()=>{" thunks "return " body "})()");
}
}
fn compile_with<'ir>(
namespace: RawIrRef<'ir>,
body: RawIrRef<'ir>,
thunks: &[(ThunkId, RawIrRef<'ir>)],
ctx: &impl CodegenContext,
buf: &mut CodeBuffer,
) {
let has_thunks = !thunks.is_empty();
if has_thunks {
code!(buf, ctx; "((_w)=>{" thunks "return " body "})({env:" namespace ",last:_w})");
} else {
code!(buf, ctx; "((_w)=>(" body "))({env:" namespace ",last:_w})");
}
}
fn compile_select<'ir, Ctx: CodegenContext>(
expr: RawIrRef<'ir>,
attrpath: &[Attr<RawIrRef<'ir>>],
default: Option<RawIrRef<'ir>>,
span: TextRange,
ctx: &Ctx,
buf: &mut CodeBuffer,
) {
if let Some(default) = default {
code!(buf, ctx;
"$sd("
expr
",["
joined(attrpath.iter(), ",", |ctx: &Ctx, buf, attr| {
match attr {
Attr::Str(sym, _) => code!(buf, ctx; ctx.get_sym(*sym)),
Attr::Dynamic(expr_id, _) => code!(buf, ctx; *expr_id),
}
})
"],"
default
","
span
")"
);
} else {
code!(buf, ctx;
"$s("
expr
",["
joined(attrpath.iter(), ",", |ctx: &Ctx, buf, attr| {
match attr {
Attr::Str(sym, _) => code!(buf, ctx; ctx.get_sym(*sym)),
Attr::Dynamic(expr, _) => code!(buf, ctx; expr),
}
})
"],"
span
")"
);
}
}
fn compile_attrset<'ir, Ctx: CodegenContext>(
stcs: &HashMap<'ir, SymId, (RawIrRef<'ir>, TextRange)>,
dyns: &[(RawIrRef<'ir>, RawIrRef<'ir>, TextRange)],
ctx: &Ctx,
buf: &mut CodeBuffer,
) {
if !dyns.is_empty() {
code!(buf, ctx;
"$ma(new Map(["
joined(stcs.iter(), ",", |ctx: &Ctx, buf, (&sym, &(val, _))| {
let key = ctx.get_sym(sym);
code!(
buf, ctx;
"[" key "," val "]"
);
})
"]),new Map(["
joined(stcs.iter(), ",", |ctx: &Ctx, buf, (&sym, &(_, span))| {
code!(buf, ctx; "[" ctx.get_sym(sym) "," span "]");
})
"]),{dynKeys:["
joined(dyns.iter(), ",", |ctx: &Ctx, buf, (key, _, _)| {
code!(buf, ctx; key);
})
"],dynVals:["
joined(dyns.iter(), ",", |ctx: &Ctx, buf, (_, val, _)| {
code!(buf, ctx; val);
})
"],dynSpans:["
joined(dyns.iter(), ",", |ctx: &Ctx, buf, (_, _, attr_span)| {
code!(buf, ctx; attr_span);
})
"]})"
);
} else if !stcs.is_empty() {
code!(buf, ctx;
"$ma(new Map(["
joined(stcs.iter(), ",", |ctx: &Ctx, buf, (&sym, &(val, _))| {
let key = ctx.get_sym(sym);
code!(
buf, ctx;
"[" key "," val "]"
);
})
"]),new Map(["
joined(stcs.iter(), ",", |ctx: &Ctx, buf, (&sym, &(_, span))| {
code!(buf, ctx; "[" ctx.get_sym(sym) "," span "]");
})
"]))"
);
} else {
code!(buf, ctx; "$e");
}
}
fn compile_list<Ctx: CodegenContext>(items: &[RawIrRef<'_>], ctx: &Ctx, buf: &mut CodeBuffer) {
code!(buf, ctx;
"["
joined(items.iter(), ",", |ctx: &Ctx, buf, item| {
code!(buf, ctx; item);
})
"]"
);
}
fn compile_concat_strings<Ctx: CodegenContext>(
parts: &[RawIrRef<'_>],
force_string: bool,
ctx: &Ctx,
buf: &mut CodeBuffer,
) {
code!(buf, ctx;
"$cs(["
joined(parts.iter(), ",", |ctx: &Ctx, buf, part| {
code!(buf, ctx; part);
})
"]," force_string ")"
);
}
fn compile_has_attr<'ir, Ctx: CodegenContext>(
lhs: RawIrRef<'ir>,
rhs: &[Attr<RawIrRef<'ir>>],
ctx: &Ctx,
buf: &mut CodeBuffer,
) {
code!(buf, ctx;
"$h("
lhs
",["
joined(rhs.iter(), ",", |ctx: &Ctx, buf, attr| {
match attr {
Attr::Str(sym, _) => code!(buf, ctx; ctx.get_sym(*sym)),
Attr::Dynamic(expr, _) => code!(buf, ctx; expr),
}
})
"])"
);
}

View File

@@ -1,124 +1,239 @@
use std::path::PathBuf;
use std::ptr::NonNull;
use std::cell::UnsafeCell;
use std::hash::BuildHasher;
use std::path::Path;
use hashbrown::HashMap;
use itertools::Itertools as _;
use bumpalo::Bump;
use ghost_cell::{GhostCell, GhostToken};
use hashbrown::{DefaultHashBuilder, HashMap, HashSet, HashTable};
use rnix::TextRange;
use string_interner::DefaultStringInterner;
use crate::codegen::{CodegenContext, Compile};
use crate::error::{Error, Result};
use crate::ir::{Builtin, DowngradeContext, ExprId, Ir, SymId};
use crate::runtime::{Runtime, RuntimeCtx};
use crate::value::Value;
use crate::bytecode::{self, Bytecode, BytecodeContext, Constant};
use crate::codegen::{CodegenContext, compile};
use crate::disassembler::{Disassembler, DisassemblerContext};
use crate::downgrade::*;
use crate::error::{Error, Result, Source};
use crate::ir::{ArgId, Ir, IrKey, IrRef, RawIrRef, SymId, ThunkId, ir_content_eq};
#[cfg(feature = "inspector")]
use crate::runtime::inspector::InspectorServer;
use crate::runtime::{ForceMode, Runtime, RuntimeContext};
use crate::store::{DaemonStore, Store, StoreConfig};
use crate::value::{Symbol, Value};
use downgrade::DowngradeCtx;
use drop_guard::{PathDropGuard, PathStackProvider};
mod downgrade;
mod drop_guard;
mod private {
use super::*;
use std::ops::DerefMut;
use std::ptr::NonNull;
pub struct CtxPtr(NonNull<Ctx>);
impl CtxPtr {
pub fn new(ctx: &mut Ctx) -> Self {
unsafe { CtxPtr(NonNull::new_unchecked(ctx)) }
}
fn as_ref(&self) -> &Ctx {
// SAFETY: This is safe since inner `NonNull<Ctx>` is obtained from `&mut Ctx`
unsafe { self.0.as_ref() }
}
fn as_mut(&mut self) -> &mut Ctx {
// SAFETY: This is safe since inner `NonNull<Ctx>` is obtained from `&mut Ctx`
unsafe { self.0.as_mut() }
}
}
impl PathStackProvider for CtxPtr {
fn path_stack(&mut self) -> &mut Vec<PathBuf> {
&mut self.as_mut().path_stack
}
}
impl RuntimeCtx for CtxPtr {
fn get_current_dir(&self) -> PathBuf {
self.as_ref().get_current_dir()
}
fn push_path_stack(&mut self, path: PathBuf) -> impl DerefMut<Target = Self> {
PathDropGuard::new(path, self)
}
fn compile_code(&mut self, expr: &str) -> Result<String> {
self.as_mut().compile_code(expr)
}
fn parse_error_span(error: &rnix::ParseError) -> Option<rnix::TextRange> {
use rnix::ParseError::*;
match error {
Unexpected(range)
| UnexpectedExtra(range)
| UnexpectedWanted(_, range, _)
| UnexpectedDoubleBind(range)
| DuplicatedArgs(range, _) => Some(*range),
_ => None,
}
}
use private::CtxPtr;
#[derive(Debug)]
pub(crate) struct SccInfo {
/// list of SCCs (exprs, recursive)
pub(crate) sccs: Vec<(Vec<ExprId>, bool)>,
fn handle_parse_error<'a>(
errors: impl IntoIterator<Item = &'a rnix::ParseError>,
source: Source,
) -> Option<Box<Error>> {
for err in errors {
if let Some(span) = parse_error_span(err) {
return Some(
Error::parse_error(err.to_string())
.with_source(source)
.with_span(span),
);
}
}
None
}
pub struct Context {
ctx: Ctx,
runtime: Runtime<CtxPtr>,
runtime: Runtime<Ctx>,
#[cfg(feature = "inspector")]
_inspector_server: Option<InspectorServer>,
}
macro_rules! eval_bc {
($name:ident, $mode:expr) => {
pub fn $name(&mut self, source: Source) -> Result<Value> {
tracing::info!("Starting evaluation");
tracing::debug!("Compiling bytecode");
let bytecode = self.ctx.compile_bytecode(source)?;
tracing::debug!("Executing bytecode");
self.runtime.eval_bytecode(bytecode, &mut self.ctx, $mode)
}
};
}
impl Context {
pub fn new() -> Result<Self> {
let ctx = Ctx::new();
let ctx = Ctx::new()?;
#[cfg(feature = "inspector")]
let runtime = Runtime::new(Default::default())?;
#[cfg(not(feature = "inspector"))]
let runtime = Runtime::new()?;
Ok(Self { ctx, runtime })
let mut context = Self {
ctx,
runtime,
#[cfg(feature = "inspector")]
_inspector_server: None,
};
context.init()?;
Ok(context)
}
pub fn eval_code(&mut self, expr: &str) -> Result<Value> {
// Initialize `path_stack` with current directory for relative path resolution
let mut guard = PathDropGuard::new_cwd(&mut self.ctx)?;
let ctx = guard.as_ctx();
#[cfg(feature = "inspector")]
pub fn new_with_inspector(addr: std::net::SocketAddr, wait_for_session: bool) -> Result<Self> {
use crate::runtime::InspectorOptions;
let code = ctx.compile_code(expr)?;
self.runtime.eval(code, CtxPtr::new(&mut self.ctx))
let ctx = Ctx::new()?;
let runtime = Runtime::new(InspectorOptions {
enable: true,
wait: wait_for_session,
})?;
let server = crate::runtime::inspector::InspectorServer::new(addr, "nix-js")
.map_err(|e| Error::internal(e.to_string()))?;
server.register_inspector("nix-js".to_string(), runtime.inspector(), wait_for_session);
let mut context = Self {
ctx,
runtime,
_inspector_server: Some(server),
};
context.init()?;
Ok(context)
}
pub fn compile_code(&mut self, expr: &str) -> Result<String> {
self.ctx.compile_code(expr)
#[cfg(feature = "inspector")]
pub fn wait_for_inspector_disconnect(&mut self) {
self.runtime.wait_for_inspector_disconnect();
}
#[allow(dead_code)]
pub(crate) fn eval_js(&mut self, code: String) -> Result<Value> {
self.runtime.eval(code, CtxPtr::new(&mut self.ctx))
fn init(&mut self) -> Result<()> {
const DERIVATION_NIX: &str = include_str!("runtime/corepkgs/derivation.nix");
let source = Source::new_virtual(
"<nix/derivation-internal.nix>".into(),
DERIVATION_NIX.to_string(),
);
let code = self.ctx.compile(source, None)?;
self.runtime.eval(
format!(
"Nix.builtins.set('derivation',({}));Nix.builtins.set('storeDir','{}');{}0n",
code,
self.get_store_dir(),
if std::env::var("NIX_JS_DEBUG_THUNKS").is_ok() {
"Nix.DEBUG_THUNKS.enabled=true;"
} else {
""
}
),
&mut self.ctx,
)?;
Ok(())
}
eval_bc!(eval, ForceMode::Force);
eval_bc!(eval_shallow, ForceMode::ForceShallow);
eval_bc!(eval_deep, ForceMode::ForceDeep);
pub fn eval_repl<'a>(&'a mut self, source: Source, scope: &'a HashSet<SymId>) -> Result<Value> {
tracing::info!("Starting evaluation");
tracing::debug!("Compiling code");
let code = self.ctx.compile(source, Some(Scope::Repl(scope)))?;
tracing::debug!("Executing JavaScript");
self.runtime
.eval(format!("Nix.forceShallow({})", code), &mut self.ctx)
}
pub fn compile(&mut self, source: Source) -> Result<String> {
self.ctx.compile(source, None)
}
pub fn compile_bytecode(&mut self, source: Source) -> Result<Bytecode> {
self.ctx.compile_bytecode(source)
}
pub fn disassemble(&self, bytecode: &Bytecode) -> String {
Disassembler::new(bytecode, &self.ctx).disassemble()
}
pub fn disassemble_colored(&self, bytecode: &Bytecode) -> String {
Disassembler::new(bytecode, &self.ctx).disassemble_colored()
}
pub fn get_store_dir(&self) -> &str {
self.ctx.get_store_dir()
}
pub fn add_binding<'a>(
&'a mut self,
name: &str,
expr: &str,
scope: &'a mut HashSet<SymId>,
) -> Result<Value> {
let source = Source::new_repl(expr.to_string())?;
let code = self.ctx.compile(source, Some(Scope::Repl(scope)))?;
let sym = self.ctx.symbols.get_or_intern(name);
let eval_and_store = format!(
"(()=>{{const __v=Nix.forceShallow({});Nix.setReplBinding(\"{}\",__v);return __v}})()",
code, name
);
scope.insert(sym);
self.runtime.eval(eval_and_store, &mut self.ctx)
}
}
pub(crate) struct Ctx {
irs: Vec<Ir>,
struct Ctx {
symbols: DefaultStringInterner,
global: NonNull<HashMap<SymId, ExprId>>,
path_stack: Vec<PathBuf>,
global: HashMap<SymId, Ir<'static, RawIrRef<'static>>>,
sources: Vec<Source>,
store: DaemonStore,
spans: UnsafeCell<Vec<(usize, TextRange)>>,
thunk_count: usize,
global_strings: Vec<String>,
global_string_map: HashMap<String, u32>,
global_constants: Vec<Constant>,
global_constant_map: HashMap<Constant, u32>,
synced_strings: usize,
synced_constants: usize,
}
impl Default for Ctx {
fn default() -> Self {
use crate::ir::{Builtins, ToIr as _};
/// Owns the bump allocator and a read-only reference into it.
///
/// # Safety
/// The `ir` field points into `_bump`'s storage. We use `'static` as a sentinel
/// lifetime because the struct owns the backing memory. The `as_ref` method
/// re-binds the lifetime to `&self`, preventing use-after-free.
struct OwnedIr {
_bump: Bump,
ir: RawIrRef<'static>,
}
impl OwnedIr {
fn as_ref(&self) -> RawIrRef<'_> {
self.ir
}
}
impl Ctx {
fn new() -> Result<Self> {
let mut symbols = DefaultStringInterner::new();
let mut irs = Vec::new();
let mut global = HashMap::new();
irs.push(Builtins.to_ir());
let builtins_expr = ExprId(0);
let builtins_sym = symbols.get_or_intern("builtins");
global.insert(builtins_sym, builtins_expr);
global.insert(builtins_sym, Ir::Builtins);
let free_globals = [
"true",
"false",
"null",
"abort",
"baseNameOf",
"break",
@@ -139,75 +254,548 @@ impl Default for Ctx {
"throw",
"toString",
];
let consts = [
("true", Ir::Bool(true)),
("false", Ir::Bool(false)),
("null", Ir::Null),
];
for name in free_globals {
let name_sym = symbols.get_or_intern(name);
let id = ExprId(irs.len());
irs.push(Builtin(name_sym).to_ir());
global.insert(name_sym, id);
let name = symbols.get_or_intern(name);
let value = Ir::Builtin(name);
global.insert(name, value);
}
for (name, value) in consts {
let name = symbols.get_or_intern(name);
global.insert(name, value);
}
Self {
let config = StoreConfig::from_env();
let store = DaemonStore::connect(&config.daemon_socket)?;
Ok(Self {
symbols,
irs,
global: unsafe { NonNull::new_unchecked(Box::leak(Box::new(global))) },
path_stack: Vec::new(),
}
}
}
impl Ctx {
pub(crate) fn new() -> Self {
Self::default()
global,
sources: Vec::new(),
store,
spans: UnsafeCell::new(Vec::new()),
thunk_count: 0,
global_strings: Vec::new(),
global_string_map: HashMap::new(),
global_constants: Vec::new(),
global_constant_map: HashMap::new(),
synced_strings: 0,
synced_constants: 0,
})
}
pub(crate) fn downgrade_ctx<'a>(&'a mut self) -> DowngradeCtx<'a> {
let global_ref = unsafe { self.global.as_ref() };
DowngradeCtx::new(self, global_ref)
fn downgrade_ctx<'ctx, 'id, 'ir>(
&'ctx mut self,
bump: &'ir Bump,
token: GhostToken<'id>,
extra_scope: Option<Scope<'ctx>>,
) -> DowngradeCtx<'ctx, 'id, 'ir> {
let source = self.get_current_source();
DowngradeCtx::new(
bump,
token,
&mut self.symbols,
&self.global,
extra_scope,
&mut self.thunk_count,
source,
)
}
pub(crate) fn get_current_dir(&self) -> PathBuf {
self.path_stack
fn get_current_dir(&self) -> &Path {
self.sources
.last()
.expect(
"path_stack should never be empty when get_current_dir is called. this is a bug",
)
.parent()
.expect("path in path_stack should always have a parent dir. this is a bug")
.to_path_buf()
.as_ref()
.expect("current_source is not set")
.get_dir()
}
fn compile_code(&mut self, expr: &str) -> Result<String> {
let root = rnix::Root::parse(expr);
if !root.errors().is_empty() {
return Err(Error::parse_error(root.errors().iter().join("; ")));
}
fn get_current_source(&self) -> Source {
self.sources
.last()
.expect("current_source is not set")
.clone()
}
#[allow(clippy::unwrap_used)]
// Always `Some` since there is no parse error
let root = self
.downgrade_ctx()
.downgrade(root.tree().expr().unwrap())?;
let code = self.get_ir(root).compile(self);
let code = format!("Nix.force({})", code);
#[cfg(debug_assertions)]
eprintln!("[DEBUG] generated code: {}", &code);
fn downgrade<'ctx>(
&'ctx mut self,
source: Source,
extra_scope: Option<Scope<'ctx>>,
) -> Result<OwnedIr> {
tracing::debug!("Parsing Nix expression");
self.sources.push(source.clone());
let root = rnix::Root::parse(&source.src);
handle_parse_error(root.errors(), source).map_or(Ok(()), Err)?;
tracing::debug!("Downgrading Nix expression");
let expr = root
.tree()
.expr()
.ok_or_else(|| Error::parse_error("unexpected EOF".into()))?;
let bump = Bump::new();
GhostToken::new(|token| {
let ir = self
.downgrade_ctx(&bump, token, extra_scope)
.downgrade_toplevel(expr)?;
let ir = unsafe { std::mem::transmute::<RawIrRef<'_>, RawIrRef<'static>>(ir) };
Ok(OwnedIr { _bump: bump, ir })
})
}
fn compile<'ctx>(
&'ctx mut self,
source: Source,
extra_scope: Option<Scope<'ctx>>,
) -> Result<String> {
let root = self.downgrade(source, extra_scope)?;
tracing::debug!("Generating JavaScript code");
let code = compile::<false>(root.as_ref(), self);
tracing::debug!("Generated code: {}", &code);
Ok(code)
}
fn compile_scoped(&mut self, source: Source, scope: Vec<String>) -> Result<String> {
let scope = Scope::ScopedImport(
scope
.into_iter()
.map(|k| self.symbols.get_or_intern(k))
.collect(),
);
let root = self.downgrade(source, Some(scope))?;
tracing::debug!("Generating JavaScript code for scoped import");
let code = compile::<true>(root.as_ref(), self);
tracing::debug!("Generated scoped code: {}", &code);
Ok(code)
}
fn compile_bytecode(&mut self, source: Source) -> Result<Bytecode> {
let root = self.downgrade(source, None)?;
tracing::debug!("Generating bytecode");
let bytecode = bytecode::compile_bytecode(root.as_ref(), self);
tracing::debug!("Compiled bytecode: {:#04X?}", bytecode.code);
Ok(bytecode)
}
fn compile_bytecode_scoped(&mut self, source: Source, scope: Vec<String>) -> Result<Bytecode> {
let scope = Scope::ScopedImport(
scope
.into_iter()
.map(|k| self.symbols.get_or_intern(k))
.collect(),
);
let root = self.downgrade(source, Some(scope))?;
tracing::debug!("Generating bytecode for scoped import");
Ok(bytecode::compile_bytecode_scoped(root.as_ref(), self))
}
}
impl CodegenContext for Ctx {
fn get_ir(&self, id: ExprId) -> &Ir {
self.irs.get(id.0).expect("ExprId out of bounds")
fn get_sym(&self, id: SymId) -> Symbol<'_> {
self.symbols
.resolve(id)
.expect("SymId out of bounds")
.into()
}
fn get_current_dir(&self) -> &std::path::Path {
self.get_current_dir()
}
fn get_current_source_id(&self) -> usize {
self.sources
.len()
.checked_sub(1)
.expect("current_source not set")
}
fn get_store_dir(&self) -> &str {
self.store.get_store_dir()
}
fn register_span(&self, range: rnix::TextRange) -> usize {
let spans = unsafe { &mut *self.spans.get() };
let id = spans.len();
spans.push((self.get_current_source_id(), range));
id
}
}
impl BytecodeContext for Ctx {
fn intern_string(&mut self, s: &str) -> u32 {
if let Some(&idx) = self.global_string_map.get(s) {
return idx;
}
let idx = self.global_strings.len() as u32;
self.global_strings.push(s.to_string());
self.global_string_map.insert(s.to_string(), idx);
idx
}
fn intern_constant(&mut self, c: Constant) -> u32 {
if let Some(&idx) = self.global_constant_map.get(&c) {
return idx;
}
let idx = self.global_constants.len() as u32;
self.global_constants.push(c.clone());
self.global_constant_map.insert(c, idx);
idx
}
fn register_span(&self, range: TextRange) -> u32 {
CodegenContext::register_span(self, range) as u32
}
fn get_sym(&self, id: SymId) -> &str {
self.symbols.resolve(id).expect("SymId out of bounds")
}
}
impl PathStackProvider for Ctx {
fn path_stack(&mut self) -> &mut Vec<PathBuf> {
&mut self.path_stack
fn get_current_dir(&self) -> &Path {
Ctx::get_current_dir(self)
}
}
impl RuntimeContext for Ctx {
fn get_current_dir(&self) -> &Path {
self.get_current_dir()
}
fn add_source(&mut self, source: Source) {
self.sources.push(source);
}
fn compile(&mut self, source: Source) -> Result<String> {
self.compile(source, None)
}
fn compile_scoped(&mut self, source: Source, scope: Vec<String>) -> Result<String> {
self.compile_scoped(source, scope)
}
fn compile_bytecode(&mut self, source: Source) -> Result<Bytecode> {
self.compile_bytecode(source)
}
fn compile_bytecode_scoped(&mut self, source: Source, scope: Vec<String>) -> Result<Bytecode> {
self.compile_bytecode_scoped(source, scope)
}
fn get_source(&self, id: usize) -> Source {
self.sources.get(id).expect("source not found").clone()
}
fn get_store(&self) -> &DaemonStore {
&self.store
}
fn get_span(&self, id: usize) -> (usize, TextRange) {
let spans = unsafe { &*self.spans.get() };
spans[id]
}
fn get_unsynced(&mut self) -> (&[String], &[Constant], usize, usize) {
let strings_base = self.synced_strings;
let constants_base = self.synced_constants;
let new_strings = &self.global_strings[strings_base..];
let new_constants = &self.global_constants[constants_base..];
self.synced_strings = self.global_strings.len();
self.synced_constants = self.global_constants.len();
(new_strings, new_constants, strings_base, constants_base)
}
}
impl DisassemblerContext for Ctx {
fn lookup_string(&self, id: u32) -> &str {
self.global_strings
.get(id as usize)
.expect("string not found")
}
fn lookup_constant(&self, id: u32) -> &Constant {
self.global_constants
.get(id as usize)
.expect("constant not found")
}
}
enum Scope<'ctx> {
Global(&'ctx HashMap<SymId, Ir<'static, RawIrRef<'static>>>),
Repl(&'ctx HashSet<SymId>),
ScopedImport(HashSet<SymId>),
Let(HashMap<SymId, ThunkId>),
Param(SymId, ArgId),
}
struct ScopeGuard<'a, 'ctx, 'id, 'ir> {
ctx: &'a mut DowngradeCtx<'ctx, 'id, 'ir>,
}
impl Drop for ScopeGuard<'_, '_, '_, '_> {
fn drop(&mut self) {
self.ctx.scopes.pop();
}
}
impl<'id, 'ir, 'ctx> ScopeGuard<'_, 'ctx, 'id, 'ir> {
fn as_ctx(&mut self) -> &mut DowngradeCtx<'ctx, 'id, 'ir> {
self.ctx
}
}
struct ThunkScope<'id, 'ir> {
bindings: bumpalo::collections::Vec<'ir, (ThunkId, IrRef<'id, 'ir>)>,
cache: HashTable<(IrRef<'id, 'ir>, ThunkId)>,
hasher: DefaultHashBuilder,
}
impl<'id, 'ir> ThunkScope<'id, 'ir> {
fn new_in(bump: &'ir Bump) -> Self {
Self {
bindings: bumpalo::collections::Vec::new_in(bump),
cache: HashTable::new(),
hasher: DefaultHashBuilder::default(),
}
}
fn lookup_cache(&self, key: IrRef<'id, 'ir>, token: &GhostToken<'id>) -> Option<ThunkId> {
let hash = self.hasher.hash_one(IrKey(key, token));
self.cache
.find(hash, |&(ir, _)| ir_content_eq(key, ir, token))
.map(|&(_, id)| id)
}
fn add_binding(&mut self, id: ThunkId, ir: IrRef<'id, 'ir>, token: &GhostToken<'id>) {
self.bindings.push((id, ir));
let hash = self.hasher.hash_one(IrKey(ir, token));
self.cache.insert_unique(hash, (ir, id), |&(ir, _)| {
self.hasher.hash_one(IrKey(ir, token))
});
}
fn extend_bindings(&mut self, iter: impl IntoIterator<Item = (ThunkId, IrRef<'id, 'ir>)>) {
self.bindings.extend(iter);
}
}
struct DowngradeCtx<'ctx, 'id, 'ir> {
bump: &'ir Bump,
token: GhostToken<'id>,
symbols: &'ctx mut DefaultStringInterner,
source: Source,
scopes: Vec<Scope<'ctx>>,
with_scope_count: usize,
arg_count: usize,
thunk_count: &'ctx mut usize,
thunk_scopes: Vec<ThunkScope<'id, 'ir>>,
}
fn should_thunk<'id>(ir: IrRef<'id, '_>, token: &GhostToken<'id>) -> bool {
!matches!(
ir.borrow(token),
Ir::Builtin(_)
| Ir::Builtins
| Ir::Int(_)
| Ir::Float(_)
| Ir::Bool(_)
| Ir::Null
| Ir::Str(_)
| Ir::Thunk(_)
)
}
impl<'ctx, 'id, 'ir> DowngradeCtx<'ctx, 'id, 'ir> {
fn new(
bump: &'ir Bump,
token: GhostToken<'id>,
symbols: &'ctx mut DefaultStringInterner,
global: &'ctx HashMap<SymId, Ir<'static, RawIrRef<'static>>>,
extra_scope: Option<Scope<'ctx>>,
thunk_count: &'ctx mut usize,
source: Source,
) -> Self {
Self {
bump,
token,
symbols,
source,
scopes: std::iter::once(Scope::Global(global))
.chain(extra_scope)
.collect(),
thunk_count,
arg_count: 0,
with_scope_count: 0,
thunk_scopes: vec![ThunkScope::new_in(bump)],
}
}
}
impl<'ctx: 'ir, 'id, 'ir> DowngradeContext<'id, 'ir> for DowngradeCtx<'ctx, 'id, 'ir> {
fn new_expr(&self, expr: Ir<'ir, IrRef<'id, 'ir>>) -> IrRef<'id, 'ir> {
IrRef::new(self.bump.alloc(GhostCell::new(expr)))
}
fn new_arg(&mut self) -> ArgId {
self.arg_count += 1;
ArgId(self.arg_count - 1)
}
fn maybe_thunk(&mut self, ir: IrRef<'id, 'ir>) -> IrRef<'id, 'ir> {
if !should_thunk(ir, &self.token) {
return ir;
}
let cached = self
.thunk_scopes
.last()
.expect("no active cache scope")
.lookup_cache(ir, &self.token);
if let Some(id) = cached {
return IrRef::alloc(self.bump, Ir::Thunk(id));
}
let id = ThunkId(*self.thunk_count);
*self.thunk_count = self.thunk_count.checked_add(1).expect("thunk id overflow");
self.thunk_scopes
.last_mut()
.expect("no active cache scope")
.add_binding(id, ir, &self.token);
IrRef::alloc(self.bump, Ir::Thunk(id))
}
fn new_sym(&mut self, sym: String) -> SymId {
self.symbols.get_or_intern(sym)
}
fn get_sym(&self, id: SymId) -> Symbol<'_> {
self.symbols.resolve(id).expect("no symbol found").into()
}
fn lookup(&self, sym: SymId, span: TextRange) -> Result<IrRef<'id, 'ir>> {
for scope in self.scopes.iter().rev() {
match scope {
&Scope::Global(global_scope) => {
if let Some(expr) = global_scope.get(&sym) {
let ir = match expr {
Ir::Builtins => Ir::Builtins,
Ir::Builtin(s) => Ir::Builtin(*s),
Ir::Bool(b) => Ir::Bool(*b),
Ir::Null => Ir::Null,
_ => unreachable!("globals should only contain leaf IR nodes"),
};
return Ok(self.new_expr(ir));
}
}
&Scope::Repl(repl_bindings) => {
if repl_bindings.contains(&sym) {
return Ok(self.new_expr(Ir::ReplBinding(sym)));
}
}
Scope::ScopedImport(scoped_bindings) => {
if scoped_bindings.contains(&sym) {
return Ok(self.new_expr(Ir::ScopedImportBinding(sym)));
}
}
Scope::Let(let_scope) => {
if let Some(&expr) = let_scope.get(&sym) {
return Ok(self.new_expr(Ir::Thunk(expr)));
}
}
&Scope::Param(param_sym, id) => {
if param_sym == sym {
return Ok(self.new_expr(Ir::Arg(id)));
}
}
}
}
if self.with_scope_count > 0 {
Ok(self.new_expr(Ir::WithLookup(sym)))
} else {
Err(Error::downgrade_error(
format!("'{}' not found", self.get_sym(sym)),
self.get_current_source(),
span,
))
}
}
fn get_current_source(&self) -> Source {
self.source.clone()
}
fn with_let_scope<F, R>(&mut self, keys: &[SymId], f: F) -> Result<R>
where
F: FnOnce(&mut Self) -> Result<(bumpalo::collections::Vec<'ir, IrRef<'id, 'ir>>, R)>,
{
let base = *self.thunk_count;
*self.thunk_count = self
.thunk_count
.checked_add(keys.len())
.expect("thunk id overflow");
let iter = keys.iter().enumerate().map(|(offset, &key)| {
(
key,
ThunkId(unsafe { base.checked_add(offset).unwrap_unchecked() }),
)
});
self.scopes.push(Scope::Let(iter.collect()));
let (vals, ret) = {
let mut guard = ScopeGuard { ctx: self };
f(guard.as_ctx())?
};
assert_eq!(keys.len(), vals.len());
let scope = self.thunk_scopes.last_mut().expect("no active thunk scope");
scope.extend_bindings((base..base + keys.len()).map(ThunkId).zip(vals));
Ok(ret)
}
fn with_param_scope<F, R>(&mut self, param: SymId, arg: ArgId, f: F) -> R
where
F: FnOnce(&mut Self) -> R,
{
self.scopes.push(Scope::Param(param, arg));
let mut guard = ScopeGuard { ctx: self };
f(guard.as_ctx())
}
fn with_with_scope<F, R>(&mut self, f: F) -> R
where
F: FnOnce(&mut Self) -> R,
{
self.with_scope_count += 1;
let ret = f(self);
self.with_scope_count -= 1;
ret
}
fn with_thunk_scope<F, R>(
&mut self,
f: F,
) -> (
R,
bumpalo::collections::Vec<'ir, (ThunkId, IrRef<'id, 'ir>)>,
)
where
F: FnOnce(&mut Self) -> R,
{
self.thunk_scopes.push(ThunkScope::new_in(self.bump));
let ret = f(self);
(
ret,
self.thunk_scopes
.pop()
.expect("no thunk scope left???")
.bindings,
)
}
fn bump(&self) -> &'ir bumpalo::Bump {
self.bump
}
}
impl<'id, 'ir, 'ctx: 'ir> DowngradeCtx<'ctx, 'id, 'ir> {
fn downgrade_toplevel(mut self, root: rnix::ast::Expr) -> Result<RawIrRef<'ir>> {
let body = root.downgrade(&mut self)?;
let thunks = self
.thunk_scopes
.pop()
.expect("no thunk scope left???")
.bindings;
let ir = IrRef::alloc(self.bump, Ir::TopLevel { body, thunks });
Ok(ir.freeze(self.token))
}
}

View File

@@ -1,257 +0,0 @@
use hashbrown::HashMap;
use hashbrown::HashSet;
use petgraph::Directed;
use petgraph::Graph;
use petgraph::graph::NodeIndex;
use crate::codegen::CodegenContext;
use crate::error::{Error, Result};
use crate::ir::{ArgId, Downgrade, DowngradeContext, ExprId, Ir, SymId, ToIr};
use super::{Ctx, SccInfo};
struct DependencyTracker {
expr_to_node: HashMap<ExprId, NodeIndex>,
graph: Graph<ExprId, (), Directed>,
current_binding: Option<ExprId>,
let_scope_exprs: HashSet<ExprId>,
}
enum Scope<'ctx> {
Global(&'ctx HashMap<SymId, ExprId>),
Let(HashMap<SymId, ExprId>),
Param(SymId, ExprId),
With(ExprId),
}
struct ScopeGuard<'a, 'ctx> {
ctx: &'a mut DowngradeCtx<'ctx>,
}
impl<'a, 'ctx> Drop for ScopeGuard<'a, 'ctx> {
fn drop(&mut self) {
self.ctx.scopes.pop();
}
}
impl<'a, 'ctx> ScopeGuard<'a, 'ctx> {
fn as_ctx(&mut self) -> &mut DowngradeCtx<'ctx> {
self.ctx
}
}
pub struct DowngradeCtx<'ctx> {
ctx: &'ctx mut Ctx,
irs: Vec<Option<Ir>>,
scopes: Vec<Scope<'ctx>>,
arg_id: usize,
dep_tracker_stack: Vec<DependencyTracker>,
}
impl<'ctx> DowngradeCtx<'ctx> {
pub fn new(ctx: &'ctx mut Ctx, global: &'ctx HashMap<SymId, ExprId>) -> Self {
Self {
scopes: vec![Scope::Global(global)],
irs: vec![],
arg_id: 0,
dep_tracker_stack: Vec::new(),
ctx,
}
}
}
impl DowngradeContext for DowngradeCtx<'_> {
fn new_expr(&mut self, expr: Ir) -> ExprId {
self.irs.push(Some(expr));
ExprId(self.ctx.irs.len() + self.irs.len() - 1)
}
fn new_arg(&mut self) -> ExprId {
self.irs.push(Some(Ir::Arg(ArgId(self.arg_id))));
self.arg_id += 1;
ExprId(self.ctx.irs.len() + self.irs.len() - 1)
}
fn new_sym(&mut self, sym: String) -> SymId {
self.ctx.symbols.get_or_intern(sym)
}
fn get_sym(&self, id: SymId) -> &str {
self.ctx.get_sym(id)
}
fn lookup(&mut self, sym: SymId) -> Result<ExprId> {
for scope in self.scopes.iter().rev() {
match scope {
&Scope::Global(global_scope) => {
if let Some(&expr) = global_scope.get(&sym) {
return Ok(expr);
}
}
Scope::Let(let_scope) => {
if let Some(&expr) = let_scope.get(&sym) {
if let Some(tracker) = self.dep_tracker_stack.last_mut()
&& let Some(current) = tracker.current_binding
&& tracker.let_scope_exprs.contains(&current)
&& tracker.let_scope_exprs.contains(&expr)
{
let from = tracker.expr_to_node[&current];
let to = tracker.expr_to_node[&expr];
tracker.graph.add_edge(from, to, ());
}
return Ok(self.new_expr(Ir::ExprRef(expr)));
}
}
&Scope::Param(param_sym, expr) => {
if param_sym == sym {
return Ok(expr);
}
}
&Scope::With(_) => (),
}
}
let namespaces: Vec<ExprId> = self
.scopes
.iter()
.filter_map(|scope| {
if let &Scope::With(namespace) = scope {
Some(namespace)
} else {
None
}
})
.collect();
let mut result = None;
for namespace in namespaces {
use crate::ir::{Attr, Select};
let select = Select {
expr: namespace,
attrpath: vec![Attr::Str(sym)],
default: result, // Link to outer With or None
};
result = Some(self.new_expr(select.to_ir()));
}
result.ok_or_else(|| Error::downgrade_error(format!("'{}' not found", self.get_sym(sym))))
}
fn extract_expr(&mut self, id: ExprId) -> Ir {
let local_id = id.0 - self.ctx.irs.len();
self.irs
.get_mut(local_id)
.expect("ExprId out of bounds")
.take()
.expect("extract_expr called on an already extracted expr")
}
fn replace_expr(&mut self, id: ExprId, expr: Ir) {
let local_id = id.0 - self.ctx.irs.len();
let _ = self
.irs
.get_mut(local_id)
.expect("ExprId out of bounds")
.insert(expr);
}
#[allow(refining_impl_trait)]
fn reserve_slots(&mut self, slots: usize) -> impl Iterator<Item = ExprId> + Clone + use<> {
let start = self.ctx.irs.len() + self.irs.len();
self.irs.extend(std::iter::repeat_with(|| None).take(slots));
(start..start + slots).map(ExprId)
}
fn downgrade(mut self, root: rnix::ast::Expr) -> Result<ExprId> {
let root = root.downgrade(&mut self)?;
self.ctx
.irs
.extend(self.irs.into_iter().map(Option::unwrap));
Ok(root)
}
fn with_let_scope<F, R>(&mut self, bindings: HashMap<SymId, ExprId>, f: F) -> R
where
F: FnOnce(&mut Self) -> R,
{
self.scopes.push(Scope::Let(bindings));
let mut guard = ScopeGuard { ctx: self };
f(guard.as_ctx())
}
fn with_param_scope<F, R>(&mut self, param: SymId, arg: ExprId, f: F) -> R
where
F: FnOnce(&mut Self) -> R,
{
self.scopes.push(Scope::Param(param, arg));
let mut guard = ScopeGuard { ctx: self };
f(guard.as_ctx())
}
fn with_with_scope<F, R>(&mut self, namespace: ExprId, f: F) -> R
where
F: FnOnce(&mut Self) -> R,
{
self.scopes.push(Scope::With(namespace));
let mut guard = ScopeGuard { ctx: self };
f(guard.as_ctx())
}
fn get_current_dir(&self) -> std::path::PathBuf {
self.ctx.get_current_dir()
}
fn push_dep_tracker(&mut self, slots: &[ExprId]) {
let mut graph = Graph::new();
let mut expr_to_node = HashMap::new();
let mut let_scope_exprs = HashSet::new();
for &expr in slots.iter() {
let node = graph.add_node(expr);
expr_to_node.insert(expr, node);
let_scope_exprs.insert(expr);
}
self.dep_tracker_stack.push(DependencyTracker {
expr_to_node,
graph,
current_binding: None,
let_scope_exprs,
});
}
fn set_current_binding(&mut self, expr: Option<ExprId>) {
if let Some(tracker) = self.dep_tracker_stack.last_mut() {
tracker.current_binding = expr;
}
}
fn pop_dep_tracker(&mut self) -> Result<SccInfo> {
let tracker = self
.dep_tracker_stack
.pop()
.expect("pop_dep_tracker without active tracker");
use petgraph::algo::kosaraju_scc;
let sccs = kosaraju_scc(&tracker.graph);
let mut sccs_topo = Vec::new();
for scc_nodes in sccs.iter() {
let mut scc_exprs = Vec::new();
let mut is_recursive = scc_nodes.len() > 1;
for &node_idx in scc_nodes {
let expr = tracker.graph[node_idx];
scc_exprs.push(expr);
if !is_recursive && tracker.graph.contains_edge(node_idx, node_idx) {
is_recursive = true;
}
}
sccs_topo.push((scc_exprs, is_recursive));
}
Ok(SccInfo { sccs: sccs_topo })
}
}

View File

@@ -1,41 +0,0 @@
use std::ops::{Deref, DerefMut};
use std::path::PathBuf;
use crate::error::{Error, Result};
pub trait PathStackProvider {
fn path_stack(&mut self) -> &mut Vec<PathBuf>;
}
pub struct PathDropGuard<'ctx, Ctx: PathStackProvider> {
ctx: &'ctx mut Ctx,
}
impl<'ctx, Ctx: PathStackProvider> PathDropGuard<'ctx, Ctx> {
pub fn new(path: PathBuf, ctx: &'ctx mut Ctx) -> Self {
ctx.path_stack().push(path);
Self { ctx }
}
pub fn new_cwd(ctx: &'ctx mut Ctx) -> Result<Self> {
let cwd = std::env::current_dir()
.map_err(|err| Error::downgrade_error(format!("cannot get cwd: {err}")))?;
let virtual_file = cwd.join("__eval__.nix");
ctx.path_stack().push(virtual_file);
Ok(Self { ctx })
}
pub fn as_ctx(&mut self) -> &mut Ctx {
self.ctx
}
}
impl<Ctx: PathStackProvider> Deref for PathDropGuard<'_, Ctx> {
type Target = Ctx;
fn deref(&self) -> &Self::Target {
self.ctx
}
}
impl<Ctx: PathStackProvider> DerefMut for PathDropGuard<'_, Ctx> {
fn deref_mut(&mut self) -> &mut Self::Target {
self.ctx
}
}

142
nix-js/src/derivation.rs Normal file
View File

@@ -0,0 +1,142 @@
use std::collections::{BTreeMap, BTreeSet};
pub struct OutputInfo {
pub path: String,
pub hash_algo: String,
pub hash: String,
}
pub struct DerivationData {
pub name: String,
pub outputs: BTreeMap<String, OutputInfo>,
pub input_drvs: BTreeMap<String, BTreeSet<String>>,
pub input_srcs: BTreeSet<String>,
pub platform: String,
pub builder: String,
pub args: Vec<String>,
pub env: BTreeMap<String, String>,
}
fn escape_string(s: &str) -> String {
let mut result = String::with_capacity(s.len() + 2);
result.push('"');
for c in s.chars() {
match c {
'"' => result.push_str("\\\""),
'\\' => result.push_str("\\\\"),
'\n' => result.push_str("\\n"),
'\r' => result.push_str("\\r"),
'\t' => result.push_str("\\t"),
_ => result.push(c),
}
}
result.push('"');
result
}
fn quote_string(s: &str) -> String {
format!("\"{}\"", s)
}
impl DerivationData {
pub fn generate_aterm(&self) -> String {
let mut output_entries = Vec::new();
for (name, info) in &self.outputs {
output_entries.push(format!(
"({},{},{},{})",
quote_string(name),
quote_string(&info.path),
quote_string(&info.hash_algo),
quote_string(&info.hash),
));
}
let outputs = output_entries.join(",");
let mut input_drv_entries = Vec::new();
for (drv_path, output_names) in &self.input_drvs {
let sorted_outs: Vec<String> = output_names.iter().map(|s| quote_string(s)).collect();
let out_list = format!("[{}]", sorted_outs.join(","));
input_drv_entries.push(format!("({},{})", quote_string(drv_path), out_list));
}
let input_drvs = input_drv_entries.join(",");
let input_srcs: Vec<String> = self.input_srcs.iter().map(|s| quote_string(s)).collect();
let input_srcs = input_srcs.join(",");
let args: Vec<String> = self.args.iter().map(|s| escape_string(s)).collect();
let args = args.join(",");
let mut env_entries: Vec<String> = Vec::new();
for (k, v) in &self.env {
env_entries.push(format!("({},{})", escape_string(k), escape_string(v)));
}
format!(
"Derive([{}],[{}],[{}],{},{},[{}],[{}])",
outputs,
input_drvs,
input_srcs,
quote_string(&self.platform),
escape_string(&self.builder),
args,
env_entries.join(","),
)
}
pub fn generate_aterm_modulo(&self, input_drv_hashes: &BTreeMap<String, String>) -> String {
let mut output_entries = Vec::new();
for (name, info) in &self.outputs {
output_entries.push(format!(
"({},{},{},{})",
quote_string(name),
quote_string(&info.path),
quote_string(&info.hash_algo),
quote_string(&info.hash),
));
}
let outputs = output_entries.join(",");
let mut input_drv_entries = Vec::new();
for (drv_hash, outputs_csv) in input_drv_hashes {
let mut sorted_outs: Vec<&str> = outputs_csv.split(',').collect();
sorted_outs.sort();
let out_list: Vec<String> = sorted_outs.iter().map(|s| quote_string(s)).collect();
let out_list = format!("[{}]", out_list.join(","));
input_drv_entries.push(format!("({},{})", quote_string(drv_hash), out_list));
}
let input_drvs = input_drv_entries.join(",");
let input_srcs: Vec<String> = self.input_srcs.iter().map(|s| quote_string(s)).collect();
let input_srcs = input_srcs.join(",");
let args: Vec<String> = self.args.iter().map(|s| escape_string(s)).collect();
let args = args.join(",");
let mut env_entries: Vec<String> = Vec::new();
for (k, v) in &self.env {
env_entries.push(format!("({},{})", escape_string(k), escape_string(v)));
}
format!(
"Derive([{}],[{}],[{}],{},{},[{}],[{}])",
outputs,
input_drvs,
input_srcs,
quote_string(&self.platform),
escape_string(&self.builder),
args,
env_entries.join(","),
)
}
pub fn collect_references(&self) -> Vec<String> {
let mut refs = BTreeSet::new();
for src in &self.input_srcs {
refs.insert(src.clone());
}
for drv_path in self.input_drvs.keys() {
refs.insert(drv_path.clone());
}
refs.into_iter().collect()
}
}

354
nix-js/src/disassembler.rs Normal file
View File

@@ -0,0 +1,354 @@
use std::fmt::Write;
use colored::Colorize;
use num_enum::TryFromPrimitive;
use crate::bytecode::{Bytecode, Constant, Op};
pub(crate) trait DisassemblerContext {
fn lookup_string(&self, id: u32) -> &str;
fn lookup_constant(&self, id: u32) -> &Constant;
}
pub(crate) struct Disassembler<'a, Ctx> {
code: &'a [u8],
ctx: &'a Ctx,
pos: usize,
}
impl<'a, Ctx: DisassemblerContext> Disassembler<'a, Ctx> {
pub fn new(bytecode: &'a Bytecode, ctx: &'a Ctx) -> Self {
Self {
code: &bytecode.code,
ctx,
pos: 0,
}
}
fn read_u8(&mut self) -> u8 {
let b = self.code[self.pos];
self.pos += 1;
b
}
fn read_u16(&mut self) -> u16 {
let bytes = self.code[self.pos..self.pos + 2]
.try_into()
.expect("no enough bytes");
self.pos += 2;
u16::from_le_bytes(bytes)
}
fn read_u32(&mut self) -> u32 {
let bytes = self.code[self.pos..self.pos + 4]
.try_into()
.expect("no enough bytes");
self.pos += 4;
u32::from_le_bytes(bytes)
}
fn read_i32(&mut self) -> i32 {
let bytes = self.code[self.pos..self.pos + 4]
.try_into()
.expect("no enough bytes");
self.pos += 4;
i32::from_le_bytes(bytes)
}
pub fn disassemble(&mut self) -> String {
self.disassemble_impl(false)
}
pub fn disassemble_colored(&mut self) -> String {
self.disassemble_impl(true)
}
fn disassemble_impl(&mut self, color: bool) -> String {
let mut out = String::new();
if color {
let _ = writeln!(out, "{}", "=== Bytecode Disassembly ===".bold().white());
let _ = writeln!(
out,
"{} {}",
"Length:".white(),
format!("{} bytes", self.code.len()).cyan()
);
} else {
let _ = writeln!(out, "=== Bytecode Disassembly ===");
let _ = writeln!(out, "Length: {} bytes", self.code.len());
}
while self.pos < self.code.len() {
let start_pos = self.pos;
let op_byte = self.read_u8();
let (mnemonic, args) = self.decode_instruction(op_byte, start_pos);
let bytes_slice = &self.code[start_pos + 1..self.pos];
for (i, chunk) in bytes_slice.chunks(4).enumerate() {
let bytes_str = {
let mut temp = String::new();
if i == 0 {
let _ = write!(&mut temp, "{:02x}", self.code[start_pos]);
} else {
let _ = write!(&mut temp, " ");
}
for b in chunk.iter() {
let _ = write!(&mut temp, " {:02x}", b);
}
temp
};
if i == 0 {
if color {
let sep = if args.is_empty() { "" } else { " " };
let _ = writeln!(
out,
"{} {:<14} | {}{}{}",
format!("{:04x}", start_pos).dimmed(),
bytes_str.green(),
mnemonic.yellow().bold(),
sep,
args.cyan()
);
} else {
let op_str = if args.is_empty() {
mnemonic.to_string()
} else {
format!("{} {}", mnemonic, args)
};
let _ = writeln!(out, "{:04x} {:<14} | {}", start_pos, bytes_str, op_str);
}
} else {
let extra_width = start_pos.ilog2() >> 4;
if color {
let _ = write!(out, " ");
for _ in 0..extra_width {
let _ = write!(out, " ");
}
let _ = writeln!(out, " {:<14} |", bytes_str.green());
} else {
let _ = write!(out, " ");
for _ in 0..extra_width {
let _ = write!(out, " ");
}
let _ = writeln!(out, " {:<14} |", bytes_str);
}
}
}
}
out
}
fn decode_instruction(&mut self, op_byte: u8, current_pc: usize) -> (&'static str, String) {
let op = Op::try_from_primitive(op_byte).expect("invalid op code");
match op {
Op::PushConst => {
let idx = self.read_u32();
let val = self.ctx.lookup_constant(idx);
let val_str = match val {
Constant::Int(i) => format!("Int({})", i),
Constant::Float(f) => format!("Float(bits: {})", f),
};
("PushConst", format!("@{} ({})", idx, val_str))
}
Op::PushString => {
let idx = self.read_u32();
let s = self.ctx.lookup_string(idx);
let len = s.len();
let mut s_fmt = format!("{:?}", s);
if s_fmt.len() > 60 {
s_fmt.truncate(57);
#[allow(clippy::unwrap_used)]
write!(s_fmt, "...\" (total {len} bytes)").unwrap();
}
("PushString", format!("@{} {}", idx, s_fmt))
}
Op::PushNull => ("PushNull", String::new()),
Op::PushTrue => ("PushTrue", String::new()),
Op::PushFalse => ("PushFalse", String::new()),
Op::LoadLocal => {
let idx = self.read_u32();
("LoadLocal", format!("[{}]", idx))
}
Op::LoadOuter => {
let depth = self.read_u8();
let idx = self.read_u32();
("LoadOuter", format!("depth={} [{}]", depth, idx))
}
Op::StoreLocal => {
let idx = self.read_u32();
("StoreLocal", format!("[{}]", idx))
}
Op::AllocLocals => {
let count = self.read_u32();
("AllocLocals", format!("count={}", count))
}
Op::MakeThunk => {
let offset = self.read_u32();
let label_idx = self.read_u32();
let label = self.ctx.lookup_string(label_idx);
("MakeThunk", format!("-> {:04x} label={}", offset, label))
}
Op::MakeClosure => {
let offset = self.read_u32();
let slots = self.read_u32();
("MakeClosure", format!("-> {:04x} slots={}", offset, slots))
}
Op::MakePatternClosure => {
let offset = self.read_u32();
let slots = self.read_u32();
let req_count = self.read_u16();
let opt_count = self.read_u16();
let ellipsis = self.read_u8() != 0;
let mut arg_str = format!(
"-> {:04x} slots={} req={} opt={} ...={})",
offset, slots, req_count, opt_count, ellipsis
);
arg_str.push_str(" Args=[");
for _ in 0..req_count {
let idx = self.read_u32();
arg_str.push_str(&format!("Req({}) ", self.ctx.lookup_string(idx)));
}
for _ in 0..opt_count {
let idx = self.read_u32();
arg_str.push_str(&format!("Opt({}) ", self.ctx.lookup_string(idx)));
}
let total_args = req_count + opt_count;
for _ in 0..total_args {
let _name_idx = self.read_u32();
let _span_id = self.read_u32();
}
arg_str.push(']');
("MakePatternClosure", arg_str)
}
Op::Call => {
let span_id = self.read_u32();
("Call", format!("span={}", span_id))
}
Op::CallNoSpan => ("CallNoSpan", String::new()),
Op::MakeAttrs => {
let count = self.read_u32();
("MakeAttrs", format!("size={}", count))
}
Op::MakeAttrsDyn => {
let static_count = self.read_u32();
let dyn_count = self.read_u32();
(
"MakeAttrsDyn",
format!("static={} dyn={}", static_count, dyn_count),
)
}
Op::MakeEmptyAttrs => ("MakeEmptyAttrs", String::new()),
Op::Select => {
let path_len = self.read_u16();
let span_id = self.read_u32();
("Select", format!("path_len={} span={}", path_len, span_id))
}
Op::SelectDefault => {
let path_len = self.read_u16();
let span_id = self.read_u32();
(
"SelectDefault",
format!("path_len={} span={}", path_len, span_id),
)
}
Op::HasAttr => {
let path_len = self.read_u16();
("HasAttr", format!("path_len={}", path_len))
}
Op::MakeList => {
let count = self.read_u32();
("MakeList", format!("size={}", count))
}
Op::OpAdd => ("OpAdd", String::new()),
Op::OpSub => ("OpSub", String::new()),
Op::OpMul => ("OpMul", String::new()),
Op::OpDiv => ("OpDiv", String::new()),
Op::OpEq => ("OpEq", String::new()),
Op::OpNeq => ("OpNeq", String::new()),
Op::OpLt => ("OpLt", String::new()),
Op::OpGt => ("OpGt", String::new()),
Op::OpLeq => ("OpLeq", String::new()),
Op::OpGeq => ("OpGeq", String::new()),
Op::OpConcat => ("OpConcat", String::new()),
Op::OpUpdate => ("OpUpdate", String::new()),
Op::OpNeg => ("OpNeg", String::new()),
Op::OpNot => ("OpNot", String::new()),
Op::ForceBool => ("ForceBool", String::new()),
Op::JumpIfFalse => {
let offset = self.read_i32();
let target = (current_pc as isize + 1 + 4 + offset as isize) as usize;
(
"JumpIfFalse",
format!("-> {:04x} offset={}", target, offset),
)
}
Op::JumpIfTrue => {
let offset = self.read_i32();
let target = (current_pc as isize + 1 + 4 + offset as isize) as usize;
("JumpIfTrue", format!("-> {:04x} offset={}", target, offset))
}
Op::Jump => {
let offset = self.read_i32();
let target = (current_pc as isize + 1 + 4 + offset as isize) as usize;
("Jump", format!("-> {:04x} offset={}", target, offset))
}
Op::ConcatStrings => {
let count = self.read_u16();
let force = self.read_u8();
("ConcatStrings", format!("count={} force={}", count, force))
}
Op::ResolvePath => ("ResolvePath", String::new()),
Op::Assert => {
let raw_idx = self.read_u32();
let span_id = self.read_u32();
("Assert", format!("text_id={} span={}", raw_idx, span_id))
}
Op::PushWith => ("PushWith", String::new()),
Op::PopWith => ("PopWith", String::new()),
Op::WithLookup => {
let idx = self.read_u32();
let name = self.ctx.lookup_string(idx);
("WithLookup", format!("{:?}", name))
}
Op::LoadBuiltins => ("LoadBuiltins", String::new()),
Op::LoadBuiltin => {
let idx = self.read_u32();
let name = self.ctx.lookup_string(idx);
("LoadBuiltin", format!("{:?}", name))
}
Op::MkPos => {
let span_id = self.read_u32();
("MkPos", format!("id={}", span_id))
}
Op::LoadReplBinding => {
let idx = self.read_u32();
let name = self.ctx.lookup_string(idx);
("LoadReplBinding", format!("{:?}", name))
}
Op::LoadScopedBinding => {
let idx = self.read_u32();
let name = self.ctx.lookup_string(idx);
("LoadScopedBinding", format!("{:?}", name))
}
Op::Return => ("Return", String::new()),
}
}
}

1249
nix-js/src/downgrade.rs Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,129 +1,355 @@
use std::path::{Path, PathBuf};
use std::sync::Arc;
use deno_core::error::JsError;
use deno_error::JsErrorClass as _;
use itertools::Itertools as _;
use miette::{Diagnostic, NamedSource, SourceSpan};
use thiserror::Error;
pub type Result<T> = core::result::Result<T, Error>;
use crate::runtime::RuntimeContext;
#[derive(Error, Debug)]
pub enum ErrorKind {
#[error("error occurred during parse stage: {0}")]
ParseError(String),
#[error("error occurred during downgrade stage: {0}")]
DowngradeError(String),
#[error("error occurred during evaluation stage: {0}")]
EvalError(String),
#[error("internal error occurred: {0}")]
InternalError(String),
#[error("{0}")]
Catchable(String),
#[error("an unknown or unexpected error occurred")]
pub type Result<T> = core::result::Result<T, Box<Error>>;
#[derive(Clone, Debug)]
pub enum SourceType {
/// dir
Eval(Arc<PathBuf>),
/// dir
Repl(Arc<PathBuf>),
/// file
File(Arc<PathBuf>),
/// virtual (name, no path)
Virtual(Arc<str>),
}
#[derive(Clone, Debug)]
pub struct Source {
pub ty: SourceType,
pub src: Arc<str>,
}
impl TryFrom<&str> for Source {
type Error = Box<Error>;
fn try_from(value: &str) -> Result<Self> {
Source::new_eval(value.into())
}
}
impl From<Source> for NamedSource<Arc<str>> {
fn from(value: Source) -> Self {
let name = value.get_name();
NamedSource::new(name, value.src.clone())
}
}
impl Source {
pub fn new_file(path: PathBuf) -> std::io::Result<Self> {
Ok(Source {
src: std::fs::read_to_string(&path)?.into(),
ty: crate::error::SourceType::File(Arc::new(path)),
})
}
pub fn new_eval(src: String) -> Result<Self> {
Ok(Self {
ty: std::env::current_dir()
.map_err(|err| Error::internal(format!("Failed to get current working dir: {err}")))
.map(Arc::new)
.map(SourceType::Eval)?,
src: src.into(),
})
}
pub fn new_repl(src: String) -> Result<Self> {
Ok(Self {
ty: std::env::current_dir()
.map_err(|err| Error::internal(format!("Failed to get current working dir: {err}")))
.map(Arc::new)
.map(SourceType::Repl)?,
src: src.into(),
})
}
pub fn new_virtual(name: Arc<str>, src: String) -> Self {
Self {
ty: SourceType::Virtual(name),
src: src.into(),
}
}
pub fn get_dir(&self) -> &Path {
use SourceType::*;
match &self.ty {
Eval(dir) | Repl(dir) => dir.as_ref(),
File(file) => file
.as_path()
.parent()
.expect("source file must have a parent dir"),
Virtual(_) => Path::new("/"),
}
}
pub fn get_name(&self) -> String {
match &self.ty {
SourceType::Eval(_) => "«eval»".into(),
SourceType::Repl(_) => "«repl»".into(),
SourceType::File(path) => path.as_os_str().to_string_lossy().to_string(),
SourceType::Virtual(name) => name.to_string(),
}
}
}
#[derive(Error, Debug, Diagnostic)]
pub enum Error {
#[error("Parse error: {message}")]
#[diagnostic(code(nix::parse))]
ParseError {
#[source_code]
src: Option<NamedSource<Arc<str>>>,
#[label("error occurred here")]
span: Option<SourceSpan>,
message: String,
},
#[error("Downgrade error: {message}")]
#[diagnostic(code(nix::downgrade))]
DowngradeError {
#[source_code]
src: Option<NamedSource<Arc<str>>>,
#[label("{message}")]
span: Option<SourceSpan>,
message: String,
},
#[error("Evaluation error: {message}")]
#[diagnostic(code(nix::eval))]
EvalError {
#[source_code]
src: Option<NamedSource<Arc<str>>>,
#[label("error occurred here")]
span: Option<SourceSpan>,
message: String,
#[help]
js_backtrace: Option<String>,
#[related]
stack_trace: Vec<StackFrame>,
},
#[error("Internal error: {message}")]
#[diagnostic(code(nix::internal))]
InternalError { message: String },
#[error("{message}")]
#[diagnostic(code(nix::catchable))]
Catchable { message: String },
#[error("Unknown error")]
#[diagnostic(code(nix::unknown))]
Unknown,
}
#[derive(Debug)]
pub struct Error {
pub kind: ErrorKind,
pub span: Option<rnix::TextRange>,
pub source: Option<Arc<str>>,
}
impl std::fmt::Display for Error {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
// Basic display
write!(f, "{}", self.kind)?;
// If we have source and span, print context
if let (Some(source), Some(span)) = (&self.source, self.span) {
let start_byte = usize::from(span.start());
let end_byte = usize::from(span.end());
if start_byte > source.len() || end_byte > source.len() {
return Ok(()); // Span is out of bounds
}
let mut start_line = 1;
let mut start_col = 1usize;
let mut line_start_byte = 0;
for (i, c) in source.char_indices() {
if i >= start_byte {
break;
}
if c == '\n' {
start_line += 1;
start_col = 1;
line_start_byte = i + 1;
} else {
start_col += 1;
}
}
let line_end_byte = source[line_start_byte..]
.find('\n')
.map(|i| line_start_byte + i)
.unwrap_or(source.len());
let line_str = &source[line_start_byte..line_end_byte];
let underline_len = if end_byte > start_byte {
end_byte - start_byte
} else {
1
};
write!(f, "\n --> {}:{}", start_line, start_col)?;
write!(f, "\n |\n")?;
writeln!(f, "{:4} | {}", start_line, line_str)?;
write!(
f,
" | {}{}",
" ".repeat(start_col.saturating_sub(1)),
"^".repeat(underline_len)
)?;
}
Ok(())
}
}
impl std::error::Error for Error {
fn source(&self) -> Option<&(dyn std::error::Error + 'static)> {
Some(&self.kind)
}
}
impl Error {
pub fn new(kind: ErrorKind) -> Self {
Self {
kind,
pub fn parse_error(msg: String) -> Box<Self> {
Error::ParseError {
src: None,
span: None,
source: None,
message: msg,
}
.into()
}
pub fn with_span(mut self, span: rnix::TextRange) -> Self {
self.span = Some(span);
pub fn downgrade_error(msg: String, src: Source, span: rnix::TextRange) -> Box<Self> {
Error::DowngradeError {
src: Some(src.into()),
span: Some(text_range_to_source_span(span)),
message: msg,
}
.into()
}
pub fn eval_error(msg: String, backtrace: Option<String>) -> Box<Self> {
Error::EvalError {
src: None,
span: None,
message: msg,
js_backtrace: backtrace,
stack_trace: Vec::new(),
}
.into()
}
pub fn internal(msg: String) -> Box<Self> {
Error::InternalError { message: msg }.into()
}
pub fn catchable(msg: String) -> Box<Self> {
Error::Catchable { message: msg }.into()
}
pub fn with_span(mut self: Box<Self>, span: rnix::TextRange) -> Box<Self> {
use Error::*;
let source_span = Some(text_range_to_source_span(span));
let (ParseError { span, .. } | DowngradeError { span, .. } | EvalError { span, .. }) =
self.as_mut()
else {
return self;
};
*span = source_span;
self
}
pub fn with_source(mut self, source: Arc<str>) -> Self {
self.source = Some(source);
pub fn with_source(mut self: Box<Self>, source: Source) -> Box<Self> {
use Error::*;
let new_src = Some(source.into());
let (ParseError { src, .. } | DowngradeError { src, .. } | EvalError { src, .. }) =
self.as_mut()
else {
return self;
};
*src = new_src;
self
}
pub fn parse_error(msg: String) -> Self {
Self::new(ErrorKind::ParseError(msg))
}
pub fn downgrade_error(msg: String) -> Self {
Self::new(ErrorKind::DowngradeError(msg))
}
pub fn eval_error(msg: String) -> Self {
Self::new(ErrorKind::EvalError(msg))
}
pub fn internal(msg: String) -> Self {
Self::new(ErrorKind::InternalError(msg))
}
pub fn catchable(msg: String) -> Self {
Self::new(ErrorKind::Catchable(msg))
}
pub fn unknown() -> Self {
Self::new(ErrorKind::Unknown)
}
}
pub fn text_range_to_source_span(range: rnix::TextRange) -> SourceSpan {
let start = usize::from(range.start());
let len = usize::from(range.end()) - start;
SourceSpan::new(start.into(), len)
}
/// Stack frame types from Nix evaluation
#[derive(Debug, Clone, Error, Diagnostic)]
#[error("{message}")]
pub struct StackFrame {
#[label]
pub span: SourceSpan,
#[help]
pub message: String,
#[source_code]
pub src: NamedSource<Arc<str>>,
}
const MAX_STACK_FRAMES: usize = 20;
const FRAMES_AT_START: usize = 15;
const FRAMES_AT_END: usize = 5;
pub(crate) fn parse_js_error(error: Box<JsError>, ctx: &impl RuntimeContext) -> Error {
let (span, src, frames) = if let Some(stack) = &error.stack {
let mut frames = parse_frames(stack, ctx);
if let Some(last_frame) = frames.pop() {
(
Some(text_range_to_source_span(last_frame.span)),
Some(last_frame.src.into()),
frames,
)
} else {
(None, None, frames)
}
} else {
(None, None, Vec::new())
};
let stack_trace = if std::env::var("NIX_JS_STACK_TRACE").is_ok() {
truncate_stack_trace(frames)
} else {
Vec::new()
};
let message = error.get_message().to_string();
let js_backtrace = error.stack.map(|stack| {
stack
.lines()
.filter(|line| !line.starts_with("NIX_STACK_FRAME:"))
.join("\n")
});
Error::EvalError {
src,
span,
message,
js_backtrace,
stack_trace,
}
}
struct NixStackFrame {
span: rnix::TextRange,
message: String,
src: Source,
}
impl From<NixStackFrame> for StackFrame {
fn from(NixStackFrame { span, message, src }: NixStackFrame) -> Self {
StackFrame {
span: text_range_to_source_span(span),
message,
src: src.into(),
}
}
}
fn parse_frames(stack: &str, ctx: &impl RuntimeContext) -> Vec<NixStackFrame> {
let mut frames = Vec::new();
for line in stack.lines() {
// Format: NIX_STACK_FRAME:span_id:message
let Some(rest) = line.strip_prefix("NIX_STACK_FRAME:") else {
continue;
};
let parts: Vec<&str> = rest.splitn(2, ':').collect();
if parts.is_empty() {
continue;
}
let span_id: usize = match parts[0].parse() {
Ok(id) => id,
Err(_) => continue,
};
let (source_id, span) = ctx.get_span(span_id);
let src = ctx.get_source(source_id);
let message = if parts.len() == 2 {
parts[1].to_string()
} else {
String::new()
};
frames.push(NixStackFrame { span, message, src });
}
frames.dedup_by(|a, b| a.span == b.span && a.message == b.message);
frames
}
fn truncate_stack_trace(frames: Vec<NixStackFrame>) -> Vec<StackFrame> {
let reversed: Vec<_> = frames.into_iter().rev().collect();
let total = reversed.len();
if total <= MAX_STACK_FRAMES {
return reversed.into_iter().map(Into::into).collect();
}
let omitted_count = total - FRAMES_AT_START - FRAMES_AT_END;
reversed
.into_iter()
.enumerate()
.filter_map(|(i, frame)| {
if i < FRAMES_AT_START {
Some(frame.into())
} else if i == FRAMES_AT_START {
Some(StackFrame {
span: text_range_to_source_span(frame.span),
message: format!("... ({} more frames omitted)", omitted_count),
src: frame.src.into(),
})
} else if i >= total - FRAMES_AT_END {
Some(frame.into())
} else {
None
}
})
.collect()
}

316
nix-js/src/fetcher.rs Normal file
View File

@@ -0,0 +1,316 @@
use deno_core::OpState;
use deno_core::ToV8;
use deno_core::op2;
use nix_compat::nixhash::HashAlgo;
use nix_compat::nixhash::NixHash;
use tracing::{debug, info, warn};
use crate::runtime::OpStateExt;
use crate::runtime::RuntimeContext;
use crate::store::Store as _;
mod archive;
pub(crate) mod cache;
mod download;
mod git;
mod metadata_cache;
pub use cache::FetcherCache;
pub use download::Downloader;
pub use metadata_cache::MetadataCache;
use crate::nar;
use crate::runtime::NixRuntimeError;
#[derive(ToV8)]
pub struct FetchUrlResult {
pub store_path: String,
pub hash: String,
}
#[derive(ToV8)]
pub struct FetchTarballResult {
pub store_path: String,
pub nar_hash: String,
}
#[derive(ToV8)]
pub struct FetchGitResult {
pub out_path: String,
pub rev: String,
pub short_rev: String,
pub rev_count: u64,
pub last_modified: u64,
pub last_modified_date: String,
pub submodules: bool,
pub nar_hash: Option<String>,
}
#[op2]
pub fn op_fetch_url<Ctx: RuntimeContext>(
state: &mut OpState,
#[string] url: String,
#[string] expected_hash: Option<String>,
#[string] name: Option<String>,
executable: bool,
) -> Result<FetchUrlResult, NixRuntimeError> {
let _span = tracing::info_span!("op_fetch_url", url = %url).entered();
info!("fetchurl started");
let file_name =
name.unwrap_or_else(|| url.rsplit('/').next().unwrap_or("download").to_string());
let metadata_cache =
MetadataCache::new(3600).map_err(|e| NixRuntimeError::from(e.to_string()))?;
let input = serde_json::json!({
"type": "file",
"url": url,
"name": file_name,
"executable": executable,
});
if let Some(cached_entry) = metadata_cache
.lookup(&input)
.map_err(|e| NixRuntimeError::from(e.to_string()))?
{
let cached_hash = cached_entry
.info
.get("hash")
.and_then(|v| v.as_str())
.unwrap_or("");
if let Some(ref expected) = expected_hash {
let normalized_expected = normalize_hash(expected);
if cached_hash != normalized_expected {
warn!("Cached hash mismatch, re-fetching");
} else {
info!("Cache hit");
return Ok(FetchUrlResult {
store_path: cached_entry.store_path.clone(),
hash: cached_hash.to_string(),
});
}
} else {
info!("Cache hit (no hash check)");
return Ok(FetchUrlResult {
store_path: cached_entry.store_path.clone(),
hash: cached_hash.to_string(),
});
}
}
info!("Cache miss, downloading");
let downloader = Downloader::new();
let data = downloader
.download(&url)
.map_err(|e| NixRuntimeError::from(e.to_string()))?;
info!(bytes = data.len(), "Download complete");
let hash = crate::nix_utils::sha256_hex(&data);
if let Some(ref expected) = expected_hash {
let normalized_expected = normalize_hash(expected);
if hash != normalized_expected {
return Err(NixRuntimeError::from(format!(
"hash mismatch for '{}': expected {}, got {}",
url, normalized_expected, hash
)));
}
}
let ctx: &Ctx = state.get_ctx();
let store = ctx.get_store();
let store_path = store
.add_to_store(&file_name, &data, false, vec![])
.map_err(|e| NixRuntimeError::from(e.to_string()))?;
info!(store_path = %store_path, "Added to store");
#[cfg(unix)]
if executable {
use std::os::unix::fs::PermissionsExt;
if let Ok(metadata) = std::fs::metadata(&store_path) {
let mut perms = metadata.permissions();
perms.set_mode(0o755);
let _ = std::fs::set_permissions(&store_path, perms);
}
}
let info = serde_json::json!({
"hash": hash,
"url": url,
});
metadata_cache
.add(&input, &info, &store_path, true)
.map_err(|e| NixRuntimeError::from(e.to_string()))?;
Ok(FetchUrlResult { store_path, hash })
}
#[op2]
pub fn op_fetch_tarball<Ctx: RuntimeContext>(
state: &mut OpState,
#[string] url: String,
#[string] name: Option<String>,
#[string] sha256: Option<String>,
) -> Result<FetchTarballResult, NixRuntimeError> {
let _span = tracing::info_span!("op_fetch_tarball", url = %url).entered();
info!("fetchTarball started");
let dir_name = name.unwrap_or_else(|| "source".to_string());
let metadata_cache =
MetadataCache::new(3600).map_err(|e| NixRuntimeError::from(e.to_string()))?;
let input = serde_json::json!({
"type": "tarball",
"url": url,
"name": dir_name,
});
let expected_sha256 = sha256
.map(
|ref sha256| match NixHash::from_str(sha256, Some(HashAlgo::Sha256)) {
Ok(NixHash::Sha256(digest)) => Ok(digest),
_ => Err(format!("fetchTarball: invalid sha256 '{sha256}'")),
},
)
.transpose()?;
let expected_hex = expected_sha256.map(hex::encode);
if let Some(cached_entry) = metadata_cache
.lookup(&input)
.map_err(|e| NixRuntimeError::from(e.to_string()))?
{
let cached_nar_hash = cached_entry
.info
.get("nar_hash")
.and_then(|v| v.as_str())
.unwrap_or("");
if let Some(ref hex) = expected_hex {
if cached_nar_hash == hex {
info!("Cache hit");
return Ok(FetchTarballResult {
store_path: cached_entry.store_path.clone(),
nar_hash: cached_nar_hash.to_string(),
});
}
} else if !cached_entry.is_expired(3600) {
info!("Cache hit (no hash check)");
return Ok(FetchTarballResult {
store_path: cached_entry.store_path.clone(),
nar_hash: cached_nar_hash.to_string(),
});
}
}
info!("Cache miss, downloading tarball");
let downloader = Downloader::new();
let data = downloader
.download(&url)
.map_err(|e| NixRuntimeError::from(e.to_string()))?;
info!(bytes = data.len(), "Download complete");
info!("Extracting tarball");
let (extracted_path, _temp_dir) = archive::extract_tarball_to_temp(&data)
.map_err(|e| NixRuntimeError::from(e.to_string()))?;
info!("Computing NAR hash");
let nar_hash =
nar::compute_nar_hash(&extracted_path).map_err(|e| NixRuntimeError::from(e.to_string()))?;
let nar_hash_hex = hex::encode(nar_hash);
debug!(
nar_hash = %nar_hash_hex,
"Hash computation complete"
);
if let Some(ref expected) = expected_hex
&& nar_hash_hex != *expected
{
return Err(NixRuntimeError::from(format!(
"Tarball hash mismatch for '{}': expected {}, got {}",
url, expected, nar_hash_hex
)));
}
info!("Adding to store");
let ctx: &Ctx = state.get_ctx();
let store = ctx.get_store();
let store_path = store
.add_to_store_from_path(&dir_name, &extracted_path, vec![])
.map_err(|e| NixRuntimeError::from(e.to_string()))?;
info!(store_path = %store_path, "Added to store");
let info = serde_json::json!({
"nar_hash": nar_hash_hex,
"url": url,
});
let immutable = expected_sha256.is_some();
metadata_cache
.add(&input, &info, &store_path, immutable)
.map_err(|e| NixRuntimeError::from(e.to_string()))?;
Ok(FetchTarballResult {
store_path,
nar_hash: nar_hash_hex,
})
}
#[op2]
pub fn op_fetch_git<Ctx: RuntimeContext>(
state: &mut OpState,
#[string] url: String,
#[string] git_ref: Option<String>,
#[string] rev: Option<String>,
shallow: bool,
submodules: bool,
all_refs: bool,
#[string] name: Option<String>,
) -> Result<FetchGitResult, NixRuntimeError> {
let _span = tracing::info_span!("op_fetch_git", url = %url).entered();
info!("fetchGit started");
let cache = FetcherCache::new().map_err(|e| NixRuntimeError::from(e.to_string()))?;
let dir_name = name.unwrap_or_else(|| "source".to_string());
let ctx: &Ctx = state.get_ctx();
let store = ctx.get_store();
git::fetch_git(
&cache,
store,
&url,
git_ref.as_deref(),
rev.as_deref(),
shallow,
submodules,
all_refs,
&dir_name,
)
.map_err(|e| NixRuntimeError::from(e.to_string()))
}
fn normalize_hash(hash: &str) -> String {
use base64::prelude::*;
if hash.starts_with("sha256-")
&& let Some(b64) = hash.strip_prefix("sha256-")
&& let Ok(bytes) = BASE64_STANDARD.decode(b64)
{
return hex::encode(bytes);
}
hash.to_string()
}
pub fn register_ops<Ctx: RuntimeContext>() -> Vec<deno_core::OpDecl> {
vec![
op_fetch_url::<Ctx>(),
op_fetch_tarball::<Ctx>(),
op_fetch_git::<Ctx>(),
]
}

View File

@@ -0,0 +1,173 @@
use std::fs;
use std::io::Cursor;
use std::os::unix::ffi::OsStrExt;
use std::path::{Path, PathBuf};
use flate2::read::GzDecoder;
#[derive(Debug, Clone, Copy)]
pub enum ArchiveFormat {
TarGz,
TarXz,
TarBz2,
Tar,
}
impl ArchiveFormat {
pub fn detect(url: &str, data: &[u8]) -> Self {
if url.ends_with(".tar.gz") || url.ends_with(".tgz") {
return ArchiveFormat::TarGz;
}
if url.ends_with(".tar.xz") || url.ends_with(".txz") {
return ArchiveFormat::TarXz;
}
if url.ends_with(".tar.bz2") || url.ends_with(".tbz2") {
return ArchiveFormat::TarBz2;
}
if url.ends_with(".tar") {
return ArchiveFormat::Tar;
}
if data.len() >= 2 && data[0] == 0x1f && data[1] == 0x8b {
return ArchiveFormat::TarGz;
}
if data.len() >= 6 && &data[0..6] == b"\xfd7zXZ\x00" {
return ArchiveFormat::TarXz;
}
if data.len() >= 3 && &data[0..3] == b"BZh" {
return ArchiveFormat::TarBz2;
}
ArchiveFormat::TarGz
}
}
pub fn extract_tarball(data: &[u8], dest: &Path) -> Result<PathBuf, ArchiveError> {
let format = ArchiveFormat::detect("", data);
let temp_dir = dest.join("_extract_temp");
fs::create_dir_all(&temp_dir)?;
match format {
ArchiveFormat::TarGz => extract_tar_gz(data, &temp_dir)?,
ArchiveFormat::TarXz => extract_tar_xz(data, &temp_dir)?,
ArchiveFormat::TarBz2 => extract_tar_bz2(data, &temp_dir)?,
ArchiveFormat::Tar => extract_tar(data, &temp_dir)?,
}
strip_single_toplevel(&temp_dir, dest)
}
fn extract_tar_gz(data: &[u8], dest: &Path) -> Result<(), ArchiveError> {
let decoder = GzDecoder::new(Cursor::new(data));
let mut archive = tar::Archive::new(decoder);
archive.unpack(dest)?;
Ok(())
}
fn extract_tar_xz(data: &[u8], dest: &Path) -> Result<(), ArchiveError> {
let decoder = xz2::read::XzDecoder::new(Cursor::new(data));
let mut archive = tar::Archive::new(decoder);
archive.unpack(dest)?;
Ok(())
}
fn extract_tar_bz2(data: &[u8], dest: &Path) -> Result<(), ArchiveError> {
let decoder = bzip2::read::BzDecoder::new(Cursor::new(data));
let mut archive = tar::Archive::new(decoder);
archive.unpack(dest)?;
Ok(())
}
fn extract_tar(data: &[u8], dest: &Path) -> Result<(), ArchiveError> {
let mut archive = tar::Archive::new(Cursor::new(data));
archive.unpack(dest)?;
Ok(())
}
fn strip_single_toplevel(temp_dir: &Path, dest: &Path) -> Result<PathBuf, ArchiveError> {
let entries: Vec<_> = fs::read_dir(temp_dir)?
.filter_map(|e| e.ok())
.filter(|e| e.file_name().as_os_str().as_bytes()[0] != b'.')
.collect();
let source_dir = if entries.len() == 1 && entries[0].file_type()?.is_dir() {
entries[0].path()
} else {
temp_dir.to_path_buf()
};
let final_dest = dest.join("content");
if final_dest.exists() {
fs::remove_dir_all(&final_dest)?;
}
if source_dir == *temp_dir {
fs::rename(temp_dir, &final_dest)?;
} else {
copy_dir_recursive(&source_dir, &final_dest)?;
fs::remove_dir_all(temp_dir)?;
}
Ok(final_dest)
}
fn copy_dir_recursive(src: &Path, dst: &Path) -> Result<(), std::io::Error> {
fs::create_dir_all(dst)?;
for entry in fs::read_dir(src)? {
let entry = entry?;
let path = entry.path();
let dest_path = dst.join(entry.file_name());
let metadata = fs::symlink_metadata(&path)?;
if metadata.is_symlink() {
let target = fs::read_link(&path)?;
#[cfg(unix)]
{
std::os::unix::fs::symlink(&target, &dest_path)?;
}
#[cfg(windows)]
{
if target.is_dir() {
std::os::windows::fs::symlink_dir(&target, &dest_path)?;
} else {
std::os::windows::fs::symlink_file(&target, &dest_path)?;
}
}
} else if metadata.is_dir() {
copy_dir_recursive(&path, &dest_path)?;
} else {
fs::copy(&path, &dest_path)?;
}
}
Ok(())
}
pub fn extract_tarball_to_temp(data: &[u8]) -> Result<(PathBuf, tempfile::TempDir), ArchiveError> {
let temp_dir = tempfile::tempdir()?;
let extracted_path = extract_tarball(data, temp_dir.path())?;
Ok((extracted_path, temp_dir))
}
#[derive(Debug)]
pub enum ArchiveError {
IoError(std::io::Error),
}
impl std::fmt::Display for ArchiveError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
ArchiveError::IoError(e) => write!(f, "I/O error: {}", e),
}
}
}
impl std::error::Error for ArchiveError {}
impl From<std::io::Error> for ArchiveError {
fn from(e: std::io::Error) -> Self {
ArchiveError::IoError(e)
}
}

View File

@@ -0,0 +1,29 @@
use std::fs;
use std::path::PathBuf;
#[derive(Debug)]
pub struct FetcherCache {
base_dir: PathBuf,
}
impl FetcherCache {
pub fn new() -> Result<Self, std::io::Error> {
let base_dir = dirs::cache_dir()
.unwrap_or_else(|| PathBuf::from("/tmp"))
.join("nix-js")
.join("fetchers");
fs::create_dir_all(&base_dir)?;
Ok(Self { base_dir })
}
fn git_cache_dir(&self) -> PathBuf {
self.base_dir.join("git")
}
pub fn get_git_bare(&self, url: &str) -> PathBuf {
let key = crate::nix_utils::sha256_hex(url.as_bytes());
self.git_cache_dir().join(key)
}
}

View File

@@ -0,0 +1,64 @@
use std::time::Duration;
use reqwest::blocking::Client;
pub struct Downloader {
client: Client,
}
impl Downloader {
pub fn new() -> Self {
let client = Client::builder()
.timeout(Duration::from_secs(300))
.user_agent("nix-js/0.1")
.build()
.expect("Failed to create HTTP client");
Self { client }
}
pub fn download(&self, url: &str) -> Result<Vec<u8>, DownloadError> {
let response = self
.client
.get(url)
.send()
.map_err(|e| DownloadError::NetworkError(e.to_string()))?;
if !response.status().is_success() {
return Err(DownloadError::HttpError {
url: url.to_string(),
status: response.status().as_u16(),
});
}
response
.bytes()
.map(|b| b.to_vec())
.map_err(|e| DownloadError::NetworkError(e.to_string()))
}
}
impl Default for Downloader {
fn default() -> Self {
Self::new()
}
}
#[derive(Debug)]
pub enum DownloadError {
NetworkError(String),
HttpError { url: String, status: u16 },
}
impl std::fmt::Display for DownloadError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
DownloadError::NetworkError(msg) => write!(f, "Network error: {}", msg),
DownloadError::HttpError { url, status } => {
write!(f, "HTTP error {} for URL: {}", status, url)
}
}
}
}
impl std::error::Error for DownloadError {}

315
nix-js/src/fetcher/git.rs Normal file
View File

@@ -0,0 +1,315 @@
use std::fs;
use std::path::PathBuf;
use std::process::Command;
use super::FetchGitResult;
use super::cache::FetcherCache;
use crate::store::Store;
#[allow(clippy::too_many_arguments)]
pub fn fetch_git(
cache: &FetcherCache,
store: &dyn Store,
url: &str,
git_ref: Option<&str>,
rev: Option<&str>,
_shallow: bool,
submodules: bool,
all_refs: bool,
name: &str,
) -> Result<FetchGitResult, GitError> {
let bare_repo = cache.get_git_bare(url);
if !bare_repo.exists() {
clone_bare(url, &bare_repo)?;
} else {
fetch_repo(&bare_repo, all_refs)?;
}
let target_rev = resolve_rev(&bare_repo, git_ref, rev)?;
let temp_dir = tempfile::tempdir()?;
let checkout_dir = checkout_rev_to_temp(&bare_repo, &target_rev, submodules, temp_dir.path())?;
let nar_hash = hex::encode(
crate::nar::compute_nar_hash(&checkout_dir)
.map_err(|e| GitError::NarHashError(e.to_string()))?,
);
let store_path = store
.add_to_store_from_path(name, &checkout_dir, vec![])
.map_err(|e| GitError::StoreError(e.to_string()))?;
let rev_count = get_rev_count(&bare_repo, &target_rev)?;
let last_modified = get_last_modified(&bare_repo, &target_rev)?;
let last_modified_date = format_timestamp(last_modified);
let short_rev = if target_rev.len() >= 7 {
target_rev[..7].to_string()
} else {
target_rev.clone()
};
Ok(FetchGitResult {
out_path: store_path,
rev: target_rev,
short_rev,
rev_count,
last_modified,
last_modified_date,
submodules,
nar_hash: Some(nar_hash),
})
}
fn clone_bare(url: &str, dest: &PathBuf) -> Result<(), GitError> {
fs::create_dir_all(dest.parent().unwrap_or(dest))?;
let output = Command::new("git")
.args(["clone", "--bare", url])
.arg(dest)
.output()?;
if !output.status.success() {
return Err(GitError::CommandFailed {
operation: "clone".to_string(),
message: String::from_utf8_lossy(&output.stderr).to_string(),
});
}
Ok(())
}
fn fetch_repo(repo: &PathBuf, all_refs: bool) -> Result<(), GitError> {
let mut args = vec!["fetch", "--prune"];
if all_refs {
args.push("--all");
}
let output = Command::new("git").args(args).current_dir(repo).output()?;
if !output.status.success() {
return Err(GitError::CommandFailed {
operation: "fetch".to_string(),
message: String::from_utf8_lossy(&output.stderr).to_string(),
});
}
Ok(())
}
fn resolve_rev(
repo: &PathBuf,
git_ref: Option<&str>,
rev: Option<&str>,
) -> Result<String, GitError> {
if let Some(rev) = rev {
return Ok(rev.to_string());
}
let ref_to_resolve = git_ref.unwrap_or("HEAD");
let output = Command::new("git")
.args(["rev-parse", ref_to_resolve])
.current_dir(repo)
.output()?;
if !output.status.success() {
let output = Command::new("git")
.args(["rev-parse", &format!("refs/heads/{}", ref_to_resolve)])
.current_dir(repo)
.output()?;
if !output.status.success() {
let output = Command::new("git")
.args(["rev-parse", &format!("refs/tags/{}", ref_to_resolve)])
.current_dir(repo)
.output()?;
if !output.status.success() {
return Err(GitError::CommandFailed {
operation: "rev-parse".to_string(),
message: format!("Could not resolve ref: {}", ref_to_resolve),
});
}
return Ok(String::from_utf8_lossy(&output.stdout).trim().to_string());
}
return Ok(String::from_utf8_lossy(&output.stdout).trim().to_string());
}
Ok(String::from_utf8_lossy(&output.stdout).trim().to_string())
}
fn checkout_rev_to_temp(
bare_repo: &PathBuf,
rev: &str,
submodules: bool,
temp_path: &std::path::Path,
) -> Result<PathBuf, GitError> {
let checkout_dir = temp_path.join("checkout");
fs::create_dir_all(&checkout_dir)?;
let output = Command::new("git")
.args(["--work-tree", checkout_dir.to_str().unwrap_or(".")])
.arg("checkout")
.arg(rev)
.arg("--")
.arg(".")
.current_dir(bare_repo)
.output()?;
if !output.status.success() {
fs::remove_dir_all(&checkout_dir)?;
return Err(GitError::CommandFailed {
operation: "checkout".to_string(),
message: String::from_utf8_lossy(&output.stderr).to_string(),
});
}
if submodules {
let output = Command::new("git")
.args(["submodule", "update", "--init", "--recursive"])
.current_dir(&checkout_dir)
.output()?;
if !output.status.success() {
tracing::warn!(
"failed to initialize submodules: {}",
String::from_utf8_lossy(&output.stderr)
);
}
}
let git_dir = checkout_dir.join(".git");
if git_dir.exists() {
fs::remove_dir_all(&git_dir)?;
}
Ok(checkout_dir)
}
fn get_rev_count(repo: &PathBuf, rev: &str) -> Result<u64, GitError> {
let output = Command::new("git")
.args(["rev-list", "--count", rev])
.current_dir(repo)
.output()?;
if !output.status.success() {
return Ok(0);
}
let count_str = String::from_utf8_lossy(&output.stdout);
count_str.trim().parse().unwrap_or(0).pipe(Ok)
}
fn get_last_modified(repo: &PathBuf, rev: &str) -> Result<u64, GitError> {
let output = Command::new("git")
.args(["log", "-1", "--format=%ct", rev])
.current_dir(repo)
.output()?;
if !output.status.success() {
return Ok(0);
}
let ts_str = String::from_utf8_lossy(&output.stdout);
ts_str.trim().parse().unwrap_or(0).pipe(Ok)
}
fn format_timestamp(ts: u64) -> String {
use std::time::{Duration, UNIX_EPOCH};
let datetime = UNIX_EPOCH + Duration::from_secs(ts);
let secs = datetime
.duration_since(UNIX_EPOCH)
.map(|d| d.as_secs())
.unwrap_or(0);
let days_since_epoch = secs / 86400;
let remaining_secs = secs % 86400;
let hours = remaining_secs / 3600;
let minutes = (remaining_secs % 3600) / 60;
let seconds = remaining_secs % 60;
let (year, month, day) = days_to_ymd(days_since_epoch);
format!(
"{:04}{:02}{:02}{:02}{:02}{:02}",
year, month, day, hours, minutes, seconds
)
}
fn days_to_ymd(days: u64) -> (u64, u64, u64) {
let mut y = 1970;
let mut remaining = days as i64;
loop {
let days_in_year = if is_leap_year(y) { 366 } else { 365 };
if remaining < days_in_year {
break;
}
remaining -= days_in_year;
y += 1;
}
let days_in_months: [i64; 12] = if is_leap_year(y) {
[31, 29, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31]
} else {
[31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31]
};
let mut m = 1;
for days_in_month in days_in_months.iter() {
if remaining < *days_in_month {
break;
}
remaining -= *days_in_month;
m += 1;
}
(y, m, (remaining + 1) as u64)
}
fn is_leap_year(y: u64) -> bool {
(y.is_multiple_of(4) && !y.is_multiple_of(100)) || y.is_multiple_of(400)
}
trait Pipe: Sized {
fn pipe<F, R>(self, f: F) -> R
where
F: FnOnce(Self) -> R,
{
f(self)
}
}
impl<T> Pipe for T {}
#[derive(Debug)]
pub enum GitError {
IoError(std::io::Error),
CommandFailed { operation: String, message: String },
NarHashError(String),
StoreError(String),
}
impl std::fmt::Display for GitError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
GitError::IoError(e) => write!(f, "I/O error: {}", e),
GitError::CommandFailed { operation, message } => {
write!(f, "Git {} failed: {}", operation, message)
}
GitError::NarHashError(e) => write!(f, "NAR hash error: {}", e),
GitError::StoreError(e) => write!(f, "Store error: {}", e),
}
}
}
impl std::error::Error for GitError {}
impl From<std::io::Error> for GitError {
fn from(e: std::io::Error) -> Self {
GitError::IoError(e)
}
}

View File

@@ -0,0 +1,218 @@
#![allow(dead_code)]
use std::path::PathBuf;
use std::time::{SystemTime, UNIX_EPOCH};
use rusqlite::{Connection, OptionalExtension, params};
use serde::{Deserialize, Serialize};
#[derive(Debug)]
pub enum CacheError {
Database(rusqlite::Error),
Json(serde_json::Error),
}
impl std::fmt::Display for CacheError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
CacheError::Database(e) => write!(f, "Database error: {}", e),
CacheError::Json(e) => write!(f, "JSON error: {}", e),
}
}
}
impl std::error::Error for CacheError {}
impl From<rusqlite::Error> for CacheError {
fn from(e: rusqlite::Error) -> Self {
CacheError::Database(e)
}
}
impl From<serde_json::Error> for CacheError {
fn from(e: serde_json::Error) -> Self {
CacheError::Json(e)
}
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct CacheEntry {
pub input: serde_json::Value,
pub info: serde_json::Value,
pub store_path: String,
pub immutable: bool,
pub timestamp: u64,
}
impl CacheEntry {
pub fn is_expired(&self, ttl_seconds: u64) -> bool {
if self.immutable {
return false;
}
if ttl_seconds == 0 {
return false;
}
let now = SystemTime::now()
.duration_since(UNIX_EPOCH)
.expect("Clock may have gone backwards")
.as_secs();
now > self.timestamp + ttl_seconds
}
}
pub struct MetadataCache {
conn: Connection,
ttl_seconds: u64,
}
impl MetadataCache {
pub fn new(ttl_seconds: u64) -> Result<Self, CacheError> {
let cache_dir = dirs::cache_dir()
.unwrap_or_else(|| PathBuf::from("/tmp"))
.join("nix-js");
std::fs::create_dir_all(&cache_dir).map_err(|e| {
CacheError::Database(rusqlite::Error::ToSqlConversionFailure(Box::new(e)))
})?;
let db_path = cache_dir.join("fetcher-cache.sqlite");
let conn = Connection::open(db_path)?;
conn.execute(
"CREATE TABLE IF NOT EXISTS cache (
input TEXT NOT NULL PRIMARY KEY,
info TEXT NOT NULL,
store_path TEXT NOT NULL,
immutable INTEGER NOT NULL,
timestamp INTEGER NOT NULL
)",
[],
)?;
Ok(Self { conn, ttl_seconds })
}
pub fn lookup(&self, input: &serde_json::Value) -> Result<Option<CacheEntry>, CacheError> {
let input_str = serde_json::to_string(input)?;
let entry: Option<(String, String, String, i64, i64)> = self
.conn
.query_row(
"SELECT input, info, store_path, immutable, timestamp FROM cache WHERE input = ?1",
params![input_str],
|row| {
Ok((
row.get(0)?,
row.get(1)?,
row.get(2)?,
row.get(3)?,
row.get(4)?,
))
},
)
.optional()?;
match entry {
Some((input_json, info_json, store_path, immutable, timestamp)) => {
let entry = CacheEntry {
input: serde_json::from_str(&input_json)?,
info: serde_json::from_str(&info_json)?,
store_path,
immutable: immutable != 0,
timestamp: timestamp as u64,
};
if entry.is_expired(self.ttl_seconds) {
Ok(None)
} else {
Ok(Some(entry))
}
}
None => Ok(None),
}
}
pub fn lookup_expired(
&self,
input: &serde_json::Value,
) -> Result<Option<CacheEntry>, CacheError> {
let input_str = serde_json::to_string(input)?;
let entry: Option<(String, String, String, i64, i64)> = self
.conn
.query_row(
"SELECT input, info, store_path, immutable, timestamp FROM cache WHERE input = ?1",
params![input_str],
|row| {
Ok((
row.get(0)?,
row.get(1)?,
row.get(2)?,
row.get(3)?,
row.get(4)?,
))
},
)
.optional()?;
match entry {
Some((input_json, info_json, store_path, immutable, timestamp)) => {
Ok(Some(CacheEntry {
input: serde_json::from_str(&input_json)?,
info: serde_json::from_str(&info_json)?,
store_path,
immutable: immutable != 0,
timestamp: timestamp as u64,
}))
}
None => Ok(None),
}
}
pub fn add(
&self,
input: &serde_json::Value,
info: &serde_json::Value,
store_path: &str,
immutable: bool,
) -> Result<(), CacheError> {
let input_str = serde_json::to_string(input)?;
let info_str = serde_json::to_string(info)?;
let timestamp = SystemTime::now()
.duration_since(UNIX_EPOCH)
.expect("Clock may have gone backwards")
.as_secs();
self.conn.execute(
"INSERT OR REPLACE INTO cache (input, info, store_path, immutable, timestamp)
VALUES (?1, ?2, ?3, ?4, ?5)",
params![
input_str,
info_str,
store_path,
if immutable { 1 } else { 0 },
timestamp as i64
],
)?;
Ok(())
}
pub fn update_timestamp(&self, input: &serde_json::Value) -> Result<(), CacheError> {
let input_str = serde_json::to_string(input)?;
let timestamp = SystemTime::now()
.duration_since(UNIX_EPOCH)
.expect("Clock may have gone backwards")
.as_secs();
self.conn.execute(
"UPDATE cache SET timestamp = ?1 WHERE input = ?2",
params![timestamp as i64, input_str],
)?;
Ok(())
}
}

View File

@@ -1,161 +1,158 @@
use derive_more::{IsVariant, TryUnwrap, Unwrap};
use hashbrown::{HashMap, HashSet};
use rnix::ast;
use std::{
hash::{Hash, Hasher},
ops::Deref,
};
use bumpalo::{Bump, boxed::Box, collections::Vec};
use ghost_cell::{GhostCell, GhostToken};
use rnix::{TextRange, ast};
use string_interner::symbol::SymbolU32;
use crate::context::SccInfo;
use crate::error::{Error, Result};
use crate::value::format_symbol;
use nix_js_macros::ir;
pub type HashMap<'ir, K, V> = hashbrown::HashMap<K, V, hashbrown::DefaultHashBuilder, &'ir Bump>;
mod downgrade;
mod utils;
use utils::*;
#[repr(transparent)]
#[derive(Clone, Copy)]
pub struct IrRef<'id, 'ir>(&'ir GhostCell<'id, Ir<'ir, Self>>);
pub use downgrade::Downgrade;
pub trait DowngradeContext {
fn downgrade(self, expr: rnix::ast::Expr) -> Result<ExprId>;
fn new_expr(&mut self, expr: Ir) -> ExprId;
fn new_arg(&mut self) -> ExprId;
fn new_sym(&mut self, sym: String) -> SymId;
fn get_sym(&self, id: SymId) -> &str;
fn lookup(&mut self, sym: SymId) -> Result<ExprId>;
fn extract_expr(&mut self, id: ExprId) -> Ir;
fn replace_expr(&mut self, id: ExprId, expr: Ir);
fn reserve_slots(&mut self, slots: usize) -> impl Iterator<Item = ExprId> + Clone + use<Self>;
fn with_param_scope<F, R>(&mut self, param: SymId, arg: ExprId, f: F) -> R
where
F: FnOnce(&mut Self) -> R;
fn with_let_scope<F, R>(&mut self, bindings: HashMap<SymId, ExprId>, f: F) -> R
where
F: FnOnce(&mut Self) -> R;
fn with_with_scope<F, R>(&mut self, namespace: ExprId, f: F) -> R
where
F: FnOnce(&mut Self) -> R;
fn get_current_dir(&self) -> std::path::PathBuf;
fn push_dep_tracker(&mut self, slots: &[ExprId]);
fn set_current_binding(&mut self, expr: Option<ExprId>);
fn pop_dep_tracker(&mut self) -> Result<SccInfo>;
}
ir! {
Ir,
Int(i64),
Float(f64),
Str,
AttrSet,
List,
HasAttr,
BinOp,
UnOp,
Select,
If,
Call,
Assert,
ConcatStrings,
Path,
Func,
Let,
Arg(ArgId),
ExprRef(ExprId),
Thunk(ExprId),
Builtins,
Builtin,
}
impl AttrSet {
fn _insert(
&mut self,
mut path: impl Iterator<Item = Attr>,
name: Attr,
value: ExprId,
ctx: &mut impl DowngradeContext,
) -> Result<()> {
if let Some(attr) = path.next() {
// If the path is not yet exhausted, we need to recurse deeper.
match attr {
Attr::Str(ident) => {
// If the next attribute is a static string.
if let Some(&id) = self.stcs.get(&ident) {
// If a sub-attrset already exists, recurse into it.
let mut ir = ctx.extract_expr(id);
let result = ir
.as_mut()
.try_unwrap_attr_set()
.map_err(|_| {
// This path segment exists but is not an attrset, which is an error.
Error::downgrade_error(format!(
"attribute '{}' already defined but is not an attribute set",
format_symbol(ctx.get_sym(ident))
))
})
.and_then(|attrs| attrs._insert(path, name, value, ctx));
ctx.replace_expr(id, ir);
result?;
} else {
// Create a new sub-attrset because this path doesn't exist yet.
let mut attrs = AttrSet::default();
attrs._insert(path, name, value, ctx)?;
let attrs = ctx.new_expr(attrs.to_ir());
self.stcs.insert(ident, attrs);
}
Ok(())
}
Attr::Dynamic(dynamic) => {
// If the next attribute is a dynamic expression, we must create a new sub-attrset.
// We cannot merge with existing dynamic attributes at this stage.
let mut attrs = AttrSet::default();
attrs._insert(path, name, value, ctx)?;
self.dyns.push((dynamic, ctx.new_expr(attrs.to_ir())));
Ok(())
}
}
} else {
// This is the final attribute in the path, so insert the value here.
match name {
Attr::Str(ident) => {
if self.stcs.insert(ident, value).is_some() {
return Err(Error::downgrade_error(format!(
"attribute '{}' already defined",
format_symbol(ctx.get_sym(ident))
)));
}
}
Attr::Dynamic(dynamic) => {
self.dyns.push((dynamic, value));
}
}
Ok(())
}
impl<'id, 'ir> IrRef<'id, 'ir> {
pub fn new(ir: &'ir GhostCell<'id, Ir<'ir, Self>>) -> Self {
Self(ir)
}
fn insert(
&mut self,
path: Vec<Attr>,
value: ExprId,
ctx: &mut impl DowngradeContext,
) -> Result<()> {
let mut path = path.into_iter();
// The last part of the path is the name of the attribute to be inserted.
let name = path
.next_back()
.expect("empty attrpath passed. this is a bug");
self._insert(path, name, value, ctx)
pub fn alloc(bump: &'ir Bump, ir: Ir<'ir, Self>) -> Self {
Self(bump.alloc(GhostCell::new(ir)))
}
/// Freeze a mutable IR reference into a read-only one, consuming the
/// `GhostToken` to prevent any further mutation.
///
/// # Safety
/// The transmute is sound because:
/// - `GhostCell<'id, T>` is `#[repr(transparent)]` over `T`
/// - `IrRef<'id, 'ir>` is `#[repr(transparent)]` over
/// `&'ir GhostCell<'id, Ir<'ir, Self>>`
/// - `RawIrRef<'ir>` is `#[repr(transparent)]` over `&'ir Ir<'ir, Self>`
/// - `Ir<'ir, Ref>` is `#[repr(C)]` and both ref types are pointer-sized
///
/// Consuming the `GhostToken` guarantees no `borrow_mut` calls can occur
/// afterwards, so the shared `&Ir` references from `RawIrRef::Deref` can
/// never alias with mutable references.
pub fn freeze(self, _token: GhostToken<'id>) -> RawIrRef<'ir> {
unsafe { std::mem::transmute(self) }
}
}
impl<'id, 'ir> Deref for IrRef<'id, 'ir> {
type Target = GhostCell<'id, Ir<'ir, IrRef<'id, 'ir>>>;
fn deref(&self) -> &Self::Target {
self.0
}
}
#[repr(transparent)]
#[derive(Clone, Copy)]
pub struct RawIrRef<'ir>(&'ir Ir<'ir, Self>);
impl<'ir> Deref for RawIrRef<'ir> {
type Target = Ir<'ir, RawIrRef<'ir>>;
fn deref(&self) -> &Self::Target {
self.0
}
}
#[repr(C)]
pub enum Ir<'ir, Ref> {
Int(i64),
Float(f64),
Bool(bool),
Null,
Str(Box<'ir, String>),
AttrSet {
stcs: HashMap<'ir, SymId, (Ref, TextRange)>,
dyns: Vec<'ir, (Ref, Ref, TextRange)>,
},
List {
items: Vec<'ir, Ref>,
},
Path(Ref),
ConcatStrings {
parts: Vec<'ir, Ref>,
force_string: bool,
},
// OPs
UnOp {
rhs: Ref,
kind: UnOpKind,
},
BinOp {
lhs: Ref,
rhs: Ref,
kind: BinOpKind,
},
HasAttr {
lhs: Ref,
rhs: Vec<'ir, Attr<Ref>>,
},
Select {
expr: Ref,
attrpath: Vec<'ir, Attr<Ref>>,
default: Option<Ref>,
span: TextRange,
},
// Conditionals
If {
cond: Ref,
consq: Ref,
alter: Ref,
},
Assert {
assertion: Ref,
expr: Ref,
assertion_raw: String,
span: TextRange,
},
With {
namespace: Ref,
body: Ref,
thunks: Vec<'ir, (ThunkId, Ref)>,
},
WithLookup(SymId),
// Function related
Func {
body: Ref,
param: Option<Param<'ir>>,
arg: ArgId,
thunks: Vec<'ir, (ThunkId, Ref)>,
},
Arg(ArgId),
Call {
func: Ref,
arg: Ref,
span: TextRange,
},
// Builtins
Builtins,
Builtin(SymId),
// Misc
TopLevel {
body: Ref,
thunks: Vec<'ir, (ThunkId, Ref)>,
},
Thunk(ThunkId),
CurPos(TextRange),
ReplBinding(SymId),
ScopedImportBinding(SymId),
}
#[repr(transparent)]
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, PartialOrd, Ord)]
pub struct ExprId(pub usize);
pub struct ThunkId(pub usize);
pub type SymId = SymbolU32;
@@ -163,52 +160,20 @@ pub type SymId = SymbolU32;
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, PartialOrd, Ord)]
pub struct ArgId(pub usize);
/// Represents a Nix attribute set.
#[derive(Debug, Default)]
pub struct AttrSet {
/// Statically known attributes (key is a string).
pub stcs: HashMap<SymId, ExprId>,
/// Dynamically computed attributes, where both the key and value are expressions.
pub dyns: Vec<(ExprId, ExprId)>,
}
/// Represents a key in an attribute path.
#[derive(Debug, TryUnwrap)]
pub enum Attr {
#[allow(unused)]
#[derive(Debug)]
pub enum Attr<Ref> {
/// A dynamic attribute key, which is an expression that must evaluate to a string.
/// Example: `attrs.${key}`
Dynamic(ExprId),
Dynamic(Ref, TextRange),
/// A static attribute key.
/// Example: `attrs.key`
Str(SymId),
}
/// Represents a Nix list.
#[derive(Debug)]
pub struct List {
/// The expressions that are elements of the list.
pub items: Vec<ExprId>,
}
/// Represents a "has attribute" check (`?` operator).
#[derive(Debug)]
pub struct HasAttr {
/// The expression to check for the attribute (the left-hand side).
pub lhs: ExprId,
/// The attribute path to look for (the right-hand side).
pub rhs: Vec<Attr>,
}
/// Represents a binary operation.
#[derive(Debug)]
pub struct BinOp {
pub lhs: ExprId,
pub rhs: ExprId,
pub kind: BinOpKind,
Str(SymId, TextRange),
}
/// The kinds of binary operations supported in Nix.
#[derive(Clone, Debug)]
#[derive(Clone, Copy, Debug, Hash, PartialEq, Eq)]
pub enum BinOpKind {
// Arithmetic
Add,
@@ -264,15 +229,8 @@ impl From<ast::BinOpKind> for BinOpKind {
}
}
/// Represents a unary operation.
#[derive(Debug)]
pub struct UnOp {
pub rhs: ExprId,
pub kind: UnOpKind,
}
/// The kinds of unary operations.
#[derive(Clone, Debug)]
#[derive(Clone, Copy, Debug, Hash, PartialEq, Eq)]
pub enum UnOpKind {
Neg, // Negation (`-`)
Not, // Logical not (`!`)
@@ -287,97 +245,439 @@ impl From<ast::UnaryOpKind> for UnOpKind {
}
}
/// Represents an attribute selection from an attribute set.
#[derive(Debug)]
pub struct Select {
/// The expression that should evaluate to an attribute set.
pub expr: ExprId,
/// The path of attributes to select.
pub attrpath: Vec<Attr>,
/// An optional default value to return if the selection fails.
pub default: Option<ExprId>,
}
/// Represents an `if-then-else` expression.
#[derive(Debug)]
pub struct If {
pub cond: ExprId,
pub consq: ExprId, // Consequence (then branch)
pub alter: ExprId, // Alternative (else branch)
}
/// Represents a function value (a lambda).
#[derive(Debug)]
pub struct Func {
/// The body of the function
pub body: ExprId,
/// The parameter specification for the function.
pub param: Param,
pub arg: ExprId,
}
/// Represents a `let ... in ...` expression.
#[derive(Debug)]
pub struct Let {
/// The bindings in the `let` expression, group in SCCs
pub binding_sccs: SccInfo,
/// The body expression evaluated in the scope of the bindings.
pub body: ExprId,
}
/// Describes the parameters of a function.
#[derive(Debug)]
pub struct Param {
/// The set of required parameter names for a pattern-matching function.
pub required: Option<Vec<SymId>>,
/// The set of all allowed parameter names for a non-ellipsis pattern-matching function.
/// If `None`, any attribute is allowed (ellipsis `...` is present).
pub allowed: Option<HashSet<SymId>>,
pub struct Param<'ir> {
pub required: Vec<'ir, (SymId, TextRange)>,
pub optional: Vec<'ir, (SymId, TextRange)>,
pub ellipsis: bool,
}
/// Represents a function call.
#[derive(Debug)]
pub struct Call {
/// The expression that evaluates to the function to be called.
pub func: ExprId,
pub arg: ExprId,
#[derive(Clone, Copy)]
pub(crate) struct IrKey<'id, 'ir, 'a>(pub IrRef<'id, 'ir>, pub &'a GhostToken<'id>);
impl std::hash::Hash for IrKey<'_, '_, '_> {
fn hash<H: Hasher>(&self, state: &mut H) {
ir_content_hash(self.0, self.1, state);
}
}
/// Represents an `assert` expression.
#[derive(Debug)]
pub struct Assert {
/// The condition to assert.
pub assertion: ExprId,
/// The expression to return if the assertion is true.
pub expr: ExprId,
impl PartialEq for IrKey<'_, '_, '_> {
fn eq(&self, other: &Self) -> bool {
ir_content_eq(self.0, other.0, self.1)
}
}
/// Represents the concatenation of multiple string expressions.
/// This is typically the result of downgrading an interpolated string.
#[derive(Debug)]
pub struct ConcatStrings {
pub parts: Vec<ExprId>,
impl Eq for IrKey<'_, '_, '_> {}
fn attr_content_hash<'id>(
attr: &Attr<IrRef<'id, '_>>,
token: &GhostToken<'id>,
state: &mut impl Hasher,
) {
core::mem::discriminant(attr).hash(state);
match attr {
Attr::Dynamic(expr, _) => ir_content_hash(*expr, token, state),
Attr::Str(sym, _) => sym.hash(state),
}
}
/// Represents a simple, non-interpolated string literal.
#[derive(Debug)]
pub struct Str {
pub val: String,
fn attr_content_eq<'id, 'ir>(
a: &Attr<IrRef<'id, 'ir>>,
b: &Attr<IrRef<'id, 'ir>>,
token: &GhostToken<'id>,
) -> bool {
match (a, b) {
(Attr::Dynamic(ae, _), Attr::Dynamic(be, _)) => ir_content_eq(*ae, *be, token),
(Attr::Str(a, _), Attr::Str(b, _)) => a == b,
_ => false,
}
}
/// Represents a path literal.
#[derive(Debug)]
pub struct Path {
/// The expression that evaluates to the string content of the path.
/// This can be a simple `Str` or a `ConcatStrings` for interpolated paths.
pub expr: ExprId,
fn param_content_hash(param: &Param<'_>, state: &mut impl Hasher) {
param.required.len().hash(state);
for (sym, _) in param.required.iter() {
sym.hash(state);
}
param.optional.len().hash(state);
for (sym, _) in param.optional.iter() {
sym.hash(state);
}
param.ellipsis.hash(state);
}
/// Represents the special `builtins` global object.
#[derive(Debug)]
pub struct Builtins;
fn param_content_eq(a: &Param<'_>, b: &Param<'_>) -> bool {
a.ellipsis == b.ellipsis
&& a.required.len() == b.required.len()
&& a.optional.len() == b.optional.len()
&& a.required
.iter()
.zip(b.required.iter())
.all(|((a, _), (b, _))| a == b)
&& a.optional
.iter()
.zip(b.optional.iter())
.all(|((a, _), (b, _))| a == b)
}
/// Represents an attribute in `builtins`.
#[derive(Debug)]
pub struct Builtin(pub SymId);
fn thunks_content_hash<'id>(
thunks: &[(ThunkId, IrRef<'id, '_>)],
token: &GhostToken<'id>,
state: &mut impl Hasher,
) {
thunks.len().hash(state);
for &(id, ir) in thunks {
id.hash(state);
ir_content_hash(ir, token, state);
}
}
fn thunks_content_eq<'id, 'ir>(
a: &[(ThunkId, IrRef<'id, 'ir>)],
b: &[(ThunkId, IrRef<'id, 'ir>)],
token: &GhostToken<'id>,
) -> bool {
a.len() == b.len()
&& a.iter()
.zip(b.iter())
.all(|(&(ai, ae), &(bi, be))| ai == bi && ir_content_eq(ae, be, token))
}
fn ir_content_hash<'id>(ir: IrRef<'id, '_>, token: &GhostToken<'id>, state: &mut impl Hasher) {
let ir = ir.borrow(token);
core::mem::discriminant(ir).hash(state);
match ir {
Ir::Int(x) => x.hash(state),
Ir::Float(x) => x.to_bits().hash(state),
Ir::Bool(x) => x.hash(state),
Ir::Null => {}
Ir::Str(x) => x.hash(state),
Ir::AttrSet { stcs, dyns } => {
stcs.len().hash(state);
let mut combined: u64 = 0;
for (&key, &(val, _)) in stcs.iter() {
let mut h = std::hash::DefaultHasher::new();
key.hash(&mut h);
ir_content_hash(val, token, &mut h);
combined = combined.wrapping_add(h.finish());
}
combined.hash(state);
dyns.len().hash(state);
for &(k, v, _) in dyns.iter() {
ir_content_hash(k, token, state);
ir_content_hash(v, token, state);
}
}
Ir::List { items } => {
items.len().hash(state);
for &item in items.iter() {
ir_content_hash(item, token, state);
}
}
Ir::HasAttr { lhs, rhs } => {
ir_content_hash(*lhs, token, state);
rhs.len().hash(state);
for attr in rhs.iter() {
attr_content_hash(attr, token, state);
}
}
&Ir::BinOp { lhs, rhs, kind } => {
ir_content_hash(lhs, token, state);
ir_content_hash(rhs, token, state);
kind.hash(state);
}
&Ir::UnOp { rhs, kind } => {
ir_content_hash(rhs, token, state);
kind.hash(state);
}
Ir::Select {
expr,
attrpath,
default,
..
} => {
ir_content_hash(*expr, token, state);
attrpath.len().hash(state);
for attr in attrpath.iter() {
attr_content_hash(attr, token, state);
}
default.is_some().hash(state);
if let Some(d) = default {
ir_content_hash(*d, token, state);
}
}
&Ir::If { cond, consq, alter } => {
ir_content_hash(cond, token, state);
ir_content_hash(consq, token, state);
ir_content_hash(alter, token, state);
}
&Ir::Call { func, arg, .. } => {
ir_content_hash(func, token, state);
ir_content_hash(arg, token, state);
}
Ir::Assert {
assertion,
expr,
assertion_raw,
..
} => {
ir_content_hash(*assertion, token, state);
ir_content_hash(*expr, token, state);
assertion_raw.hash(state);
}
Ir::ConcatStrings {
force_string,
parts,
} => {
force_string.hash(state);
parts.len().hash(state);
for &part in parts.iter() {
ir_content_hash(part, token, state);
}
}
&Ir::Path(expr) => ir_content_hash(expr, token, state),
Ir::Func {
body,
arg,
param,
thunks,
} => {
ir_content_hash(*body, token, state);
arg.hash(state);
param.is_some().hash(state);
if let Some(p) = param {
param_content_hash(p, state);
}
thunks_content_hash(thunks, token, state);
}
Ir::TopLevel { body, thunks } => {
ir_content_hash(*body, token, state);
thunks_content_hash(thunks, token, state);
}
Ir::Arg(x) => x.hash(state),
Ir::Thunk(x) => x.hash(state),
Ir::Builtins => {}
Ir::Builtin(x) => x.hash(state),
Ir::CurPos(x) => x.hash(state),
Ir::ReplBinding(x) => x.hash(state),
Ir::ScopedImportBinding(x) => x.hash(state),
&Ir::With {
namespace,
body,
ref thunks,
} => {
ir_content_hash(namespace, token, state);
ir_content_hash(body, token, state);
thunks_content_hash(thunks, token, state);
}
Ir::WithLookup(x) => x.hash(state),
}
}
pub(crate) fn ir_content_eq<'id, 'ir>(
a: IrRef<'id, 'ir>,
b: IrRef<'id, 'ir>,
token: &GhostToken<'id>,
) -> bool {
std::ptr::eq(a.0, b.0)
|| match (a.borrow(token), b.borrow(token)) {
(Ir::Int(a), Ir::Int(b)) => a == b,
(Ir::Float(a), Ir::Float(b)) => a.to_bits() == b.to_bits(),
(Ir::Bool(a), Ir::Bool(b)) => a == b,
(Ir::Null, Ir::Null) => true,
(Ir::Str(a), Ir::Str(b)) => **a == **b,
(
Ir::AttrSet {
stcs: a_stcs,
dyns: a_dyns,
},
Ir::AttrSet {
stcs: b_stcs,
dyns: b_dyns,
},
) => {
a_stcs.len() == b_stcs.len()
&& a_dyns.len() == b_dyns.len()
&& a_stcs.iter().all(|(&k, &(av, _))| {
b_stcs
.get(&k)
.is_some_and(|&(bv, _)| ir_content_eq(av, bv, token))
})
&& a_dyns
.iter()
.zip(b_dyns.iter())
.all(|(&(ak, av, _), &(bk, bv, _))| {
ir_content_eq(ak, bk, token) && ir_content_eq(av, bv, token)
})
}
(Ir::List { items: a }, Ir::List { items: b }) => {
a.len() == b.len()
&& a.iter()
.zip(b.iter())
.all(|(&a, &b)| ir_content_eq(a, b, token))
}
(Ir::HasAttr { lhs: al, rhs: ar }, Ir::HasAttr { lhs: bl, rhs: br }) => {
ir_content_eq(*al, *bl, token)
&& ar.len() == br.len()
&& ar
.iter()
.zip(br.iter())
.all(|(a, b)| attr_content_eq(a, b, token))
}
(
&Ir::BinOp {
lhs: al,
rhs: ar,
kind: ak,
},
&Ir::BinOp {
lhs: bl,
rhs: br,
kind: bk,
},
) => ak == bk && ir_content_eq(al, bl, token) && ir_content_eq(ar, br, token),
(&Ir::UnOp { rhs: ar, kind: ak }, &Ir::UnOp { rhs: br, kind: bk }) => {
ak == bk && ir_content_eq(ar, br, token)
}
(
Ir::Select {
expr: ae,
attrpath: aa,
default: ad,
..
},
Ir::Select {
expr: be,
attrpath: ba,
default: bd,
..
},
) => {
ir_content_eq(*ae, *be, token)
&& aa.len() == ba.len()
&& aa
.iter()
.zip(ba.iter())
.all(|(a, b)| attr_content_eq(a, b, token))
&& match (ad, bd) {
(Some(a), Some(b)) => ir_content_eq(*a, *b, token),
(None, None) => true,
_ => false,
}
}
(
&Ir::If {
cond: ac,
consq: acs,
alter: aa,
},
&Ir::If {
cond: bc,
consq: bcs,
alter: ba,
},
) => {
ir_content_eq(ac, bc, token)
&& ir_content_eq(acs, bcs, token)
&& ir_content_eq(aa, ba, token)
}
(
&Ir::Call {
func: af, arg: aa, ..
},
&Ir::Call {
func: bf, arg: ba, ..
},
) => ir_content_eq(af, bf, token) && ir_content_eq(aa, ba, token),
(
Ir::Assert {
assertion: aa,
expr: ae,
assertion_raw: ar,
..
},
Ir::Assert {
assertion: ba,
expr: be,
assertion_raw: br,
..
},
) => ar == br && ir_content_eq(*aa, *ba, token) && ir_content_eq(*ae, *be, token),
(
Ir::ConcatStrings {
force_string: af,
parts: ap,
},
Ir::ConcatStrings {
force_string: bf,
parts: bp,
},
) => {
af == bf
&& ap.len() == bp.len()
&& ap
.iter()
.zip(bp.iter())
.all(|(&a, &b)| ir_content_eq(a, b, token))
}
(&Ir::Path(a), &Ir::Path(b)) => ir_content_eq(a, b, token),
(
Ir::Func {
body: ab,
arg: aa,
param: ap,
thunks: at,
},
Ir::Func {
body: bb,
arg: ba,
param: bp,
thunks: bt,
},
) => {
ir_content_eq(*ab, *bb, token)
&& aa == ba
&& match (ap, bp) {
(Some(a), Some(b)) => param_content_eq(a, b),
(None, None) => true,
_ => false,
}
&& thunks_content_eq(at, bt, token)
}
(
Ir::TopLevel {
body: ab,
thunks: at,
},
Ir::TopLevel {
body: bb,
thunks: bt,
},
) => ir_content_eq(*ab, *bb, token) && thunks_content_eq(at, bt, token),
(Ir::Arg(a), Ir::Arg(b)) => a == b,
(Ir::Thunk(a), Ir::Thunk(b)) => a == b,
(Ir::Builtins, Ir::Builtins) => true,
(Ir::Builtin(a), Ir::Builtin(b)) => a == b,
(Ir::CurPos(a), Ir::CurPos(b)) => a == b,
(Ir::ReplBinding(a), Ir::ReplBinding(b)) => a == b,
(Ir::ScopedImportBinding(a), Ir::ScopedImportBinding(b)) => a == b,
(
Ir::With {
namespace: a_ns,
body: a_body,
thunks: a_thunks,
},
Ir::With {
namespace: b_ns,
body: b_body,
thunks: b_thunks,
},
) => {
ir_content_eq(*a_ns, *b_ns, token)
&& ir_content_eq(*a_body, *b_body, token)
&& thunks_content_eq(a_thunks, b_thunks, token)
}
(Ir::WithLookup(a), Ir::WithLookup(b)) => a == b,
_ => false,
}
}

View File

@@ -1,386 +0,0 @@
// Assume no parse error
#![allow(clippy::unwrap_used)]
use rnix::ast::{self, Expr, HasEntry};
use crate::error::{Error, Result};
use super::*;
pub trait Downgrade<Ctx: DowngradeContext> {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId>;
}
impl<Ctx: DowngradeContext> Downgrade<Ctx> for Expr {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
use Expr::*;
match self {
Apply(apply) => apply.downgrade(ctx),
Assert(assert) => assert.downgrade(ctx),
Error(error) => Err(self::Error::downgrade_error(error.to_string())),
IfElse(ifelse) => ifelse.downgrade(ctx),
Select(select) => select.downgrade(ctx),
Str(str) => str.downgrade(ctx),
Path(path) => path.downgrade(ctx),
Literal(lit) => lit.downgrade(ctx),
Lambda(lambda) => lambda.downgrade(ctx),
LegacyLet(let_) => let_.downgrade(ctx),
LetIn(letin) => letin.downgrade(ctx),
List(list) => list.downgrade(ctx),
BinOp(op) => op.downgrade(ctx),
AttrSet(attrs) => attrs.downgrade(ctx),
UnaryOp(op) => op.downgrade(ctx),
Ident(ident) => ident.downgrade(ctx),
With(with) => with.downgrade(ctx),
HasAttr(has) => has.downgrade(ctx),
Paren(paren) => paren.expr().unwrap().downgrade(ctx),
Root(root) => root.expr().unwrap().downgrade(ctx),
}
}
}
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::Assert {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
let assertion = self.condition().unwrap().downgrade(ctx)?;
let expr = self.body().unwrap().downgrade(ctx)?;
Ok(ctx.new_expr(Assert { assertion, expr }.to_ir()))
}
}
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::IfElse {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
let cond = self.condition().unwrap().downgrade(ctx)?;
let consq = self.body().unwrap().downgrade(ctx)?;
let alter = self.else_body().unwrap().downgrade(ctx)?;
Ok(ctx.new_expr(If { cond, consq, alter }.to_ir()))
}
}
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::Path {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
let parts_ast: Vec<_> = self.parts().collect();
let has_interpolation = parts_ast
.iter()
.any(|part| matches!(part, ast::InterpolPart::Interpolation(_)));
let parts = if !has_interpolation {
// Resolve at compile time
let path_str: String = parts_ast
.into_iter()
.filter_map(|part| match part {
ast::InterpolPart::Literal(lit) => Some(lit.to_string()),
_ => None,
})
.collect();
let resolved_path = if path_str.starts_with('/') {
path_str
} else {
let current_dir = ctx.get_current_dir();
current_dir
.join(&path_str)
.canonicalize()
.map_err(|e| {
crate::error::Error::downgrade_error(format!(
"Failed to resolve path {}: {}",
path_str, e
))
})?
.to_string_lossy()
.to_string()
};
vec![ctx.new_expr(Str { val: resolved_path }.to_ir())]
} else {
// Resolve at runtime
parts_ast
.into_iter()
.map(|part| match part {
ast::InterpolPart::Literal(lit) => Ok(ctx.new_expr(
Str {
val: lit.to_string(),
}
.to_ir(),
)),
ast::InterpolPart::Interpolation(interpol) => {
interpol.expr().unwrap().downgrade(ctx)
}
})
.collect::<Result<Vec<_>>>()?
};
let expr = if parts.len() == 1 {
parts.into_iter().next().unwrap()
} else {
ctx.new_expr(ConcatStrings { parts }.to_ir())
};
Ok(ctx.new_expr(Path { expr }.to_ir()))
}
}
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::Str {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
let normalized = self.normalized_parts();
let is_single_literal = normalized.len() == 1
&& matches!(normalized.first(), Some(ast::InterpolPart::Literal(_)));
let parts = normalized
.into_iter()
.map(|part| match part {
ast::InterpolPart::Literal(lit) => Ok(ctx.new_expr(Str { val: lit }.to_ir())),
ast::InterpolPart::Interpolation(interpol) => {
interpol.expr().unwrap().downgrade(ctx)
}
})
.collect::<Result<Vec<_>>>()?;
Ok(if is_single_literal {
parts.into_iter().next().unwrap()
} else {
ctx.new_expr(ConcatStrings { parts }.to_ir())
})
}
}
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::Literal {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
Ok(ctx.new_expr(match self.kind() {
ast::LiteralKind::Integer(int) => Ir::Int(int.value().unwrap()),
ast::LiteralKind::Float(float) => Ir::Float(float.value().unwrap()),
ast::LiteralKind::Uri(uri) => Str {
val: uri.to_string(),
}
.to_ir(),
}))
}
}
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::Ident {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
let sym = self.ident_token().unwrap().to_string();
let sym = ctx.new_sym(sym);
ctx.lookup(sym)
}
}
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::AttrSet {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
let rec = self.rec_token().is_some();
if !rec {
let attrs = downgrade_attrs(self, ctx)?;
return Ok(ctx.new_expr(attrs.to_ir()));
}
// rec { a = 1; b = a; } => let a = 1; b = a; in { inherit a b; }
let entries: Vec<_> = self.entries().collect();
let (binding_sccs, body) = downgrade_let_bindings(entries, ctx, |ctx, binding_keys| {
// Create plain attrset as body with inherit
let mut attrs = AttrSet {
stcs: HashMap::new(),
dyns: Vec::new(),
};
for sym in binding_keys {
let expr = ctx.lookup(*sym)?;
attrs.stcs.insert(*sym, expr);
}
Ok(ctx.new_expr(attrs.to_ir()))
})?;
Ok(ctx.new_expr(Let { body, binding_sccs }.to_ir()))
}
}
/// Downgrades a list.
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::List {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
let items = self
.items()
.map(|item| maybe_thunk(item, ctx))
.collect::<Result<_>>()?;
Ok(ctx.new_expr(List { items }.to_ir()))
}
}
/// Downgrades a binary operation.
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::BinOp {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
let lhs = self.lhs().unwrap().downgrade(ctx)?;
let rhs = self.rhs().unwrap().downgrade(ctx)?;
let kind = self.operator().unwrap().into();
Ok(ctx.new_expr(BinOp { lhs, rhs, kind }.to_ir()))
}
}
/// Downgrades a "has attribute" (`?`) expression.
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::HasAttr {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
let lhs = self.expr().unwrap().downgrade(ctx)?;
let rhs = downgrade_attrpath(self.attrpath().unwrap(), ctx)?;
Ok(ctx.new_expr(HasAttr { lhs, rhs }.to_ir()))
}
}
/// Downgrades a unary operation.
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::UnaryOp {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
let rhs = self.expr().unwrap().downgrade(ctx)?;
let kind = self.operator().unwrap().into();
Ok(ctx.new_expr(UnOp { rhs, kind }.to_ir()))
}
}
/// Downgrades an attribute selection (`.`).
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::Select {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
let expr = self.expr().unwrap().downgrade(ctx)?;
let attrpath = downgrade_attrpath(self.attrpath().unwrap(), ctx)?;
let default = if let Some(default) = self.default_expr() {
Some(default.downgrade(ctx)?)
} else {
None
};
Ok(ctx.new_expr(
Select {
expr,
attrpath,
default,
}
.to_ir(),
))
}
}
/// Downgrades a `legacy let`, which is essentially a recursive attribute set.
/// The body of the `let` is accessed via `let.body`.
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::LegacyLet {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
let bindings = downgrade_static_attrs(self, ctx)?;
let binding_keys: Vec<_> = bindings.keys().copied().collect();
let attrset_expr = ctx.with_let_scope(bindings, |ctx| {
let mut attrs = AttrSet {
stcs: HashMap::new(),
dyns: Vec::new(),
};
for sym in binding_keys {
let expr = ctx.lookup(sym)?;
attrs.stcs.insert(sym, expr);
}
Ok(ctx.new_expr(attrs.to_ir()))
})?;
let body_sym = ctx.new_sym("body".to_string());
let select = Select {
expr: attrset_expr,
attrpath: vec![Attr::Str(body_sym)],
default: None,
};
Ok(ctx.new_expr(select.to_ir()))
}
}
/// Downgrades a `let ... in ...` expression.
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::LetIn {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
let entries: Vec<_> = self.entries().collect();
let body_expr = self.body().unwrap();
let (binding_sccs, body) =
downgrade_let_bindings(entries, ctx, |ctx, _binding_keys| body_expr.downgrade(ctx))?;
Ok(ctx.new_expr(Let { body, binding_sccs }.to_ir()))
}
}
/// Downgrades a `with` expression.
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::With {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
// with namespace; expr
let namespace = self.namespace().unwrap().downgrade(ctx)?;
// Downgrade body in With scope
let expr = ctx.with_with_scope(namespace, |ctx| self.body().unwrap().downgrade(ctx))?;
Ok(expr)
}
}
/// Downgrades a lambda (function) expression.
/// This involves desugaring pattern-matching arguments into `let` bindings.
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::Lambda {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
let arg = ctx.new_arg();
let required;
let allowed;
let body;
match self.param().unwrap() {
ast::Param::IdentParam(id) => {
// Simple case: `x: body`
let param_sym = ctx.new_sym(id.to_string());
required = None;
allowed = None;
// Downgrade body in Param scope
body = ctx
.with_param_scope(param_sym, arg, |ctx| self.body().unwrap().downgrade(ctx))?;
}
ast::Param::Pattern(pattern) => {
let alias = pattern
.pat_bind()
.map(|alias| ctx.new_sym(alias.ident().unwrap().to_string()));
let has_ellipsis = pattern.ellipsis_token().is_some();
let pat_entries = pattern.pat_entries();
let PatternBindings {
body: inner_body,
scc_info,
required_params,
allowed_params,
} = downgrade_pattern_bindings(
pat_entries,
alias,
arg,
has_ellipsis,
ctx,
|ctx, _| self.body().unwrap().downgrade(ctx),
)?;
required = Some(required_params);
allowed = allowed_params;
body = ctx.new_expr(
Let {
body: inner_body,
binding_sccs: scc_info,
}
.to_ir(),
);
}
}
let param = Param { required, allowed };
// The function's body and parameters are now stored directly in the `Func` node.
Ok(ctx.new_expr(Func { body, param, arg }.to_ir()))
}
}
/// Downgrades a function application.
/// In Nix, function application is left-associative, so `f a b` should be parsed as `((f a) b)`.
/// Each Apply node represents a single function call with one argument.
impl<Ctx: DowngradeContext> Downgrade<Ctx> for ast::Apply {
fn downgrade(self, ctx: &mut Ctx) -> Result<ExprId> {
let func = self.lambda().unwrap().downgrade(ctx)?;
let arg = maybe_thunk(self.argument().unwrap(), ctx)?;
Ok(ctx.new_expr(Call { func, arg }.to_ir()))
}
}

View File

@@ -1,474 +0,0 @@
// Assume no parse error
#![allow(clippy::unwrap_used)]
use hashbrown::hash_map::Entry;
use hashbrown::{HashMap, HashSet};
use rnix::ast;
use crate::error::{Error, Result};
use crate::ir::{Attr, AttrSet, ConcatStrings, ExprId, Ir, Select, Str, SymId};
use crate::value::format_symbol;
use super::*;
pub fn maybe_thunk(mut expr: ast::Expr, ctx: &mut impl DowngradeContext) -> Result<ExprId> {
use ast::Expr::*;
let expr = loop {
expr = match expr {
Paren(paren) => paren.expr().unwrap(),
Root(root) => root.expr().unwrap(),
expr => break expr,
}
};
match expr {
Error(error) => return Err(self::Error::downgrade_error(error.to_string())),
Ident(ident) => return ident.downgrade(ctx),
Literal(lit) => return lit.downgrade(ctx),
Str(str) => return str.downgrade(ctx),
Path(path) => return path.downgrade(ctx),
_ => (),
}
let id = match expr {
Apply(apply) => apply.downgrade(ctx),
Assert(assert) => assert.downgrade(ctx),
IfElse(ifelse) => ifelse.downgrade(ctx),
Select(select) => select.downgrade(ctx),
Lambda(lambda) => lambda.downgrade(ctx),
LegacyLet(let_) => let_.downgrade(ctx),
LetIn(letin) => letin.downgrade(ctx),
List(list) => list.downgrade(ctx),
BinOp(op) => op.downgrade(ctx),
AttrSet(attrs) => attrs.downgrade(ctx),
UnaryOp(op) => op.downgrade(ctx),
With(with) => with.downgrade(ctx),
HasAttr(has) => has.downgrade(ctx),
_ => unreachable!(),
}?;
Ok(ctx.new_expr(Ir::Thunk(id)))
}
/// Downgrades the entries of an attribute set.
/// This handles `inherit` and `attrpath = value;` entries.
pub fn downgrade_attrs(
attrs: impl ast::HasEntry,
ctx: &mut impl DowngradeContext,
) -> Result<AttrSet> {
let entries = attrs.entries();
let mut attrs = AttrSet {
stcs: HashMap::new(),
dyns: Vec::new(),
};
for entry in entries {
match entry {
ast::Entry::Inherit(inherit) => downgrade_inherit(inherit, &mut attrs.stcs, ctx)?,
ast::Entry::AttrpathValue(value) => downgrade_attrpathvalue(value, &mut attrs, ctx)?,
}
}
Ok(attrs)
}
/// Downgrades attribute set entries for a `let...in` expression.
/// This is a stricter version of `downgrade_attrs` that disallows dynamic attributes,
/// as `let` bindings must be statically known.
pub fn downgrade_static_attrs(
attrs: impl ast::HasEntry,
ctx: &mut impl DowngradeContext,
) -> Result<HashMap<SymId, ExprId>> {
let entries = attrs.entries();
let mut attrs = AttrSet {
stcs: HashMap::new(),
dyns: Vec::new(),
};
for entry in entries {
match entry {
ast::Entry::Inherit(inherit) => downgrade_inherit(inherit, &mut attrs.stcs, ctx)?,
ast::Entry::AttrpathValue(value) => {
downgrade_static_attrpathvalue(value, &mut attrs, ctx)?
}
}
}
Ok(attrs.stcs)
}
/// Downgrades an `inherit` statement.
/// `inherit (from) a b;` is translated into `a = from.a; b = from.b;`.
/// `inherit a b;` is translated into `a = a; b = b;` (i.e., bringing variables into scope).
pub fn downgrade_inherit(
inherit: ast::Inherit,
stcs: &mut HashMap<SymId, ExprId>,
ctx: &mut impl DowngradeContext,
) -> Result<()> {
// Downgrade the `from` expression if it exists.
let from = if let Some(from) = inherit.from() {
Some(from.expr().unwrap().downgrade(ctx)?)
} else {
None
};
for attr in inherit.attrs() {
let ident = match downgrade_attr(attr, ctx)? {
Attr::Str(ident) => ident,
_ => {
// `inherit` does not allow dynamic attributes.
return Err(Error::downgrade_error(
"dynamic attributes not allowed in inherit".to_string(),
));
}
};
let expr = if let Some(expr) = from {
ctx.new_expr(
Select {
expr,
attrpath: vec![Attr::Str(ident)],
default: None,
}
.to_ir(),
)
} else {
ctx.lookup(ident)?
};
match stcs.entry(ident) {
Entry::Occupied(occupied) => {
return Err(Error::eval_error(format!(
"attribute '{}' already defined",
format_symbol(ctx.get_sym(*occupied.key()))
)));
}
Entry::Vacant(vacant) => vacant.insert(expr),
};
}
Ok(())
}
/// Downgrades a single attribute key (part of an attribute path).
/// An attribute can be a static identifier, an interpolated string, or a dynamic expression.
pub fn downgrade_attr(attr: ast::Attr, ctx: &mut impl DowngradeContext) -> Result<Attr> {
use ast::Attr::*;
use ast::InterpolPart::*;
match attr {
Ident(ident) => Ok(Attr::Str(ctx.new_sym(ident.to_string()))),
Str(string) => {
let parts = string.normalized_parts();
if parts.is_empty() {
Ok(Attr::Str(ctx.new_sym("".to_string())))
} else if parts.len() == 1 {
// If the string has only one part, it's either a literal or a single interpolation.
match parts.into_iter().next().unwrap() {
Literal(ident) => Ok(Attr::Str(ctx.new_sym(ident))),
Interpolation(interpol) => {
Ok(Attr::Dynamic(interpol.expr().unwrap().downgrade(ctx)?))
}
}
} else {
// If the string has multiple parts, it's an interpolated string that must be concatenated.
let parts = parts
.into_iter()
.map(|part| match part {
Literal(lit) => Ok(ctx.new_expr(self::Str { val: lit }.to_ir())),
Interpolation(interpol) => interpol.expr().unwrap().downgrade(ctx),
})
.collect::<Result<Vec<_>>>()?;
Ok(Attr::Dynamic(ctx.new_expr(ConcatStrings { parts }.to_ir())))
}
}
Dynamic(dynamic) => Ok(Attr::Dynamic(dynamic.expr().unwrap().downgrade(ctx)?)),
}
}
/// Downgrades an attribute path (e.g., `a.b."${c}".d`) into a `Vec<Attr>`.
pub fn downgrade_attrpath(
attrpath: ast::Attrpath,
ctx: &mut impl DowngradeContext,
) -> Result<Vec<Attr>> {
attrpath
.attrs()
.map(|attr| downgrade_attr(attr, ctx))
.collect::<Result<Vec<_>>>()
}
/// Downgrades an `attrpath = value;` expression and inserts it into an `AttrSet`.
pub fn downgrade_attrpathvalue(
value: ast::AttrpathValue,
attrs: &mut AttrSet,
ctx: &mut impl DowngradeContext,
) -> Result<()> {
let path = downgrade_attrpath(value.attrpath().unwrap(), ctx)?;
let value = maybe_thunk(value.value().unwrap(), ctx)?;
attrs.insert(path, value, ctx)
}
/// A stricter version of `downgrade_attrpathvalue` for `let...in` bindings.
/// It ensures that the attribute path contains no dynamic parts.
pub fn downgrade_static_attrpathvalue(
value: ast::AttrpathValue,
attrs: &mut AttrSet,
ctx: &mut impl DowngradeContext,
) -> Result<()> {
let path = downgrade_attrpath(value.attrpath().unwrap(), ctx)?;
if path.iter().any(|attr| matches!(attr, Attr::Dynamic(_))) {
return Err(Error::downgrade_error(
"dynamic attributes not allowed in let bindings".to_string(),
));
}
let value = value.value().unwrap().downgrade(ctx)?;
attrs.insert(path, value, ctx)
}
pub struct PatternBindings {
pub body: ExprId,
pub scc_info: SccInfo,
pub required_params: Vec<SymId>,
pub allowed_params: Option<HashSet<SymId>>,
}
/// Helper function for Lambda pattern parameters with SCC analysis.
/// Processes pattern entries like `{ a, b ? 2, ... }@alias` and creates optimized bindings.
///
/// # Parameters
/// - `pat_entries`: Iterator over pattern entries from the AST
/// - `alias`: Optional alias symbol (from @alias syntax)
/// - `arg`: The argument expression to extract from
///
/// Returns a tuple of (binding slots, body, SCC info, required params, allowed params)
pub fn downgrade_pattern_bindings<Ctx>(
pat_entries: impl Iterator<Item = ast::PatEntry>,
alias: Option<SymId>,
arg: ExprId,
has_ellipsis: bool,
ctx: &mut Ctx,
body_fn: impl FnOnce(&mut Ctx, &[SymId]) -> Result<ExprId>,
) -> Result<PatternBindings>
where
Ctx: DowngradeContext,
{
let mut param_syms = Vec::new();
let mut param_defaults = Vec::new();
let mut seen_params = HashSet::new();
for entry in pat_entries {
let sym = ctx.new_sym(entry.ident().unwrap().to_string());
if !seen_params.insert(sym) {
return Err(Error::downgrade_error(format!(
"duplicate parameter '{}'",
format_symbol(ctx.get_sym(sym))
)));
}
let default_ast = entry.default();
param_syms.push(sym);
param_defaults.push(default_ast);
}
let mut binding_keys: Vec<SymId> = param_syms.clone();
if let Some(alias_sym) = alias {
binding_keys.push(alias_sym);
}
let required: Vec<SymId> = param_syms
.iter()
.zip(param_defaults.iter())
.filter_map(|(&sym, default)| if default.is_none() { Some(sym) } else { None })
.collect();
let allowed: Option<HashSet<SymId>> = if has_ellipsis {
None
} else {
Some(param_syms.iter().copied().collect())
};
let (scc_info, body) = downgrade_bindings_generic(
ctx,
binding_keys,
|ctx, sym_to_slot| {
let mut bindings = HashMap::new();
for (sym, default_ast) in param_syms.iter().zip(param_defaults.iter()) {
let slot = *sym_to_slot.get(sym).unwrap();
ctx.set_current_binding(Some(slot));
let default = if let Some(default_expr) = default_ast {
Some(default_expr.clone().downgrade(ctx)?)
} else {
None
};
let select_expr = ctx.new_expr(
Select {
expr: arg,
attrpath: vec![Attr::Str(*sym)],
default,
}
.to_ir(),
);
bindings.insert(*sym, select_expr);
ctx.set_current_binding(None);
}
if let Some(alias_sym) = alias {
bindings.insert(alias_sym, arg);
}
Ok(bindings)
},
body_fn,
)?;
Ok(PatternBindings {
body,
scc_info,
required_params: required,
allowed_params: allowed,
})
}
/// Generic helper function to downgrade bindings with SCC analysis.
/// This is the core logic for let bindings, extracted for reuse.
///
/// # Parameters
/// - `binding_keys`: The symbols for all bindings
/// - `compute_bindings_fn`: Called in let scope with sym_to_slot mapping to compute binding values
/// - `body_fn`: Called in let scope to compute the body expression
///
/// Returns a tuple of (binding slots, body result, SCC info)
pub fn downgrade_bindings_generic<Ctx, B, F>(
ctx: &mut Ctx,
binding_keys: Vec<SymId>,
compute_bindings_fn: B,
body_fn: F,
) -> Result<(SccInfo, ExprId)>
where
Ctx: DowngradeContext,
B: FnOnce(&mut Ctx, &HashMap<SymId, ExprId>) -> Result<HashMap<SymId, ExprId>>,
F: FnOnce(&mut Ctx, &[SymId]) -> Result<ExprId>,
{
let slots: Vec<_> = ctx.reserve_slots(binding_keys.len()).collect();
let let_bindings: HashMap<_, _> = binding_keys
.iter()
.copied()
.zip(slots.iter().copied())
.collect();
ctx.push_dep_tracker(&slots);
ctx.with_let_scope(let_bindings.clone(), |ctx| {
let bindings = compute_bindings_fn(ctx, &let_bindings)?;
let scc_info = ctx.pop_dep_tracker()?;
for (sym, slot) in binding_keys.iter().copied().zip(slots.iter()) {
if let Some(&expr) = bindings.get(&sym) {
ctx.replace_expr(*slot, Ir::Thunk(expr));
} else {
return Err(Error::downgrade_error(format!(
"binding '{}' not found",
format_symbol(ctx.get_sym(sym))
)));
}
}
let body = body_fn(ctx, &binding_keys)?;
Ok((scc_info, body))
})
}
/// Helper function to downgrade entries with let bindings semantics.
/// This extracts common logic for both `rec` attribute sets and `let...in` expressions.
///
/// Returns a tuple of (binding slots, body result, SCC info) where:
/// - binding slots: pre-allocated expression slots for the bindings
/// - body result: the result of calling `body_fn` in the let scope
/// - SCC info: strongly connected components information for optimization
pub fn downgrade_let_bindings<Ctx, F>(
entries: Vec<ast::Entry>,
ctx: &mut Ctx,
body_fn: F,
) -> Result<(SccInfo, ExprId)>
where
Ctx: DowngradeContext,
F: FnOnce(&mut Ctx, &[SymId]) -> Result<ExprId>,
{
let mut binding_syms = HashSet::new();
for entry in &entries {
match entry {
ast::Entry::Inherit(inherit) => {
for attr in inherit.attrs() {
if let ast::Attr::Ident(ident) = attr {
let sym = ctx.new_sym(ident.to_string());
if !binding_syms.insert(sym) {
return Err(Error::downgrade_error(format!(
"attribute '{}' already defined",
format_symbol(ctx.get_sym(sym))
)));
}
}
}
}
ast::Entry::AttrpathValue(value) => {
let attrpath = value.attrpath().unwrap();
if let Some(first_attr) = attrpath.attrs().next()
&& let ast::Attr::Ident(ident) = first_attr
{
let sym = ctx.new_sym(ident.to_string());
if !binding_syms.insert(sym) {
return Err(Error::downgrade_error(format!(
"attribute '{}' already defined",
format_symbol(ctx.get_sym(sym))
)));
}
}
}
}
}
let binding_keys: Vec<_> = binding_syms.into_iter().collect();
downgrade_bindings_generic(
ctx,
binding_keys,
|ctx, sym_to_slot| {
let mut temp_attrs = AttrSet {
stcs: HashMap::new(),
dyns: Vec::new(),
};
for entry in entries {
match entry {
ast::Entry::Inherit(inherit) => {
for attr in inherit.attrs() {
if let ast::Attr::Ident(ident) = attr {
let sym = ctx.new_sym(ident.to_string());
let slot = *sym_to_slot.get(&sym).unwrap();
ctx.set_current_binding(Some(slot));
}
}
downgrade_inherit(inherit, &mut temp_attrs.stcs, ctx)?;
ctx.set_current_binding(None);
}
ast::Entry::AttrpathValue(value) => {
let attrpath = value.attrpath().unwrap();
if let Some(first_attr) = attrpath.attrs().next()
&& let ast::Attr::Ident(ident) = first_attr
{
let sym = ctx.new_sym(ident.to_string());
let slot = *sym_to_slot.get(&sym).unwrap();
ctx.set_current_binding(Some(slot));
}
downgrade_static_attrpathvalue(value, &mut temp_attrs, ctx)?;
ctx.set_current_binding(None);
}
}
}
Ok(temp_attrs.stcs)
},
body_fn,
)
}

View File

@@ -1,12 +1,22 @@
#![warn(clippy::unwrap_used)]
mod codegen;
pub mod context;
pub mod error;
mod ir;
mod nix_hash;
mod runtime;
pub mod logging;
pub mod value;
mod bytecode;
mod codegen;
mod derivation;
mod disassembler;
mod downgrade;
mod fetcher;
mod ir;
mod nar;
mod nix_utils;
mod runtime;
mod store;
mod string_context;
#[global_allocator]
static GLOBAL: mimalloc::MiMalloc = mimalloc::MiMalloc;

48
nix-js/src/logging.rs Normal file
View File

@@ -0,0 +1,48 @@
use std::env;
use std::io::IsTerminal;
use tracing_subscriber::{EnvFilter, Layer, fmt, layer::SubscriberExt, util::SubscriberInitExt};
pub fn init_logging() {
let is_terminal = std::io::stderr().is_terminal();
let show_time = env::var("NIX_JS_LOG_TIME")
.map(|v| v == "1" || v.to_lowercase() == "true")
.unwrap_or(false);
let filter = EnvFilter::from_default_env();
let fmt_layer = fmt::layer()
.with_target(true)
.with_thread_ids(false)
.with_thread_names(false)
.with_file(false)
.with_line_number(false)
.with_ansi(is_terminal)
.with_level(true);
let fmt_layer = if show_time {
fmt_layer.with_timer(fmt::time::uptime()).boxed()
} else {
fmt_layer.without_time().boxed()
};
tracing_subscriber::registry()
.with(filter)
.with(fmt_layer)
.init();
init_miette_handler();
}
fn init_miette_handler() {
let is_terminal = std::io::stderr().is_terminal();
miette::set_hook(Box::new(move |_| {
Box::new(
miette::MietteHandlerOpts::new()
.terminal_links(is_terminal)
.unicode(is_terminal)
.color(is_terminal)
.build(),
)
}))
.ok();
}

187
nix-js/src/main.rs Normal file
View File

@@ -0,0 +1,187 @@
use std::path::PathBuf;
use std::process::exit;
use anyhow::Result;
use clap::{Args, Parser, Subcommand};
use hashbrown::HashSet;
use nix_js::context::Context;
use nix_js::error::Source;
use rustyline::DefaultEditor;
use rustyline::error::ReadlineError;
#[derive(Parser)]
#[command(name = "nix-js", about = "Nix expression evaluator")]
struct Cli {
#[cfg(feature = "inspector")]
#[arg(long, value_name = "HOST:PORT", num_args = 0..=1, default_missing_value = "127.0.0.1:9229")]
inspect: Option<String>,
#[cfg(feature = "inspector")]
#[arg(long, value_name = "HOST:PORT", num_args = 0..=1, default_missing_value = "127.0.0.1:9229")]
inspect_brk: Option<String>,
#[command(subcommand)]
command: Command,
}
#[derive(Subcommand)]
enum Command {
Compile {
#[clap(flatten)]
source: ExprSource,
#[arg(long)]
silent: bool,
},
Eval {
#[clap(flatten)]
source: ExprSource,
},
Repl,
}
#[derive(Args)]
#[group(required = true, multiple = false)]
struct ExprSource {
#[clap(short, long)]
expr: Option<String>,
#[clap(short, long)]
file: Option<PathBuf>,
}
fn create_context(#[cfg(feature = "inspector")] cli: &Cli) -> Result<Context> {
#[cfg(feature = "inspector")]
{
let (addr_str, wait) = if let Some(ref addr) = cli.inspect_brk {
(Some(addr.as_str()), true)
} else if let Some(ref addr) = cli.inspect {
(Some(addr.as_str()), false)
} else {
(None, false)
};
if let Some(addr_str) = addr_str {
let addr: std::net::SocketAddr = addr_str
.parse()
.map_err(|e| anyhow::anyhow!("invalid inspector address '{}': {}", addr_str, e))?;
return Ok(Context::new_with_inspector(addr, wait)?);
}
}
Ok(Context::new()?)
}
fn run_compile(context: &mut Context, src: ExprSource, silent: bool) -> Result<()> {
let src = if let Some(expr) = src.expr {
Source::new_eval(expr)?
} else if let Some(file) = src.file {
Source::new_file(file)?
} else {
unreachable!()
};
match context.compile_bytecode(src) {
Ok(compiled) => {
if !silent {
println!("{}", context.disassemble_colored(&compiled));
}
}
Err(err) => {
eprintln!("{:?}", miette::Report::new(*err));
exit(1);
}
};
#[cfg(feature = "inspector")]
context.wait_for_inspector_disconnect();
Ok(())
}
fn run_eval(context: &mut Context, src: ExprSource) -> Result<()> {
let src = if let Some(expr) = src.expr {
Source::new_eval(expr)?
} else if let Some(file) = src.file {
Source::new_file(file)?
} else {
unreachable!()
};
match context.eval_deep(src) {
Ok(value) => {
println!("{}", value.display_compat());
}
Err(err) => {
eprintln!("{:?}", miette::Report::new(*err));
exit(1);
}
};
#[cfg(feature = "inspector")]
context.wait_for_inspector_disconnect();
Ok(())
}
fn run_repl(context: &mut Context) -> Result<()> {
let mut rl = DefaultEditor::new()?;
let mut scope = HashSet::new();
const RE: ere::Regex<3> = ere::compile_regex!("^[ \t]*([a-zA-Z_][a-zA-Z0-9_'-]*)[ \t]*(.*)$");
loop {
let readline = rl.readline("nix-js-repl> ");
match readline {
Ok(line) => {
if line.trim().is_empty() {
continue;
}
let _ = rl.add_history_entry(line.as_str());
if let Some([Some(_), Some(ident), Some(rest)]) = RE.exec(&line) {
if let Some(expr) = rest.strip_prefix('=') {
let expr = expr.trim_start();
if expr.is_empty() {
eprintln!("Error: missing expression after '='");
continue;
}
match context.add_binding(ident, expr, &mut scope) {
Ok(value) => println!("{} = {}", ident, value),
Err(err) => eprintln!("{:?}", miette::Report::new(*err)),
}
} else {
let src = Source::new_repl(line)?;
match context.eval_repl(src, &scope) {
Ok(value) => println!("{value}"),
Err(err) => eprintln!("{:?}", miette::Report::new(*err)),
}
}
} else {
let src = Source::new_repl(line)?;
match context.eval_shallow(src) {
Ok(value) => println!("{value}"),
Err(err) => eprintln!("{:?}", miette::Report::new(*err)),
}
}
}
Err(ReadlineError::Interrupted) => {
println!();
}
Err(ReadlineError::Eof) => {
println!("CTRL-D");
break;
}
Err(err) => {
eprintln!("Error: {err:?}");
break;
}
}
}
Ok(())
}
fn main() -> Result<()> {
nix_js::logging::init_logging();
let cli = Cli::parse();
let mut context = create_context(
#[cfg(feature = "inspector")]
&cli,
)?;
match cli.command {
Command::Compile { source, silent } => run_compile(&mut context, source, silent),
Command::Eval { source } => run_eval(&mut context, source),
Command::Repl => run_repl(&mut context),
}
}

66
nix-js/src/nar.rs Normal file
View File

@@ -0,0 +1,66 @@
use std::io::Read;
use std::path::Path;
use nix_nar::Encoder;
use sha2::{Digest, Sha256};
use crate::error::{Error, Result};
pub fn compute_nar_hash(path: &Path) -> Result<[u8; 32]> {
let mut hasher = Sha256::new();
std::io::copy(
&mut Encoder::new(path).map_err(|err| Error::internal(err.to_string()))?,
&mut hasher,
)
.map_err(|err| Error::internal(err.to_string()))?;
Ok(hasher.finalize().into())
}
pub fn pack_nar(path: &Path) -> Result<Vec<u8>> {
let mut buffer = Vec::new();
Encoder::new(path)
.map_err(|err| Error::internal(err.to_string()))?
.read_to_end(&mut buffer)
.map_err(|err| Error::internal(err.to_string()))?;
Ok(buffer)
}
#[cfg(test)]
#[allow(clippy::unwrap_used)]
mod tests {
use std::fs;
use tempfile::TempDir;
use super::*;
#[test_log::test]
fn test_simple_file() {
let temp = TempDir::new().unwrap();
let file_path = temp.path().join("test.txt");
fs::write(&file_path, "hello").unwrap();
let hash = hex::encode(compute_nar_hash(&file_path).unwrap());
assert_eq!(
hash,
"0a430879c266f8b57f4092a0f935cf3facd48bbccde5760d4748ca405171e969"
);
assert!(!hash.is_empty());
assert_eq!(hash.len(), 64);
}
#[test_log::test]
fn test_directory() {
let temp = TempDir::new().unwrap();
fs::write(temp.path().join("a.txt"), "aaa").unwrap();
fs::write(temp.path().join("b.txt"), "bbb").unwrap();
let hash = hex::encode(compute_nar_hash(temp.path()).unwrap());
assert_eq!(
hash,
"0036c14209749bc9b9631e2077b108b701c322ab53853cd26f2746268a86fc0f"
);
assert!(!hash.is_empty());
assert_eq!(hash.len(), 64);
}
}

View File

@@ -1,123 +0,0 @@
use sha2::{Digest, Sha256};
const NIX_BASE32_CHARS: &[u8; 32] = b"0123456789abcdfghijklmnpqrsvwxyz";
const STORE_DIR: &str = "/nix/store";
pub fn sha256_hex(data: &str) -> String {
let mut hasher = Sha256::new();
hasher.update(data.as_bytes());
hex::encode(hasher.finalize())
}
pub fn compress_hash(hash: &[u8; 32], new_size: usize) -> Vec<u8> {
let mut result = vec![0u8; new_size];
for i in 0..32 {
result[i % new_size] ^= hash[i];
}
result
}
pub fn nix_base32_encode(bytes: &[u8]) -> String {
let len = (bytes.len() * 8 - 1) / 5 + 1;
let mut result = String::with_capacity(len);
for n in (0..len).rev() {
let b = n * 5;
let i = b / 8;
let j = b % 8;
let c = if i >= bytes.len() {
0
} else {
let mut c = (bytes[i] as u16) >> j;
if j > 3 && i + 1 < bytes.len() {
c |= (bytes[i + 1] as u16) << (8 - j);
}
c
};
result.push(NIX_BASE32_CHARS[(c & 0x1f) as usize] as char);
}
result
}
pub fn make_store_path(ty: &str, hash_hex: &str, name: &str) -> String {
let s = format!("{}:sha256:{}:{}:{}", ty, hash_hex, STORE_DIR, name);
let mut hasher = Sha256::new();
hasher.update(s.as_bytes());
let hash: [u8; 32] = hasher.finalize().into();
let compressed = compress_hash(&hash, 20);
let encoded = nix_base32_encode(&compressed);
format!("{}/{}-{}", STORE_DIR, encoded, name)
}
pub fn output_path_name(drv_name: &str, output_name: &str) -> String {
if output_name == "out" {
drv_name.to_string()
} else {
format!("{}-{}", drv_name, output_name)
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_nix_base32_encode() {
let bytes = [0xFF, 0xFF, 0xFF, 0xFF, 0xFF];
let encoded = nix_base32_encode(&bytes);
assert_eq!(encoded.len(), 8);
let bytes_zero = [0u8; 20];
let encoded_zero = nix_base32_encode(&bytes_zero);
assert_eq!(encoded_zero.len(), 32);
assert!(encoded_zero.chars().all(|c| c == '0'));
}
#[test]
fn test_compress_hash() {
let hash = [0u8; 32];
let compressed = compress_hash(&hash, 20);
assert_eq!(compressed.len(), 20);
assert!(compressed.iter().all(|&b| b == 0));
let hash_ones = [0xFF; 32];
let compressed_ones = compress_hash(&hash_ones, 20);
assert_eq!(compressed_ones.len(), 20);
}
#[test]
fn test_sha256_hex() {
let data = "hello world";
let hash = sha256_hex(data);
assert_eq!(hash.len(), 64);
assert_eq!(
hash,
"b94d27b9934d3e08a52e52d7da7dabfac484efe37a5380ee9088f7ace2efcde9"
);
}
#[test]
fn test_output_path_name() {
assert_eq!(output_path_name("hello", "out"), "hello");
assert_eq!(output_path_name("hello", "dev"), "hello-dev");
assert_eq!(output_path_name("hello", "doc"), "hello-doc");
}
#[test]
fn test_make_store_path() {
let path = make_store_path("output:out", "abc123", "hello");
assert!(path.starts_with("/nix/store/"));
assert!(path.ends_with("-hello"));
let hash_parts: Vec<&str> = path.split('/').collect();
assert_eq!(hash_parts.len(), 4);
let name_part = hash_parts[3];
assert!(name_part.contains('-'));
}
}

21
nix-js/src/nix_utils.rs Normal file
View File

@@ -0,0 +1,21 @@
use nix_compat::store_path::compress_hash;
use sha2::{Digest as _, Sha256};
pub fn sha256_hex(data: &[u8]) -> String {
let mut hasher = Sha256::new();
hasher.update(data);
hex::encode(hasher.finalize())
}
pub fn make_store_path(store_dir: &str, ty: &str, hash_hex: &str, name: &str) -> String {
let s = format!("{}:sha256:{}:{}:{}", ty, hash_hex, store_dir, name);
let mut hasher = Sha256::new();
hasher.update(s.as_bytes());
let hash: [u8; 32] = hasher.finalize().into();
let compressed = compress_hash::<20>(&hash);
let encoded = nix_compat::nixbase32::encode(&compressed);
format!("{}/{}-{}", store_dir, encoded, name)
}

View File

@@ -1,38 +1,86 @@
use std::borrow::Cow;
use std::marker::PhantomData;
use std::ops::DerefMut;
use std::path::PathBuf;
use std::sync::Once;
use std::path::Path;
#[cfg(feature = "inspector")]
use deno_core::PollEventLoopOptions;
use deno_core::{Extension, ExtensionFileSource, JsRuntime, OpState, RuntimeOptions, v8};
use deno_error::JsErrorClass;
use crate::error::{Error, Result};
use crate::bytecode::{Bytecode, Constant};
use crate::error::{Error, Result, Source};
use crate::store::DaemonStore;
use crate::value::{AttrSet, List, Symbol, Value};
#[cfg(feature = "inspector")]
pub(crate) mod inspector;
mod ops;
use ops::*;
type ScopeRef<'p, 's> = v8::PinnedRef<'p, v8::HandleScope<'s>>;
type LocalValue<'a> = v8::Local<'a, v8::Value>;
type LocalSymbol<'a> = v8::Local<'a, v8::Symbol>;
pub(crate) trait RuntimeCtx: 'static {
fn get_current_dir(&self) -> PathBuf;
fn push_path_stack(&mut self, path: PathBuf) -> impl DerefMut<Target = Self>;
fn compile_code(&mut self, code: &str) -> Result<String>;
pub(crate) trait RuntimeContext: 'static {
fn get_current_dir(&self) -> &Path;
fn add_source(&mut self, path: Source);
fn compile(&mut self, source: Source) -> Result<String>;
fn compile_scoped(&mut self, source: Source, scope: Vec<String>) -> Result<String>;
fn compile_bytecode(&mut self, source: Source) -> Result<Bytecode>;
fn compile_bytecode_scoped(&mut self, source: Source, scope: Vec<String>) -> Result<Bytecode>;
fn get_source(&self, id: usize) -> Source;
fn get_store(&self) -> &DaemonStore;
fn get_span(&self, id: usize) -> (usize, rnix::TextRange);
fn get_unsynced(&mut self) -> (&[String], &[Constant], usize, usize);
}
fn runtime_extension<Ctx: RuntimeCtx>() -> Extension {
pub(crate) trait OpStateExt<Ctx: RuntimeContext> {
fn get_ctx(&self) -> &Ctx;
fn get_ctx_mut(&mut self) -> &mut Ctx;
}
impl<Ctx: RuntimeContext> OpStateExt<Ctx> for OpState {
fn get_ctx(&self) -> &Ctx {
self.try_borrow::<&'static mut Ctx>()
.expect("RuntimeContext not set")
}
fn get_ctx_mut(&mut self) -> &mut Ctx {
self.try_borrow_mut::<&'static mut Ctx>()
.expect("RuntimeContext not set")
}
}
fn runtime_extension<Ctx: RuntimeContext>() -> Extension {
const ESM: &[ExtensionFileSource] =
&deno_core::include_js_files!(nix_runtime dir "runtime-ts/dist", "runtime.js");
let ops = vec![
let mut ops = vec![
op_import::<Ctx>(),
op_scoped_import::<Ctx>(),
op_resolve_path(),
op_read_file(),
op_read_file_type(),
op_read_dir(),
op_path_exists(),
op_resolve_path::<Ctx>(),
op_sha256_hex(),
op_make_store_path(),
op_output_path_name(),
op_make_fixed_output_path(),
op_walk_dir(),
op_make_placeholder(),
op_store_path::<Ctx>(),
op_convert_hash(),
op_hash_string(),
op_hash_file(),
op_parse_hash(),
op_add_path::<Ctx>(),
op_add_filtered_path::<Ctx>(),
op_decode_span::<Ctx>(),
op_to_file::<Ctx>(),
op_copy_path_to_store::<Ctx>(),
op_get_env(),
op_match(),
op_split(),
op_from_json(),
op_from_toml(),
op_finalize_derivation::<Ctx>(),
op_to_xml(),
];
ops.extend(crate::fetcher::register_ops::<Ctx>());
Extension {
name: "nix_runtime",
@@ -46,9 +94,8 @@ fn runtime_extension<Ctx: RuntimeCtx>() -> Extension {
mod private {
use deno_error::js_error_wrapper;
#[allow(dead_code)]
#[derive(Debug)]
pub struct SimpleErrorWrapper(pub(crate) String);
pub struct SimpleErrorWrapper(String);
impl std::fmt::Display for SimpleErrorWrapper {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
std::fmt::Display::fmt(&self.0, f)
@@ -56,196 +103,220 @@ mod private {
}
impl std::error::Error for SimpleErrorWrapper {}
js_error_wrapper!(SimpleErrorWrapper, NixError, "Error");
js_error_wrapper!(SimpleErrorWrapper, NixRuntimeError, "Error");
impl From<String> for NixError {
impl From<String> for NixRuntimeError {
fn from(value: String) -> Self {
NixError(SimpleErrorWrapper(value))
NixRuntimeError(SimpleErrorWrapper(value))
}
}
impl From<&str> for NixError {
impl From<&str> for NixRuntimeError {
fn from(value: &str) -> Self {
NixError(SimpleErrorWrapper(value.to_string()))
NixRuntimeError(SimpleErrorWrapper(value.to_string()))
}
}
}
use private::NixError;
pub(crate) use private::NixRuntimeError;
#[deno_core::op2]
#[string]
fn op_import<Ctx: RuntimeCtx>(
state: &mut OpState,
#[string] path: String,
) -> std::result::Result<String, NixError> {
let ctx = state.borrow_mut::<Ctx>();
let current_dir = ctx.get_current_dir();
let mut absolute_path = current_dir
.join(&path)
.canonicalize()
.map_err(|e| format!("Failed to resolve path {}: {}", path, e))?;
if absolute_path.is_dir() {
absolute_path.push("default.nix")
}
let content = std::fs::read_to_string(&absolute_path)
.map_err(|e| format!("Failed to read {}: {}", absolute_path.display(), e))?;
let mut guard = ctx.push_path_stack(absolute_path);
let ctx = guard.deref_mut();
Ok(ctx.compile_code(&content).map_err(|err| err.to_string())?)
}
#[deno_core::op2]
#[string]
fn op_read_file(#[string] path: String) -> std::result::Result<String, NixError> {
Ok(std::fs::read_to_string(&path).map_err(|e| format!("Failed to read {}: {}", path, e))?)
}
#[deno_core::op2(fast)]
fn op_path_exists(#[string] path: String) -> bool {
std::path::Path::new(&path).exists()
}
#[deno_core::op2]
#[string]
fn op_resolve_path<Ctx: RuntimeCtx>(
state: &mut OpState,
#[string] path: String,
) -> std::result::Result<String, NixError> {
let ctx = state.borrow::<Ctx>();
// If already absolute, return as-is
if path.starts_with('/') {
return Ok(path);
}
// Resolve relative path against current file directory (or CWD)
let current_dir = ctx.get_current_dir();
Ok(current_dir
.join(&path)
.canonicalize()
.map(|p| p.to_string_lossy().to_string())
.map_err(|e| format!("Failed to resolve path {}: {}", path, e))?)
}
#[deno_core::op2]
#[string]
fn op_sha256_hex(#[string] data: String) -> String {
crate::nix_hash::sha256_hex(&data)
}
#[deno_core::op2]
#[string]
fn op_make_store_path(
#[string] ty: String,
#[string] hash_hex: String,
#[string] name: String,
) -> String {
crate::nix_hash::make_store_path(&ty, &hash_hex, &name)
}
#[deno_core::op2]
#[string]
fn op_output_path_name(#[string] drv_name: String, #[string] output_name: String) -> String {
crate::nix_hash::output_path_name(&drv_name, &output_name)
}
#[deno_core::op2]
#[string]
fn op_make_fixed_output_path(
#[string] hash_algo: String,
#[string] hash: String,
#[string] hash_mode: String,
#[string] name: String,
) -> String {
use sha2::{Digest, Sha256};
if hash_algo == "sha256" && hash_mode == "recursive" {
crate::nix_hash::make_store_path("source", &hash, &name)
} else {
let prefix = if hash_mode == "recursive" { "r:" } else { "" };
let inner_input = format!("fixed:out:{}{}:{}:", prefix, hash_algo, hash);
let mut hasher = Sha256::new();
hasher.update(inner_input.as_bytes());
let inner_hash = hex::encode(hasher.finalize());
crate::nix_hash::make_store_path("output:out", &inner_hash, &name)
}
}
pub(crate) struct Runtime<Ctx: RuntimeCtx> {
pub(crate) struct Runtime<Ctx: RuntimeContext> {
js_runtime: JsRuntime,
is_thunk_symbol: v8::Global<v8::Symbol>,
primop_metadata_symbol: v8::Global<v8::Symbol>,
has_context_symbol: v8::Global<v8::Symbol>,
#[cfg(feature = "inspector")]
rt: tokio::runtime::Runtime,
#[cfg(feature = "inspector")]
wait_for_inspector: bool,
symbols: GlobalSymbols,
cached_fns: CachedFunctions,
_marker: PhantomData<Ctx>,
}
impl<Ctx: RuntimeCtx> Runtime<Ctx> {
pub(crate) fn new() -> Result<Self> {
// Initialize V8 once
#[cfg(feature = "inspector")]
#[derive(Debug, Clone, Copy, Default)]
pub(crate) struct InspectorOptions {
pub(crate) enable: bool,
pub(crate) wait: bool,
}
impl<Ctx: RuntimeContext> Runtime<Ctx> {
pub(crate) fn new(
#[cfg(feature = "inspector")] inspector_options: InspectorOptions,
) -> Result<Self> {
use std::sync::Once;
static INIT: Once = Once::new();
INIT.call_once(|| {
JsRuntime::init_platform(
Some(v8::new_default_platform(0, false).make_shared()),
false,
assert_eq!(
deno_core::v8_set_flags(vec![
"".into(),
format!("--stack-size={}", 8 * 1024),
#[cfg(feature = "prof")]
("--prof".into())
]),
[""]
);
JsRuntime::init_platform(Some(v8::new_default_platform(0, false).make_shared()));
});
let mut js_runtime = JsRuntime::new(RuntimeOptions {
extensions: vec![runtime_extension::<Ctx>()],
#[cfg(feature = "inspector")]
inspector: inspector_options.enable,
is_main: true,
..Default::default()
});
let (is_thunk_symbol, primop_metadata_symbol, has_context_symbol) = {
js_runtime.op_state().borrow_mut().put(RegexCache::new());
js_runtime.op_state().borrow_mut().put(DrvHashCache::new());
let (symbols, cached_fns) = {
deno_core::scope!(scope, &mut js_runtime);
Self::get_symbols(scope)?
let symbols = Self::get_symbols(scope)?;
let cached_fns = Self::get_cached_functions(scope)?;
(symbols, cached_fns)
};
Ok(Self {
js_runtime,
is_thunk_symbol,
primop_metadata_symbol,
has_context_symbol,
#[cfg(feature = "inspector")]
rt: tokio::runtime::Builder::new_current_thread()
.enable_all()
.build()
.expect("failed to build tokio runtime"),
#[cfg(feature = "inspector")]
wait_for_inspector: inspector_options.wait,
symbols,
cached_fns,
_marker: PhantomData,
})
}
pub(crate) fn eval(&mut self, script: String, ctx: Ctx) -> Result<Value> {
#[cfg(feature = "inspector")]
pub(crate) fn inspector(&self) -> std::rc::Rc<deno_core::JsRuntimeInspector> {
self.js_runtime.inspector()
}
#[cfg(feature = "inspector")]
pub(crate) fn wait_for_inspector_disconnect(&mut self) {
let _ = self
.rt
.block_on(self.js_runtime.run_event_loop(PollEventLoopOptions {
wait_for_inspector: true,
..Default::default()
}));
}
pub(crate) fn eval(&mut self, script: String, ctx: &mut Ctx) -> Result<Value> {
let ctx: &'static mut Ctx = unsafe { &mut *(ctx as *mut Ctx) };
self.js_runtime.op_state().borrow_mut().put(ctx);
#[cfg(feature = "inspector")]
if self.wait_for_inspector {
self.js_runtime
.inspector()
.wait_for_session_and_break_on_next_statement();
} else {
self.js_runtime.inspector().wait_for_session();
}
let global_value = self
.js_runtime
.execute_script("<eval>", script)
.map_err(|e| Error::eval_error(format!("{}", e.get_message())))?;
.map_err(|error| {
let op_state = self.js_runtime.op_state();
let op_state_borrow = op_state.borrow();
let ctx: &Ctx = op_state_borrow.get_ctx();
crate::error::parse_js_error(error, ctx)
})?;
// Retrieve scope from JsRuntime
deno_core::scope!(scope, self.js_runtime);
let local_value = v8::Local::new(scope, &global_value);
let is_thunk_symbol = v8::Local::new(scope, &self.is_thunk_symbol);
let primop_metadata_symbol = v8::Local::new(scope, &self.primop_metadata_symbol);
let has_context_symbol = v8::Local::new(scope, &self.has_context_symbol);
let symbols = &self.symbols.local(scope);
Ok(to_value(
local_value,
scope,
is_thunk_symbol,
primop_metadata_symbol,
has_context_symbol,
))
Ok(to_value(local_value, scope, symbols))
}
/// get (IS_THUNK, PRIMOP_METADATA, HAS_CONTEXT)
fn get_symbols(
scope: &ScopeRef,
) -> Result<(
v8::Global<v8::Symbol>,
v8::Global<v8::Symbol>,
v8::Global<v8::Symbol>,
)> {
pub(crate) fn eval_bytecode(
&mut self,
result: Bytecode,
ctx: &mut Ctx,
force_mode: ForceMode,
) -> Result<Value> {
let ctx: &'static mut Ctx = unsafe { &mut *(ctx as *mut Ctx) };
{
deno_core::scope!(scope, self.js_runtime);
sync_global_tables(scope, &self.cached_fns, ctx);
}
let op_state = self.js_runtime.op_state();
op_state.borrow_mut().put(ctx);
#[cfg(feature = "inspector")]
if self.wait_for_inspector {
self.js_runtime
.inspector()
.wait_for_session_and_break_on_next_statement();
} else {
self.js_runtime.inspector().wait_for_session();
}
deno_core::scope!(scope, self.js_runtime);
let store = v8::ArrayBuffer::new_backing_store_from_boxed_slice(result.code);
let ab = v8::ArrayBuffer::with_backing_store(scope, &store.make_shared());
let u8a = v8::Uint8Array::new(scope, ab, 0, ab.byte_length())
.ok_or_else(|| Error::internal("failed to create Uint8Array".into()))?;
let dir = v8::String::new(scope, &result.current_dir)
.ok_or_else(|| Error::internal("failed to create dir string".into()))?;
let undef = v8::undefined(scope);
let tc = std::pin::pin!(v8::TryCatch::new(scope));
let scope = &mut tc.init();
let exec_bytecode = v8::Local::new(scope, &self.cached_fns.exec_bytecode);
let raw_result = exec_bytecode
.call(scope, undef.into(), &[u8a.into(), dir.into()])
.ok_or_else(|| {
scope
.exception()
.map(|e| {
let op_state_borrow = op_state.borrow();
let ctx: &Ctx = op_state_borrow.get_ctx();
Box::new(crate::error::parse_js_error(
deno_core::error::JsError::from_v8_exception(scope, e),
ctx,
))
})
.unwrap_or_else(|| Error::internal("bytecode execution failed".into()))
})?;
let force_fn = match force_mode {
ForceMode::Force => &self.cached_fns.force_fn,
ForceMode::ForceShallow => &self.cached_fns.force_shallow_fn,
ForceMode::ForceDeep => &self.cached_fns.force_deep_fn,
};
let force_fn = v8::Local::new(scope, force_fn);
let forced = force_fn
.call(scope, undef.into(), &[raw_result])
.ok_or_else(|| {
scope
.exception()
.map(|e| {
let op_state_borrow = op_state.borrow();
let ctx: &Ctx = op_state_borrow.get_ctx();
Box::new(crate::error::parse_js_error(
deno_core::error::JsError::from_v8_exception(scope, e),
ctx,
))
})
.unwrap_or_else(|| Error::internal("force failed".into()))
})?;
let symbols = &self.symbols.local(scope);
Ok(to_value(forced, scope, symbols))
}
fn get_symbols(scope: &ScopeRef) -> Result<GlobalSymbols> {
let global = scope.get_current_context().global(scope);
let nix_key = v8::String::new(scope, "Nix")
.ok_or_else(|| Error::internal("failed to create V8 String".into()))?;
@@ -257,54 +328,168 @@ impl<Ctx: RuntimeCtx> Runtime<Ctx> {
Error::internal("failed to convert global Nix Value to object".into())
})?;
let is_thunk_sym_key = v8::String::new(scope, "IS_THUNK")
.ok_or_else(|| Error::internal("failed to create V8 String".into()))?;
let is_thunk_sym = nix_obj
.get(scope, is_thunk_sym_key.into())
.ok_or_else(|| Error::internal("failed to get IS_THUNK Symbol".into()))?;
let is_thunk = is_thunk_sym.try_cast::<v8::Symbol>().map_err(|err| {
Error::internal(format!(
"failed to convert IS_THUNK Value to Symbol ({err})"
))
})?;
let is_thunk = v8::Global::new(scope, is_thunk);
let primop_metadata_sym_key = v8::String::new(scope, "PRIMOP_METADATA")
.ok_or_else(|| Error::internal("failed to create V8 String".into()))?;
let primop_metadata_sym = nix_obj
.get(scope, primop_metadata_sym_key.into())
.ok_or_else(|| Error::internal("failed to get PRIMOP_METADATA Symbol".into()))?;
let primop_metadata = primop_metadata_sym
.try_cast::<v8::Symbol>()
.map_err(|err| {
let get_symbol = |symbol| {
let key = v8::String::new(scope, symbol)
.ok_or_else(|| Error::internal("failed to create V8 String".into()))?;
let val = nix_obj
.get(scope, key.into())
.ok_or_else(|| Error::internal(format!("failed to get {symbol} Symbol")))?;
let sym = val.try_cast::<v8::Symbol>().map_err(|err| {
Error::internal(format!(
"failed to convert PRIMOP_METADATA Value to Symbol ({err})"
"failed to convert {symbol} Value to Symbol ({err})"
))
})?;
let primop_metadata = v8::Global::new(scope, primop_metadata);
Result::Ok(v8::Global::new(scope, sym))
};
let has_context_sym_key = v8::String::new(scope, "HAS_CONTEXT")
let is_thunk = get_symbol("IS_THUNK")?;
let primop_metadata = get_symbol("PRIMOP_METADATA")?;
let has_context = get_symbol("HAS_CONTEXT")?;
let is_path = get_symbol("IS_PATH")?;
let is_cycle = get_symbol("IS_CYCLE")?;
Ok(GlobalSymbols {
is_thunk,
primop_metadata,
has_context,
is_path,
is_cycle,
})
}
fn get_cached_functions(scope: &ScopeRef) -> Result<CachedFunctions> {
let global = scope.get_current_context().global(scope);
let nix_key = v8::String::new(scope, "Nix")
.ok_or_else(|| Error::internal("failed to create V8 String".into()))?;
let has_context_sym = nix_obj
.get(scope, has_context_sym_key.into())
.ok_or_else(|| Error::internal("failed to get HAS_CONTEXT Symbol".into()))?;
let has_context = has_context_sym.try_cast::<v8::Symbol>().map_err(|err| {
Error::internal(format!(
"failed to convert HAS_CONTEXT Value to Symbol ({err})"
))
})?;
let has_context = v8::Global::new(scope, has_context);
let nix_obj = global
.get(scope, nix_key.into())
.ok_or_else(|| Error::internal("failed to get global Nix object".into()))?
.to_object(scope)
.ok_or_else(|| {
Error::internal("failed to convert global Nix Value to object".into())
})?;
Ok((is_thunk, primop_metadata, has_context))
let get_fn = |name: &str| -> Result<v8::Global<v8::Function>> {
let key = v8::String::new(scope, name)
.ok_or_else(|| Error::internal("failed to create V8 String".into()))?;
let val = nix_obj
.get(scope, key.into())
.ok_or_else(|| Error::internal(format!("failed to get Nix.{name}")))?;
let func = val
.try_cast::<v8::Function>()
.map_err(|err| Error::internal(format!("Nix.{name} is not a function ({err})")))?;
Ok(v8::Global::new(scope, func))
};
let exec_bytecode = get_fn("execBytecode")?;
let force_fn = get_fn("force")?;
let force_shallow_fn = get_fn("forceShallow")?;
let force_deep_fn = get_fn("forceDeep")?;
let strings_key = v8::String::new(scope, "strings")
.ok_or_else(|| Error::internal("failed to create V8 String".into()))?;
let strings_array = nix_obj
.get(scope, strings_key.into())
.ok_or_else(|| Error::internal("failed to get Nix.strings".into()))?
.try_cast::<v8::Array>()
.map_err(|err| Error::internal(format!("Nix.strings is not an array ({err})")))?;
let constants_key = v8::String::new(scope, "constants")
.ok_or_else(|| Error::internal("failed to create V8 String".into()))?;
let constants_array = nix_obj
.get(scope, constants_key.into())
.ok_or_else(|| Error::internal("failed to get Nix.constants".into()))?
.try_cast::<v8::Array>()
.map_err(|err| Error::internal(format!("Nix.constants is not an array ({err})")))?;
Ok(CachedFunctions {
exec_bytecode,
force_fn,
force_shallow_fn,
force_deep_fn,
strings_array: v8::Global::new(scope, strings_array),
constants_array: v8::Global::new(scope, constants_array),
})
}
}
struct GlobalSymbols {
is_thunk: v8::Global<v8::Symbol>,
primop_metadata: v8::Global<v8::Symbol>,
has_context: v8::Global<v8::Symbol>,
is_path: v8::Global<v8::Symbol>,
is_cycle: v8::Global<v8::Symbol>,
}
impl GlobalSymbols {
fn local<'a>(&self, scope: &ScopeRef<'a, '_>) -> LocalSymbols<'a> {
LocalSymbols {
is_thunk: v8::Local::new(scope, &self.is_thunk),
primop_metadata: v8::Local::new(scope, &self.primop_metadata),
has_context: v8::Local::new(scope, &self.has_context),
is_path: v8::Local::new(scope, &self.is_path),
is_cycle: v8::Local::new(scope, &self.is_cycle),
}
}
}
struct LocalSymbols<'a> {
is_thunk: v8::Local<'a, v8::Symbol>,
primop_metadata: v8::Local<'a, v8::Symbol>,
has_context: v8::Local<'a, v8::Symbol>,
is_path: v8::Local<'a, v8::Symbol>,
is_cycle: v8::Local<'a, v8::Symbol>,
}
struct CachedFunctions {
exec_bytecode: v8::Global<v8::Function>,
force_fn: v8::Global<v8::Function>,
force_shallow_fn: v8::Global<v8::Function>,
force_deep_fn: v8::Global<v8::Function>,
strings_array: v8::Global<v8::Array>,
constants_array: v8::Global<v8::Array>,
}
pub(crate) enum ForceMode {
Force,
ForceShallow,
ForceDeep,
}
fn sync_global_tables<Ctx: RuntimeContext>(
scope: &ScopeRef,
cached: &CachedFunctions,
ctx: &mut Ctx,
) {
let (new_strings, new_constants, strings_base, constants_base) = ctx.get_unsynced();
if !new_strings.is_empty() {
let s_array = v8::Local::new(scope, &cached.strings_array);
for (i, s) in new_strings.iter().enumerate() {
let idx = (strings_base + i) as u32;
#[allow(clippy::unwrap_used)]
let val = v8::String::new(scope, s).unwrap();
s_array.set_index(scope, idx, val.into());
}
}
if !new_constants.is_empty() {
let k_array = v8::Local::new(scope, &cached.constants_array);
for (i, c) in new_constants.iter().enumerate() {
let idx = (constants_base + i) as u32;
let val: v8::Local<v8::Value> = match c {
Constant::Int(n) => v8::BigInt::new_from_i64(scope, *n).into(),
Constant::Float(bits) => v8::Number::new(scope, f64::from_bits(*bits)).into(),
};
k_array.set_index(scope, idx, val);
}
}
}
fn to_value<'a>(
val: LocalValue<'a>,
scope: &ScopeRef<'a, '_>,
is_thunk_symbol: LocalSymbol<'a>,
primop_metadata_symbol: LocalSymbol<'a>,
has_context_symbol: LocalSymbol<'a>,
symbols: &LocalSymbols<'a>,
) -> Value {
match () {
_ if val.is_big_int() => {
@@ -334,30 +519,51 @@ fn to_value<'a>(
let list = (0..len)
.map(|i| {
let val = val.get_index(scope, i).expect("infallible index operation");
to_value(
val,
scope,
is_thunk_symbol,
primop_metadata_symbol,
has_context_symbol,
)
to_value(val, scope, symbols)
})
.collect();
Value::List(List::new(list))
}
_ if val.is_function() => {
if let Some(primop) = to_primop(val, scope, primop_metadata_symbol) {
if let Some(primop) = to_primop(val, scope, symbols.primop_metadata) {
primop
} else {
Value::Func
}
}
_ if val.is_map() => {
let val = val.try_cast::<v8::Map>().expect("infallible conversion");
let size = val.size() as u32;
let array = val.as_array(scope);
let attrs = (0..size)
.map(|i| {
let key = array
.get_index(scope, i * 2)
.expect("infallible index operation");
let key = key.to_rust_string_lossy(scope);
let val = array
.get_index(scope, i * 2 + 1)
.expect("infallible index operation");
let val = to_value(val, scope, symbols);
(Symbol::new(Cow::Owned(key)), val)
})
.collect();
Value::AttrSet(AttrSet::new(attrs))
}
_ if val.is_object() => {
if is_thunk(val, scope, is_thunk_symbol) {
if is_thunk(val, scope, symbols.is_thunk) {
return Value::Thunk;
}
if let Some(string_val) = extract_string_with_context(val, scope, has_context_symbol) {
if is_cycle(val, scope, symbols.is_cycle) {
return Value::Repeated;
}
if let Some(path_val) = extract_path(val, scope, symbols.is_path) {
return Value::Path(path_val);
}
if let Some(string_val) = extract_string_with_context(val, scope, symbols.has_context) {
return Value::String(string_val);
}
@@ -373,16 +579,7 @@ fn to_value<'a>(
.expect("infallible index operation");
let val = val.get(scope, key).expect("infallible operation");
let key = key.to_rust_string_lossy(scope);
(
Symbol::new(key),
to_value(
val,
scope,
is_thunk_symbol,
primop_metadata_symbol,
has_context_symbol,
),
)
(Symbol::from(key), to_value(val, scope, symbols))
})
.collect();
Value::AttrSet(AttrSet::new(attrs))
@@ -400,6 +597,15 @@ fn is_thunk<'a>(val: LocalValue<'a>, scope: &ScopeRef<'a, '_>, symbol: LocalSymb
matches!(obj.get(scope, symbol.into()), Some(v) if v.is_true())
}
fn is_cycle<'a>(val: LocalValue<'a>, scope: &ScopeRef<'a, '_>, symbol: LocalSymbol<'a>) -> bool {
if !val.is_object() {
return false;
}
let obj = val.to_object(scope).expect("infallible conversion");
matches!(obj.get(scope, symbol.into()), Some(v) if v.is_true())
}
fn extract_string_with_context<'a>(
val: LocalValue<'a>,
scope: &ScopeRef<'a, '_>,
@@ -426,6 +632,32 @@ fn extract_string_with_context<'a>(
}
}
fn extract_path<'a>(
val: LocalValue<'a>,
scope: &ScopeRef<'a, '_>,
symbol: LocalSymbol<'a>,
) -> Option<String> {
if !val.is_object() {
return None;
}
let obj = val.to_object(scope).expect("infallible conversion");
let is_path = obj.get(scope, symbol.into())?;
if !is_path.is_true() {
return None;
}
let value_key = v8::String::new(scope, "value")?;
let value = obj.get(scope, value_key.into())?;
if value.is_string() {
Some(value.to_rust_string_lossy(scope))
} else {
None
}
}
fn to_primop<'a>(
val: LocalValue<'a>,
scope: &ScopeRef<'a, '_>,
@@ -453,34 +685,3 @@ fn to_primop<'a>(
Some(Value::PrimOpApp(name))
}
}
#[cfg(test)]
#[allow(clippy::unwrap_used)]
mod test {
use super::*;
use crate::context::Context;
#[test]
fn to_value_working() {
let mut ctx = Context::new().unwrap();
assert_eq!(
ctx.eval_js(
"({
test: [1., 9223372036854775807n, true, false, 'hello world!']
})"
.into(),
)
.unwrap(),
Value::AttrSet(AttrSet::new(std::collections::BTreeMap::from([(
Symbol::from("test"),
Value::List(List::new(vec![
Value::Float(1.),
Value::Int(9223372036854775807),
Value::Bool(true),
Value::Bool(false),
Value::String("hello world!".to_string())
]))
)])))
);
}
}

View File

@@ -0,0 +1,31 @@
drvAttrs@{
outputs ? [ "out" ],
...
}:
let
strict = derivationStrict drvAttrs;
commonAttrs =
drvAttrs
// (builtins.listToAttrs outputsList)
// {
all = map (x: x.value) outputsList;
inherit drvAttrs;
};
outputToAttrListElement = outputName: {
name = outputName;
value = commonAttrs // {
outPath = builtins.getAttr outputName strict;
drvPath = strict.drvPath;
type = "derivation";
inherit outputName;
};
};
outputsList = map outputToAttrListElement outputs;
in
(builtins.head outputsList).value

View File

@@ -0,0 +1,76 @@
{
system ? "", # obsolete
url,
hash ? "", # an SRI hash
# Legacy hash specification
md5 ? "",
sha1 ? "",
sha256 ? "",
sha512 ? "",
outputHash ?
if hash != "" then
hash
else if sha512 != "" then
sha512
else if sha1 != "" then
sha1
else if md5 != "" then
md5
else
sha256,
outputHashAlgo ?
if hash != "" then
""
else if sha512 != "" then
"sha512"
else if sha1 != "" then
"sha1"
else if md5 != "" then
"md5"
else
"sha256",
executable ? false,
unpack ? false,
name ? baseNameOf (toString url),
# still translates to __impure to trigger derivationStrict error checks.
impure ? false,
}:
derivation (
{
builder = "builtin:fetchurl";
# New-style output content requirements.
outputHashMode = if unpack || executable then "recursive" else "flat";
inherit
name
url
executable
unpack
;
system = "builtin";
# No need to double the amount of network traffic
preferLocalBuild = true;
impureEnvVars = [
# We borrow these environment variables from the caller to allow
# easy proxy configuration. This is impure, but a fixed-output
# derivation like fetchurl is allowed to do so since its result is
# by definition pure.
"http_proxy"
"https_proxy"
"ftp_proxy"
"all_proxy"
"no_proxy"
];
# To make "nix-prefetch-url" work.
urls = [ url ];
}
// (if impure then { __impure = true; } else { inherit outputHashAlgo outputHash; })
)

View File

@@ -0,0 +1,493 @@
// Copyright 2018-2025 the Deno authors. MIT license.
// Alias for the future `!` type.
use core::convert::Infallible as Never;
use std::cell::RefCell;
use std::net::SocketAddr;
use std::pin::pin;
use std::process;
use std::rc::Rc;
use std::task::Poll;
use std::thread;
use deno_core::InspectorMsg;
use deno_core::InspectorSessionChannels;
use deno_core::InspectorSessionKind;
use deno_core::InspectorSessionProxy;
use deno_core::JsRuntimeInspector;
use deno_core::anyhow::Context;
use deno_core::futures::channel::mpsc;
use deno_core::futures::channel::mpsc::UnboundedReceiver;
use deno_core::futures::channel::mpsc::UnboundedSender;
use deno_core::futures::channel::oneshot;
use deno_core::futures::prelude::*;
use deno_core::futures::stream::StreamExt;
use deno_core::serde_json::Value;
use deno_core::serde_json::json;
use deno_core::unsync::spawn;
use deno_core::url::Url;
use fastwebsockets::Frame;
use fastwebsockets::OpCode;
use fastwebsockets::WebSocket;
use hashbrown::HashMap;
use hyper::body::Bytes;
use hyper_util::rt::TokioIo;
use tokio::net::TcpListener;
use tokio::sync::broadcast;
use uuid::Uuid;
/// Websocket server that is used to proxy connections from
/// devtools to the inspector.
pub struct InspectorServer {
pub host: SocketAddr,
register_inspector_tx: UnboundedSender<InspectorInfo>,
shutdown_server_tx: Option<broadcast::Sender<()>>,
thread_handle: Option<thread::JoinHandle<()>>,
}
impl InspectorServer {
pub fn new(host: SocketAddr, name: &'static str) -> Result<Self, anyhow::Error> {
let (register_inspector_tx, register_inspector_rx) = mpsc::unbounded::<InspectorInfo>();
let (shutdown_server_tx, shutdown_server_rx) = broadcast::channel(1);
let tcp_listener = std::net::TcpListener::bind(host)
.with_context(|| format!("Failed to bind inspector server socket at {}", host))?;
tcp_listener.set_nonblocking(true)?;
let thread_handle = thread::spawn(move || {
let rt = tokio::runtime::Builder::new_current_thread()
.enable_all()
.build()
.expect("failed to build tokio runtime");
let local = tokio::task::LocalSet::new();
local.block_on(
&rt,
server(
tcp_listener,
register_inspector_rx,
shutdown_server_rx,
name,
),
)
});
Ok(Self {
host,
register_inspector_tx,
shutdown_server_tx: Some(shutdown_server_tx),
thread_handle: Some(thread_handle),
})
}
pub fn register_inspector(
&self,
module_url: String,
inspector: Rc<JsRuntimeInspector>,
wait_for_session: bool,
) {
let session_sender = inspector.get_session_sender();
let deregister_rx = inspector.add_deregister_handler();
let info = InspectorInfo::new(
self.host,
session_sender,
deregister_rx,
module_url,
wait_for_session,
);
self.register_inspector_tx
.unbounded_send(info)
.expect("unreachable");
}
}
impl Drop for InspectorServer {
fn drop(&mut self) {
if let Some(shutdown_server_tx) = self.shutdown_server_tx.take() {
shutdown_server_tx
.send(())
.expect("unable to send shutdown signal");
}
if let Some(thread_handle) = self.thread_handle.take() {
thread_handle.join().expect("unable to join thread");
}
}
}
fn handle_ws_request(
req: http::Request<hyper::body::Incoming>,
inspector_map_rc: Rc<RefCell<HashMap<Uuid, InspectorInfo>>>,
) -> http::Result<http::Response<Box<http_body_util::Full<Bytes>>>> {
let (parts, body) = req.into_parts();
let req = http::Request::from_parts(parts, ());
let maybe_uuid = req
.uri()
.path()
.strip_prefix("/ws/")
.and_then(|s| Uuid::parse_str(s).ok());
let Some(uuid) = maybe_uuid else {
return http::Response::builder()
.status(http::StatusCode::BAD_REQUEST)
.body(Box::new(Bytes::from("Malformed inspector UUID").into()));
};
// run in a block to not hold borrow to `inspector_map` for too long
let new_session_tx = {
let inspector_map = inspector_map_rc.borrow();
let maybe_inspector_info = inspector_map.get(&uuid);
let Some(info) = maybe_inspector_info else {
return http::Response::builder()
.status(http::StatusCode::NOT_FOUND)
.body(Box::new(Bytes::from("Invalid inspector UUID").into()));
};
info.new_session_tx.clone()
};
let (parts, _) = req.into_parts();
let mut req = http::Request::from_parts(parts, body);
let Ok((resp, upgrade_fut)) = fastwebsockets::upgrade::upgrade(&mut req) else {
return http::Response::builder()
.status(http::StatusCode::BAD_REQUEST)
.body(Box::new(
Bytes::from("Not a valid Websocket Request").into(),
));
};
// spawn a task that will wait for websocket connection and then pump messages between
// the socket and inspector proxy
spawn(async move {
let websocket = match upgrade_fut.await {
Ok(w) => w,
Err(err) => {
eprintln!(
"Inspector server failed to upgrade to WS connection: {:?}",
err
);
return;
}
};
// The 'outbound' channel carries messages sent to the websocket.
let (outbound_tx, outbound_rx) = mpsc::unbounded();
// The 'inbound' channel carries messages received from the websocket.
let (inbound_tx, inbound_rx) = mpsc::unbounded();
let inspector_session_proxy = InspectorSessionProxy {
channels: InspectorSessionChannels::Regular {
tx: outbound_tx,
rx: inbound_rx,
},
kind: InspectorSessionKind::NonBlocking {
wait_for_disconnect: true,
},
};
eprintln!("Debugger session started.");
let _ = new_session_tx.unbounded_send(inspector_session_proxy);
pump_websocket_messages(websocket, inbound_tx, outbound_rx).await;
});
let (parts, _body) = resp.into_parts();
let resp = http::Response::from_parts(parts, Box::new(http_body_util::Full::new(Bytes::new())));
Ok(resp)
}
fn handle_json_request(
inspector_map: Rc<RefCell<HashMap<Uuid, InspectorInfo>>>,
host: Option<String>,
) -> http::Result<http::Response<Box<http_body_util::Full<Bytes>>>> {
let data = inspector_map
.borrow()
.values()
.map(move |info| info.get_json_metadata(&host))
.collect::<Vec<_>>();
let body: http_body_util::Full<Bytes> =
Bytes::from(serde_json::to_string(&data).expect("unreachable")).into();
http::Response::builder()
.status(http::StatusCode::OK)
.header(http::header::CONTENT_TYPE, "application/json")
.body(Box::new(body))
}
fn handle_json_version_request(
version_response: Value,
) -> http::Result<http::Response<Box<http_body_util::Full<Bytes>>>> {
let body = Box::new(http_body_util::Full::from(
serde_json::to_string(&version_response).expect("unreachable"),
));
http::Response::builder()
.status(http::StatusCode::OK)
.header(http::header::CONTENT_TYPE, "application/json")
.body(body)
}
async fn server(
listener: std::net::TcpListener,
register_inspector_rx: UnboundedReceiver<InspectorInfo>,
shutdown_server_rx: broadcast::Receiver<()>,
name: &str,
) {
let inspector_map_ = Rc::new(RefCell::new(HashMap::<Uuid, InspectorInfo>::new()));
let inspector_map = Rc::clone(&inspector_map_);
let register_inspector_handler =
listen_for_new_inspectors(register_inspector_rx, inspector_map.clone()).boxed_local();
let inspector_map = Rc::clone(&inspector_map_);
let deregister_inspector_handler = future::poll_fn(|cx| {
inspector_map
.borrow_mut()
.retain(|_, info| info.deregister_rx.poll_unpin(cx) == Poll::Pending);
Poll::<Never>::Pending
})
.boxed_local();
let json_version_response = json!({
"Browser": name,
"Protocol-Version": "1.3",
"V8-Version": deno_core::v8::VERSION_STRING,
});
// Create the server manually so it can use the Local Executor
let listener = match TcpListener::from_std(listener) {
Ok(l) => l,
Err(err) => {
eprintln!("Cannot create async listener from std listener: {:?}", err);
return;
}
};
let server_handler = async move {
loop {
let mut rx = shutdown_server_rx.resubscribe();
let mut shutdown_rx = pin!(rx.recv());
let mut accept = pin!(listener.accept());
let stream = tokio::select! {
accept_result =
&mut accept => {
match accept_result {
Ok((s, _)) => s,
Err(err) => {
eprintln!("Failed to accept inspector connection: {:?}", err);
continue;
}
}
},
_ = &mut shutdown_rx => {
break;
}
};
let io = TokioIo::new(stream);
let inspector_map = Rc::clone(&inspector_map_);
let json_version_response = json_version_response.clone();
let mut shutdown_server_rx = shutdown_server_rx.resubscribe();
let service =
hyper::service::service_fn(move |req: http::Request<hyper::body::Incoming>| {
future::ready({
// If the host header can make a valid URL, use it
let host = req
.headers()
.get("host")
.and_then(|host| host.to_str().ok())
.and_then(|host| Url::parse(&format!("http://{host}")).ok())
.and_then(|url| match (url.host(), url.port()) {
(Some(host), Some(port)) => Some(format!("{host}:{port}")),
(Some(host), None) => Some(format!("{host}")),
_ => None,
});
match (req.method(), req.uri().path()) {
(&http::Method::GET, path) if path.starts_with("/ws/") => {
handle_ws_request(req, Rc::clone(&inspector_map))
}
(&http::Method::GET, "/json/version") => {
handle_json_version_request(json_version_response.clone())
}
(&http::Method::GET, "/json") => {
handle_json_request(Rc::clone(&inspector_map), host)
}
(&http::Method::GET, "/json/list") => {
handle_json_request(Rc::clone(&inspector_map), host)
}
_ => http::Response::builder()
.status(http::StatusCode::NOT_FOUND)
.body(Box::new(http_body_util::Full::new(Bytes::from(
"Not Found",
)))),
}
})
});
deno_core::unsync::spawn(async move {
let server = hyper::server::conn::http1::Builder::new();
let mut conn = pin!(server.serve_connection(io, service).with_upgrades());
let mut shutdown_rx = pin!(shutdown_server_rx.recv());
tokio::select! {
result = conn.as_mut() => {
if let Err(err) = result {
eprintln!("Failed to serve connection: {:?}", err);
}
},
_ = &mut shutdown_rx => {
conn.as_mut().graceful_shutdown();
let _ = conn.await;
}
}
});
}
}
.boxed_local();
tokio::select! {
_ = register_inspector_handler => {},
_ = deregister_inspector_handler => unreachable!(),
_ = server_handler => {},
}
}
async fn listen_for_new_inspectors(
mut register_inspector_rx: UnboundedReceiver<InspectorInfo>,
inspector_map: Rc<RefCell<HashMap<Uuid, InspectorInfo>>>,
) {
while let Some(info) = register_inspector_rx.next().await {
eprintln!(
"Debugger listening on {}",
info.get_websocket_debugger_url(&info.host.to_string())
);
eprintln!("Visit chrome://inspect to connect to the debugger.");
if info.wait_for_session {
eprintln!("nix-js is waiting for debugger to connect.");
}
if inspector_map.borrow_mut().insert(info.uuid, info).is_some() {
panic!("Inspector UUID already in map");
}
}
}
/// The pump future takes care of forwarding messages between the websocket
/// and channels. It resolves when either side disconnects, ignoring any
/// errors.
///
/// The future proxies messages sent and received on a WebSocket
/// to a UnboundedSender/UnboundedReceiver pair. We need these "unbounded" channel ends to sidestep
/// Tokio's task budget, which causes issues when JsRuntimeInspector::poll_sessions()
/// needs to block the thread because JavaScript execution is paused.
///
/// This works because UnboundedSender/UnboundedReceiver are implemented in the
/// 'futures' crate, therefore they can't participate in Tokio's cooperative
/// task yielding.
async fn pump_websocket_messages(
mut websocket: WebSocket<TokioIo<hyper::upgrade::Upgraded>>,
inbound_tx: UnboundedSender<String>,
mut outbound_rx: UnboundedReceiver<InspectorMsg>,
) {
'pump: loop {
tokio::select! {
Some(msg) = outbound_rx.next() => {
let msg = Frame::text(msg.content.into_bytes().into());
let _ = websocket.write_frame(msg).await;
}
Ok(msg) = websocket.read_frame() => {
match msg.opcode {
OpCode::Text => {
if let Ok(s) = String::from_utf8(msg.payload.to_vec()) {
let _ = inbound_tx.unbounded_send(s);
}
}
OpCode::Close => {
// Users don't care if there was an error coming from debugger,
// just about the fact that debugger did disconnect.
eprintln!("Debugger session ended");
break 'pump;
}
_ => {
// Ignore other messages.
}
}
}
else => {
break 'pump;
}
}
}
}
/// Inspector information that is sent from the isolate thread to the server
/// thread when a new inspector is created.
pub struct InspectorInfo {
pub host: SocketAddr,
pub uuid: Uuid,
pub thread_name: Option<String>,
pub new_session_tx: UnboundedSender<InspectorSessionProxy>,
pub deregister_rx: oneshot::Receiver<()>,
pub url: String,
pub wait_for_session: bool,
}
impl InspectorInfo {
pub fn new(
host: SocketAddr,
new_session_tx: mpsc::UnboundedSender<InspectorSessionProxy>,
deregister_rx: oneshot::Receiver<()>,
url: String,
wait_for_session: bool,
) -> Self {
Self {
host,
uuid: Uuid::new_v4(),
thread_name: thread::current().name().map(|n| n.to_owned()),
new_session_tx,
deregister_rx,
url,
wait_for_session,
}
}
fn get_json_metadata(&self, host: &Option<String>) -> Value {
let host_listen = format!("{}", self.host);
let host = host.as_ref().unwrap_or(&host_listen);
json!({
"description": "nix-js",
"devtoolsFrontendUrl": self.get_frontend_url(host),
"faviconUrl": "https://deno.land/favicon.ico",
"id": self.uuid.to_string(),
"title": self.get_title(),
"type": "node",
"url": self.url.to_string(),
"webSocketDebuggerUrl": self.get_websocket_debugger_url(host),
})
}
pub fn get_websocket_debugger_url(&self, host: &str) -> String {
format!("ws://{}/ws/{}", host, &self.uuid)
}
fn get_frontend_url(&self, host: &str) -> String {
format!(
"devtools://devtools/bundled/js_app.html?ws={}/ws/{}&experiments=true&v8only=true",
host, &self.uuid
)
}
fn get_title(&self) -> String {
format!(
"nix-js{} [pid: {}]",
self.thread_name
.as_ref()
.map(|n| format!(" - {n}"))
.unwrap_or_default(),
process::id(),
)
}
}

1896
nix-js/src/runtime/ops.rs Normal file

File diff suppressed because it is too large Load Diff

40
nix-js/src/store.rs Normal file
View File

@@ -0,0 +1,40 @@
use crate::error::Result;
mod config;
mod daemon;
mod error;
mod validation;
pub use config::StoreConfig;
pub use daemon::DaemonStore;
pub use validation::validate_store_path;
pub trait Store: Send + Sync {
fn get_store_dir(&self) -> &str;
fn is_valid_path(&self, path: &str) -> Result<bool>;
fn ensure_path(&self, path: &str) -> Result<()>;
fn add_to_store(
&self,
name: &str,
content: &[u8],
recursive: bool,
references: Vec<String>,
) -> Result<String>;
fn add_to_store_from_path(
&self,
name: &str,
source_path: &std::path::Path,
references: Vec<String>,
) -> Result<String>;
fn add_text_to_store(
&self,
name: &str,
content: &str,
references: Vec<String>,
) -> Result<String>;
}

View File

@@ -0,0 +1,22 @@
use std::path::PathBuf;
#[derive(Debug, Clone)]
pub struct StoreConfig {
pub daemon_socket: PathBuf,
}
impl StoreConfig {
pub fn from_env() -> Self {
let daemon_socket = std::env::var("NIX_DAEMON_SOCKET")
.map(PathBuf::from)
.unwrap_or_else(|_| PathBuf::from("/nix/var/nix/daemon-socket/socket"));
Self { daemon_socket }
}
}
impl Default for StoreConfig {
fn default() -> Self {
Self::from_env()
}
}

773
nix-js/src/store/daemon.rs Normal file
View File

@@ -0,0 +1,773 @@
use std::io::{Error as IoError, ErrorKind as IoErrorKind, Result as IoResult};
use std::path::Path;
use nix_compat::nix_daemon::types::{AddToStoreNarRequest, UnkeyedValidPathInfo};
use nix_compat::nix_daemon::worker_protocol::{ClientSettings, Operation};
use nix_compat::store_path::StorePath;
use nix_compat::wire::ProtocolVersion;
use nix_compat::wire::de::{NixRead, NixReader};
use nix_compat::wire::ser::{NixSerialize, NixWrite, NixWriter, NixWriterBuilder};
use num_enum::{IntoPrimitive, TryFromPrimitive};
use thiserror::Error;
use tokio::io::{AsyncReadExt, AsyncWriteExt, ReadHalf, WriteHalf, split};
use tokio::net::UnixStream;
use tokio::sync::Mutex;
use super::Store;
use crate::error::{Error, Result};
pub struct DaemonStore {
runtime: tokio::runtime::Runtime,
connection: NixDaemonConnection,
}
impl DaemonStore {
pub fn connect(socket_path: &Path) -> Result<Self> {
let runtime = tokio::runtime::Runtime::new()
.map_err(|e| Error::internal(format!("Failed to create tokio runtime: {}", e)))?;
let connection = runtime.block_on(async {
NixDaemonConnection::connect(socket_path)
.await
.map_err(|e| {
Error::internal(format!(
"Failed to connect to nix-daemon at {}: {}",
socket_path.display(),
e
))
})
})?;
Ok(Self {
runtime,
connection,
})
}
fn block_on<F>(&self, future: F) -> F::Output
where
F: std::future::Future,
{
self.runtime.block_on(future)
}
}
impl Store for DaemonStore {
fn get_store_dir(&self) -> &str {
"/nix/store"
}
fn is_valid_path(&self, path: &str) -> Result<bool> {
self.block_on(async {
self.connection
.is_valid_path(path)
.await
.map_err(|e| Error::internal(format!("Daemon error in is_valid_path: {}", e)))
})
}
fn ensure_path(&self, path: &str) -> Result<()> {
self.block_on(async {
self.connection.ensure_path(path).await.map_err(|e| {
Error::eval_error(
format!(
"builtins.storePath: path '{}' is not valid in nix store: {}",
path, e
),
None,
)
})
})
}
fn add_to_store(
&self,
name: &str,
content: &[u8],
recursive: bool,
references: Vec<String>,
) -> Result<String> {
use std::fs;
use nix_compat::nix_daemon::types::AddToStoreNarRequest;
use nix_compat::nixhash::{CAHash, NixHash};
use nix_compat::store_path::{StorePath, build_ca_path};
use sha2::{Digest, Sha256};
use tempfile::NamedTempFile;
let temp_file = NamedTempFile::new()
.map_err(|e| Error::internal(format!("Failed to create temp file: {}", e)))?;
fs::write(temp_file.path(), content)
.map_err(|e| Error::internal(format!("Failed to write temp file: {}", e)))?;
let nar_data = crate::nar::pack_nar(temp_file.path())?;
let nar_hash_hex = {
let mut hasher = Sha256::new();
hasher.update(&nar_data);
hex::encode(hasher.finalize())
};
let nar_hash_bytes = hex::decode(&nar_hash_hex)
.map_err(|e| Error::internal(format!("Invalid nar hash: {}", e)))?;
let mut nar_hash_arr = [0u8; 32];
nar_hash_arr.copy_from_slice(&nar_hash_bytes);
let ca_hash = if recursive {
CAHash::Nar(NixHash::Sha256(nar_hash_arr))
} else {
let mut content_hasher = Sha256::new();
content_hasher.update(content);
let content_hash = content_hasher.finalize();
let mut content_hash_arr = [0u8; 32];
content_hash_arr.copy_from_slice(&content_hash);
CAHash::Flat(NixHash::Sha256(content_hash_arr))
};
let ref_store_paths: std::result::Result<Vec<StorePath<String>>, _> = references
.iter()
.map(|r| StorePath::<String>::from_absolute_path(r.as_bytes()))
.collect();
let ref_store_paths = ref_store_paths
.map_err(|e| Error::internal(format!("Invalid reference path: {}", e)))?;
let store_path: StorePath<String> =
build_ca_path(name, &ca_hash, references.clone(), false)
.map_err(|e| Error::internal(format!("Failed to build store path: {}", e)))?;
let store_path_str = store_path.to_absolute_path();
if self.is_valid_path(&store_path_str)? {
return Ok(store_path_str);
}
let request = AddToStoreNarRequest {
path: store_path,
deriver: None,
nar_hash: unsafe {
std::mem::transmute::<[u8; 32], nix_compat::nix_daemon::types::NarHash>(
nar_hash_arr,
)
},
references: ref_store_paths,
registration_time: 0,
nar_size: nar_data.len() as u64,
ultimate: false,
signatures: vec![],
ca: Some(ca_hash),
repair: false,
dont_check_sigs: false,
};
self.block_on(async {
self.connection
.add_to_store_nar(request, &nar_data)
.await
.map_err(|e| Error::internal(format!("Failed to add to store: {}", e)))
})?;
Ok(store_path_str)
}
fn add_to_store_from_path(
&self,
name: &str,
source_path: &std::path::Path,
references: Vec<String>,
) -> Result<String> {
use nix_compat::nix_daemon::types::AddToStoreNarRequest;
use nix_compat::nixhash::{CAHash, NixHash};
use nix_compat::store_path::{StorePath, build_ca_path};
use sha2::{Digest, Sha256};
let nar_data = crate::nar::pack_nar(source_path)?;
let nar_hash: [u8; 32] = {
let mut hasher = Sha256::new();
hasher.update(&nar_data);
hasher.finalize().into()
};
let ca_hash = CAHash::Nar(NixHash::Sha256(nar_hash));
let ref_store_paths: std::result::Result<Vec<StorePath<String>>, _> = references
.iter()
.map(|r| StorePath::<String>::from_absolute_path(r.as_bytes()))
.collect();
let ref_store_paths = ref_store_paths
.map_err(|e| Error::internal(format!("Invalid reference path: {}", e)))?;
let store_path: StorePath<String> =
build_ca_path(name, &ca_hash, references.clone(), false)
.map_err(|e| Error::internal(format!("Failed to build store path: {}", e)))?;
let store_path_str = store_path.to_absolute_path();
if self.is_valid_path(&store_path_str)? {
return Ok(store_path_str);
}
let request = AddToStoreNarRequest {
path: store_path,
deriver: None,
nar_hash: unsafe {
std::mem::transmute::<[u8; 32], nix_compat::nix_daemon::types::NarHash>(nar_hash)
},
references: ref_store_paths,
registration_time: 0,
nar_size: nar_data.len() as u64,
ultimate: false,
signatures: vec![],
ca: Some(ca_hash),
repair: false,
dont_check_sigs: false,
};
self.block_on(async {
self.connection
.add_to_store_nar(request, &nar_data)
.await
.map_err(|e| Error::internal(format!("Failed to add to store: {}", e)))
})?;
Ok(store_path_str)
}
fn add_text_to_store(
&self,
name: &str,
content: &str,
references: Vec<String>,
) -> Result<String> {
use std::fs;
use nix_compat::nix_daemon::types::AddToStoreNarRequest;
use nix_compat::nixhash::CAHash;
use nix_compat::store_path::{StorePath, build_text_path};
use sha2::{Digest, Sha256};
use tempfile::NamedTempFile;
let temp_file = NamedTempFile::new()
.map_err(|e| Error::internal(format!("Failed to create temp file: {}", e)))?;
fs::write(temp_file.path(), content.as_bytes())
.map_err(|e| Error::internal(format!("Failed to write temp file: {}", e)))?;
let nar_data = crate::nar::pack_nar(temp_file.path())?;
let nar_hash: [u8; 32] = {
let mut hasher = Sha256::new();
hasher.update(&nar_data);
hasher.finalize().into()
};
let content_hash = {
let mut hasher = Sha256::new();
hasher.update(content.as_bytes());
hasher.finalize().into()
};
let ref_store_paths: std::result::Result<Vec<StorePath<String>>, _> = references
.iter()
.map(|r| StorePath::<String>::from_absolute_path(r.as_bytes()))
.collect();
let ref_store_paths = ref_store_paths
.map_err(|e| Error::internal(format!("Invalid reference path: {}", e)))?;
let store_path: StorePath<String> = build_text_path(name, content, references.clone())
.map_err(|e| Error::internal(format!("Failed to build text store path: {}", e)))?;
let store_path_str = store_path.to_absolute_path();
if self.is_valid_path(&store_path_str)? {
return Ok(store_path_str);
}
let request = AddToStoreNarRequest {
path: store_path,
deriver: None,
nar_hash: unsafe {
std::mem::transmute::<[u8; 32], nix_compat::nix_daemon::types::NarHash>(nar_hash)
},
references: ref_store_paths,
registration_time: 0,
nar_size: nar_data.len() as u64,
ultimate: false,
signatures: vec![],
ca: Some(CAHash::Text(content_hash)),
repair: false,
dont_check_sigs: false,
};
self.block_on(async {
self.connection
.add_to_store_nar(request, &nar_data)
.await
.map_err(|e| Error::internal(format!("Failed to add text to store: {}", e)))
})?;
Ok(store_path_str)
}
}
const PROTOCOL_VERSION: ProtocolVersion = ProtocolVersion::from_parts(1, 37);
// Protocol magic numbers (from nix-compat worker_protocol.rs)
const WORKER_MAGIC_1: u64 = 0x6e697863; // "nixc"
const WORKER_MAGIC_2: u64 = 0x6478696f; // "dxio"
const STDERR_LAST: u64 = 0x616c7473; // "alts"
const STDERR_ERROR: u64 = 0x63787470; // "cxtp"
/// Performs the client handshake with a nix-daemon server
///
/// This is the client-side counterpart to `server_handshake_client`.
/// It exchanges magic numbers, negotiates protocol version, and sends client settings.
async fn client_handshake<RW>(
conn: &mut RW,
client_settings: &ClientSettings,
) -> IoResult<ProtocolVersion>
where
RW: AsyncReadExt + AsyncWriteExt + Unpin,
{
// 1. Send magic number 1
conn.write_u64_le(WORKER_MAGIC_1).await?;
// 2. Receive magic number 2
let magic2 = conn.read_u64_le().await?;
if magic2 != WORKER_MAGIC_2 {
return Err(IoError::new(
IoErrorKind::InvalidData,
format!("Invalid magic number from server: {}", magic2),
));
}
// 3. Receive server protocol version
let server_version_raw = conn.read_u64_le().await?;
let server_version: ProtocolVersion = server_version_raw.try_into().map_err(|e| {
IoError::new(
IoErrorKind::InvalidData,
format!("Invalid protocol version: {}", e),
)
})?;
// 4. Send our protocol version
conn.write_u64_le(PROTOCOL_VERSION.into()).await?;
// Pick the minimum version
let protocol_version = std::cmp::min(PROTOCOL_VERSION, server_version);
// 5. Send obsolete fields based on protocol version
if protocol_version.minor() >= 14 {
// CPU affinity (obsolete, send 0)
conn.write_u64_le(0).await?;
}
if protocol_version.minor() >= 11 {
// Reserve space (obsolete, send 0)
conn.write_u64_le(0).await?;
}
if protocol_version.minor() >= 33 {
// Read Nix version string
let version_len = conn.read_u64_le().await? as usize;
let mut version_bytes = vec![0u8; version_len];
conn.read_exact(&mut version_bytes).await?;
// Padding
let padding = (8 - (version_len % 8)) % 8;
if padding > 0 {
let mut pad = vec![0u8; padding];
conn.read_exact(&mut pad).await?;
}
}
if protocol_version.minor() >= 35 {
// Read trust level
let _trust = conn.read_u64_le().await?;
}
// 6. Read STDERR_LAST
let stderr_last = conn.read_u64_le().await?;
if stderr_last != STDERR_LAST {
return Err(IoError::new(
IoErrorKind::InvalidData,
format!("Expected STDERR_LAST, got: {}", stderr_last),
));
}
// 7. Send SetOptions operation with client settings
conn.write_u64_le(Operation::SetOptions.into()).await?;
conn.flush().await?;
// Serialize client settings
let mut settings_buf = Vec::new();
{
let mut writer = NixWriterBuilder::default()
.set_version(protocol_version)
.build(&mut settings_buf);
writer.write_value(client_settings).await?;
writer.flush().await?;
}
conn.write_all(&settings_buf).await?;
conn.flush().await?;
// 8. Read response to SetOptions
let response = conn.read_u64_le().await?;
if response != STDERR_LAST {
return Err(IoError::new(
IoErrorKind::InvalidData,
format!("Expected STDERR_LAST after SetOptions, got: {}", response),
));
}
Ok(protocol_version)
}
/// Low-level Nix Daemon client
///
/// This struct manages communication with a nix-daemon using the wire protocol.
/// It is NOT thread-safe and should be wrapped in a Mutex for concurrent access.
pub struct NixDaemonClient {
protocol_version: ProtocolVersion,
reader: NixReader<ReadHalf<UnixStream>>,
writer: NixWriter<WriteHalf<UnixStream>>,
_marker: std::marker::PhantomData<std::cell::Cell<()>>,
}
impl NixDaemonClient {
/// Connect to a nix-daemon at the given Unix socket path
pub async fn connect(socket_path: &Path) -> IoResult<Self> {
let stream = UnixStream::connect(socket_path).await?;
Self::from_stream(stream).await
}
/// Create a client from an existing Unix stream
pub async fn from_stream(mut stream: UnixStream) -> IoResult<Self> {
let client_settings = ClientSettings::default();
// Perform handshake
let protocol_version = client_handshake(&mut stream, &client_settings).await?;
// Split stream into reader and writer
let (read_half, write_half) = split(stream);
let reader = NixReader::builder()
.set_version(protocol_version)
.build(read_half);
let writer = NixWriterBuilder::default()
.set_version(protocol_version)
.build(write_half);
Ok(Self {
protocol_version,
reader,
writer,
_marker: Default::default(),
})
}
/// Execute an operation with a single parameter
async fn execute_with<P, T>(&mut self, operation: Operation, param: &P) -> IoResult<T>
where
P: NixSerialize + Send,
T: nix_compat::wire::de::NixDeserialize,
{
// Send operation
self.writer.write_value(&operation).await?;
// Send parameter
self.writer.write_value(param).await?;
self.writer.flush().await?;
self.read_response().await
}
/// Read a response from the daemon
///
/// The daemon sends either:
/// - STDERR_LAST followed by the result
/// - STDERR_ERROR followed by a structured error
async fn read_response<T>(&mut self) -> IoResult<T>
where
T: nix_compat::wire::de::NixDeserialize,
{
loop {
let msg = self.reader.read_number().await?;
if msg == STDERR_LAST {
let result: T = self.reader.read_value().await?;
return Ok(result);
} else if msg == STDERR_ERROR {
let error_msg = self.read_daemon_error().await?;
return Err(IoError::other(error_msg));
} else {
let _data: String = self.reader.read_value().await?;
continue;
}
}
}
async fn read_daemon_error(&mut self) -> IoResult<NixDaemonError> {
let type_marker: String = self.reader.read_value().await?;
assert_eq!(type_marker, "Error");
let level = NixDaemonErrorLevel::try_from_primitive(
self.reader
.read_number()
.await?
.try_into()
.map_err(|_| IoError::other("invalid nix-daemon error level"))?,
)
.map_err(|_| IoError::other("invalid nix-daemon error level"))?;
// removed
let _name: String = self.reader.read_value().await?;
let msg: String = self.reader.read_value().await?;
let have_pos: u64 = self.reader.read_number().await?;
assert_eq!(have_pos, 0);
let nr_traces: u64 = self.reader.read_number().await?;
let mut traces = Vec::new();
for _ in 0..nr_traces {
let _trace_pos: u64 = self.reader.read_number().await?;
let trace_hint: String = self.reader.read_value().await?;
traces.push(trace_hint);
}
Ok(NixDaemonError { level, msg, traces })
}
/// Check if a path is valid in the store
pub async fn is_valid_path(&mut self, path: &str) -> IoResult<bool> {
let store_path = StorePath::<String>::from_absolute_path(path.as_bytes())
.map_err(|e| IoError::new(IoErrorKind::InvalidInput, e.to_string()))?;
self.execute_with(Operation::IsValidPath, &store_path).await
}
/// Query information about a store path
#[allow(dead_code)]
pub async fn query_path_info(&mut self, path: &str) -> IoResult<Option<UnkeyedValidPathInfo>> {
let store_path = StorePath::<String>::from_absolute_path(path.as_bytes())
.map_err(|e| IoError::new(IoErrorKind::InvalidInput, e.to_string()))?;
self.writer.write_value(&Operation::QueryPathInfo).await?;
self.writer.write_value(&store_path).await?;
self.writer.flush().await?;
loop {
let msg = self.reader.read_number().await?;
if msg == STDERR_LAST {
let has_value: bool = self.reader.read_value().await?;
if has_value {
use nix_compat::narinfo::Signature;
use nix_compat::nixhash::CAHash;
let deriver = self.reader.read_value().await?;
let nar_hash: String = self.reader.read_value().await?;
let references = self.reader.read_value().await?;
let registration_time = self.reader.read_value().await?;
let nar_size = self.reader.read_value().await?;
let ultimate = self.reader.read_value().await?;
let signatures: Vec<Signature<String>> = self.reader.read_value().await?;
let ca: Option<CAHash> = self.reader.read_value().await?;
let value = UnkeyedValidPathInfo {
deriver,
nar_hash,
references,
registration_time,
nar_size,
ultimate,
signatures,
ca,
};
return Ok(Some(value));
} else {
return Ok(None);
}
} else if msg == STDERR_ERROR {
let error_msg = self.read_daemon_error().await?;
return Err(IoError::other(error_msg));
} else {
let _data: String = self.reader.read_value().await?;
continue;
}
}
}
/// Ensure a path is available in the store
pub async fn ensure_path(&mut self, path: &str) -> IoResult<()> {
let store_path = StorePath::<String>::from_absolute_path(path.as_bytes())
.map_err(|e| IoError::new(IoErrorKind::InvalidInput, e.to_string()))?;
self.writer.write_value(&Operation::EnsurePath).await?;
self.writer.write_value(&store_path).await?;
self.writer.flush().await?;
loop {
let msg = self.reader.read_number().await?;
if msg == STDERR_LAST {
return Ok(());
} else if msg == STDERR_ERROR {
let error_msg = self.read_daemon_error().await?;
return Err(IoError::other(error_msg));
} else {
let _data: String = self.reader.read_value().await?;
continue;
}
}
}
/// Query which paths are valid
#[allow(dead_code)]
pub async fn query_valid_paths(&mut self, paths: Vec<String>) -> IoResult<Vec<String>> {
let store_paths: IoResult<Vec<StorePath<String>>> = paths
.iter()
.map(|p| {
StorePath::<String>::from_absolute_path(p.as_bytes())
.map_err(|e| IoError::new(IoErrorKind::InvalidInput, e.to_string()))
})
.collect();
let store_paths = store_paths?;
// Send operation
self.writer.write_value(&Operation::QueryValidPaths).await?;
// Manually serialize the request since QueryValidPaths doesn't impl NixSerialize
// QueryValidPaths = { paths: Vec<StorePath>, substitute: bool }
self.writer.write_value(&store_paths).await?;
// For protocol >= 1.27, send substitute flag
if self.protocol_version.minor() >= 27 {
self.writer.write_value(&false).await?;
}
self.writer.flush().await?;
let result: Vec<StorePath<String>> = self.read_response().await?;
Ok(result.into_iter().map(|p| p.to_absolute_path()).collect())
}
/// Add a NAR to the store
pub async fn add_to_store_nar(
&mut self,
request: AddToStoreNarRequest,
nar_data: &[u8],
) -> IoResult<()> {
tracing::debug!(
"add_to_store_nar: path={}, nar_size={}",
request.path.to_absolute_path(),
request.nar_size,
);
self.writer.write_value(&Operation::AddToStoreNar).await?;
self.writer.write_value(&request.path).await?;
self.writer.write_value(&request.deriver).await?;
let nar_hash_hex = hex::encode(request.nar_hash.as_ref());
self.writer.write_value(&nar_hash_hex).await?;
self.writer.write_value(&request.references).await?;
self.writer.write_value(&request.registration_time).await?;
self.writer.write_value(&request.nar_size).await?;
self.writer.write_value(&request.ultimate).await?;
self.writer.write_value(&request.signatures).await?;
self.writer.write_value(&request.ca).await?;
self.writer.write_value(&request.repair).await?;
self.writer.write_value(&request.dont_check_sigs).await?;
if self.protocol_version.minor() >= 23 {
self.writer.write_number(nar_data.len() as u64).await?;
self.writer.write_all(nar_data).await?;
self.writer.write_number(0u64).await?;
} else {
self.writer.write_slice(nar_data).await?;
}
self.writer.flush().await?;
loop {
let msg = self.reader.read_number().await?;
if msg == STDERR_LAST {
return Ok(());
} else if msg == STDERR_ERROR {
let error_msg = self.read_daemon_error().await?;
return Err(IoError::other(error_msg));
} else {
let _data: String = self.reader.read_value().await?;
continue;
}
}
}
}
/// Thread-safe wrapper around NixDaemonClient
pub struct NixDaemonConnection {
client: Mutex<NixDaemonClient>,
}
impl NixDaemonConnection {
/// Connect to a nix-daemon at the given socket path
pub async fn connect(socket_path: &Path) -> IoResult<Self> {
let client = NixDaemonClient::connect(socket_path).await?;
Ok(Self {
client: Mutex::new(client),
})
}
/// Check if a path is valid in the store
pub async fn is_valid_path(&self, path: &str) -> IoResult<bool> {
let mut client = self.client.lock().await;
client.is_valid_path(path).await
}
/// Query information about a store path
#[allow(dead_code)]
pub async fn query_path_info(&self, path: &str) -> IoResult<Option<UnkeyedValidPathInfo>> {
let mut client = self.client.lock().await;
client.query_path_info(path).await
}
/// Ensure a path is available in the store
pub async fn ensure_path(&self, path: &str) -> IoResult<()> {
let mut client = self.client.lock().await;
client.ensure_path(path).await
}
/// Query which paths are valid
#[allow(dead_code)]
pub async fn query_valid_paths(&self, paths: Vec<String>) -> IoResult<Vec<String>> {
let mut client = self.client.lock().await;
client.query_valid_paths(paths).await
}
/// Add a NAR to the store
pub async fn add_to_store_nar(
&self,
request: AddToStoreNarRequest,
nar_data: &[u8],
) -> IoResult<()> {
let mut client = self.client.lock().await;
client.add_to_store_nar(request, nar_data).await
}
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, IntoPrimitive, TryFromPrimitive)]
#[repr(u8)]
pub enum NixDaemonErrorLevel {
Error = 0,
Warn,
Notice,
Info,
Talkative,
Chatty,
Debug,
Vomit,
}
#[derive(Debug, Error)]
#[error("{msg}")]
pub struct NixDaemonError {
level: NixDaemonErrorLevel,
msg: String,
traces: Vec<String>,
}

34
nix-js/src/store/error.rs Normal file
View File

@@ -0,0 +1,34 @@
#![allow(dead_code)]
use std::fmt;
#[derive(Debug)]
pub enum StoreError {
DaemonConnectionFailed(String),
OperationFailed(String),
InvalidPath(String),
PathNotFound(String),
Io(std::io::Error),
}
impl fmt::Display for StoreError {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self {
StoreError::DaemonConnectionFailed(msg) => {
write!(f, "Failed to connect to nix-daemon: {}", msg)
}
StoreError::OperationFailed(msg) => write!(f, "Store operation failed: {}", msg),
StoreError::InvalidPath(msg) => write!(f, "Invalid store path: {}", msg),
StoreError::PathNotFound(path) => write!(f, "Path not found in store: {}", path),
StoreError::Io(e) => write!(f, "I/O error: {}", e),
}
}
}
impl std::error::Error for StoreError {}
impl From<std::io::Error> for StoreError {
fn from(e: std::io::Error) -> Self {
StoreError::Io(e)
}
}

View File

@@ -0,0 +1,153 @@
use crate::error::{Error, Result};
pub fn validate_store_path(store_dir: &str, path: &str) -> Result<()> {
if !path.starts_with(store_dir) {
return Err(Error::eval_error(
format!("path '{}' is not in the Nix store", path),
None,
));
}
let relative = path
.strip_prefix(store_dir)
.and_then(|s| s.strip_prefix('/'))
.ok_or_else(|| Error::eval_error(format!("invalid store path format: {}", path), None))?;
if relative.is_empty() {
return Err(Error::eval_error(
format!("store path cannot be store directory itself: {}", path),
None,
));
}
let parts: Vec<&str> = relative.splitn(2, '-').collect();
if parts.len() != 2 {
return Err(Error::eval_error(
format!("invalid store path format (missing name): {}", path),
None,
));
}
let hash = parts[0];
let name = parts[1];
if hash.len() != 32 {
return Err(Error::eval_error(
format!(
"invalid store path hash length (expected 32, got {}): {}",
hash.len(),
hash
),
None,
));
}
for ch in hash.chars() {
if !matches!(ch, '0'..='9' | 'a'..='d' | 'f'..='n' | 'p'..='s' | 'v'..='z') {
return Err(Error::eval_error(
format!("invalid character '{}' in store path hash: {}", ch, hash),
None,
));
}
}
if name.is_empty() {
return Err(Error::eval_error(
format!("store path has empty name: {}", path),
None,
));
}
if name.starts_with('.') {
return Err(Error::eval_error(
format!("store path name cannot start with '.': {}", name),
None,
));
}
for ch in name.chars() {
if !matches!(ch, '0'..='9' | 'a'..='z' | 'A'..='Z' | '+' | '-' | '.' | '_' | '?' | '=') {
return Err(Error::eval_error(
format!("invalid character '{}' in store path name: {}", ch, name),
None,
));
}
}
Ok(())
}
#[cfg(test)]
mod tests {
use super::*;
#[test_log::test]
fn test_valid_store_paths() {
let store_dir = "/nix/store";
let valid_paths = vec![
"/nix/store/0123456789abcdfghijklmnpqrsvwxyz-hello",
"/nix/store/abcdfghijklmnpqrsvwxyz0123456789-hello-1.0",
"/nix/store/00000000000000000000000000000000-test_+-.?=",
];
for path in valid_paths {
assert!(
validate_store_path(store_dir, path).is_ok(),
"Expected {} to be valid, got {:?}",
path,
validate_store_path(store_dir, path)
);
}
}
#[test_log::test]
fn test_invalid_store_paths() {
let store_dir = "/nix/store";
let invalid_paths = vec![
("/tmp/foo", "not in store"),
("/nix/store", "empty relative"),
("/nix/store/tooshort-name", "hash too short"),
(
"/nix/store/abc123defghijklmnopqrstuvwxyz123-name",
"hash too long",
),
(
"/nix/store/abcd1234abcd1234abcd1234abcd123e-name",
"e in hash",
),
(
"/nix/store/abcd1234abcd1234abcd1234abcd123o-name",
"o in hash",
),
(
"/nix/store/abcd1234abcd1234abcd1234abcd123u-name",
"u in hash",
),
(
"/nix/store/abcd1234abcd1234abcd1234abcd123t-name",
"t in hash",
),
(
"/nix/store/abcd1234abcd1234abcd1234abcd1234-.name",
"name starts with dot",
),
(
"/nix/store/abcd1234abcd1234abcd1234abcd1234-na/me",
"slash in name",
),
(
"/nix/store/abcd1234abcd1234abcd1234abcd1234",
"missing name",
),
];
for (path, reason) in invalid_paths {
assert!(
validate_store_path(store_dir, path).is_err(),
"Expected {} to be invalid ({})",
path,
reason
);
}
}
}

View File

@@ -0,0 +1,209 @@
use std::collections::{BTreeMap, BTreeSet, VecDeque};
pub enum StringContextElem {
Opaque { path: String },
DrvDeep { drv_path: String },
Built { drv_path: String, output: String },
}
impl StringContextElem {
pub fn decode(encoded: &str) -> Self {
if let Some(drv_path) = encoded.strip_prefix('=') {
StringContextElem::DrvDeep {
drv_path: drv_path.to_string(),
}
} else if let Some(rest) = encoded.strip_prefix('!') {
if let Some(second_bang) = rest.find('!') {
let output = rest[..second_bang].to_string();
let drv_path = rest[second_bang + 1..].to_string();
StringContextElem::Built { drv_path, output }
} else {
StringContextElem::Opaque {
path: encoded.to_string(),
}
}
} else {
StringContextElem::Opaque {
path: encoded.to_string(),
}
}
}
}
pub type InputDrvs = BTreeMap<String, BTreeSet<String>>;
pub type Srcs = BTreeSet<String>;
pub fn extract_input_drvs_and_srcs(context: &[String]) -> Result<(InputDrvs, Srcs), String> {
let mut input_drvs: BTreeMap<String, BTreeSet<String>> = BTreeMap::new();
let mut input_srcs: BTreeSet<String> = BTreeSet::new();
for encoded in context {
match StringContextElem::decode(encoded) {
StringContextElem::Opaque { path } => {
input_srcs.insert(path);
}
StringContextElem::DrvDeep { drv_path } => {
compute_fs_closure(&drv_path, &mut input_drvs, &mut input_srcs)?;
}
StringContextElem::Built { drv_path, output } => {
input_drvs.entry(drv_path).or_default().insert(output);
}
}
}
Ok((input_drvs, input_srcs))
}
fn compute_fs_closure(
drv_path: &str,
input_drvs: &mut BTreeMap<String, BTreeSet<String>>,
input_srcs: &mut BTreeSet<String>,
) -> Result<(), String> {
let mut queue: VecDeque<String> = VecDeque::new();
let mut visited: BTreeSet<String> = BTreeSet::new();
queue.push_back(drv_path.to_string());
while let Some(current_path) = queue.pop_front() {
if visited.contains(&current_path) {
continue;
}
visited.insert(current_path.clone());
input_srcs.insert(current_path.clone());
if !current_path.ends_with(".drv") {
continue;
}
let content = std::fs::read_to_string(&current_path)
.map_err(|e| format!("failed to read derivation {}: {}", current_path, e))?;
let inputs = parse_derivation_inputs(&content)
.ok_or_else(|| format!("failed to parse derivation {}", current_path))?;
for src in inputs.input_srcs {
input_srcs.insert(src.clone());
if !visited.contains(&src) {
queue.push_back(src);
}
}
for (dep_drv, outputs) in inputs.input_drvs {
input_srcs.insert(dep_drv.clone());
let entry = input_drvs.entry(dep_drv.clone()).or_default();
for output in outputs {
entry.insert(output);
}
if !visited.contains(&dep_drv) {
queue.push_back(dep_drv);
}
}
}
Ok(())
}
struct DerivationInputs {
input_drvs: Vec<(String, Vec<String>)>,
input_srcs: Vec<String>,
}
fn parse_derivation_inputs(aterm: &str) -> Option<DerivationInputs> {
let aterm = aterm.strip_prefix("Derive([")?;
let mut bracket_count: i32 = 1;
let mut pos = 0;
let bytes = aterm.as_bytes();
while pos < bytes.len() && bracket_count > 0 {
match bytes[pos] {
b'[' => bracket_count += 1,
b']' => bracket_count -= 1,
_ => {}
}
pos += 1;
}
if bracket_count != 0 {
return None;
}
let rest = &aterm[pos..];
let rest = rest.strip_prefix(",[")?;
let mut input_drvs = Vec::new();
let mut bracket_count: i32 = 1;
let mut start = 0;
pos = 0;
let bytes = rest.as_bytes();
while pos < bytes.len() && bracket_count > 0 {
match bytes[pos] {
b'[' => bracket_count += 1,
b']' => bracket_count -= 1,
b'(' if bracket_count == 1 => {
start = pos;
}
b')' if bracket_count == 1 => {
let entry = &rest[start + 1..pos];
if let Some((drv_path, outputs)) = parse_input_drv_entry(entry) {
input_drvs.push((drv_path, outputs));
}
}
_ => {}
}
pos += 1;
}
let rest = &rest[pos..];
let rest = rest.strip_prefix(",[")?;
let mut input_srcs = Vec::new();
bracket_count = 1;
pos = 0;
let bytes = rest.as_bytes();
while pos < bytes.len() && bracket_count > 0 {
match bytes[pos] {
b'[' => bracket_count += 1,
b']' => bracket_count -= 1,
b'"' if bracket_count == 1 => {
pos += 1;
let src_start = pos;
while pos < bytes.len() && bytes[pos] != b'"' {
if bytes[pos] == b'\\' && pos + 1 < bytes.len() {
pos += 2;
} else {
pos += 1;
}
}
let src = std::str::from_utf8(&bytes[src_start..pos]).ok()?;
input_srcs.push(src.to_string());
}
_ => {}
}
pos += 1;
}
Some(DerivationInputs {
input_drvs,
input_srcs,
})
}
fn parse_input_drv_entry(entry: &str) -> Option<(String, Vec<String>)> {
let entry = entry.strip_prefix('"')?;
let quote_end = entry.find('"')?;
let drv_path = entry[..quote_end].to_string();
let rest = &entry[quote_end + 1..];
let rest = rest.strip_prefix(",[")?;
let rest = rest.strip_suffix(']')?;
let mut outputs = Vec::new();
for part in rest.split(',') {
let part = part.trim();
if let Some(name) = part.strip_prefix('"').and_then(|s| s.strip_suffix('"')) {
outputs.push(name.to_string());
}
}
Some((drv_path, outputs))
}

View File

@@ -1,94 +1,85 @@
use core::fmt::{Debug, Display, Formatter, Result as FmtResult};
use core::hash::Hash;
use core::ops::Deref;
use std::borrow::Cow;
use std::collections::BTreeMap;
use std::ops::DerefMut;
use std::sync::LazyLock;
use derive_more::{Constructor, IsVariant, Unwrap};
use regex::Regex;
/// Represents a Nix symbol, which is used as a key in attribute sets.
#[derive(Debug, Clone, Hash, PartialEq, Eq, PartialOrd, Ord, Constructor)]
pub struct Symbol(String);
pub struct Symbol<'a>(Cow<'a, str>);
impl<T: Into<String>> From<T> for Symbol {
fn from(value: T) -> Self {
Symbol(value.into())
pub type StaticSymbol = Symbol<'static>;
impl From<String> for Symbol<'_> {
fn from(value: String) -> Self {
Symbol(Cow::Owned(value))
}
}
impl<'a> From<&'a str> for Symbol<'a> {
fn from(value: &'a str) -> Self {
Symbol(Cow::Borrowed(value))
}
}
/// Formats a string slice as a Nix symbol, quoting it if necessary.
pub fn format_symbol<'a>(sym: impl Into<Cow<'a, str>>) -> Cow<'a, str> {
let sym = sym.into();
if REGEX.is_match(&sym) {
if Symbol::NORMAL_REGEX.test(&sym) {
sym
} else {
Cow::Owned(format!(r#""{sym}""#))
Cow::Owned(escape_quote_string(&sym))
}
}
impl Display for Symbol {
impl Display for Symbol<'_> {
fn fmt(&self, f: &mut Formatter<'_>) -> FmtResult {
if self.normal() {
write!(f, "{}", self.0)
} else {
write!(f, r#""{}""#, self.0)
write!(f, "{}", escape_quote_string(&self.0))
}
}
}
static REGEX: LazyLock<Regex> = LazyLock::new(|| {
Regex::new(r"^[a-zA-Z_][a-zA-Z0-9_'-]*$").expect("hardcoded regex is always valid")
});
impl Symbol {
impl Symbol<'_> {
const NORMAL_REGEX: ere::Regex<1> = ere::compile_regex!("^[a-zA-Z_][a-zA-Z0-9_'-]*$");
/// Checks if the symbol is a "normal" identifier that doesn't require quotes.
fn normal(&self) -> bool {
REGEX.is_match(self)
Self::NORMAL_REGEX.test(self)
}
}
impl Deref for Symbol {
impl Deref for Symbol<'_> {
type Target = str;
fn deref(&self) -> &Self::Target {
&self.0
}
}
impl Symbol {
/// Consumes the `Symbol`, returning its inner `String`.
pub fn into_inner(self) -> String {
self.0
}
/// Returns a reference to the inner `String`.
pub fn as_inner(&self) -> &String {
&self.0
}
}
/// Represents a Nix attribute set, which is a map from symbols to values.
#[derive(Constructor, Clone, PartialEq)]
#[derive(Constructor, Default, Clone, PartialEq)]
pub struct AttrSet {
data: BTreeMap<Symbol, Value>,
data: BTreeMap<StaticSymbol, Value>,
}
impl AttrSet {
/// Gets a value by key (string or Symbol).
pub fn get(&self, key: impl Into<Symbol>) -> Option<&Value> {
pub fn get<'a, 'sym: 'a>(&'a self, key: impl Into<Symbol<'sym>>) -> Option<&'a Value> {
self.data.get(&key.into())
}
/// Checks if a key exists in the attribute set.
pub fn contains_key(&self, key: impl Into<Symbol>) -> bool {
pub fn contains_key<'a, 'sym: 'a>(&'a self, key: impl Into<Symbol<'sym>>) -> bool {
self.data.contains_key(&key.into())
}
}
impl Deref for AttrSet {
type Target = BTreeMap<Symbol, Value>;
type Target = BTreeMap<StaticSymbol, Value>;
fn deref(&self) -> &Self::Target {
&self.data
}
@@ -118,26 +109,52 @@ impl Debug for AttrSet {
impl Display for AttrSet {
fn fmt(&self, f: &mut Formatter<'_>) -> FmtResult {
use Value::*;
write!(f, "{{ ")?;
let mut first = true;
for (k, v) in self.data.iter() {
if !first {
write!(f, "; ")?;
if self.data.len() > 1 {
writeln!(f, "{{")?;
for (k, v) in self.data.iter() {
write!(f, " {k} = ")?;
match v {
List(_) => writeln!(f, "[ ... ];")?,
AttrSet(_) => writeln!(f, "{{ ... }};")?,
v => writeln!(f, "{v};")?,
}
}
write!(f, "{k} = ")?;
match v {
AttrSet(_) => write!(f, "{{ ... }}"),
List(_) => write!(f, "[ ... ]"),
v => write!(f, "{v}"),
}?;
first = false;
write!(f, "}}")
} else {
write!(f, "{{")?;
for (k, v) in self.data.iter() {
write!(f, " {k} = ")?;
match v {
List(_) => write!(f, "[ ... ];")?,
AttrSet(_) => write!(f, "{{ ... }};")?,
v => write!(f, "{v};")?,
}
}
write!(f, " }}")
}
}
}
impl AttrSet {
pub fn display_compat(&self) -> AttrSetCompatDisplay<'_> {
AttrSetCompatDisplay(self)
}
}
pub struct AttrSetCompatDisplay<'a>(&'a AttrSet);
impl Display for AttrSetCompatDisplay<'_> {
fn fmt(&self, f: &mut Formatter<'_>) -> FmtResult {
write!(f, "{{")?;
for (k, v) in self.0.data.iter() {
write!(f, " {k} = {};", v.display_compat())?;
}
write!(f, " }}")
}
}
/// Represents a Nix list, which is a vector of values.
#[derive(Constructor, Clone, Debug, PartialEq)]
#[derive(Constructor, Default, Clone, Debug, PartialEq)]
pub struct List {
data: Vec<Value>,
}
@@ -155,10 +172,45 @@ impl DerefMut for List {
}
impl Display for List {
fn fmt(&self, f: &mut Formatter<'_>) -> FmtResult {
use Value::*;
if self.data.len() > 1 {
writeln!(f, "[")?;
for v in self.data.iter() {
match v {
List(_) => writeln!(f, " [ ... ]")?,
AttrSet(_) => writeln!(f, " {{ ... }}")?,
v => writeln!(f, " {v}")?,
}
}
write!(f, "]")
} else {
write!(f, "[ ")?;
for v in self.data.iter() {
match v {
List(_) => write!(f, "[ ... ] ")?,
AttrSet(_) => write!(f, "{{ ... }} ")?,
v => write!(f, "{v} ")?,
}
}
write!(f, "]")
}
}
}
impl List {
pub fn display_compat(&self) -> ListCompatDisplay<'_> {
ListCompatDisplay(self)
}
}
pub struct ListCompatDisplay<'a>(&'a List);
impl Display for ListCompatDisplay<'_> {
fn fmt(&self, f: &mut Formatter<'_>) -> FmtResult {
write!(f, "[ ")?;
for v in self.data.iter() {
write!(f, "{v} ")?;
for v in self.0.data.iter() {
write!(f, "{} ", v.display_compat())?;
}
write!(f, "]")
}
@@ -177,6 +229,8 @@ pub enum Value {
Null,
/// A string value.
String(String),
/// A path value (absolute path string).
Path(String),
/// An attribute set.
AttrSet(AttrSet),
/// A list.
@@ -194,22 +248,125 @@ pub enum Value {
Repeated,
}
fn escape_quote_string(s: &str) -> String {
let mut ret = String::with_capacity(s.len() + 2);
ret.push('"');
let mut iter = s.chars().peekable();
while let Some(c) = iter.next() {
match c {
'\\' => ret.push_str("\\\\"),
'"' => ret.push_str("\\\""),
'\n' => ret.push_str("\\n"),
'\r' => ret.push_str("\\r"),
'\t' => ret.push_str("\\t"),
'$' if iter.peek() == Some(&'{') => ret.push_str("\\$"),
c => ret.push(c),
}
}
ret.push('"');
ret
}
/// Format a float matching C's `printf("%g", x)` with default precision 6.
fn fmt_nix_float(f: &mut Formatter<'_>, x: f64) -> FmtResult {
if !x.is_finite() {
return write!(f, "{x}");
}
if x == 0.0 {
return if x.is_sign_negative() {
write!(f, "-0")
} else {
write!(f, "0")
};
}
let precision: i32 = 6;
let exp = x.abs().log10().floor() as i32;
let formatted = if exp >= -4 && exp < precision {
let decimal_places = (precision - 1 - exp) as usize;
format!("{x:.decimal_places$}")
} else {
let sig_digits = (precision - 1) as usize;
let s = format!("{x:.sig_digits$e}");
let (mantissa, exp_part) = s
.split_once('e')
.expect("scientific notation must contain 'e'");
let (sign, digits) = if let Some(d) = exp_part.strip_prefix('-') {
("-", d)
} else if let Some(d) = exp_part.strip_prefix('+') {
("+", d)
} else {
("+", exp_part)
};
if digits.len() < 2 {
format!("{mantissa}e{sign}0{digits}")
} else {
format!("{mantissa}e{sign}{digits}")
}
};
if formatted.contains('.') {
if let Some(e_pos) = formatted.find('e') {
let trimmed = formatted[..e_pos]
.trim_end_matches('0')
.trim_end_matches('.');
write!(f, "{}{}", trimmed, &formatted[e_pos..])
} else {
let trimmed = formatted.trim_end_matches('0').trim_end_matches('.');
write!(f, "{trimmed}")
}
} else {
write!(f, "{formatted}")
}
}
impl Display for Value {
fn fmt(&self, f: &mut Formatter<'_>) -> FmtResult {
use Value::*;
match self {
&Int(x) => write!(f, "{x}"),
&Float(x) => write!(f, "{x}"),
&Float(x) => fmt_nix_float(f, x),
&Bool(x) => write!(f, "{x}"),
Null => write!(f, "null"),
String(x) => write!(f, r#""{x}""#),
String(x) => write!(f, "{}", escape_quote_string(x)),
Path(x) => write!(f, "{x}"),
AttrSet(x) => write!(f, "{x}"),
List(x) => write!(f, "{x}"),
Thunk => write!(f, "«code»"),
Func => write!(f, "«lambda»"),
PrimOp(name) => write!(f, "«primop {name}»"),
PrimOpApp(name) => write!(f, "«partially applied primop {name}»"),
Repeated => write!(f, "<REPEATED>"),
Repeated => write!(f, "«repeated»"),
}
}
}
impl Value {
pub fn display_compat(&self) -> ValueCompatDisplay<'_> {
ValueCompatDisplay(self)
}
}
pub struct ValueCompatDisplay<'a>(&'a Value);
impl Display for ValueCompatDisplay<'_> {
fn fmt(&self, f: &mut Formatter<'_>) -> FmtResult {
use Value::*;
match self.0 {
&Int(x) => write!(f, "{x}"),
&Float(x) => fmt_nix_float(f, x),
&Bool(x) => write!(f, "{x}"),
Null => write!(f, "null"),
String(x) => write!(f, "{}", escape_quote_string(x)),
Path(x) => write!(f, "{x}"),
AttrSet(x) => write!(f, "{}", x.display_compat()),
List(x) => write!(f, "{}", x.display_compat()),
Thunk => write!(f, "«thunk»"),
Func => write!(f, "<LAMBDA>"),
PrimOp(_) => write!(f, "<PRIMOP>"),
PrimOpApp(_) => write!(f, "<PRIMOP-APP>"),
Repeated => write!(f, "«repeated»"),
}
}
}

View File

@@ -1,149 +0,0 @@
mod utils;
use nix_js::value::{List, Value};
use utils::eval;
#[test]
fn builtins_accessible() {
let result = eval("builtins");
assert!(matches!(result, Value::AttrSet(_)));
}
#[test]
fn builtins_self_reference() {
let result = eval("builtins.builtins");
assert!(matches!(result, Value::AttrSet(_)));
}
#[test]
fn builtins_add() {
assert_eq!(eval("builtins.add 1 2"), Value::Int(3));
}
#[test]
fn builtins_length() {
assert_eq!(eval("builtins.length [1 2 3]"), Value::Int(3));
}
#[test]
fn builtins_map() {
assert_eq!(
eval("builtins.map (x: x * 2) [1 2 3]"),
Value::List(List::new(vec![Value::Int(2), Value::Int(4), Value::Int(6)]))
);
}
#[test]
fn builtins_filter() {
assert_eq!(
eval("builtins.filter (x: x > 1) [1 2 3]"),
Value::List(List::new(vec![Value::Int(2), Value::Int(3)]))
);
}
#[test]
fn builtins_attrnames() {
let result = eval("builtins.attrNames { a = 1; b = 2; }");
assert!(matches!(result, Value::List(_)));
if let Value::List(list) = result {
assert_eq!(format!("{:?}", list).matches(',').count() + 1, 2);
}
}
#[test]
fn builtins_head() {
assert_eq!(eval("builtins.head [1 2 3]"), Value::Int(1));
}
#[test]
fn builtins_tail() {
assert_eq!(
eval("builtins.tail [1 2 3]"),
Value::List(List::new(vec![Value::Int(2), Value::Int(3)]))
);
}
#[test]
fn builtins_in_let() {
assert_eq!(eval("let b = builtins; in b.add 5 3"), Value::Int(8));
}
#[test]
fn builtins_in_with() {
assert_eq!(eval("with builtins; add 10 20"), Value::Int(30));
}
#[test]
fn builtins_nested_calls() {
assert_eq!(
eval("builtins.add (builtins.mul 2 3) (builtins.sub 10 5)"),
Value::Int(11)
);
}
#[test]
fn builtins_is_list() {
assert_eq!(eval("builtins.isList [1 2 3]"), Value::Bool(true));
}
#[test]
fn builtins_is_attrs() {
assert_eq!(eval("builtins.isAttrs { a = 1; }"), Value::Bool(true));
}
#[test]
fn builtins_is_function() {
assert_eq!(eval("builtins.isFunction (x: x)"), Value::Bool(true));
}
#[test]
fn builtins_is_null() {
assert_eq!(eval("builtins.isNull null"), Value::Bool(true));
}
#[test]
fn builtins_is_bool() {
assert_eq!(eval("builtins.isBool true"), Value::Bool(true));
}
#[test]
fn builtins_shadowing() {
assert_eq!(
eval("let builtins = { add = x: y: x - y; }; in builtins.add 5 3"),
Value::Int(2)
);
}
#[test]
fn builtins_lazy_evaluation() {
let result = eval("builtins.builtins.builtins.add 1 1");
assert_eq!(result, Value::Int(2));
}
#[test]
fn builtins_foldl() {
assert_eq!(
eval("builtins.foldl' (acc: x: acc + x) 0 [1 2 3 4 5]"),
Value::Int(15)
);
}
#[test]
fn builtins_elem() {
assert_eq!(eval("builtins.elem 2 [1 2 3]"), Value::Bool(true));
assert_eq!(eval("builtins.elem 5 [1 2 3]"), Value::Bool(false));
}
#[test]
fn builtins_concat_lists() {
assert_eq!(
eval("builtins.concatLists [[1 2] [3 4] [5]]"),
Value::List(List::new(vec![
Value::Int(1),
Value::Int(2),
Value::Int(3),
Value::Int(4),
Value::Int(5)
]))
);
}

View File

@@ -1,102 +0,0 @@
mod utils;
use nix_js::context::Context;
use nix_js::value::Value;
#[test]
fn import_absolute_path() {
let mut ctx = Context::new().unwrap();
let temp_dir = tempfile::tempdir().unwrap();
let lib_path = temp_dir.path().join("nix_test_lib.nix");
std::fs::write(&lib_path, "{ add = a: b: a + b; }").unwrap();
let expr = format!(r#"(import "{}").add 3 5"#, lib_path.display());
assert_eq!(ctx.eval_code(&expr).unwrap(), Value::Int(8));
}
#[test]
fn import_nested() {
let mut ctx = Context::new().unwrap();
let temp_dir = tempfile::tempdir().unwrap();
let lib_path = temp_dir.path().join("lib.nix");
std::fs::write(&lib_path, "{ add = a: b: a + b; }").unwrap();
let main_path = temp_dir.path().join("main.nix");
let main_content = format!(
r#"let lib = import {}; in {{ result = lib.add 10 20; }}"#,
lib_path.display()
);
std::fs::write(&main_path, main_content).unwrap();
let expr = format!(r#"(import "{}").result"#, main_path.display());
assert_eq!(ctx.eval_code(&expr).unwrap(), Value::Int(30));
}
#[test]
fn import_relative_path() {
let mut ctx = Context::new().unwrap();
let temp_dir = tempfile::tempdir().unwrap();
let subdir = temp_dir.path().join("subdir");
std::fs::create_dir_all(&subdir).unwrap();
let lib_path = temp_dir.path().join("lib.nix");
std::fs::write(&lib_path, "{ multiply = a: b: a * b; }").unwrap();
let helper_path = subdir.join("helper.nix");
std::fs::write(&helper_path, "{ subtract = a: b: a - b; }").unwrap();
let main_path = temp_dir.path().join("main.nix");
let main_content = r#"
let
lib = import ./lib.nix;
helper = import ./subdir/helper.nix;
in {
result1 = lib.multiply 3 4;
result2 = helper.subtract 10 3;
}
"#;
std::fs::write(&main_path, main_content).unwrap();
let expr = format!(r#"let x = import "{}"; in x.result1"#, main_path.display());
assert_eq!(ctx.eval_code(&expr).unwrap(), Value::Int(12));
let expr = format!(r#"let x = import "{}"; in x.result2"#, main_path.display());
assert_eq!(ctx.eval_code(&expr).unwrap(), Value::Int(7));
}
#[test]
fn import_returns_function() {
let mut ctx = Context::new().unwrap();
let temp_dir = tempfile::tempdir().unwrap();
let func_path = temp_dir.path().join("nix_test_func.nix");
std::fs::write(&func_path, "x: x * 2").unwrap();
let expr = format!(r#"(import "{}") 5"#, func_path.display());
assert_eq!(ctx.eval_code(&expr).unwrap(), Value::Int(10));
}
#[test]
fn import_with_complex_dependency_graph() {
let mut ctx = Context::new().unwrap();
let temp_dir = tempfile::tempdir().unwrap();
let utils_path = temp_dir.path().join("utils.nix");
std::fs::write(&utils_path, "{ double = x: x * 2; }").unwrap();
let math_path = temp_dir.path().join("math.nix");
let math_content = r#"let utils = import ./utils.nix; in { triple = x: x + utils.double x; }"#;
std::fs::write(&math_path, math_content).unwrap();
let main_path = temp_dir.path().join("main.nix");
let main_content = r#"let math = import ./math.nix; in math.triple 5"#;
std::fs::write(&main_path, main_content).unwrap();
let expr = format!(r#"import "{}""#, main_path.display());
assert_eq!(ctx.eval_code(&expr).unwrap(), Value::Int(15));
}

View File

@@ -1,290 +0,0 @@
use nix_js::context::Context;
use nix_js::value::Value;
fn eval(expr: &str) -> Value {
let mut ctx = Context::new().unwrap();
ctx.eval_code(expr).unwrap_or_else(|e| panic!("{}", e))
}
#[test]
fn hascontext_plain_string() {
let result = eval(r#"builtins.hasContext "hello""#);
assert_eq!(result, Value::Bool(false));
}
#[test]
fn hascontext_derivation_output() {
let result = eval(
r#"
let
drv = derivation { name = "test"; builder = "/bin/sh"; system = "x86_64-linux"; };
in builtins.hasContext (builtins.toString drv)
"#,
);
assert_eq!(result, Value::Bool(true));
}
#[test]
fn getcontext_plain_string() {
let result = eval(r#"builtins.getContext "hello""#);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.is_empty(), "Plain string should have empty context");
}
_ => panic!("Expected AttrSet, got {:?}", result),
}
}
#[test]
fn getcontext_derivation_output() {
let result = eval(
r#"
let
drv = derivation { name = "test"; builder = "/bin/sh"; system = "x86_64-linux"; };
str = builtins.toString drv;
ctx = builtins.getContext str;
in builtins.attrNames ctx
"#,
);
match result {
Value::List(list) => {
assert_eq!(list.len(), 1, "Should have exactly one context entry");
match list.first().unwrap() {
Value::String(s) => {
assert!(s.ends_with(".drv"), "Context key should be a .drv path");
}
other => panic!("Expected String, got {:?}", other),
}
}
_ => panic!("Expected List, got {:?}", result),
}
}
#[test]
fn unsafediscardstringcontext() {
let result = eval(
r#"
let
drv = derivation { name = "test"; builder = "/bin/sh"; system = "x86_64-linux"; };
strWithContext = builtins.toString drv;
strWithoutContext = builtins.unsafeDiscardStringContext strWithContext;
in builtins.hasContext strWithoutContext
"#,
);
assert_eq!(result, Value::Bool(false));
}
#[test]
fn unsafediscardstringcontext_preserves_value() {
let result = eval(
r#"
let
drv = derivation { name = "test"; builder = "/bin/sh"; system = "x86_64-linux"; };
strWithContext = builtins.toString drv;
strWithoutContext = builtins.unsafeDiscardStringContext strWithContext;
in strWithContext == strWithoutContext
"#,
);
assert_eq!(result, Value::Bool(true));
}
#[test]
fn appendcontext_basic() {
let result = eval(
r#"
let
str = builtins.appendContext "hello" {
"/nix/store/0000000000000000000000000000000-test.drv" = { outputs = ["out"]; };
};
in builtins.hasContext str
"#,
);
assert_eq!(result, Value::Bool(true));
}
#[test]
fn appendcontext_preserves_value() {
let result = eval(
r#"
let
str = builtins.appendContext "hello" {
"/nix/store/0000000000000000000000000000000-test.drv" = { outputs = ["out"]; };
};
in str == "hello"
"#,
);
assert_eq!(result, Value::Bool(true));
}
#[test]
fn string_concat_merges_context() {
let result = eval(
r#"
let
drv1 = derivation { name = "test1"; builder = "/bin/sh"; system = "x86_64-linux"; };
drv2 = derivation { name = "test2"; builder = "/bin/sh"; system = "x86_64-linux"; };
str1 = builtins.toString drv1;
str2 = builtins.toString drv2;
combined = str1 + " " + str2;
ctx = builtins.getContext combined;
in builtins.length (builtins.attrNames ctx)
"#,
);
assert_eq!(result, Value::Int(2));
}
#[test]
fn string_add_merges_context() {
let result = eval(
r#"
let
drv1 = derivation { name = "test1"; builder = "/bin/sh"; system = "x86_64-linux"; };
drv2 = derivation { name = "test2"; builder = "/bin/sh"; system = "x86_64-linux"; };
str1 = builtins.toString drv1;
str2 = builtins.toString drv2;
combined = str1 + " " + str2;
ctx = builtins.getContext combined;
in builtins.length (builtins.attrNames ctx)
"#,
);
assert_eq!(result, Value::Int(2));
}
#[test]
fn context_in_derivation_args() {
let result = eval(
r#"
let
dep = derivation { name = "dep"; builder = "/bin/sh"; system = "x86_64-linux"; };
drv = derivation {
name = "test";
builder = "/bin/sh";
system = "x86_64-linux";
args = [ ((builtins.toString dep) + "/bin/run") ];
};
in drv.drvPath
"#,
);
match result {
Value::String(s) => {
assert!(s.starts_with("/nix/store/"), "Should be a store path");
assert!(s.ends_with(".drv"), "Should be a .drv file");
}
_ => panic!("Expected String, got {:?}", result),
}
}
#[test]
fn context_in_derivation_env() {
let result = eval(
r#"
let
dep = derivation { name = "dep"; builder = "/bin/sh"; system = "x86_64-linux"; };
drv = derivation {
name = "test";
builder = "/bin/sh";
system = "x86_64-linux";
myDep = builtins.toString dep;
};
in drv.drvPath
"#,
);
match result {
Value::String(s) => {
assert!(s.starts_with("/nix/store/"), "Should be a store path");
assert!(s.ends_with(".drv"), "Should be a .drv file");
}
_ => panic!("Expected String, got {:?}", result),
}
}
#[test]
fn tostring_preserves_context() {
let result = eval(
r#"
let
drv = derivation { name = "test"; builder = "/bin/sh"; system = "x86_64-linux"; };
str = builtins.toString drv;
in builtins.hasContext str
"#,
);
assert_eq!(result, Value::Bool(true));
}
#[test]
fn interpolation_derivation_returns_outpath() {
let result = eval(
r#"
let
drv = derivation { name = "test"; builder = "/bin/sh"; system = "x86_64-linux"; };
in "${drv}"
"#,
);
match result {
Value::String(s) => {
assert!(s.starts_with("/nix/store/"), "Should be a store path");
assert!(s.ends_with("-test"), "Should end with derivation name");
}
_ => panic!("Expected String, got {:?}", result),
}
}
#[test]
fn interpolation_derivation_has_context() {
let result = eval(
r#"
let
drv = derivation { name = "test"; builder = "/bin/sh"; system = "x86_64-linux"; };
in builtins.hasContext "${drv}"
"#,
);
assert_eq!(result, Value::Bool(true));
}
#[test]
fn interpolation_derivation_context_correct() {
let result = eval(
r#"
let
drv = derivation { name = "test"; builder = "/bin/sh"; system = "x86_64-linux"; };
ctx = builtins.getContext "${drv}";
keys = builtins.attrNames ctx;
drvPath = builtins.head keys;
in ctx.${drvPath}.outputs
"#,
);
match result {
Value::List(list) => {
assert_eq!(list.len(), 1);
assert_eq!(list.first().unwrap(), &Value::String("out".to_string()));
}
_ => panic!("Expected List with ['out'], got {:?}", result),
}
}
#[test]
fn interpolation_multiple_derivations() {
let result = eval(
r#"
let
drv1 = derivation { name = "test1"; builder = "/bin/sh"; system = "x86_64-linux"; };
drv2 = derivation { name = "test2"; builder = "/bin/sh"; system = "x86_64-linux"; };
combined = "prefix-${drv1}-middle-${drv2}-suffix";
ctx = builtins.getContext combined;
in builtins.length (builtins.attrNames ctx)
"#,
);
assert_eq!(result, Value::Int(2));
}
#[test]
fn interpolation_derivation_equals_tostring() {
let result = eval(
r#"
let
drv = derivation { name = "test"; builder = "/bin/sh"; system = "x86_64-linux"; };
in "${drv}" == builtins.toString drv
"#,
);
assert_eq!(result, Value::Bool(true));
}

View File

@@ -1,34 +1,33 @@
mod utils;
use nix_js::value::Value;
use utils::eval;
#[test]
use crate::utils::{eval, eval_result};
#[test_log::test]
fn arithmetic() {
assert_eq!(eval("1 + 1"), Value::Int(2));
}
#[test]
#[test_log::test]
fn simple_function_application() {
assert_eq!(eval("(x: x) 1"), Value::Int(1));
}
#[test]
#[test_log::test]
fn curried_function() {
assert_eq!(eval("(x: y: x - y) 2 1"), Value::Int(1));
}
#[test]
#[test_log::test]
fn rec_attrset() {
assert_eq!(eval("rec { b = a; a = 1; }.b"), Value::Int(1));
}
#[test]
#[test_log::test]
fn let_binding() {
assert_eq!(eval("let b = a; a = 1; in b"), Value::Int(1));
}
#[test]
#[test_log::test]
fn fibonacci() {
assert_eq!(
eval(
@@ -38,7 +37,7 @@ fn fibonacci() {
);
}
#[test]
#[test_log::test]
fn fixed_point_combinator() {
assert_eq!(
eval("((f: let x = f x; in x)(self: { x = 1; y = self.x + 1; })).y"),
@@ -46,20 +45,25 @@ fn fixed_point_combinator() {
);
}
#[test]
#[test_log::test]
fn conditional_true() {
assert_eq!(eval("if true then 1 else 0"), Value::Int(1));
}
#[test]
#[test_log::test]
fn conditional_false() {
assert_eq!(eval("if false then 1 else 0"), Value::Int(0));
}
#[test]
#[test_log::test]
fn nested_let() {
assert_eq!(
eval("let x = 1; in let y = x + 1; z = y + 1; in z"),
Value::Int(3)
);
}
#[test_log::test]
fn rec_inherit_fails() {
assert!(eval_result("{ inherit x; }").is_err());
}

View File

@@ -0,0 +1,326 @@
use std::collections::BTreeMap;
use nix_js::value::{AttrSet, List, Value};
use crate::utils::eval;
#[test_log::test]
fn builtins_accessible() {
let result = eval("builtins");
assert!(matches!(result, Value::AttrSet(_)));
}
#[test_log::test]
fn builtins_self_reference() {
let result = eval("builtins.builtins");
assert!(matches!(result, Value::AttrSet(_)));
}
#[test_log::test]
fn builtins_add() {
assert_eq!(eval("builtins.add 1 2"), Value::Int(3));
}
#[test_log::test]
fn builtins_length() {
assert_eq!(eval("builtins.length [1 2 3]"), Value::Int(3));
}
#[test_log::test]
fn builtins_map() {
assert_eq!(
eval("builtins.map (x: x * 2) [1 2 3]"),
Value::List(List::new(vec![Value::Int(2), Value::Int(4), Value::Int(6)]))
);
}
#[test_log::test]
fn builtins_filter() {
assert_eq!(
eval("builtins.filter (x: x > 1) [1 2 3]"),
Value::List(List::new(vec![Value::Int(2), Value::Int(3)]))
);
}
#[test_log::test]
fn builtins_attrnames() {
let result = eval("builtins.attrNames { a = 1; b = 2; }");
assert!(matches!(result, Value::List(_)));
if let Value::List(list) = result {
assert_eq!(format!("{:?}", list).matches(',').count() + 1, 2);
}
}
#[test_log::test]
fn builtins_head() {
assert_eq!(eval("builtins.head [1 2 3]"), Value::Int(1));
}
#[test_log::test]
fn builtins_tail() {
assert_eq!(
eval("builtins.tail [1 2 3]"),
Value::List(List::new(vec![Value::Int(2), Value::Int(3)]))
);
}
#[test_log::test]
fn builtins_in_let() {
assert_eq!(eval("let b = builtins; in b.add 5 3"), Value::Int(8));
}
#[test_log::test]
fn builtins_in_with() {
assert_eq!(eval("with builtins; add 10 20"), Value::Int(30));
}
#[test_log::test]
fn builtins_nested_calls() {
assert_eq!(
eval("builtins.add (builtins.mul 2 3) (builtins.sub 10 5)"),
Value::Int(11)
);
}
#[test_log::test]
fn builtins_is_list() {
assert_eq!(eval("builtins.isList [1 2 3]"), Value::Bool(true));
}
#[test_log::test]
fn builtins_is_attrs() {
assert_eq!(eval("builtins.isAttrs { a = 1; }"), Value::Bool(true));
}
#[test_log::test]
fn builtins_is_function() {
assert_eq!(eval("builtins.isFunction (x: x)"), Value::Bool(true));
}
#[test_log::test]
fn builtins_is_null() {
assert_eq!(eval("builtins.isNull null"), Value::Bool(true));
}
#[test_log::test]
fn builtins_is_bool() {
assert_eq!(eval("builtins.isBool true"), Value::Bool(true));
}
#[test_log::test]
fn builtins_shadowing() {
assert_eq!(
eval("let builtins = { add = x: y: x - y; }; in builtins.add 5 3"),
Value::Int(2)
);
}
#[test_log::test]
fn builtins_lazy_evaluation() {
let result = eval("builtins.builtins.builtins.add 1 1");
assert_eq!(result, Value::Int(2));
}
#[test_log::test]
fn builtins_foldl() {
assert_eq!(
eval("builtins.foldl' (acc: x: acc + x) 0 [1 2 3 4 5]"),
Value::Int(15)
);
}
#[test_log::test]
fn builtins_elem() {
assert_eq!(eval("builtins.elem 2 [1 2 3]"), Value::Bool(true));
assert_eq!(eval("builtins.elem 5 [1 2 3]"), Value::Bool(false));
}
#[test_log::test]
fn builtins_concat_lists() {
assert_eq!(
eval("builtins.concatLists [[1 2] [3 4] [5]]"),
Value::List(List::new(vec![
Value::Int(1),
Value::Int(2),
Value::Int(3),
Value::Int(4),
Value::Int(5)
]))
);
}
#[test_log::test]
fn builtins_compare_versions_basic() {
assert_eq!(
eval("builtins.compareVersions \"1.0\" \"2.3\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"2.1\" \"2.3\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3\" \"2.3\""),
Value::Int(0)
);
assert_eq!(
eval("builtins.compareVersions \"2.5\" \"2.3\""),
Value::Int(1)
);
assert_eq!(
eval("builtins.compareVersions \"3.1\" \"2.3\""),
Value::Int(1)
);
}
#[test_log::test]
fn builtins_compare_versions_components() {
assert_eq!(
eval("builtins.compareVersions \"2.3.1\" \"2.3\""),
Value::Int(1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3\" \"2.3.1\""),
Value::Int(-1)
);
}
#[test_log::test]
fn builtins_compare_versions_numeric_vs_alpha() {
// Numeric component comes before alpha component
assert_eq!(
eval("builtins.compareVersions \"2.3.1\" \"2.3a\""),
Value::Int(1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3a\" \"2.3.1\""),
Value::Int(-1)
);
}
#[test_log::test]
fn builtins_compare_versions_pre() {
// "pre" is special: comes before everything except another "pre"
assert_eq!(
eval("builtins.compareVersions \"2.3pre1\" \"2.3\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3pre3\" \"2.3pre12\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3pre1\" \"2.3c\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3pre1\" \"2.3q\""),
Value::Int(-1)
);
}
#[test_log::test]
fn builtins_compare_versions_alpha() {
// Alphabetic comparison
assert_eq!(
eval("builtins.compareVersions \"2.3a\" \"2.3c\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3c\" \"2.3a\""),
Value::Int(1)
);
}
#[test_log::test]
fn builtins_compare_versions_symmetry() {
// Test symmetry: compareVersions(a, b) == -compareVersions(b, a)
assert_eq!(
eval("builtins.compareVersions \"1.0\" \"2.3\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3\" \"1.0\""),
Value::Int(1)
);
}
#[test_log::test]
fn builtins_compare_versions_complex() {
// Complex version strings with multiple components
assert_eq!(
eval("builtins.compareVersions \"1.2.3.4\" \"1.2.3.5\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"1.2.10\" \"1.2.9\""),
Value::Int(1)
);
assert_eq!(
eval("builtins.compareVersions \"1.2a3\" \"1.2a10\""),
Value::Int(-1)
);
}
#[test_log::test]
fn builtins_generic_closure() {
assert_eq!(
eval(
"with builtins; length (genericClosure { startSet = [ { key = 1; } ]; operator = { key }: [ { key = key / 1.; } ]; a = 1; })"
),
Value::Int(1),
);
assert_eq!(
eval(
"with builtins; (elemAt (genericClosure { startSet = [ { key = 1; } ]; operator = { key }: [ { key = key / 1.; } ]; a = 1; }) 0).key"
),
Value::Int(1),
);
}
#[test_log::test]
fn builtins_function_args() {
assert_eq!(
eval("builtins.functionArgs (x: 1)"),
Value::AttrSet(AttrSet::default())
);
assert_eq!(
eval("builtins.functionArgs ({}: 1)"),
Value::AttrSet(AttrSet::default())
);
assert_eq!(
eval("builtins.functionArgs ({...}: 1)"),
Value::AttrSet(AttrSet::default())
);
assert_eq!(
eval("builtins.functionArgs ({a}: 1)"),
Value::AttrSet(AttrSet::new(BTreeMap::from([(
"a".into(),
Value::Bool(false)
)])))
);
assert_eq!(
eval("builtins.functionArgs ({a, b ? 1}: 1)"),
Value::AttrSet(AttrSet::new(BTreeMap::from([
("a".into(), Value::Bool(false)),
("b".into(), Value::Bool(true))
])))
);
assert_eq!(
eval("builtins.functionArgs ({a, b ? 1, ...}: 1)"),
Value::AttrSet(AttrSet::new(BTreeMap::from([
("a".into(), Value::Bool(false)),
("b".into(), Value::Bool(true))
])))
);
}
#[test_log::test]
fn builtins_parse_drv_name() {
let result = eval(r#"builtins.parseDrvName "nix-js-0.1.0pre""#).unwrap_attr_set();
assert_eq!(result.get("name"), Some(&Value::String("nix-js".into())));
assert_eq!(
result.get("version"),
Some(&Value::String("0.1.0pre".into()))
);
}

View File

@@ -0,0 +1,193 @@
use nix_js::value::Value;
use crate::utils::eval_result;
#[test_log::test]
fn to_file_simple() {
let result =
eval_result(r#"builtins.toFile "hello.txt" "Hello, World!""#).expect("Failed to evaluate");
match result {
Value::String(path) => {
assert!(path.contains("-hello.txt"));
assert!(std::path::Path::new(&path).exists());
let contents = std::fs::read_to_string(&path).expect("Failed to read file");
assert_eq!(contents, "Hello, World!");
}
_ => panic!("Expected string, got {:?}", result),
}
}
#[test_log::test]
fn to_file_with_references() {
let result = eval_result(
r#"
let
dep = builtins.toFile "dep.txt" "dependency";
in
builtins.toFile "main.txt" "Reference: ${dep}"
"#,
)
.expect("Failed to evaluate");
match result {
Value::String(path) => {
assert!(path.contains("-main.txt"));
let contents = std::fs::read_to_string(&path).expect("Failed to read file");
assert!(contents.contains("Reference: "));
assert!(contents.contains("-dep.txt"));
}
_ => panic!("Expected string"),
}
}
#[test_log::test]
fn to_file_invalid_name_with_slash() {
let result = eval_result(r#"builtins.toFile "foo/bar.txt" "content""#);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.to_string()
.contains("name cannot contain '/'")
);
}
#[test_log::test]
fn to_file_invalid_name_dot() {
let result = eval_result(r#"builtins.toFile "." "content""#);
assert!(result.is_err());
assert!(result.unwrap_err().to_string().contains("invalid name"));
}
#[test_log::test]
fn to_file_invalid_name_dotdot() {
let result = eval_result(r#"builtins.toFile ".." "content""#);
assert!(result.is_err());
assert!(result.unwrap_err().to_string().contains("invalid name"));
}
#[test_log::test]
fn store_path_validation_not_in_store() {
let result = eval_result(r#"builtins.storePath "/tmp/foo""#);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.to_string()
.contains("not in the Nix store")
);
}
#[test_log::test]
fn store_path_validation_malformed_hash() {
let dummy_file_result = eval_result(r#"builtins.toFile "dummy.txt" "content""#)
.expect("Failed to create dummy file");
let dummy_path = match dummy_file_result {
Value::String(ref p) => p.clone(),
_ => panic!("Expected string"),
};
let store_dir = std::path::Path::new(&dummy_path)
.parent()
.expect("Failed to get parent dir")
.to_str()
.expect("Failed to convert to string");
let test_path = format!("{}/invalid-hash-hello", store_dir);
let result = eval_result(&format!(r#"builtins.storePath "{}""#, test_path));
assert!(result.is_err());
let err_str = result.unwrap_err().to_string();
assert!(
err_str.contains("invalid") || err_str.contains("hash"),
"Expected hash validation error, got: {}",
err_str
);
}
#[test_log::test]
fn store_path_validation_missing_name() {
let dummy_file_result = eval_result(r#"builtins.toFile "dummy.txt" "content""#)
.expect("Failed to create dummy file");
let dummy_path = match dummy_file_result {
Value::String(ref p) => p.clone(),
_ => panic!("Expected string"),
};
let store_dir = std::path::Path::new(&dummy_path)
.parent()
.expect("Failed to get parent dir")
.to_str()
.expect("Failed to convert to string");
let test_path = format!("{}/abcd1234abcd1234abcd1234abcd1234", store_dir);
let result = eval_result(&format!(r#"builtins.storePath "{}""#, test_path));
assert!(result.is_err());
let err_str = result.unwrap_err().to_string();
assert!(
err_str.contains("missing name") || err_str.contains("format"),
"Expected missing name error, got: {}",
err_str
);
}
#[test_log::test]
fn to_file_curried_application() {
let result = eval_result(
r#"
let
makeFile = builtins.toFile "test.txt";
in
makeFile "test content"
"#,
)
.expect("Failed to evaluate");
match result {
Value::String(path) => {
assert!(path.contains("-test.txt"));
let contents = std::fs::read_to_string(&path).expect("Failed to read file");
assert_eq!(contents, "test content");
}
_ => panic!("Expected string"),
}
}
#[test_log::test]
fn to_file_number_conversion() {
let result = eval_result(r#"builtins.toFile "number.txt" (builtins.toString 42)"#)
.expect("Failed to evaluate");
match result {
Value::String(path) => {
let contents = std::fs::read_to_string(&path).expect("Failed to read file");
assert_eq!(contents, "42");
}
_ => panic!("Expected string"),
}
}
#[test_log::test]
fn to_file_list_conversion() {
let result = eval_result(
r#"builtins.toFile "list.txt" (builtins.concatStringsSep "\n" ["line1" "line2" "line3"])"#,
)
.expect("Failed to evaluate");
match result {
Value::String(path) => {
let contents = std::fs::read_to_string(&path).expect("Failed to read file");
assert_eq!(contents, "line1\nline2\nline3");
}
_ => panic!("Expected string"),
}
}

View File

@@ -1,14 +1,48 @@
use nix_js::context::Context;
use nix_js::value::Value;
#[test]
use crate::utils::{eval_deep, eval_deep_result};
#[test_log::test]
fn add_operator_preserves_derivation_context() {
let result = eval_deep(
r#"
let
dep = derivation { name = "dep"; builder = "/bin/sh"; system = "x86_64-linux"; outputs = ["out" "dev"]; };
getOutput = output: pkg: pkg.${output} or pkg.out or pkg;
user = derivation {
name = "user";
builder = "/bin/sh";
system = "x86_64-linux";
libPath = (getOutput "lib" dep) + "/lib";
devPath = dep.dev + "/include";
};
in user.drvPath
"#,
);
let nix_result = eval_deep(
r#"
let
dep = derivation { name = "dep"; builder = "/bin/sh"; system = "x86_64-linux"; outputs = ["out" "dev"]; };
user = derivation {
name = "user";
builder = "/bin/sh";
system = "x86_64-linux";
libPath = "${dep.out}/lib";
devPath = "${dep.dev}/include";
};
in user.drvPath
"#,
);
assert_eq!(result, nix_result);
}
#[test_log::test]
fn derivation_minimal() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation { name = "hello"; builder = "/bin/sh"; system = "x86_64-linux"; }"#,
)
.unwrap();
let result = eval_deep(
r#"derivation { name = "hello"; builder = "/bin/sh"; system = "x86_64-linux"; }"#,
);
match result {
Value::AttrSet(attrs) => {
@@ -42,19 +76,16 @@ fn derivation_minimal() {
}
}
#[test]
#[test_log::test]
fn derivation_with_args() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "test";
builder = "/bin/sh";
system = "x86_64-linux";
args = ["-c" "echo hello"];
}"#,
)
.unwrap();
let result = eval_deep(
r#"derivation {
name = "test";
builder = "/bin/sh";
system = "x86_64-linux";
args = ["-c" "echo hello"];
}"#,
);
match result {
Value::AttrSet(attrs) => {
@@ -67,14 +98,11 @@ fn derivation_with_args() {
}
}
#[test]
#[test_log::test]
fn derivation_to_string() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"toString (derivation { name = "foo"; builder = "/bin/sh"; system = "x86_64-linux"; })"#,
)
.unwrap();
let result = eval_deep(
r#"toString (derivation { name = "foo"; builder = "/bin/sh"; system = "x86_64-linux"; })"#,
);
match result {
Value::String(s) => assert_eq!(s, "/nix/store/xpcvxsx5sw4rbq666blz6sxqlmsqphmr-foo"),
@@ -82,20 +110,19 @@ fn derivation_to_string() {
}
}
#[test]
#[test_log::test]
fn derivation_missing_name() {
let mut ctx = Context::new().unwrap();
let result = ctx.eval_code(r#"derivation { builder = "/bin/sh"; system = "x86_64-linux"; }"#);
let result =
eval_deep_result(r#"derivation { builder = "/bin/sh"; system = "x86_64-linux"; }"#);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("missing required attribute 'name'"));
}
#[test]
#[test_log::test]
fn derivation_invalid_name_with_drv_suffix() {
let mut ctx = Context::new().unwrap();
let result = ctx.eval_code(
let result = eval_deep_result(
r#"derivation { name = "foo.drv"; builder = "/bin/sh"; system = "x86_64-linux"; }"#,
);
@@ -104,40 +131,35 @@ fn derivation_invalid_name_with_drv_suffix() {
assert!(err_msg.contains("cannot end with .drv"));
}
#[test]
#[test_log::test]
fn derivation_missing_builder() {
let mut ctx = Context::new().unwrap();
let result = ctx.eval_code(r#"derivation { name = "test"; system = "x86_64-linux"; }"#);
let result = eval_deep_result(r#"derivation { name = "test"; system = "x86_64-linux"; }"#);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("missing required attribute 'builder'"));
}
#[test]
#[test_log::test]
fn derivation_missing_system() {
let mut ctx = Context::new().unwrap();
let result = ctx.eval_code(r#"derivation { name = "test"; builder = "/bin/sh"; }"#);
let result = eval_deep_result(r#"derivation { name = "test"; builder = "/bin/sh"; }"#);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("missing required attribute 'system'"));
}
#[test]
#[test_log::test]
fn derivation_with_env_vars() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "test";
builder = "/bin/sh";
system = "x86_64-linux";
MY_VAR = "hello";
ANOTHER = "world";
}"#,
)
.unwrap();
let result = eval_deep(
r#"derivation {
name = "test";
builder = "/bin/sh";
system = "x86_64-linux";
MY_VAR = "hello";
ANOTHER = "world";
}"#,
);
match result {
Value::AttrSet(attrs) => {
@@ -148,33 +170,29 @@ fn derivation_with_env_vars() {
}
}
#[test]
#[test_log::test]
fn derivation_strict() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"builtins.derivationStrict { name = "test"; builder = "/bin/sh"; system = "x86_64-linux"; }"#,
)
.unwrap();
let result = eval_deep(
r#"builtins.derivationStrict { name = "test"; builder = "/bin/sh"; system = "x86_64-linux"; }"#,
);
match result {
Value::AttrSet(attrs) => {
assert_eq!(attrs.get("type"), Some(&Value::String("derivation".into())));
assert!(attrs.contains_key("drvPath"));
assert!(attrs.contains_key("outPath"));
assert!(attrs.contains_key("out"));
assert!(!attrs.contains_key("type"));
assert!(!attrs.contains_key("outPath"));
}
_ => panic!("Expected AttrSet"),
}
}
#[test]
#[test_log::test]
fn derivation_deterministic_paths() {
let mut ctx = Context::new().unwrap();
let expr = r#"derivation { name = "hello"; builder = "/bin/sh"; system = "x86_64-linux"; }"#;
let result1 = ctx.eval_code(expr).unwrap();
let result2 = ctx.eval_code(expr).unwrap();
let result1 = eval_deep(expr);
let result2 = eval_deep(expr);
match (result1, result2) {
(Value::AttrSet(attrs1), Value::AttrSet(attrs2)) => {
@@ -185,19 +203,16 @@ fn derivation_deterministic_paths() {
}
}
#[test]
#[test_log::test]
fn derivation_escaping_in_aterm() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "test";
builder = "/bin/sh";
system = "x86_64-linux";
args = ["-c" "echo \"hello\nworld\""];
}"#,
)
.unwrap();
let result = eval_deep(
r#"derivation {
name = "test";
builder = "/bin/sh";
system = "x86_64-linux";
args = ["-c" "echo \"hello\nworld\""];
}"#,
);
match result {
Value::AttrSet(attrs) => {
@@ -208,53 +223,34 @@ fn derivation_escaping_in_aterm() {
}
}
#[test]
#[test_log::test]
fn multi_output_two_outputs() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "multi";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out" "dev"];
}"#,
)
.unwrap();
let drv = eval_deep(
r#"derivation {
name = "multi";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out" "dev"];
}"#,
);
match result {
match drv {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("drvPath"));
assert!(attrs.contains_key("out"));
assert!(attrs.contains_key("dev"));
assert!(attrs.contains_key("outPath"));
assert!(attrs.contains_key("drvPath"));
// Verify exact paths match CppNix
if let Some(Value::String(drv_path)) = attrs.get("drvPath") {
assert_eq!(
drv_path,
"/nix/store/vmyjryfipkn9ss3ya23hk8p3m58l6dsl-multi.drv"
);
} else {
panic!("drvPath should be a string");
}
if let Some(Value::String(out_path)) = attrs.get("out") {
assert_eq!(
out_path,
"/nix/store/a3d95yg9d215c54n0ybr4npmpnj29229-multi"
panic!(
"drvPath should be a string, got: {:?}",
attrs.get("drvPath")
);
} else {
panic!("out should be a string");
}
if let Some(Value::String(dev_path)) = attrs.get("dev") {
assert_eq!(
dev_path,
"/nix/store/hq3b99lz71gwfq6x8lqwg14hf929q0d2-multi-dev"
);
} else {
panic!("dev should be a string");
}
if let Some(Value::String(out_path)) = attrs.get("outPath") {
@@ -262,26 +258,24 @@ fn multi_output_two_outputs() {
out_path,
"/nix/store/a3d95yg9d215c54n0ybr4npmpnj29229-multi"
);
assert_eq!(attrs.get("out"), Some(&Value::String(out_path.clone())));
} else {
panic!("outPath should be a string");
}
}
_ => panic!("Expected AttrSet"),
}
}
#[test]
#[test_log::test]
fn multi_output_three_outputs() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "three";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out" "dev" "doc"];
}"#,
)
.unwrap();
let result = eval_deep(
r#"derivation {
name = "three";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out" "dev" "doc"];
}"#,
);
match result {
Value::AttrSet(attrs) => {
@@ -320,19 +314,16 @@ fn multi_output_three_outputs() {
}
}
#[test]
#[test_log::test]
fn multi_output_backward_compat() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "compat";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out"];
}"#,
)
.unwrap();
let result = eval_deep(
r#"derivation {
name = "compat";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out"];
}"#,
);
match result {
Value::AttrSet(attrs) => {
@@ -349,49 +340,41 @@ fn multi_output_backward_compat() {
}
}
#[test]
#[test_log::test]
fn multi_output_deterministic() {
let mut ctx = Context::new().unwrap();
let result1 = ctx
.eval_code(
r#"derivation {
name = "determ";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out" "dev"];
}"#,
)
.unwrap();
let result1 = eval_deep(
r#"derivation {
name = "determ";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out" "dev"];
}"#,
);
let result2 = ctx
.eval_code(
r#"derivation {
name = "determ";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out" "dev"];
}"#,
)
.unwrap();
let result2 = eval_deep(
r#"derivation {
name = "determ";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out" "dev"];
}"#,
);
assert_eq!(result1, result2);
}
#[test]
#[test_log::test]
fn fixed_output_sha256_flat() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "fixed";
builder = "/bin/sh";
system = "x86_64-linux";
outputHash = "0000000000000000000000000000000000000000000000000000000000000000";
outputHashAlgo = "sha256";
outputHashMode = "flat";
}"#,
)
.unwrap();
let result = eval_deep(
r#"derivation {
name = "fixed";
builder = "/bin/sh";
system = "x86_64-linux";
outputHash = "0000000000000000000000000000000000000000000000000000000000000000";
outputHashAlgo = "sha256";
outputHashMode = "flat";
}"#,
);
match result {
Value::AttrSet(attrs) => {
@@ -417,47 +400,33 @@ fn fixed_output_sha256_flat() {
}
}
#[test]
fn fixed_output_default_algo() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
#[test_log::test]
fn fixed_output_missing_hashalgo() {
assert!(
eval_deep_result(
r#"derivation {
name = "default";
builder = "/bin/sh";
system = "x86_64-linux";
outputHash = "0000000000000000000000000000000000000000000000000000000000000000";
}"#,
name = "default";
builder = "/bin/sh";
system = "x86_64-linux";
outputHash = "0000000000000000000000000000000000000000000000000000000000000000";
}"#,
)
.unwrap();
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("outPath"));
// Verify it defaults to sha256 (same as explicitly specifying it)
if let Some(Value::String(out_path)) = attrs.get("outPath") {
assert!(out_path.contains("/nix/store/"));
}
}
_ => panic!("Expected AttrSet"),
}
.is_err()
);
}
#[test]
#[test_log::test]
fn fixed_output_recursive_mode() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "recursive";
builder = "/bin/sh";
system = "x86_64-linux";
outputHash = "1111111111111111111111111111111111111111111111111111111111111111";
outputHashAlgo = "sha256";
outputHashMode = "recursive";
}"#,
)
.unwrap();
let result = eval_deep(
r#"derivation {
name = "recursive";
builder = "/bin/sh";
system = "x86_64-linux";
outputHash = "1111111111111111111111111111111111111111111111111111111111111111";
outputHashAlgo = "sha256";
outputHashMode = "recursive";
}"#,
);
match result {
Value::AttrSet(attrs) => {
@@ -476,15 +445,15 @@ fn fixed_output_recursive_mode() {
}
}
#[test]
#[test_log::test]
fn fixed_output_rejects_multi_output() {
let mut ctx = Context::new().unwrap();
let result = ctx.eval_code(
let result = eval_deep_result(
r#"derivation {
name = "invalid";
builder = "/bin/sh";
system = "x86_64-linux";
outputHash = "0000000000000000000000000000000000000000000000000000000000000000";
outputHashAlgo = "sha256";
outputs = ["out" "dev"];
}"#,
);
@@ -494,10 +463,9 @@ fn fixed_output_rejects_multi_output() {
assert!(err_msg.contains("fixed-output") && err_msg.contains("one"));
}
#[test]
#[test_log::test]
fn fixed_output_invalid_hash_mode() {
let mut ctx = Context::new().unwrap();
let result = ctx.eval_code(
let result = eval_deep_result(
r#"derivation {
name = "invalid";
builder = "/bin/sh";
@@ -512,62 +480,55 @@ fn fixed_output_invalid_hash_mode() {
assert!(err_msg.contains("outputHashMode") && err_msg.contains("invalid"));
}
#[test]
#[test_log::test]
fn structured_attrs_basic() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "struct";
builder = "/bin/sh";
system = "x86_64-linux";
__structuredAttrs = true;
foo = "bar";
count = 42;
enabled = true;
}"#,
)
.unwrap();
let result = eval_deep(
r#"derivation {
name = "struct";
builder = "/bin/sh";
system = "x86_64-linux";
__structuredAttrs = true;
foo = "bar";
count = 42;
enabled = true;
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("drvPath"));
assert!(attrs.contains_key("outPath"));
assert!(!attrs.contains_key("foo"));
assert!(!attrs.contains_key("count"));
assert!(attrs.contains_key("foo"));
assert!(attrs.contains_key("count"));
}
_ => panic!("Expected AttrSet"),
}
}
#[test]
#[test_log::test]
fn structured_attrs_nested() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "nested";
builder = "/bin/sh";
system = "x86_64-linux";
__structuredAttrs = true;
data = { x = 1; y = [2 3]; };
}"#,
)
.unwrap();
let result = eval_deep(
r#"derivation {
name = "nested";
builder = "/bin/sh";
system = "x86_64-linux";
__structuredAttrs = true;
data = { x = 1; y = [2 3]; };
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("drvPath"));
assert!(!attrs.contains_key("data"));
assert!(attrs.contains_key("data"));
}
_ => panic!("Expected AttrSet"),
}
}
#[test]
#[test_log::test]
fn structured_attrs_rejects_functions() {
let mut ctx = Context::new().unwrap();
let result = ctx.eval_code(
let result = eval_deep_result(
r#"derivation {
name = "invalid";
builder = "/bin/sh";
@@ -579,23 +540,20 @@ fn structured_attrs_rejects_functions() {
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("function") && err_msg.contains("serialize"));
assert!(err_msg.contains("cannot convert lambda to JSON"));
}
#[test]
#[test_log::test]
fn structured_attrs_false() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "normal";
builder = "/bin/sh";
system = "x86_64-linux";
__structuredAttrs = false;
foo = "bar";
}"#,
)
.unwrap();
let result = eval_deep(
r#"derivation {
name = "normal";
builder = "/bin/sh";
system = "x86_64-linux";
__structuredAttrs = false;
foo = "bar";
}"#,
);
match result {
Value::AttrSet(attrs) => {
@@ -608,45 +566,39 @@ fn structured_attrs_false() {
}
}
#[test]
#[test_log::test]
fn ignore_nulls_true() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "ignore";
builder = "/bin/sh";
system = "x86_64-linux";
__ignoreNulls = true;
foo = "bar";
nullValue = null;
}"#,
)
.unwrap();
let result = eval_deep(
r#"derivation {
name = "ignore";
builder = "/bin/sh";
system = "x86_64-linux";
__ignoreNulls = true;
foo = "bar";
nullValue = null;
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("foo"));
assert!(!attrs.contains_key("nullValue"));
assert!(attrs.contains_key("nullValue"));
}
_ => panic!("Expected AttrSet"),
}
}
#[test]
#[test_log::test]
fn ignore_nulls_false() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "keep";
builder = "/bin/sh";
system = "x86_64-linux";
__ignoreNulls = false;
nullValue = null;
}"#,
)
.unwrap();
let result = eval_deep(
r#"derivation {
name = "keep";
builder = "/bin/sh";
system = "x86_64-linux";
__ignoreNulls = false;
nullValue = null;
}"#,
);
match result {
Value::AttrSet(attrs) => {
@@ -659,84 +611,76 @@ fn ignore_nulls_false() {
}
}
#[test]
#[test_log::test]
fn ignore_nulls_with_structured_attrs() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "combined";
builder = "/bin/sh";
system = "x86_64-linux";
__structuredAttrs = true;
__ignoreNulls = true;
foo = "bar";
nullValue = null;
}"#,
)
.unwrap();
let result = eval_deep(
r#"derivation {
name = "combined";
builder = "/bin/sh";
system = "x86_64-linux";
__structuredAttrs = true;
__ignoreNulls = true;
foo = "bar";
nullValue = null;
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("drvPath"));
assert!(!attrs.contains_key("foo"));
assert!(!attrs.contains_key("nullValue"));
assert!(attrs.contains_key("foo"));
assert!(attrs.contains_key("nullValue"));
}
_ => panic!("Expected AttrSet"),
}
}
#[test]
#[test_log::test]
fn all_features_combined() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "all";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out" "dev"];
__structuredAttrs = true;
__ignoreNulls = true;
data = { x = 1; };
nullValue = null;
}"#,
)
.unwrap();
let result = eval_deep(
r#"derivation {
name = "all";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out" "dev"];
__structuredAttrs = true;
__ignoreNulls = true;
data = { x = 1; };
nullValue = null;
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("out"));
assert!(attrs.contains_key("dev"));
assert!(attrs.contains_key("outPath"));
assert!(!attrs.contains_key("data"));
assert!(!attrs.contains_key("nullValue"));
assert!(attrs.contains_key("data"));
assert!(attrs.contains_key("nullValue"));
}
_ => panic!("Expected AttrSet"),
}
}
#[test]
#[test_log::test]
fn fixed_output_with_structured_attrs() {
let mut ctx = Context::new().unwrap();
let result = ctx
.eval_code(
r#"derivation {
name = "fixstruct";
builder = "/bin/sh";
system = "x86_64-linux";
outputHash = "abc123";
__structuredAttrs = true;
data = { key = "value"; };
}"#,
)
.unwrap();
let result = eval_deep(
r#"derivation {
name = "fixstruct";
builder = "/bin/sh";
system = "x86_64-linux";
outputHash = "0000000000000000000000000000000000000000000000000000000000000000";
outputHashAlgo = "sha256";
__structuredAttrs = true;
data = { key = "value"; };
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("outPath"));
assert!(attrs.contains_key("drvPath"));
assert!(!attrs.contains_key("data"));
assert!(attrs.contains_key("data"));
}
_ => panic!("Expected AttrSet"),
}

View File

@@ -0,0 +1,36 @@
use crate::utils::eval;
#[test_log::test]
fn test_find_file_corepkg_fetchurl() {
let result = eval(
r#"
let
searchPath = [];
lookupPath = "nix/fetchurl.nix";
in
builtins.findFile searchPath lookupPath
"#,
);
assert!(result.to_string().contains("fetchurl.nix"));
}
#[test_log::test]
fn test_lookup_path_syntax() {
let result = eval(r#"<nix/fetchurl.nix>"#);
assert!(result.to_string().contains("fetchurl.nix"));
}
#[test_log::test]
fn test_import_corepkg() {
let result = eval(
r#"
let
fetchurl = import <nix/fetchurl.nix>;
in
builtins.typeOf fetchurl
"#,
);
assert_eq!(result.to_string(), "\"lambda\"");
}

View File

@@ -1,24 +1,23 @@
mod utils;
use nix_js::value::{List, Value};
use utils::eval;
#[test]
use crate::utils::{eval, eval_result};
#[test_log::test]
fn true_literal() {
assert_eq!(eval("true"), Value::Bool(true));
}
#[test]
#[test_log::test]
fn false_literal() {
assert_eq!(eval("false"), Value::Bool(false));
}
#[test]
#[test_log::test]
fn null_literal() {
assert_eq!(eval("null"), Value::Null);
}
#[test]
#[test_log::test]
fn map_function() {
assert_eq!(
eval("map (x: x * 2) [1 2 3]"),
@@ -26,23 +25,23 @@ fn map_function() {
);
}
#[test]
#[test_log::test]
fn is_null_function() {
assert_eq!(eval("isNull null"), Value::Bool(true));
assert_eq!(eval("isNull 5"), Value::Bool(false));
}
#[test]
#[test_log::test]
fn shadow_true() {
assert_eq!(eval("let true = false; in true"), Value::Bool(false));
}
#[test]
#[test_log::test]
fn shadow_map() {
assert_eq!(eval("let map = x: y: x; in map 1 2"), Value::Int(1));
}
#[test]
#[test_log::test]
fn mixed_usage() {
assert_eq!(
eval("if true then map (x: x + 1) [1 2] else []"),
@@ -50,7 +49,7 @@ fn mixed_usage() {
);
}
#[test]
#[test_log::test]
fn in_let_bindings() {
assert_eq!(
eval("let x = true; y = false; in x && y"),
@@ -58,18 +57,18 @@ fn in_let_bindings() {
);
}
#[test]
#[test_log::test]
fn shadow_in_function() {
assert_eq!(eval("(true: true) false"), Value::Bool(false));
}
#[test]
#[test_log::test]
fn throw_function() {
let result = utils::eval_result("throw \"error message\"");
let result = eval_result("throw \"error message\"");
assert!(result.is_err());
}
#[test]
#[test_log::test]
fn to_string_function() {
assert_eq!(eval("toString 42"), Value::String("42".to_string()));
}

View File

@@ -1,20 +1,19 @@
mod utils;
use nix_js::value::Value;
use utils::{eval, eval_result};
#[test]
use crate::utils::{eval, eval_result};
#[test_log::test]
fn required_parameters() {
assert_eq!(eval("({ a, b }: a + b) { a = 1; b = 2; }"), Value::Int(3));
}
#[test]
#[test_log::test]
fn missing_required_parameter() {
let result = eval_result("({ a, b }: a + b) { a = 1; }");
assert!(result.is_err());
}
#[test]
#[test_log::test]
fn all_required_parameters_present() {
assert_eq!(
eval("({ x, y, z }: x + y + z) { x = 1; y = 2; z = 3; }"),
@@ -22,13 +21,13 @@ fn all_required_parameters_present() {
);
}
#[test]
#[test_log::test]
fn reject_unexpected_arguments() {
let result = eval_result("({ a, b }: a + b) { a = 1; b = 2; c = 3; }");
assert!(result.is_err());
}
#[test]
#[test_log::test]
fn ellipsis_accepts_extra_arguments() {
assert_eq!(
eval("({ a, b, ... }: a + b) { a = 1; b = 2; c = 3; }"),
@@ -36,12 +35,12 @@ fn ellipsis_accepts_extra_arguments() {
);
}
#[test]
#[test_log::test]
fn default_parameters() {
assert_eq!(eval("({ a, b ? 5 }: a + b) { a = 1; }"), Value::Int(6));
}
#[test]
#[test_log::test]
fn override_default_parameter() {
assert_eq!(
eval("({ a, b ? 5 }: a + b) { a = 1; b = 10; }"),
@@ -49,7 +48,7 @@ fn override_default_parameter() {
);
}
#[test]
#[test_log::test]
fn at_pattern_alias() {
assert_eq!(
eval("(args@{ a, b }: args.a + args.b) { a = 1; b = 2; }"),
@@ -57,17 +56,17 @@ fn at_pattern_alias() {
);
}
#[test]
#[test_log::test]
fn simple_parameter_no_validation() {
assert_eq!(eval("(x: x.a + x.b) { a = 1; b = 2; }"), Value::Int(3));
}
#[test]
#[test_log::test]
fn simple_parameter_accepts_any_argument() {
assert_eq!(eval("(x: x) 42"), Value::Int(42));
}
#[test]
#[test_log::test]
fn nested_function_parameters() {
assert_eq!(
eval("({ a }: { b }: a + b) { a = 5; } { b = 3; }"),
@@ -75,12 +74,12 @@ fn nested_function_parameters() {
);
}
#[test]
#[test_log::test]
fn pattern_param_simple_reference_in_default() {
assert_eq!(eval("({ a, b ? a }: b) { a = 10; }"), Value::Int(10));
}
#[test]
#[test_log::test]
fn pattern_param_multiple_references_in_default() {
assert_eq!(
eval("({ a, b ? a + 5, c ? 1 }: b + c) { a = 10; }"),
@@ -88,7 +87,7 @@ fn pattern_param_multiple_references_in_default() {
);
}
#[test]
#[test_log::test]
fn pattern_param_mutual_reference() {
assert_eq!(
eval("({ a, b ? c + 1, c ? 5 }: b) { a = 1; }"),
@@ -96,7 +95,7 @@ fn pattern_param_mutual_reference() {
);
}
#[test]
#[test_log::test]
fn pattern_param_override_mutual_reference() {
assert_eq!(
eval("({ a, b ? c + 1, c ? 5 }: b) { a = 1; c = 10; }"),
@@ -104,7 +103,7 @@ fn pattern_param_override_mutual_reference() {
);
}
#[test]
#[test_log::test]
fn pattern_param_reference_list() {
assert_eq!(
eval("({ a, b ? [ a 2 ] }: builtins.elemAt b 0) { a = 42; }"),
@@ -112,7 +111,7 @@ fn pattern_param_reference_list() {
);
}
#[test]
#[test_log::test]
fn pattern_param_alias_in_default() {
assert_eq!(
eval("(args@{ a, b ? args.a + 10 }: b) { a = 5; }"),

View File

@@ -0,0 +1,368 @@
use nix_js::context::Context;
use nix_js::error::Source;
use nix_js::value::Value;
use crate::utils::{eval, eval_result};
#[test_log::test]
fn import_absolute_path() {
let temp_dir = tempfile::tempdir().unwrap();
let lib_path = temp_dir.path().join("nix_test_lib.nix");
std::fs::write(&lib_path, "{ add = a: b: a + b; }").unwrap();
let expr = format!(r#"(import "{}").add 3 5"#, lib_path.display());
assert_eq!(eval(&expr), Value::Int(8));
}
#[test_log::test]
fn import_nested() {
let temp_dir = tempfile::tempdir().unwrap();
let lib_path = temp_dir.path().join("lib.nix");
std::fs::write(&lib_path, "{ add = a: b: a + b; }").unwrap();
let main_path = temp_dir.path().join("main.nix");
let main_content = format!(
r#"let lib = import {}; in {{ result = lib.add 10 20; }}"#,
lib_path.display()
);
std::fs::write(&main_path, main_content).unwrap();
let expr = format!(r#"(import "{}").result"#, main_path.display());
assert_eq!(eval(&expr), Value::Int(30));
}
#[test_log::test]
fn import_relative_path() {
let temp_dir = tempfile::tempdir().unwrap();
let subdir = temp_dir.path().join("subdir");
std::fs::create_dir_all(&subdir).unwrap();
let lib_path = temp_dir.path().join("lib.nix");
std::fs::write(&lib_path, "{ multiply = a: b: a * b; }").unwrap();
let helper_path = subdir.join("helper.nix");
std::fs::write(&helper_path, "{ subtract = a: b: a - b; }").unwrap();
let main_path = temp_dir.path().join("main.nix");
let main_content = r#"
let
lib = import ./lib.nix;
helper = import ./subdir/helper.nix;
in {
result1 = lib.multiply 3 4;
result2 = helper.subtract 10 3;
}
"#;
std::fs::write(&main_path, main_content).unwrap();
let expr = format!(r#"let x = import "{}"; in x.result1"#, main_path.display());
assert_eq!(eval(&expr), Value::Int(12));
let expr = format!(r#"let x = import "{}"; in x.result2"#, main_path.display());
assert_eq!(eval(&expr), Value::Int(7));
}
#[test_log::test]
fn import_returns_function() {
let temp_dir = tempfile::tempdir().unwrap();
let func_path = temp_dir.path().join("nix_test_func.nix");
std::fs::write(&func_path, "x: x * 2").unwrap();
let expr = format!(r#"(import "{}") 5"#, func_path.display());
assert_eq!(eval(&expr), Value::Int(10));
}
#[test_log::test]
fn import_with_complex_dependency_graph() {
let temp_dir = tempfile::tempdir().unwrap();
let utils_path = temp_dir.path().join("utils.nix");
std::fs::write(&utils_path, "{ double = x: x * 2; }").unwrap();
let math_path = temp_dir.path().join("math.nix");
let math_content = r#"let utils = import ./utils.nix; in { triple = x: x + utils.double x; }"#;
std::fs::write(&math_path, math_content).unwrap();
let main_path = temp_dir.path().join("main.nix");
let main_content = r#"let math = import ./math.nix; in math.triple 5"#;
std::fs::write(&main_path, main_content).unwrap();
let expr = format!(r#"import "{}""#, main_path.display());
assert_eq!(eval(&expr), Value::Int(15));
}
// Tests for builtins.path
#[test_log::test]
fn path_with_file() {
let mut ctx = Context::new().unwrap();
let temp_dir = tempfile::tempdir().unwrap();
let test_file = temp_dir.path().join("test.txt");
std::fs::write(&test_file, "Hello, World!").unwrap();
let expr = format!(r#"builtins.path {{ path = {}; }}"#, test_file.display());
let result = ctx.eval(Source::new_eval(expr).unwrap()).unwrap();
// Should return a store path string
if let Value::String(store_path) = result {
assert!(store_path.starts_with(ctx.get_store_dir()));
assert!(store_path.contains("test.txt"));
} else {
panic!("Expected string, got {:?}", result);
}
}
#[test_log::test]
fn path_with_custom_name() {
let temp_dir = tempfile::tempdir().unwrap();
let test_file = temp_dir.path().join("original.txt");
std::fs::write(&test_file, "Content").unwrap();
let expr = format!(
r#"builtins.path {{ path = {}; name = "custom-name"; }}"#,
test_file.display()
);
let result = eval(&expr);
if let Value::String(store_path) = result {
assert!(store_path.contains("custom-name"));
assert!(!store_path.contains("original.txt"));
} else {
panic!("Expected string, got {:?}", result);
}
}
#[test_log::test]
fn path_with_directory_recursive() {
let mut ctx = Context::new().unwrap();
let temp_dir = tempfile::tempdir().unwrap();
let test_dir = temp_dir.path().join("mydir");
std::fs::create_dir_all(&test_dir).unwrap();
std::fs::write(test_dir.join("file1.txt"), "Content 1").unwrap();
std::fs::write(test_dir.join("file2.txt"), "Content 2").unwrap();
let expr = format!(
r#"builtins.path {{ path = {}; recursive = true; }}"#,
test_dir.display()
);
let result = ctx.eval(Source::new_eval(expr).unwrap()).unwrap();
if let Value::String(store_path) = result {
assert!(store_path.starts_with(ctx.get_store_dir()));
assert!(store_path.contains("mydir"));
} else {
panic!("Expected string, got {:?}", result);
}
}
#[test_log::test]
fn path_flat_with_file() {
let mut ctx = Context::new().unwrap();
let temp_dir = tempfile::tempdir().unwrap();
let test_file = temp_dir.path().join("flat.txt");
std::fs::write(&test_file, "Flat content").unwrap();
let expr = format!(
r#"builtins.path {{ path = {}; recursive = false; }}"#,
test_file.display()
);
let result = ctx.eval(Source::new_eval(expr).unwrap()).unwrap();
if let Value::String(store_path) = result {
assert!(store_path.starts_with(ctx.get_store_dir()));
} else {
panic!("Expected string, got {:?}", result);
}
}
#[test_log::test]
fn path_flat_with_directory_fails() {
let temp_dir = tempfile::tempdir().unwrap();
let test_dir = temp_dir.path().join("mydir");
std::fs::create_dir_all(&test_dir).unwrap();
let expr = format!(
r#"builtins.path {{ path = {}; recursive = false; }}"#,
test_dir.display()
);
let result = eval_result(&expr);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("recursive") || err_msg.contains("regular file"));
}
#[test_log::test]
fn path_nonexistent_fails() {
let expr = r#"builtins.path { path = "/nonexistent/path/that/should/not/exist"; }"#;
let result = eval_result(expr);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("does not exist"));
}
#[test_log::test]
fn path_missing_path_param() {
let expr = r#"builtins.path { name = "test"; }"#;
let result = eval_result(expr);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("path") && err_msg.contains("required"));
}
#[test_log::test]
fn path_with_sha256() {
let temp_dir = tempfile::tempdir().unwrap();
let test_file = temp_dir.path().join("hash_test.txt");
std::fs::write(&test_file, "Test content for hashing").unwrap();
// First, get the hash by calling without sha256
let expr1 = format!(r#"builtins.path {{ path = {}; }}"#, test_file.display());
let result1 = eval(&expr1);
let store_path1 = match result1 {
Value::String(s) => s,
_ => panic!("Expected string"),
};
// Compute the actual hash (for testing, we'll just verify the same path is returned)
// In real usage, the user would know the hash beforehand
let expr2 = format!(r#"builtins.path {{ path = {}; }}"#, test_file.display());
let result2 = eval(&expr2);
let store_path2 = match result2 {
Value::String(s) => s,
_ => panic!("Expected string"),
};
// Same input should produce same output
assert_eq!(store_path1, store_path2);
}
#[test_log::test]
fn path_deterministic() {
let temp_dir = tempfile::tempdir().unwrap();
let test_file = temp_dir.path().join("deterministic.txt");
std::fs::write(&test_file, "Same content").unwrap();
let expr = format!(
r#"builtins.path {{ path = {}; name = "myfile"; }}"#,
test_file.display()
);
let result1 = eval(&expr);
let result2 = eval(&expr);
// Same inputs should produce same store path
assert_eq!(result1, result2);
}
#[test_log::test]
fn read_file_type_regular_file() {
let temp_dir = tempfile::tempdir().unwrap();
let test_file = temp_dir.path().join("test.txt");
std::fs::write(&test_file, "Test content").unwrap();
let expr = format!(r#"builtins.readFileType {}"#, test_file.display());
assert_eq!(eval(&expr), Value::String("regular".to_string()));
}
#[test_log::test]
fn read_file_type_directory() {
let temp_dir = tempfile::tempdir().unwrap();
let test_dir = temp_dir.path().join("testdir");
std::fs::create_dir(&test_dir).unwrap();
let expr = format!(r#"builtins.readFileType {}"#, test_dir.display());
assert_eq!(eval(&expr), Value::String("directory".to_string()));
}
#[test_log::test]
fn read_file_type_symlink() {
let temp_dir = tempfile::tempdir().unwrap();
let target = temp_dir.path().join("target.txt");
let symlink = temp_dir.path().join("link.txt");
std::fs::write(&target, "Target content").unwrap();
#[cfg(unix)]
std::os::unix::fs::symlink(&target, &symlink).unwrap();
#[cfg(unix)]
{
let expr = format!(r#"builtins.readFileType {}"#, symlink.display());
assert_eq!(eval(&expr), Value::String("symlink".to_string()));
}
}
#[test_log::test]
fn read_dir_basic() {
let temp_dir = tempfile::tempdir().unwrap();
let test_dir = temp_dir.path().join("readdir_test");
std::fs::create_dir(&test_dir).unwrap();
std::fs::write(test_dir.join("file1.txt"), "Content 1").unwrap();
std::fs::write(test_dir.join("file2.txt"), "Content 2").unwrap();
std::fs::create_dir(test_dir.join("subdir")).unwrap();
let expr = format!(r#"builtins.readDir {}"#, test_dir.display());
let result = eval(&expr);
if let Value::AttrSet(attrs) = result {
assert_eq!(
attrs.get("file1.txt"),
Some(&Value::String("regular".to_string()))
);
assert_eq!(
attrs.get("file2.txt"),
Some(&Value::String("regular".to_string()))
);
assert_eq!(
attrs.get("subdir"),
Some(&Value::String("directory".to_string()))
);
assert_eq!(attrs.len(), 3);
} else {
panic!("Expected AttrSet, got {:?}", result);
}
}
#[test_log::test]
fn read_dir_empty() {
let temp_dir = tempfile::tempdir().unwrap();
let test_dir = temp_dir.path().join("empty_dir");
std::fs::create_dir(&test_dir).unwrap();
let expr = format!(r#"builtins.readDir {}"#, test_dir.display());
let result = eval(&expr);
if let Value::AttrSet(attrs) = result {
assert_eq!(attrs.len(), 0);
} else {
panic!("Expected AttrSet, got {:?}", result);
}
}
#[test_log::test]
fn read_dir_nonexistent_fails() {
let expr = r#"builtins.readDir "/nonexistent/directory""#;
let result = eval_result(expr);
assert!(result.is_err());
}
#[test_log::test]
fn read_dir_on_file_fails() {
let temp_dir = tempfile::tempdir().unwrap();
let test_file = temp_dir.path().join("test.txt");
std::fs::write(&test_file, "Test content").unwrap();
let expr = format!(r#"builtins.readDir {}"#, test_file.display());
let result = eval_result(&expr);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("not a directory"));
}

319
nix-js/tests/tests/lang.rs Normal file
View File

@@ -0,0 +1,319 @@
#![allow(non_snake_case)]
use std::path::PathBuf;
use nix_js::context::Context;
use nix_js::error::Source;
use nix_js::value::Value;
fn get_lang_dir() -> PathBuf {
PathBuf::from(env!("CARGO_MANIFEST_DIR")).join("tests/tests/lang")
}
fn eval_file(name: &str) -> Result<(Value, Source), String> {
let lang_dir = get_lang_dir();
let nix_path = lang_dir.join(format!("{name}.nix"));
let expr = format!(r#"import "{}""#, nix_path.display());
let mut ctx = Context::new().map_err(|e| e.to_string())?;
let source = Source {
ty: nix_js::error::SourceType::File(nix_path.into()),
src: expr.into(),
};
ctx.eval_deep(source.clone())
.map(|val| (val, source))
.map_err(|e| e.to_string())
}
fn read_expected(name: &str) -> String {
let lang_dir = get_lang_dir();
let exp_path = lang_dir.join(format!("{name}.exp"));
std::fs::read_to_string(exp_path)
.expect("expected file should exist")
.trim_end()
.to_string()
}
fn format_value(value: &Value) -> String {
value.display_compat().to_string()
}
macro_rules! eval_okay_test {
($(#[$attr:meta])* $name:ident$(, $pre:expr)?) => {
$(#[$attr])*
#[test_log::test]
fn $name() {
$(($pre)();)?
let test_name = concat!("eval-okay-", stringify!($name))
.replace("_", "-")
.replace("r#", "");
let result = eval_file(&test_name);
match result {
Ok((value, source)) => {
let actual = format_value(&value);
let actual = actual.replace(
source
.get_dir()
.parent()
.unwrap()
.to_string_lossy()
.as_ref(),
"/pwd",
);
let expected = read_expected(&test_name);
assert_eq!(actual, expected, "Output mismatch for {}", test_name);
}
Err(e) => {
panic!("Test {} failed to evaluate: {}", test_name, e);
}
}
}
};
}
macro_rules! eval_fail_test {
($name:ident) => {
#[test_log::test]
fn $name() {
let test_name = concat!("eval-fail-", stringify!($name))
.replace("_", "-")
.replace("r#", "");
let result = eval_file(&test_name);
assert!(
result.is_err(),
"Test {} should have failed but succeeded with: {:?}",
test_name,
result
);
}
};
}
eval_okay_test!(any_all);
eval_okay_test!(arithmetic);
eval_okay_test!(attrnames);
eval_okay_test!(attrs);
eval_okay_test!(attrs2);
eval_okay_test!(attrs3);
eval_okay_test!(attrs4);
eval_okay_test!(attrs5);
eval_okay_test!(
#[ignore = "__overrides is not supported"]
attrs6
);
eval_okay_test!(
#[ignore = "requires --arg/--argstr CLI flags"]
autoargs
);
eval_okay_test!(backslash_newline_1);
eval_okay_test!(backslash_newline_2);
eval_okay_test!(baseNameOf);
eval_okay_test!(builtins);
eval_okay_test!(builtins_add);
eval_okay_test!(callable_attrs);
eval_okay_test!(catattrs);
eval_okay_test!(closure);
eval_okay_test!(comments);
eval_okay_test!(concat);
eval_okay_test!(concatmap);
eval_okay_test!(concatstringssep);
eval_okay_test!(context);
eval_okay_test!(context_introspection);
eval_okay_test!(convertHash);
eval_okay_test!(curpos);
eval_okay_test!(deepseq);
eval_okay_test!(delayed_with);
eval_okay_test!(delayed_with_inherit);
eval_okay_test!(deprecate_cursed_or);
eval_okay_test!(derivation_legacy);
eval_okay_test!(dynamic_attrs);
eval_okay_test!(dynamic_attrs_2);
eval_okay_test!(dynamic_attrs_bare);
eval_okay_test!(elem);
eval_okay_test!(empty_args);
eval_okay_test!(eq);
eval_okay_test!(eq_derivations);
eval_okay_test!(filter);
eval_okay_test!(
#[ignore = "not implemented: flakeRefToString"]
flake_ref_to_string
);
eval_okay_test!(flatten);
eval_okay_test!(float);
eval_okay_test!(floor_ceil);
eval_okay_test!(foldlStrict);
eval_okay_test!(foldlStrict_lazy_elements);
eval_okay_test!(foldlStrict_lazy_initial_accumulator);
eval_okay_test!(fromjson);
eval_okay_test!(fromjson_escapes);
eval_okay_test!(fromTOML);
eval_okay_test!(
#[ignore = "timestamps are not supported"]
fromTOML_timestamps
);
eval_okay_test!(functionargs);
eval_okay_test!(hashfile);
eval_okay_test!(hashstring);
eval_okay_test!(getattrpos);
eval_okay_test!(getattrpos_functionargs);
eval_okay_test!(getattrpos_undefined);
eval_okay_test!(getenv, || {
unsafe { std::env::set_var("TEST_VAR", "foo") };
});
eval_okay_test!(groupBy);
eval_okay_test!(r#if);
eval_okay_test!(ind_string);
eval_okay_test!(import);
eval_okay_test!(inherit_attr_pos);
eval_okay_test!(
#[ignore = "__overrides is not supported"]
inherit_from
);
eval_okay_test!(intersectAttrs);
eval_okay_test!(r#let);
eval_okay_test!(list);
eval_okay_test!(listtoattrs);
eval_okay_test!(logic);
eval_okay_test!(map);
eval_okay_test!(mapattrs);
eval_okay_test!(merge_dynamic_attrs);
eval_okay_test!(nested_with);
eval_okay_test!(new_let);
eval_okay_test!(null_dynamic_attrs);
eval_okay_test!(
#[ignore = "__overrides is not supported"]
overrides
);
eval_okay_test!(
#[ignore = "not implemented: parseFlakeRef"]
parse_flake_ref
);
eval_okay_test!(partition);
eval_okay_test!(path);
eval_okay_test!(pathexists);
eval_okay_test!(path_string_interpolation, || {
unsafe {
std::env::set_var("HOME", "/fake-home");
}
});
eval_okay_test!(patterns);
eval_okay_test!(print);
eval_okay_test!(readDir);
eval_okay_test!(readfile);
eval_okay_test!(readFileType);
eval_okay_test!(redefine_builtin);
eval_okay_test!(regex_match);
eval_okay_test!(regex_split);
eval_okay_test!(regression_20220122);
eval_okay_test!(regression_20220125);
eval_okay_test!(regrettable_rec_attrset_merge);
eval_okay_test!(remove);
eval_okay_test!(repeated_empty_attrs);
eval_okay_test!(repeated_empty_list);
eval_okay_test!(replacestrings);
eval_okay_test!(
#[ignore = "requires -I CLI flags"]
search_path
);
eval_okay_test!(scope_1);
eval_okay_test!(scope_2);
eval_okay_test!(scope_3);
eval_okay_test!(scope_4);
eval_okay_test!(scope_6);
eval_okay_test!(scope_7);
eval_okay_test!(seq);
eval_okay_test!(sort);
eval_okay_test!(splitversion);
eval_okay_test!(string);
eval_okay_test!(strings_as_attrs_names);
eval_okay_test!(substring);
eval_okay_test!(substring_context);
eval_okay_test!(symlink_resolution);
eval_okay_test!(
#[ignore = "TCO not implemented, also disabled in CppNix"]
tail_call_1
);
eval_okay_test!(tojson);
eval_okay_test!(toxml);
eval_okay_test!(toxml2);
eval_okay_test!(tryeval);
eval_okay_test!(types);
eval_okay_test!(versions);
eval_okay_test!(with);
eval_okay_test!(zipAttrsWith);
eval_fail_test!(fail_abort);
eval_fail_test!(fail_addDrvOutputDependencies_empty_context);
eval_fail_test!(fail_addDrvOutputDependencies_multi_elem_context);
eval_fail_test!(fail_addDrvOutputDependencies_wrong_element_kind);
eval_fail_test!(fail_addErrorContext_example);
eval_fail_test!(fail_assert);
eval_fail_test!(fail_assert_equal_attrs_names);
eval_fail_test!(fail_assert_equal_attrs_names_2);
eval_fail_test!(fail_assert_equal_derivations);
eval_fail_test!(fail_assert_equal_derivations_extra);
eval_fail_test!(fail_assert_equal_floats);
eval_fail_test!(fail_assert_equal_function_direct);
eval_fail_test!(fail_assert_equal_int_float);
eval_fail_test!(fail_assert_equal_ints);
eval_fail_test!(fail_assert_equal_list_length);
eval_fail_test!(fail_assert_equal_paths);
eval_fail_test!(fail_assert_equal_type);
eval_fail_test!(fail_assert_equal_type_nested);
eval_fail_test!(fail_assert_nested_bool);
eval_fail_test!(fail_attr_name_type);
eval_fail_test!(fail_attrset_merge_drops_later_rec);
eval_fail_test!(fail_bad_string_interpolation_1);
eval_fail_test!(fail_bad_string_interpolation_2);
eval_fail_test!(fail_bad_string_interpolation_3);
eval_fail_test!(fail_bad_string_interpolation_4);
eval_fail_test!(fail_blackhole);
eval_fail_test!(fail_call_primop);
eval_fail_test!(fail_deepseq);
eval_fail_test!(fail_derivation_name);
eval_fail_test!(fail_dup_dynamic_attrs);
eval_fail_test!(fail_duplicate_traces);
eval_fail_test!(fail_eol_1);
eval_fail_test!(fail_eol_2);
eval_fail_test!(fail_eol_3);
eval_fail_test!(fail_fetchTree_negative);
eval_fail_test!(fail_fetchurl_baseName);
eval_fail_test!(fail_fetchurl_baseName_attrs);
eval_fail_test!(fail_fetchurl_baseName_attrs_name);
eval_fail_test!(fail_flake_ref_to_string_negative_integer);
eval_fail_test!(fail_foldlStrict_strict_op_application);
eval_fail_test!(fail_fromJSON_keyWithNullByte);
eval_fail_test!(fail_fromJSON_overflowing);
eval_fail_test!(fail_fromJSON_valueWithNullByte);
eval_fail_test!(fail_fromTOML_keyWithNullByte);
eval_fail_test!(fail_fromTOML_timestamps);
eval_fail_test!(fail_fromTOML_valueWithNullByte);
eval_fail_test!(fail_hashfile_missing);
eval_fail_test!(fail_infinite_recursion_lambda);
eval_fail_test!(fail_list);
eval_fail_test!(fail_missing_arg);
eval_fail_test!(fail_mutual_recursion);
eval_fail_test!(fail_nested_list_items);
eval_fail_test!(fail_nonexist_path);
eval_fail_test!(fail_not_throws);
eval_fail_test!(fail_overflowing_add);
eval_fail_test!(fail_overflowing_div);
eval_fail_test!(fail_overflowing_mul);
eval_fail_test!(fail_overflowing_sub);
eval_fail_test!(fail_path_slash);
eval_fail_test!(fail_pipe_operators);
eval_fail_test!(fail_recursion);
eval_fail_test!(fail_remove);
eval_fail_test!(fail_scope_5);
eval_fail_test!(fail_seq);
eval_fail_test!(fail_set);
eval_fail_test!(fail_set_override);
eval_fail_test!(fail_string_nul_1);
eval_fail_test!(fail_string_nul_2);
eval_fail_test!(fail_substring);
eval_fail_test!(fail_toJSON);
eval_fail_test!(fail_toJSON_non_utf_8);
eval_fail_test!(fail_to_path);
eval_fail_test!(fail_undeclared_arg);
eval_fail_test!(fail_using_set_as_attr_name);

Some files were not shown because too many files have changed in this diff Show More