Compare commits

...

161 Commits

Author SHA1 Message Date
198d847151 feat: init Rust VM 2026-03-15 17:53:19 +08:00
40d00a6c47 feat: value 2026-03-14 10:28:43 +08:00
0c9a391618 clean up 2026-03-14 10:28:43 +08:00
7a7229d70e fmt: group_imports = "StdExternalCrate" 2026-03-08 17:41:06 +08:00
e4004ccb6d feat: bytecode 2026-03-08 17:41:06 +08:00
843ae6cfb4 refactor: use GhostCell to provide interior mutability in Ir 2026-03-08 16:57:34 +08:00
c24d6a8bb3 chore: merge codegen::compile and codegen::compile_scoped 2026-02-25 21:14:35 +08:00
d7351e907b feat: thunk caching (WIP) 2026-02-25 21:14:27 +08:00
550223a1d7 refactor: tidy 2026-02-21 22:30:13 +08:00
53dbee3514 refactor(downgrade): use bumpalo 2026-02-21 21:54:40 +08:00
e1517c338e chore: tidy 2026-02-20 16:12:54 +08:00
45096f5254 fix: fetchGit 2026-02-19 22:34:41 +08:00
b57fea3104 optimize: short-circuit update (//) 2026-02-19 21:54:55 +08:00
4380fa85c4 optimize: compact 2026-02-19 21:14:02 +08:00
99045aa76c chore: fmt 2026-02-19 20:14:06 +08:00
7eb3acf26f optimize: use v8::Script::compile to run script directly in op_import 2026-02-19 19:58:02 +08:00
b424f60f9f optimize: avoid using #[serde] in ops 2026-02-19 19:11:56 +08:00
42031edac1 optimize: generate shorter code 2026-02-19 17:16:35 +08:00
04dcadfd61 optimize 2026-02-18 20:45:50 +08:00
c3c39bda0c feat: v8 profiling 2026-02-18 20:45:50 +08:00
782092b91e optimize: type check 2026-02-18 20:45:50 +08:00
ae5febd5dd feat(cli): compile 2026-02-18 20:45:42 +08:00
3cc7c7be75 chore: update deps; restructure tests; use Map over Record 2026-02-17 12:02:54 +08:00
f49634ccc0 optimize: use Map to represent NixAttrs 2026-02-17 10:35:37 +08:00
4a885c18b8 chore: add eslint 2026-02-17 10:35:37 +08:00
37e395c0e3 optimize: builtins.intersectAttrs 2026-02-16 23:07:52 +08:00
16a8480d29 feat(cli): support eval file 2026-02-16 21:52:08 +08:00
f0a0593d4c feat: implement builtins.toXML 2026-02-16 19:52:33 +08:00
ce64a82da3 feat: inspector 2026-02-16 19:52:33 +08:00
5c48e5cfdd feat: implement hash related primops 2026-02-15 19:55:29 +08:00
7836f8c869 refactor: handle derivation generation on Rust side 2026-02-15 19:38:11 +08:00
e357678d70 feat: implement realisePath 2026-02-15 18:26:24 +08:00
2f2c690023 chore(runtime-ts): fix linter errors 2026-02-15 12:20:31 +08:00
cf4dd6c379 fix: preserve string context in builtins.{path,fetch*} 2026-02-14 20:06:38 +08:00
31c7a62311 chore(typos): ignore lang tests 2026-02-14 19:02:26 +08:00
ad5d047c01 chore: eliminate Result::unwrap 2026-02-14 19:01:06 +08:00
795742e3d8 deps: upgrade dependencies 2026-02-14 19:01:00 +08:00
60cd61d771 feat: implement fromTOML; fix fromJSON implementation 2026-02-14 13:30:31 +08:00
d95a6e509c feat: tidy fetcher (partial)
* shouldn't have used LLM to implement this...
2026-02-13 21:57:51 +08:00
48a43bed55 fix: fetchTarball 2026-02-13 21:19:34 +08:00
6d8f1e79e6 fix: handle NixPath in nixValueToJson 2026-02-13 21:19:34 +08:00
df4edaf5bb fix: preserve string context in + operator 2026-02-13 20:14:46 +08:00
3aee3c67b9 optimization: with scope lookup 2026-02-13 18:50:42 +08:00
a8f1c81b60 fix: derivation (WIP) 2026-02-12 00:18:12 +08:00
249eaf3c11 refactor 2026-02-12 00:18:12 +08:00
a79e20c417 fix: drvDeep 2026-02-12 00:18:12 +08:00
bd9eb638af fix: deepSeq 2026-02-08 15:35:59 +08:00
d09b84676c refactor: with_thunk_scope 2026-02-08 12:48:01 +08:00
1346ae5405 refactor 2026-02-08 12:21:48 +08:00
6b46e466c2 feat: implement scopedImport & REPL scoping 2026-02-08 03:53:53 +08:00
f154010120 chore: update flake.lock 2026-02-08 02:50:48 +08:00
f5364ded1e feat: handle regex operations on rust side 2026-02-08 01:20:38 +08:00
26e7b74585 refactor: remove unused span_utils.rs 2026-02-08 01:20:38 +08:00
e8a28a6d2f feat: use ere (compile-time regex compilation) 2026-02-07 18:42:16 +08:00
216930027d refactor: move ops to runtime/ops.rs 2026-02-07 16:28:19 +08:00
4d6fd6d614 feat: eval_shallow & eval_deep 2026-01-31 22:09:12 +08:00
b7f4ece472 fix: force the first argument of builtins.trace 2026-01-31 20:26:53 +08:00
a68681b4f5 fix: derivation references 2026-01-31 20:19:31 +08:00
ba3e2ae3de feat: set v8 stack size to 8 MiB 2026-01-31 19:29:09 +08:00
c5aee21514 fix: remove incorrect dynamic attr check 2026-01-31 18:59:11 +08:00
8f01ce2eb4 fix: handle __functor in forceFunction 2026-01-31 18:58:44 +08:00
a08f0e78a3 feat: builtins.placeholder 2026-01-31 18:54:13 +08:00
547f8f3828 fix: disable TCO test 2026-01-31 17:51:01 +08:00
aa368cb12e fix: toJSON test 2026-01-31 17:29:57 +08:00
0360bbe4aa fix: canonicalize paths 2026-01-31 17:23:28 +08:00
db64763d77 fix: escape "${" 2026-01-31 17:23:07 +08:00
1aba28d97b fix: symlink resolution 2026-01-31 16:55:44 +08:00
6838f9a0cf fix: unsafeGetAttrPos call on functionArgs 2026-01-31 16:46:16 +08:00
cb539c52c3 chore: fmt 2026-01-31 16:23:23 +08:00
b8f8b5764d fix: print test 2026-01-31 15:36:03 +08:00
1cfa8223c6 fix: path related tests 2026-01-31 15:03:30 +08:00
13874ca6ca fix: structuredAttrs 2026-01-31 13:53:02 +08:00
5703329850 fix: copy path to store in concatStringsWithContext 2026-01-31 12:47:32 +08:00
f0812c9063 refactor: recursive attrset; fix attrset merging 2026-01-31 12:07:13 +08:00
97854afafa fix: getenv test 2026-01-30 22:52:03 +08:00
9545b0fcae fix: recursive attrs 2026-01-30 22:51:42 +08:00
aee46b0b49 feat: use CppNix's test suite 2026-01-30 22:47:47 +08:00
1f835e7b06 fix: derivation semantic 2026-01-30 22:45:37 +08:00
9ee2dd5c08 fix: inherit in recursive attribute sets 2026-01-29 18:40:13 +08:00
084968c08a fix: implication operator (->) 2026-01-29 17:35:19 +08:00
058ef44259 refactor(codegen): less allocation 2026-01-29 11:42:40 +08:00
86953dd9d3 refactor: thunk scope 2026-01-27 10:45:40 +08:00
d1f87260a6 fix: infinite recursion on perl (WIP) 2026-01-27 10:45:40 +08:00
3186cfe6e4 feat(error): stack trace 2026-01-25 03:06:13 +08:00
4d68fb26d9 fix: derivation & derivationStrict 2026-01-25 03:06:13 +08:00
51f7f4079b feat: builtins.parseDrvName 2026-01-25 03:06:13 +08:00
7136f57c12 feat: builtins.unsafeGetAttrPos & __curPos 2026-01-25 03:06:13 +08:00
05b66070a3 feat: builtins.readType & builtins.readDir 2026-01-25 03:06:13 +08:00
13a7d761f4 feat: lookup path (builtins.findFile) 2026-01-25 03:06:13 +08:00
3d315cd050 feat: debug thunk location 2026-01-25 03:06:13 +08:00
62ec37f3ad feat: mkAttrs 2026-01-25 03:06:13 +08:00
a6aded7bea fix: return defaultVal when lhs is not attrset in selectWithDefault 2026-01-25 03:06:13 +08:00
296c0398a4 fix(builtins.zipAttrsWith): laziness 2026-01-25 03:06:13 +08:00
10430e2006 refactor: RuntimeContext 2026-01-25 03:06:13 +08:00
f46ee9d48f chore: tidy 2026-01-24 11:20:12 +08:00
e58ebbe408 fix: maybe_thunk 2026-01-24 02:20:33 +08:00
2f5f84c6c1 fix: removeAttrs 2026-01-24 01:58:08 +08:00
33775092ee fix: preserve string context 2026-01-24 00:57:42 +08:00
ef5d8c3b29 feat: builtins.functionArgs 2026-01-24 00:57:42 +08:00
56a8ba9475 feat: builtins.genericClosure; refactor type check 2026-01-24 00:57:42 +08:00
58c3e67409 fix: force the attrset before checking its type in Nix.hasAttr 2026-01-24 00:54:30 +08:00
7679a3b67f fix: list related primops 2026-01-24 00:54:30 +08:00
95faa7b35f fix: make attrValues' order consistent 2026-01-24 00:54:30 +08:00
43b8959842 fix: use forceBool in codegen 2026-01-24 00:54:30 +08:00
041d7b7dd2 feat: builtins.getEnv 2026-01-24 00:54:30 +08:00
15c4159dcc refactor: error handling 2026-01-24 00:54:30 +08:00
2cb85529c9 feat: NamedSource 2026-01-20 20:12:57 +08:00
e310133421 feat: better error handling 2026-01-20 08:55:08 +08:00
208b996627 fix: error message of Nix.select 2026-01-18 16:53:12 +08:00
9aee36a0e2 fix: relative path resolution
comma operator...
2026-01-18 16:42:54 +08:00
dcb853ea0a feat: logging; copy to store; fix daemon: write_slice -> write_all 2026-01-18 16:42:54 +08:00
2441e10607 feat: use snix nix-compat; implement metadata cache 2026-01-18 16:42:21 +08:00
611255d42c feat: nix_nar 2026-01-18 01:10:49 +08:00
2ad662c765 feat: initial nix-daemon implementation 2026-01-18 01:04:25 +08:00
52bf46407a feat: string context 2026-01-18 00:52:31 +08:00
513b43965c feat: do not use Object.defineProperty? 2026-01-18 00:52:11 +08:00
09bfbca64a refactor: tidy; fix runtime path resolution 2026-01-17 16:42:10 +08:00
f2fc12026f feat: initial path implementation 2026-01-17 12:20:18 +08:00
97e5e7b995 feat: regex related builtins 2026-01-16 21:50:32 +08:00
e620f39a4a fix: use coerceToString 2026-01-16 21:30:56 +08:00
5341ad6c27 feat: builtins.compareVersions 2026-01-16 21:07:15 +08:00
4f8edab795 fix: fetchTree & fetchTarball 2026-01-16 21:05:44 +08:00
e676d2f9f4 fix: unwrap non-recursive let bindings 2026-01-16 21:01:46 +08:00
b6a6630a93 feat: always resolve path at runtime 2026-01-16 21:01:14 +08:00
62abfff439 fix: make update operator lazy 2026-01-16 20:38:41 +08:00
55825788b8 chore: directly map nix boolean/null to javascript boolean/null 2026-01-16 20:38:15 +08:00
b4e0b53cde fix: select 2026-01-14 17:38:15 +08:00
6cd87aa653 chore: tidy 2026-01-14 17:38:12 +08:00
a8683e720b fix(codegen): string escape 2026-01-12 17:44:19 +08:00
3b6804dde6 feat: toJSON 2026-01-11 18:57:52 +08:00
4c505edef5 fix: let 2026-01-11 18:57:52 +08:00
75cb3bfaf1 fix: SCC interscope reference 2026-01-11 18:57:52 +08:00
7d04d8262f fix: duplicate definition check in let-in 2026-01-11 18:57:52 +08:00
c8e617fe24 fix: escape attr keys 2026-01-11 18:57:52 +08:00
158784cbe8 fix: lazy select_with_default 2026-01-11 18:57:52 +08:00
5b1750b1ba feat: thunk loop debugging 2026-01-11 18:57:52 +08:00
160b59b8bf feat: __functor 2026-01-11 18:57:52 +08:00
0538463bf0 fix: Path::canonicalize -> normalize_path
* Nix doesn't require path to exist
2026-01-11 18:57:52 +08:00
621d4ea5c0 fix: lazy select_with_default 2026-01-11 18:57:52 +08:00
3f7fd02263 feat: initial fetcher implementation 2026-01-11 18:57:14 +08:00
c5240385ea feat: initial string context implementation 2026-01-11 11:18:14 +08:00
95088103c8 feat: initial derivation implementation 2026-01-11 00:49:44 +08:00
e33770c1bf chore: tidy 2026-01-10 22:04:23 +08:00
fbf35ba4cd feat: implement coerceToString 2026-01-10 22:03:05 +08:00
1adb7a24a9 feat: implement SCC analysis; refactor test; rename js helper functions 2026-01-10 22:03:05 +08:00
36ccc735f9 refactor: avoid Pin hack 2026-01-10 15:16:48 +08:00
fdda1ae682 fix: throw error on duplicated let entry 2026-01-10 11:55:51 +08:00
e29e432328 feat: runtime error 2026-01-10 10:28:48 +08:00
cc53963df0 refactor: less unwrap 2026-01-10 10:28:48 +08:00
0376621982 refactor: flatten Ir::Const & Value::Const; add Ir::Builtin to represent
globally available builtins
2026-01-10 10:27:55 +08:00
9cfffc440f refactor: tidy 2026-01-09 17:57:22 +08:00
23950da6ea refactor: Runtime 2026-01-07 18:59:10 +08:00
9d1d4a3763 feat: add missing primops 2026-01-06 22:00:18 +08:00
c9455bd0a8 feat: persist JsRuntime 2026-01-06 18:02:35 +08:00
c43d796dc0 refactor: avoid global state 2026-01-05 18:24:59 +08:00
89b68d5fe9 refactor: rename IS_PRIMOP -> PRIMOP_METADATA 2026-01-04 17:32:58 +08:00
45d777a157 feat: implement assert 2026-01-03 18:45:09 +08:00
159267c70b fmt: tidy 2026-01-03 17:17:36 +08:00
40884c21ad feat: builtins.import 2026-01-03 17:14:41 +08:00
c79eb0951e fix(runtime::to_value): numbers are always NixFloat 2026-01-03 12:20:12 +08:00
add715f560 fmt: biome & editorconfig 2026-01-02 23:21:46 +08:00
526 changed files with 22054 additions and 5890 deletions

11
.editorconfig Normal file
View File

@@ -0,0 +1,11 @@
root = true
[*]
indent_style = space
indent_size = 2
end_of_line = lf
charset = utf-8
insert_final_newline = true
[*.{rs,toml}]
indent_size = 4

1
.envrc
View File

@@ -1 +1,2 @@
use flake use flake
dotenv

9
.gitignore vendored
View File

@@ -1,3 +1,12 @@
target/ target/
/.direnv/ /.direnv/
# Profiling
flamegraph*.svg
perf.data*
profile.json.gz
prof.json
*.cpuprofile
*.cpuprofile.gz
*v8.log*

24
.lazy.lua Normal file
View File

@@ -0,0 +1,24 @@
vim.lsp.config("biome", {
root_dir = function (_bufnr, on_dir)
on_dir(vim.fn.getcwd())
end
})
vim.lsp.config("eslint", {
settings = {
eslint = {
options = {
configFile = "./nix-js/runtime-ts/eslint.config.mts"
}
}
}
})
vim.lsp.config("rust_analyzer", {
settings = {
["rust-analyzer"] = {
cargo = {
}
}
}
})
return {}

3144
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,10 @@
[workspace] [workspace]
resolver = "3" resolver = "3"
members = [ members = [
"nix-js", "fix",
"nix-js-macros" "boxing",
] ]
[profile.profiling]
inherits = "release"
debug = true

31
Justfile Normal file
View File

@@ -0,0 +1,31 @@
[no-exit-message]
@repl:
cargo run -- repl
[no-exit-message]
@eval expr:
cargo run -- eval --expr '{{expr}}'
[no-exit-message]
@replr:
cargo run --release -- repl
[no-exit-message]
@evalr expr:
cargo run --release -- eval --expr '{{expr}}'
[no-exit-message]
@repli:
cargo run --release --features inspector -- --inspect-brk 127.0.0.1:9229 repl
[no-exit-message]
@evali expr:
cargo run --release --features inspector -- --inspect-brk 127.0.0.1:9229 eval --expr '{{expr}}'
[no-exit-message]
@replp:
cargo run --release --features prof -- repl
[no-exit-message]
@evalp expr:
cargo run --release --features prof -- eval --expr '{{expr}}'

70
biome.json Normal file
View File

@@ -0,0 +1,70 @@
{
"$schema": "https://biomejs.dev/schemas/2.3.14/schema.json",
"vcs": {
"enabled": true,
"clientKind": "git",
"useIgnoreFile": true
},
"files": {
"includes": ["**", "!!**/dist"]
},
"formatter": {
"enabled": true,
"formatWithErrors": true,
"attributePosition": "auto",
"indentStyle": "space",
"indentWidth": 2,
"lineWidth": 110,
"lineEnding": "lf"
},
"linter": {
"rules": {
"style": {
"useNamingConvention": {
"level": "warn",
"options": {
"strictCase": false,
"conventions": [
{
"selector": { "kind": "objectLiteralProperty" },
"formats": ["camelCase", "PascalCase", "CONSTANT_CASE"]
},
{
"selector": { "kind": "typeProperty" },
"formats": ["camelCase", "snake_case"]
}
]
}
}
}
}
},
"overrides": [
{
"includes": ["**/global.d.ts"],
"linter": {
"rules": {
"style": {
"useNamingConvention": "off"
}
}
}
}
],
"javascript": {
"formatter": {
"arrowParentheses": "always",
"bracketSameLine": false,
"bracketSpacing": true,
"jsxQuoteStyle": "double",
"quoteProperties": "asNeeded",
"semicolons": "always",
"trailingCommas": "all"
}
},
"json": {
"formatter": {
"trailingCommas": "none"
}
}
}

8
boxing/Cargo.toml Normal file
View File

@@ -0,0 +1,8 @@
[package]
name = "boxing"
version = "0.1.3"
edition = "2021"
description = "NaN-boxing primitives (local fork with bool fix)"
[dependencies]
sptr = "0.3"

2
boxing/src/lib.rs Normal file
View File

@@ -0,0 +1,2 @@
pub mod nan;
mod utils;

7
boxing/src/nan.rs Normal file
View File

@@ -0,0 +1,7 @@
pub mod raw;
pub use raw::RawBox;
const SIGN_MASK: u64 = 0x7FFF_FFFF_FFFF_FFFF;
const QUIET_NAN: u64 = 0x7FF8_0000_0000_0000;
const NEG_QUIET_NAN: u64 = 0xFFF8_0000_0000_0000;

471
boxing/src/nan/raw.rs Normal file
View File

@@ -0,0 +1,471 @@
use super::{NEG_QUIET_NAN, QUIET_NAN, SIGN_MASK};
use crate::utils::ArrayExt;
use sptr::Strict;
use std::fmt;
use std::mem::ManuallyDrop;
use std::num::NonZeroU8;
pub trait RawStore: Sized {
fn to_val(self, value: &mut Value);
fn from_val(value: &Value) -> Self;
}
impl RawStore for [u8; 6] {
#[inline]
fn to_val(self, value: &mut Value) {
value.set_data(self);
}
#[inline]
fn from_val(value: &Value) -> Self {
*value.data()
}
}
impl RawStore for bool {
#[inline]
fn to_val(self, value: &mut Value) {
value.set_data([u8::from(self)].truncate_to());
}
#[inline]
fn from_val(value: &Value) -> Self {
value.data()[0] == 1
}
}
macro_rules! int_store {
($ty:ty) => {
impl RawStore for $ty {
#[inline]
fn to_val(self, value: &mut Value) {
let bytes = self.to_ne_bytes();
value.set_data(bytes.truncate_to());
}
#[inline]
fn from_val(value: &Value) -> Self {
<$ty>::from_ne_bytes(value.data().truncate_to())
}
}
};
}
int_store!(u8);
int_store!(u16);
int_store!(u32);
int_store!(i8);
int_store!(i16);
int_store!(i32);
fn store_ptr<P: Strict + Copy>(value: &mut Value, ptr: P) {
#[cfg(target_pointer_width = "64")]
{
assert!(
ptr.addr() <= 0x0000_FFFF_FFFF_FFFF,
"Pointer too large to store in NaN box"
);
let val = (unsafe { value.whole_mut() } as *mut [u8; 8]).cast::<P>();
let ptr = Strict::map_addr(ptr, |addr| {
addr | (usize::from(value.header().into_raw()) << 48)
});
unsafe { val.write(ptr) };
}
#[cfg(target_pointer_width = "32")]
{
let _ = (value, ptr);
unimplemented!("32-bit pointer storage not supported");
}
}
fn load_ptr<P: Strict>(value: &Value) -> P {
#[cfg(target_pointer_width = "64")]
{
let val = (unsafe { value.whole() } as *const [u8; 8]).cast::<P>();
let ptr = unsafe { val.read() };
Strict::map_addr(ptr, |addr| addr & 0x0000_FFFF_FFFF_FFFF)
}
#[cfg(target_pointer_width = "32")]
{
let _ = value;
unimplemented!("32-bit pointer storage not supported");
}
}
impl<T> RawStore for *const T {
fn to_val(self, value: &mut Value) {
store_ptr::<*const T>(value, self);
}
fn from_val(value: &Value) -> Self {
load_ptr::<*const T>(value)
}
}
impl<T> RawStore for *mut T {
fn to_val(self, value: &mut Value) {
store_ptr::<*mut T>(value, self);
}
fn from_val(value: &Value) -> Self {
load_ptr::<*mut T>(value)
}
}
#[derive(Copy, Clone, PartialEq, Eq)]
enum TagVal {
_P1,
_P2,
_P3,
_P4,
_P5,
_P6,
_P7,
_N1,
_N2,
_N3,
_N4,
_N5,
_N6,
_N7,
}
#[derive(Copy, Clone, PartialEq, Eq)]
pub struct RawTag(TagVal);
impl RawTag {
#[inline]
#[must_use]
pub fn new(neg: bool, val: NonZeroU8) -> RawTag {
unsafe { Self::new_unchecked(neg, val.get() & 0x07) }
}
#[inline]
#[must_use]
pub fn new_checked(neg: bool, val: u8) -> Option<RawTag> {
Some(RawTag(match (neg, val) {
(false, 1) => TagVal::_P1,
(false, 2) => TagVal::_P2,
(false, 3) => TagVal::_P3,
(false, 4) => TagVal::_P4,
(false, 5) => TagVal::_P5,
(false, 6) => TagVal::_P6,
(false, 7) => TagVal::_P7,
(true, 1) => TagVal::_N1,
(true, 2) => TagVal::_N2,
(true, 3) => TagVal::_N3,
(true, 4) => TagVal::_N4,
(true, 5) => TagVal::_N5,
(true, 6) => TagVal::_N6,
(true, 7) => TagVal::_N7,
_ => return None,
}))
}
/// # Safety
///
/// `val` must be in the range `1..8`
#[inline]
#[must_use]
pub unsafe fn new_unchecked(neg: bool, val: u8) -> RawTag {
RawTag(match (neg, val) {
(false, 1) => TagVal::_P1,
(false, 2) => TagVal::_P2,
(false, 3) => TagVal::_P3,
(false, 4) => TagVal::_P4,
(false, 5) => TagVal::_P5,
(false, 6) => TagVal::_P6,
(false, 7) => TagVal::_P7,
(true, 1) => TagVal::_N1,
(true, 2) => TagVal::_N2,
(true, 3) => TagVal::_N3,
(true, 4) => TagVal::_N4,
(true, 5) => TagVal::_N5,
(true, 6) => TagVal::_N6,
(true, 7) => TagVal::_N7,
_ => unsafe { core::hint::unreachable_unchecked() },
})
}
#[inline]
#[must_use]
pub fn is_neg(self) -> bool {
matches!(self.0, |TagVal::_N1| TagVal::_N2
| TagVal::_N3
| TagVal::_N4
| TagVal::_N5
| TagVal::_N6
| TagVal::_N7)
}
#[inline]
#[must_use]
pub fn val(self) -> NonZeroU8 {
match self.0 {
TagVal::_P1 | TagVal::_N1 => NonZeroU8::MIN,
TagVal::_P2 | TagVal::_N2 => NonZeroU8::MIN.saturating_add(1),
TagVal::_P3 | TagVal::_N3 => NonZeroU8::MIN.saturating_add(2),
TagVal::_P4 | TagVal::_N4 => NonZeroU8::MIN.saturating_add(3),
TagVal::_P5 | TagVal::_N5 => NonZeroU8::MIN.saturating_add(4),
TagVal::_P6 | TagVal::_N6 => NonZeroU8::MIN.saturating_add(5),
TagVal::_P7 | TagVal::_N7 => NonZeroU8::MIN.saturating_add(6),
}
}
#[inline]
#[must_use]
pub fn neg_val(self) -> (bool, u8) {
match self.0 {
TagVal::_P1 => (false, 1),
TagVal::_P2 => (false, 2),
TagVal::_P3 => (false, 3),
TagVal::_P4 => (false, 4),
TagVal::_P5 => (false, 5),
TagVal::_P6 => (false, 6),
TagVal::_P7 => (false, 7),
TagVal::_N1 => (true, 1),
TagVal::_N2 => (true, 2),
TagVal::_N3 => (true, 3),
TagVal::_N4 => (true, 4),
TagVal::_N5 => (true, 5),
TagVal::_N6 => (true, 6),
TagVal::_N7 => (true, 7),
}
}
}
impl fmt::Debug for RawTag {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.debug_struct("RawTag")
.field("neg", &self.is_neg())
.field("val", &self.val())
.finish()
}
}
#[derive(Copy, Clone, Debug, PartialEq)]
#[repr(transparent)]
struct Header(u16);
impl Header {
#[inline]
fn new(tag: RawTag) -> Header {
let (neg, val) = tag.neg_val();
Header(0x7FF8 | (u16::from(neg) << 15) | u16::from(val))
}
#[inline]
fn tag(self) -> RawTag {
unsafe { RawTag::new_unchecked(self.get_sign(), self.get_tag()) }
}
#[inline]
fn get_sign(self) -> bool {
self.0 & 0x8000 != 0
}
#[inline]
fn get_tag(self) -> u8 {
(self.0 & 0x0007) as u8
}
#[inline]
fn into_raw(self) -> u16 {
self.0
}
}
#[derive(Clone, Debug, PartialEq)]
#[repr(C, align(8))]
pub struct Value {
#[cfg(target_endian = "big")]
header: Header,
data: [u8; 6],
#[cfg(target_endian = "little")]
header: Header,
}
impl Value {
#[inline]
pub fn new(tag: RawTag, data: [u8; 6]) -> Value {
Value {
header: Header::new(tag),
data,
}
}
#[inline]
pub fn empty(tag: RawTag) -> Value {
Value::new(tag, [0; 6])
}
pub fn store<T: RawStore>(tag: RawTag, val: T) -> Value {
let mut v = Value::new(tag, [0; 6]);
T::to_val(val, &mut v);
v
}
pub fn load<T: RawStore>(self) -> T {
T::from_val(&self)
}
#[inline]
#[must_use]
pub fn tag(&self) -> RawTag {
self.header.tag()
}
#[inline]
fn header(&self) -> &Header {
&self.header
}
#[inline]
pub fn set_data(&mut self, val: [u8; 6]) {
self.data = val;
}
#[inline]
#[must_use]
pub fn data(&self) -> &[u8; 6] {
&self.data
}
#[inline]
#[must_use]
pub fn data_mut(&mut self) -> &mut [u8; 6] {
&mut self.data
}
#[inline]
#[must_use]
pub unsafe fn whole(&self) -> &[u8; 8] {
let ptr = (self as *const Value).cast::<[u8; 8]>();
unsafe { &*ptr }
}
#[inline]
#[must_use]
pub unsafe fn whole_mut(&mut self) -> &mut [u8; 8] {
let ptr = (self as *mut Value).cast::<[u8; 8]>();
unsafe { &mut *ptr }
}
}
#[repr(C)]
pub union RawBox {
float: f64,
value: ManuallyDrop<Value>,
bits: u64,
#[cfg(target_pointer_width = "64")]
ptr: *const (),
#[cfg(target_pointer_width = "32")]
ptr: (u32, *const ()),
}
impl RawBox {
#[inline]
#[must_use]
pub fn from_float(val: f64) -> RawBox {
match (val.is_nan(), val.is_sign_positive()) {
(true, true) => RawBox {
float: f64::from_bits(QUIET_NAN),
},
(true, false) => RawBox {
float: f64::from_bits(NEG_QUIET_NAN),
},
(false, _) => RawBox { float: val },
}
}
#[inline]
#[must_use]
pub fn from_value(value: Value) -> RawBox {
RawBox {
value: ManuallyDrop::new(value),
}
}
#[inline]
#[must_use]
pub fn tag(&self) -> Option<RawTag> {
if self.is_value() {
Some(unsafe { self.value.tag() })
} else {
None
}
}
#[inline]
#[must_use]
pub fn is_float(&self) -> bool {
(unsafe { !self.float.is_nan() } || unsafe { self.bits & SIGN_MASK == QUIET_NAN })
}
#[inline]
#[must_use]
pub fn is_value(&self) -> bool {
(unsafe { self.float.is_nan() } && unsafe { self.bits & SIGN_MASK != QUIET_NAN })
}
#[inline]
#[must_use]
pub fn float(&self) -> Option<&f64> {
if self.is_float() {
Some(unsafe { &self.float })
} else {
None
}
}
#[inline]
#[must_use]
pub fn value(&self) -> Option<&Value> {
if self.is_value() {
Some(unsafe { &self.value })
} else {
None
}
}
#[inline]
pub fn into_float_unchecked(self) -> f64 {
unsafe { self.float }
}
}
impl Clone for RawBox {
#[inline]
fn clone(&self) -> Self {
RawBox {
ptr: unsafe { self.ptr },
}
}
}
impl fmt::Debug for RawBox {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self.float() {
Some(val) => f.debug_tuple("RawBox::Float").field(val).finish(),
None => {
let val = self.value().expect("RawBox is neither float nor value");
f.debug_struct("RawBox::Data")
.field("tag", &val.tag())
.field("data", val.data())
.finish()
}
}
}
}

16
boxing/src/utils.rs Normal file
View File

@@ -0,0 +1,16 @@
pub trait ArrayExt<const LEN: usize> {
type Elem;
fn truncate_to<const M: usize>(self) -> [Self::Elem; M];
}
impl<T: Default + Copy, const N: usize> ArrayExt<N> for [T; N] {
type Elem = T;
fn truncate_to<const M: usize>(self) -> [Self::Elem; M] {
let copy_len = usize::min(N, M);
let mut out = [T::default(); M];
out[0..copy_len].copy_from_slice(&self[0..copy_len]);
out
}
}

16
default.nix Normal file
View File

@@ -0,0 +1,16 @@
let
lockFile = builtins.fromJSON (builtins.readFile ./flake.lock);
flake-compat-node = lockFile.nodes.${lockFile.nodes.root.inputs.flake-compat};
flake-compat = builtins.fetchTarball {
inherit (flake-compat-node.locked) url;
sha256 = flake-compat-node.locked.narHash;
};
flake = (
import flake-compat {
src = ./.;
copySourceTreeToStore = false;
}
);
in
flake.defaultNix

106
fix/Cargo.toml Normal file
View File

@@ -0,0 +1,106 @@
[package]
name = "fix"
version = "0.1.0"
edition = "2024"
[dependencies]
mimalloc = "0.1"
tokio = { version = "1.41", features = [
"rt-multi-thread",
"sync",
"net",
"io-util",
] }
nix-compat = { git = "https://git.snix.dev/snix/snix.git", version = "0.1.0", features = [
"wire",
"async",
] }
# REPL
anyhow = "1.0"
rustyline = "17.0"
# CLI
clap = { version = "4", features = ["derive"] }
# Logging
tracing = "0.1"
tracing-subscriber = { version = "0.3", features = ["env-filter"] }
derive_more = { version = "2", features = ["full"] }
thiserror = "2"
miette = { version = "7.4", features = ["fancy"] }
hashbrown = "0.16"
string-interner = "0.19"
bumpalo = { version = "3.20", features = [
"allocator-api2",
"boxed",
"collections",
] }
rust-embed = "8.11"
itertools = "0.14"
regex = "1.11"
nix-nar = "0.3"
sha2 = "0.10"
sha1 = "0.10"
md5 = "0.8"
hex = "0.4"
base64 = "0.22"
reqwest = { version = "0.13", features = [
"blocking",
"rustls",
], default-features = false }
tar = "0.4"
flate2 = "1.0"
xz2 = "0.1"
bzip2 = "0.6"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
# spec 1.0.0
toml = "=0.9.9"
dirs = "6.0"
tempfile = "3.24"
rusqlite = { version = "0.38", features = ["bundled"] }
rnix = "0.14"
rowan = "0.16"
ere = "0.2.4"
num_enum = "0.7.5"
tap = "1.0.1"
ghost-cell = "0.2"
colored = "3.1"
boxing = { path = "../boxing" }
sealed = "0.6"
small-map = "0.1"
smallvec = "1.15"
[dependencies.gc-arena]
git = "https://github.com/kyren/gc-arena"
rev = "75671ae03f53718357b741ed4027560f14e90836"
features = ["allocator-api2", "hashbrown", "smallvec"]
[dev-dependencies]
criterion = { version = "0.8", features = ["html_reports"] }
test-log = { version = "0.2", features = ["trace"] }
[[bench]]
name = "basic_ops"
harness = false
[[bench]]
name = "builtins"
harness = false
[[bench]]
name = "thunk_scope"
harness = false

113
fix/benches/basic_ops.rs Normal file
View File

@@ -0,0 +1,113 @@
mod utils;
use std::hint::black_box;
use criterion::{Criterion, criterion_group, criterion_main};
use utils::eval;
fn bench_arithmetic(c: &mut Criterion) {
let mut group = c.benchmark_group("arithmetic");
group.bench_function("addition", |b| b.iter(|| eval(black_box("1 + 1"))));
group.bench_function("subtraction", |b| b.iter(|| eval(black_box("10 - 5"))));
group.bench_function("multiplication", |b| b.iter(|| eval(black_box("6 * 7"))));
group.bench_function("division", |b| b.iter(|| eval(black_box("100 / 5"))));
group.bench_function("complex_expression", |b| {
b.iter(|| eval(black_box("(5 + 3) * (10 - 2) / 4")))
});
group.finish();
}
fn bench_comparison(c: &mut Criterion) {
let mut group = c.benchmark_group("comparison");
group.bench_function("equality", |b| b.iter(|| eval(black_box("42 == 42"))));
group.bench_function("less_than", |b| b.iter(|| eval(black_box("5 < 10"))));
group.bench_function("logical_and", |b| {
b.iter(|| eval(black_box("true && false")))
});
group.bench_function("logical_or", |b| {
b.iter(|| eval(black_box("true || false")))
});
group.finish();
}
fn bench_function_application(c: &mut Criterion) {
let mut group = c.benchmark_group("function_application");
group.bench_function("simple_identity", |b| {
b.iter(|| eval(black_box("(x: x) 42")))
});
group.bench_function("curried_function", |b| {
b.iter(|| eval(black_box("(x: y: x + y) 10 20")))
});
group.bench_function("nested_application", |b| {
b.iter(|| eval(black_box("((x: y: z: x + y + z) 1) 2 3")))
});
group.finish();
}
fn bench_let_bindings(c: &mut Criterion) {
let mut group = c.benchmark_group("let_bindings");
group.bench_function("simple_let", |b| {
b.iter(|| eval(black_box("let x = 5; in x + 10")))
});
group.bench_function("multiple_bindings", |b| {
b.iter(|| eval(black_box("let a = 1; b = 2; c = 3; in a + b + c")))
});
group.bench_function("dependent_bindings", |b| {
b.iter(|| eval(black_box("let x = 5; y = x * 2; z = y + 3; in z")))
});
group.finish();
}
fn bench_attrsets(c: &mut Criterion) {
let mut group = c.benchmark_group("attrsets");
group.bench_function("simple_attrset", |b| {
b.iter(|| eval(black_box("{ a = 1; b = 2; }.a")))
});
group.bench_function("nested_attrset", |b| {
b.iter(|| eval(black_box("{ a.b.c = 42; }.a.b.c")))
});
group.bench_function("rec_attrset", |b| {
b.iter(|| eval(black_box("rec { a = 1; b = a + 1; }.b")))
});
group.bench_function("attrset_update", |b| {
b.iter(|| eval(black_box("{ a = 1; } // { b = 2; }")))
});
group.finish();
}
fn bench_lists(c: &mut Criterion) {
let mut group = c.benchmark_group("lists");
group.bench_function("simple_list", |b| {
b.iter(|| eval(black_box("[ 1 2 3 4 5 ]")))
});
group.bench_function("list_concatenation", |b| {
b.iter(|| eval(black_box("[ 1 2 3 ] ++ [ 4 5 6 ]")))
});
group.bench_function("nested_list", |b| {
b.iter(|| eval(black_box("[ [ 1 2 ] [ 3 4 ] [ 5 6 ] ]")))
});
group.finish();
}
criterion_group!(
benches,
bench_arithmetic,
bench_comparison,
bench_function_application,
bench_let_bindings,
bench_attrsets,
bench_lists
);
criterion_main!(benches);

145
fix/benches/builtins.rs Normal file
View File

@@ -0,0 +1,145 @@
mod utils;
use std::hint::black_box;
use criterion::{Criterion, criterion_group, criterion_main};
use utils::eval;
fn bench_builtin_math(c: &mut Criterion) {
let mut group = c.benchmark_group("builtin_math");
group.bench_function("add", |b| b.iter(|| eval(black_box("builtins.add 10 20"))));
group.bench_function("sub", |b| b.iter(|| eval(black_box("builtins.sub 20 10"))));
group.bench_function("mul", |b| b.iter(|| eval(black_box("builtins.mul 6 7"))));
group.bench_function("div", |b| b.iter(|| eval(black_box("builtins.div 100 5"))));
group.finish();
}
fn bench_builtin_list(c: &mut Criterion) {
let mut group = c.benchmark_group("builtin_list");
group.bench_function("length_small", |b| {
b.iter(|| eval(black_box("builtins.length [1 2 3 4 5]")))
});
group.bench_function("length_large", |b| {
b.iter(|| {
eval(black_box(
"builtins.length [1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20]",
))
})
});
group.bench_function("head", |b| {
b.iter(|| eval(black_box("builtins.head [1 2 3 4 5]")))
});
group.bench_function("tail", |b| {
b.iter(|| eval(black_box("builtins.tail [1 2 3 4 5]")))
});
group.bench_function("elem_found", |b| {
b.iter(|| eval(black_box("builtins.elem 3 [1 2 3 4 5]")))
});
group.bench_function("elem_not_found", |b| {
b.iter(|| eval(black_box("builtins.elem 10 [1 2 3 4 5]")))
});
group.bench_function("concat_lists", |b| {
b.iter(|| eval(black_box("builtins.concatLists [[1 2] [3 4] [5 6]]")))
});
group.finish();
}
fn bench_builtin_map_filter(c: &mut Criterion) {
let mut group = c.benchmark_group("builtin_map_filter");
group.bench_function("map_small", |b| {
b.iter(|| eval(black_box("builtins.map (x: x * 2) [1 2 3 4 5]")))
});
group.bench_function("map_large", |b| {
b.iter(|| {
eval(black_box(
"builtins.map (x: x * 2) [1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20]",
))
})
});
group.bench_function("filter_small", |b| {
b.iter(|| eval(black_box("builtins.filter (x: x > 2) [1 2 3 4 5]")))
});
group.bench_function("filter_large", |b| {
b.iter(|| {
eval(black_box(
"builtins.filter (x: x > 10) [1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20]",
))
})
});
group.bench_function("foldl", |b| {
b.iter(|| {
eval(black_box(
"builtins.foldl' (acc: x: acc + x) 0 [1 2 3 4 5 6 7 8 9 10]",
))
})
});
group.finish();
}
fn bench_builtin_attrset(c: &mut Criterion) {
let mut group = c.benchmark_group("builtin_attrset");
group.bench_function("attrNames", |b| {
b.iter(|| eval(black_box("builtins.attrNames { a = 1; b = 2; c = 3; }")))
});
group.bench_function("attrValues", |b| {
b.iter(|| eval(black_box("builtins.attrValues { a = 1; b = 2; c = 3; }")))
});
group.bench_function("hasAttr", |b| {
b.iter(|| eval(black_box("builtins.hasAttr \"a\" { a = 1; b = 2; c = 3; }")))
});
group.bench_function("getAttr", |b| {
b.iter(|| eval(black_box("builtins.getAttr \"b\" { a = 1; b = 2; c = 3; }")))
});
group.finish();
}
fn bench_builtin_type_checks(c: &mut Criterion) {
let mut group = c.benchmark_group("builtin_type_checks");
group.bench_function("isInt", |b| b.iter(|| eval(black_box("builtins.isInt 42"))));
group.bench_function("isList", |b| {
b.iter(|| eval(black_box("builtins.isList [1 2 3]")))
});
group.bench_function("isAttrs", |b| {
b.iter(|| eval(black_box("builtins.isAttrs { a = 1; }")))
});
group.bench_function("isFunction", |b| {
b.iter(|| eval(black_box("builtins.isFunction (x: x)")))
});
group.bench_function("typeOf", |b| {
b.iter(|| eval(black_box("builtins.typeOf 42")))
});
group.finish();
}
fn bench_free_globals(c: &mut Criterion) {
let mut group = c.benchmark_group("free_globals");
group.bench_function("map", |b| {
b.iter(|| eval(black_box("map (x: x * 2) [1 2 3 4 5]")))
});
group.bench_function("isNull", |b| b.iter(|| eval(black_box("isNull null"))));
group.bench_function("toString", |b| b.iter(|| eval(black_box("toString 42"))));
group.finish();
}
criterion_group!(
benches,
bench_builtin_math,
bench_builtin_list,
bench_builtin_map_filter,
bench_builtin_attrset,
bench_builtin_type_checks,
bench_free_globals
);
criterion_main!(benches);

174
fix/benches/thunk_scope.rs Normal file
View File

@@ -0,0 +1,174 @@
mod utils;
use std::hint::black_box;
use criterion::{Criterion, criterion_group, criterion_main};
use utils::eval;
fn bench_non_recursive(c: &mut Criterion) {
let mut group = c.benchmark_group("non_recursive");
group.bench_function("simple_bindings", |b| {
b.iter(|| eval(black_box("let x = 1; y = 2; z = x + y; in z")))
});
group.bench_function("many_bindings", |b| {
b.iter(|| {
eval(black_box(
"let a = 1; b = 2; c = 3; d = 4; e = 5; f = 6; g = 7; h = 8; i = 9; j = 10; in a + b + c + d + e + f + g + h + i + j",
))
})
});
group.bench_function("dependent_chain", |b| {
b.iter(|| {
eval(black_box(
"let a = 1; b = a + 1; c = b + 1; d = c + 1; e = d + 1; in e",
))
})
});
group.bench_function("rec_attrset_non_recursive", |b| {
b.iter(|| eval(black_box("rec { a = 1; b = 2; c = 3; d = a + b + c; }.d")))
});
group.finish();
}
fn bench_recursive(c: &mut Criterion) {
let mut group = c.benchmark_group("recursive");
group.bench_function("simple_recursion", |b| {
b.iter(|| {
eval(black_box(
"let f = n: if n == 0 then 0 else f (n - 1); in f 10",
))
})
});
group.bench_function("fibonacci", |b| {
b.iter(|| {
eval(black_box(
"let fib = n: if n <= 1 then 1 else fib (n - 1) + fib (n - 2); in fib 10",
))
})
});
group.bench_function("factorial", |b| {
b.iter(|| {
eval(black_box(
"let factorial = n: if n == 0 then 1 else n * factorial (n - 1); in factorial 10",
))
})
});
group.bench_function("rec_attrset_recursive", |b| {
b.iter(|| {
eval(black_box(
"rec { f = n: if n == 0 then 1 else f (n - 1); }.f 10",
))
})
});
group.finish();
}
fn bench_mutual_recursion(c: &mut Criterion) {
let mut group = c.benchmark_group("mutual_recursion");
group.bench_function("two_way", |b| {
b.iter(|| {
eval(black_box(
"let f = n: if n == 0 then 0 else g (n - 1); g = n: if n == 0 then 1 else f (n - 1); in f 10",
))
})
});
group.bench_function("even_odd", |b| {
b.iter(|| {
eval(black_box(
"let even = n: if n == 0 then true else odd (n - 1); odd = n: if n == 0 then false else even (n - 1); in even 20",
))
})
});
group.bench_function("three_way", |b| {
b.iter(|| {
eval(black_box(
"let a = n: if n == 0 then 1 else b (n - 1); b = n: if n == 0 then 2 else c (n - 1); c = n: if n == 0 then 3 else a (n - 1); in a 15",
))
})
});
group.finish();
}
fn bench_mixed(c: &mut Criterion) {
let mut group = c.benchmark_group("mixed");
group.bench_function("recursive_with_constants", |b| {
b.iter(|| {
eval(black_box(
"let x = 10; fib = n: if n <= 1 then 1 else fib (n - 1) + fib (n - 2); in fib x",
))
})
});
group.bench_function("multiple_recursive_functions", |b| {
b.iter(|| {
eval(black_box(
"let fib = n: if n <= 1 then 1 else fib (n - 1) + fib (n - 2); fact = n: if n == 0 then 1 else n * fact (n - 1); in (fib 8) + (fact 5)",
))
})
});
group.bench_function("complex_dependency_graph", |b| {
b.iter(|| {
eval(black_box(
"let a = 1; b = 2; c = a + b; fib = n: if n <= 1 then 1 else fib (n - 1) + fib (n - 2); result = c + fib 8; in result",
))
})
});
group.finish();
}
fn bench_nested_scopes(c: &mut Criterion) {
let mut group = c.benchmark_group("nested_scopes");
group.bench_function("nested_let_non_recursive", |b| {
b.iter(|| {
eval(black_box(
"let x = 1; in let y = x + 1; in let z = y + 1; in z",
))
})
});
group.bench_function("nested_let_with_recursive", |b| {
b.iter(|| {
eval(black_box(
"let f = n: if n == 0 then 0 else f (n - 1); in let g = m: f m; in g 10",
))
})
});
group.bench_function("deeply_nested", |b| {
b.iter(|| {
eval(black_box(
"let a = 1; in let b = a + 1; in let c = b + 1; in let d = c + 1; in let e = d + 1; in e",
))
})
});
group.finish();
}
criterion_group!(
benches,
bench_non_recursive,
bench_recursive,
bench_mutual_recursion,
bench_mixed,
bench_nested_scopes
);
criterion_main!(benches);

18
fix/benches/utils.rs Normal file
View File

@@ -0,0 +1,18 @@
#![allow(dead_code)]
use fix::error::{Result, Source};
use fix::runtime::Runtime;
use fix::value::Value;
pub fn eval(expr: &str) -> Value {
Runtime::new()
.unwrap()
.eval(Source::new_eval(expr.into()).unwrap())
.unwrap()
}
pub fn eval_result(expr: &str) -> Result<Value> {
Runtime::new()
.unwrap()
.eval(Source::new_eval(expr.into()).unwrap())
}

867
fix/src/codegen.rs Normal file
View File

@@ -0,0 +1,867 @@
use std::ops::Deref;
use gc_arena::Collect;
use hashbrown::HashMap;
use num_enum::TryFromPrimitive;
use rnix::TextRange;
use string_interner::Symbol as _;
use crate::ir::{ArgId, Attr, BinOpKind, Ir, Param, RawIrRef, StringId, ThunkId, UnOpKind};
pub(crate) struct InstructionPtr(pub usize);
#[derive(Collect)]
#[collect(require_static)]
pub struct Bytecode {
pub code: Box<[u8]>,
pub current_dir: String,
}
pub(crate) trait BytecodeContext {
fn intern_string(&mut self, s: &str) -> StringId;
fn register_span(&mut self, range: TextRange) -> u32;
fn get_code(&self) -> &[u8];
fn get_code_mut(&mut self) -> &mut Vec<u8>;
}
#[repr(u8)]
#[derive(Clone, Copy, TryFromPrimitive)]
#[allow(clippy::enum_variant_names)]
pub enum Op {
PushSmi,
PushBigInt,
PushFloat,
PushString,
PushNull,
PushTrue,
PushFalse,
LoadLocal,
LoadOuter,
StoreLocal,
AllocLocals,
MakeThunk,
MakeClosure,
MakePatternClosure,
Call,
CallNoSpan,
MakeAttrs,
MakeAttrsDyn,
MakeEmptyAttrs,
Select,
SelectDefault,
HasAttr,
MakeList,
OpAdd,
OpSub,
OpMul,
OpDiv,
OpEq,
OpNeq,
OpLt,
OpGt,
OpLeq,
OpGeq,
OpConcat,
OpUpdate,
OpNeg,
OpNot,
ForceBool,
JumpIfFalse,
JumpIfTrue,
Jump,
ConcatStrings,
ResolvePath,
Assert,
PushWith,
PopWith,
WithLookup,
LoadBuiltins,
LoadBuiltin,
MkPos,
LoadReplBinding,
LoadScopedBinding,
Return,
}
struct ScopeInfo {
depth: u16,
arg_id: Option<ArgId>,
thunk_map: HashMap<ThunkId, u32>,
}
struct BytecodeEmitter<'a, Ctx: BytecodeContext> {
ctx: &'a mut Ctx,
scope_stack: Vec<ScopeInfo>,
}
pub(crate) fn compile_bytecode(ir: RawIrRef<'_>, ctx: &mut impl BytecodeContext) -> InstructionPtr {
let ip = ctx.get_code().len();
let mut emitter = BytecodeEmitter::new(ctx);
emitter.emit_toplevel(ir);
InstructionPtr(ip)
}
impl<'a, Ctx: BytecodeContext> BytecodeEmitter<'a, Ctx> {
fn new(ctx: &'a mut Ctx) -> Self {
Self {
ctx,
scope_stack: Vec::with_capacity(32),
}
}
#[inline]
fn emit_op(&mut self, op: Op) {
self.ctx.get_code_mut().push(op as u8);
}
#[inline]
fn emit_u8(&mut self, val: u8) {
self.ctx.get_code_mut().push(val);
}
#[inline]
fn emit_u16(&mut self, val: u16) {
self.ctx
.get_code_mut()
.extend_from_slice(&val.to_le_bytes());
}
#[inline]
fn emit_u32(&mut self, val: u32) {
self.ctx
.get_code_mut()
.extend_from_slice(&val.to_le_bytes());
}
#[inline]
fn emit_i32(&mut self, val: i32) {
self.ctx
.get_code_mut()
.extend_from_slice(&val.to_le_bytes());
}
#[inline]
fn emit_i64(&mut self, val: i64) {
self.ctx
.get_code_mut()
.extend_from_slice(&val.to_le_bytes());
}
#[inline]
fn emit_f64(&mut self, val: f64) {
self.ctx
.get_code_mut()
.extend_from_slice(&val.to_le_bytes());
}
#[inline]
fn emit_i32_placeholder(&mut self) -> usize {
let offset = self.ctx.get_code_mut().len();
self.ctx.get_code_mut().extend_from_slice(&[0u8; 4]);
offset
}
#[inline]
fn patch_i32(&mut self, offset: usize, val: i32) {
self.ctx.get_code_mut()[offset..offset + 4].copy_from_slice(&val.to_le_bytes());
}
#[inline]
fn emit_jump_placeholder(&mut self) -> usize {
self.emit_op(Op::Jump);
self.emit_i32_placeholder()
}
#[inline]
fn patch_jump_target(&mut self, placeholder_offset: usize) {
let current_pos = self.ctx.get_code_mut().len();
let relative_offset = (current_pos as i32) - (placeholder_offset as i32) - 4;
self.patch_i32(placeholder_offset, relative_offset);
}
#[inline]
fn emit_str_id(&mut self, id: StringId) {
self.ctx
.get_code_mut()
.extend_from_slice(&(id.0.to_usize() as u32).to_le_bytes());
}
fn current_depth(&self) -> u16 {
self.scope_stack.last().map_or(0, |s| s.depth)
}
fn resolve_thunk(&self, id: ThunkId) -> (u16, u32) {
for scope in self.scope_stack.iter().rev() {
if let Some(&local_idx) = scope.thunk_map.get(&id) {
let layer = self.current_depth() - scope.depth;
return (layer, local_idx);
}
}
panic!("ThunkId {:?} not found in any scope", id);
}
fn resolve_arg(&self, id: ArgId) -> (u16, u32) {
for scope in self.scope_stack.iter().rev() {
if scope.arg_id == Some(id) {
let layer = self.current_depth() - scope.depth;
return (layer, 0);
}
}
panic!("ArgId {:?} not found in any scope", id);
}
fn emit_load(&mut self, layer: u16, local: u32) {
if layer == 0 {
self.emit_op(Op::LoadLocal);
self.emit_u32(local);
} else {
self.emit_op(Op::LoadOuter);
self.emit_u8(layer as u8);
self.emit_u32(local);
}
}
fn count_with_thunks(&self, ir: RawIrRef<'_>) -> usize {
match ir.deref() {
Ir::With { thunks, body, .. } => thunks.len() + self.count_with_thunks(*body),
Ir::TopLevel { thunks, body } => thunks.len() + self.count_with_thunks(*body),
Ir::If { cond, consq, alter } => {
self.count_with_thunks(*cond)
+ self.count_with_thunks(*consq)
+ self.count_with_thunks(*alter)
}
Ir::BinOp { lhs, rhs, .. } => {
self.count_with_thunks(*lhs) + self.count_with_thunks(*rhs)
}
Ir::UnOp { rhs, .. } => self.count_with_thunks(*rhs),
Ir::Call { func, arg, .. } => {
self.count_with_thunks(*func) + self.count_with_thunks(*arg)
}
Ir::Assert {
assertion, expr, ..
} => self.count_with_thunks(*assertion) + self.count_with_thunks(*expr),
Ir::Select { expr, default, .. } => {
self.count_with_thunks(*expr) + default.map_or(0, |d| self.count_with_thunks(d))
}
Ir::HasAttr { lhs, .. } => self.count_with_thunks(*lhs),
Ir::ConcatStrings { parts, .. } => {
parts.iter().map(|p| self.count_with_thunks(*p)).sum()
}
Ir::Path(p) => self.count_with_thunks(*p),
Ir::List { items } => items.iter().map(|item| self.count_with_thunks(*item)).sum(),
Ir::AttrSet { stcs, dyns } => {
stcs.iter()
.map(|(_, &(val, _))| self.count_with_thunks(val))
.sum::<usize>()
+ dyns
.iter()
.map(|&(k, v, _)| self.count_with_thunks(k) + self.count_with_thunks(v))
.sum::<usize>()
}
_ => 0,
}
}
fn collect_all_thunks<'ir>(
&self,
own_thunks: &[(ThunkId, RawIrRef<'ir>)],
body: RawIrRef<'ir>,
) -> Vec<(ThunkId, RawIrRef<'ir>)> {
let mut all = Vec::from(own_thunks);
self.collect_with_thunks_recursive(body, &mut all);
let mut i = 0;
while i < all.len() {
let thunk_body = all[i].1;
self.collect_with_thunks_recursive(thunk_body, &mut all);
i += 1;
}
all
}
fn collect_with_thunks_recursive<'ir>(
&self,
ir: RawIrRef<'ir>,
out: &mut Vec<(ThunkId, RawIrRef<'ir>)>,
) {
match ir.deref() {
Ir::With { thunks, body, .. } => {
for &(id, inner) in thunks.iter() {
out.push((id, inner));
}
self.collect_with_thunks_recursive(*body, out);
}
Ir::TopLevel { thunks, body } => {
for &(id, inner) in thunks.iter() {
out.push((id, inner));
}
self.collect_with_thunks_recursive(*body, out);
}
Ir::If { cond, consq, alter } => {
self.collect_with_thunks_recursive(*cond, out);
self.collect_with_thunks_recursive(*consq, out);
self.collect_with_thunks_recursive(*alter, out);
}
Ir::BinOp { lhs, rhs, .. } => {
self.collect_with_thunks_recursive(*lhs, out);
self.collect_with_thunks_recursive(*rhs, out);
}
Ir::UnOp { rhs, .. } => self.collect_with_thunks_recursive(*rhs, out),
Ir::Call { func, arg, .. } => {
self.collect_with_thunks_recursive(*func, out);
self.collect_with_thunks_recursive(*arg, out);
}
Ir::Assert {
assertion, expr, ..
} => {
self.collect_with_thunks_recursive(*assertion, out);
self.collect_with_thunks_recursive(*expr, out);
}
Ir::Select { expr, default, .. } => {
self.collect_with_thunks_recursive(*expr, out);
if let Some(d) = default {
self.collect_with_thunks_recursive(*d, out);
}
}
Ir::HasAttr { lhs, .. } => self.collect_with_thunks_recursive(*lhs, out),
Ir::ConcatStrings { parts, .. } => {
for p in parts.iter() {
self.collect_with_thunks_recursive(*p, out);
}
}
Ir::Path(p) => self.collect_with_thunks_recursive(*p, out),
Ir::List { items } => {
for item in items.iter() {
self.collect_with_thunks_recursive(*item, out);
}
}
Ir::AttrSet { stcs, dyns } => {
for (_, &(val, _)) in stcs.iter() {
self.collect_with_thunks_recursive(val, out);
}
for &(key, val, _) in dyns.iter() {
self.collect_with_thunks_recursive(key, out);
self.collect_with_thunks_recursive(val, out);
}
}
_ => {}
}
}
fn push_scope(&mut self, has_arg: bool, arg_id: Option<ArgId>, thunk_ids: &[ThunkId]) {
let depth = self.scope_stack.len() as u16;
let thunk_base = if has_arg { 1u32 } else { 0u32 };
let thunk_map = thunk_ids
.iter()
.enumerate()
.map(|(i, &id)| (id, thunk_base + i as u32))
.collect();
self.scope_stack.push(ScopeInfo {
depth,
arg_id,
thunk_map,
});
}
fn pop_scope(&mut self) {
self.scope_stack.pop();
}
fn emit_toplevel(&mut self, ir: RawIrRef<'_>) {
match ir.deref() {
Ir::TopLevel { body, thunks } => {
let with_thunk_count = self.count_with_thunks(*body);
let total_slots = thunks.len() + with_thunk_count;
let all_thunks = self.collect_all_thunks(thunks, *body);
let thunk_ids: Vec<ThunkId> = all_thunks.iter().map(|&(id, _)| id).collect();
self.push_scope(false, None, &thunk_ids);
if total_slots > 0 {
self.emit_op(Op::AllocLocals);
self.emit_u32(total_slots as u32);
}
self.emit_scope_thunks(thunks);
self.emit_expr(*body);
self.emit_op(Op::Return);
self.pop_scope();
}
_ => {
self.push_scope(false, None, &[]);
self.emit_expr(ir);
self.emit_op(Op::Return);
self.pop_scope();
}
}
}
fn emit_scope_thunks(&mut self, thunks: &[(ThunkId, RawIrRef<'_>)]) {
for &(id, inner) in thunks {
let label = format!("e{}", id.0);
let label_idx = self.ctx.intern_string(&label);
let skip_patch = self.emit_jump_placeholder();
let entry_point = self.ctx.get_code_mut().len() as u32;
self.emit_expr(inner);
self.emit_op(Op::Return);
self.patch_jump_target(skip_patch);
self.emit_op(Op::MakeThunk);
self.emit_u32(entry_point);
self.emit_str_id(label_idx);
let (_, local_idx) = self.resolve_thunk(id);
self.emit_op(Op::StoreLocal);
self.emit_u32(local_idx);
}
}
fn emit_expr(&mut self, ir: RawIrRef<'_>) {
match ir.deref() {
&Ir::Int(x) => {
if x <= i32::MAX as i64 {
self.emit_op(Op::PushSmi);
self.emit_i32(x as i32);
} else {
self.emit_op(Op::PushBigInt);
self.emit_i64(x);
}
}
&Ir::Float(x) => {
self.emit_op(Op::PushFloat);
self.emit_f64(x);
}
&Ir::Bool(true) => self.emit_op(Op::PushTrue),
&Ir::Bool(false) => self.emit_op(Op::PushFalse),
Ir::Null => self.emit_op(Op::PushNull),
Ir::Str(s) => {
let idx = self.ctx.intern_string(s.deref());
self.emit_op(Op::PushString);
self.emit_str_id(idx);
}
&Ir::Path(p) => {
self.emit_expr(p);
self.emit_op(Op::ResolvePath);
}
&Ir::If { cond, consq, alter } => {
self.emit_expr(cond);
self.emit_op(Op::ForceBool);
self.emit_op(Op::JumpIfFalse);
let else_placeholder = self.emit_i32_placeholder();
let after_jif = self.ctx.get_code_mut().len();
self.emit_expr(consq);
self.emit_op(Op::Jump);
let end_placeholder = self.emit_i32_placeholder();
let after_jump = self.ctx.get_code_mut().len();
let else_offset = (after_jump as i32) - (after_jif as i32);
self.patch_i32(else_placeholder, else_offset);
self.emit_expr(alter);
let end_offset = (self.ctx.get_code_mut().len() as i32) - (after_jump as i32);
self.patch_i32(end_placeholder, end_offset);
}
&Ir::BinOp { lhs, rhs, kind } => {
self.emit_binop(lhs, rhs, kind);
}
&Ir::UnOp { rhs, kind } => match kind {
UnOpKind::Neg => {
self.emit_expr(rhs);
self.emit_op(Op::OpNeg);
}
UnOpKind::Not => {
self.emit_expr(rhs);
self.emit_op(Op::OpNot);
}
},
&Ir::Func {
body,
ref param,
arg,
ref thunks,
} => {
self.emit_func(arg, thunks, param, body);
}
Ir::AttrSet { stcs, dyns } => {
self.emit_attrset(stcs, dyns);
}
Ir::List { items } => {
for &item in items.iter() {
self.emit_expr(item);
}
self.emit_op(Op::MakeList);
self.emit_u32(items.len() as u32);
}
&Ir::Call { func, arg, span } => {
self.emit_expr(func);
self.emit_expr(arg);
let span_id = self.ctx.register_span(span);
self.emit_op(Op::Call);
self.emit_u32(span_id);
}
&Ir::Arg(id) => {
let (layer, local) = self.resolve_arg(id);
self.emit_load(layer, local);
}
&Ir::TopLevel { body, ref thunks } => {
self.emit_toplevel_inner(body, thunks);
}
&Ir::Select {
expr,
ref attrpath,
default,
span,
} => {
self.emit_select(expr, attrpath, default, span);
}
&Ir::Thunk(id) => {
let (layer, local) = self.resolve_thunk(id);
self.emit_load(layer, local);
}
Ir::Builtins => {
self.emit_op(Op::LoadBuiltins);
}
&Ir::Builtin(name) => {
self.emit_op(Op::LoadBuiltin);
self.emit_u32(name.0.to_usize() as u32);
}
&Ir::ConcatStrings {
ref parts,
force_string,
} => {
for &part in parts.iter() {
self.emit_expr(part);
}
self.emit_op(Op::ConcatStrings);
self.emit_u16(parts.len() as u16);
self.emit_u8(if force_string { 1 } else { 0 });
}
&Ir::HasAttr { lhs, ref rhs } => {
self.emit_has_attr(lhs, rhs);
}
Ir::Assert {
assertion,
expr,
assertion_raw,
span,
} => {
let raw_idx = self.ctx.intern_string(assertion_raw);
let span_id = self.ctx.register_span(*span);
self.emit_expr(*assertion);
self.emit_expr(*expr);
self.emit_op(Op::Assert);
self.emit_str_id(raw_idx);
self.emit_u32(span_id);
}
&Ir::CurPos(span) => {
let span_id = self.ctx.register_span(span);
self.emit_op(Op::MkPos);
self.emit_u32(span_id);
}
&Ir::ReplBinding(name) => {
self.emit_op(Op::LoadReplBinding);
self.emit_str_id(name);
}
&Ir::ScopedImportBinding(name) => {
self.emit_op(Op::LoadScopedBinding);
self.emit_str_id(name);
}
&Ir::With {
namespace,
body,
ref thunks,
} => {
self.emit_with(namespace, body, thunks);
}
&Ir::WithLookup(name) => {
self.emit_op(Op::WithLookup);
self.emit_str_id(name);
}
}
}
fn emit_binop(&mut self, lhs: RawIrRef<'_>, rhs: RawIrRef<'_>, kind: BinOpKind) {
use BinOpKind::*;
match kind {
And => {
self.emit_expr(lhs);
self.emit_op(Op::ForceBool);
self.emit_op(Op::JumpIfFalse);
let skip_placeholder = self.emit_i32_placeholder();
let after_jif = self.ctx.get_code_mut().len();
self.emit_expr(rhs);
self.emit_op(Op::ForceBool);
self.emit_op(Op::Jump);
let end_placeholder = self.emit_i32_placeholder();
let after_jump = self.ctx.get_code_mut().len();
let false_offset = (after_jump as i32) - (after_jif as i32);
self.patch_i32(skip_placeholder, false_offset);
self.emit_op(Op::PushFalse);
let end_offset = (self.ctx.get_code_mut().len() as i32) - (after_jump as i32);
self.patch_i32(end_placeholder, end_offset);
}
Or => {
self.emit_expr(lhs);
self.emit_op(Op::ForceBool);
self.emit_op(Op::JumpIfTrue);
let skip_placeholder = self.emit_i32_placeholder();
let after_jit = self.ctx.get_code_mut().len();
self.emit_expr(rhs);
self.emit_op(Op::ForceBool);
self.emit_op(Op::Jump);
let end_placeholder = self.emit_i32_placeholder();
let after_jump = self.ctx.get_code_mut().len();
let true_offset = (after_jump as i32) - (after_jit as i32);
self.patch_i32(skip_placeholder, true_offset);
self.emit_op(Op::PushTrue);
let end_offset = (self.ctx.get_code_mut().len() as i32) - (after_jump as i32);
self.patch_i32(end_placeholder, end_offset);
}
Impl => {
self.emit_expr(lhs);
self.emit_op(Op::ForceBool);
self.emit_op(Op::JumpIfFalse);
let skip_placeholder = self.emit_i32_placeholder();
let after_jif = self.ctx.get_code_mut().len();
self.emit_expr(rhs);
self.emit_op(Op::ForceBool);
self.emit_op(Op::Jump);
let end_placeholder = self.emit_i32_placeholder();
let after_jump = self.ctx.get_code_mut().len();
let true_offset = (after_jump as i32) - (after_jif as i32);
self.patch_i32(skip_placeholder, true_offset);
self.emit_op(Op::PushTrue);
let end_offset = (self.ctx.get_code_mut().len() as i32) - (after_jump as i32);
self.patch_i32(end_placeholder, end_offset);
}
PipeL => {
self.emit_expr(rhs);
self.emit_expr(lhs);
self.emit_op(Op::CallNoSpan);
}
PipeR => {
self.emit_expr(lhs);
self.emit_expr(rhs);
self.emit_op(Op::CallNoSpan);
}
_ => {
self.emit_expr(lhs);
self.emit_expr(rhs);
self.emit_op(match kind {
Add => Op::OpAdd,
Sub => Op::OpSub,
Mul => Op::OpMul,
Div => Op::OpDiv,
Eq => Op::OpEq,
Neq => Op::OpNeq,
Lt => Op::OpLt,
Gt => Op::OpGt,
Leq => Op::OpLeq,
Geq => Op::OpGeq,
Con => Op::OpConcat,
Upd => Op::OpUpdate,
_ => unreachable!(),
});
}
}
}
fn emit_func(
&mut self,
arg: ArgId,
thunks: &[(ThunkId, RawIrRef<'_>)],
param: &Option<Param<'_>>,
body: RawIrRef<'_>,
) {
let with_thunk_count = self.count_with_thunks(body);
let total_slots = thunks.len() + with_thunk_count;
let all_thunks = self.collect_all_thunks(thunks, body);
let thunk_ids: Vec<ThunkId> = all_thunks.iter().map(|&(id, _)| id).collect();
let skip_patch = self.emit_jump_placeholder();
let entry_point = self.ctx.get_code().len() as u32;
self.push_scope(true, Some(arg), &thunk_ids);
self.emit_scope_thunks(thunks);
self.emit_expr(body);
self.emit_op(Op::Return);
self.pop_scope();
self.patch_jump_target(skip_patch);
if let Some(Param {
required,
optional,
ellipsis,
}) = param
{
self.emit_op(Op::MakePatternClosure);
self.emit_u32(entry_point);
self.emit_u32(total_slots as u32);
self.emit_u16(required.len() as u16);
self.emit_u16(optional.len() as u16);
self.emit_u8(if *ellipsis { 1 } else { 0 });
for &(sym, _) in required.iter() {
self.emit_str_id(sym);
}
for &(sym, _) in optional.iter() {
self.emit_str_id(sym);
}
for &(sym, span) in required.iter().chain(optional.iter()) {
let span_id = self.ctx.register_span(span);
self.emit_str_id(sym);
self.emit_u32(span_id);
}
} else {
self.emit_op(Op::MakeClosure);
self.emit_u32(entry_point);
self.emit_u32(total_slots as u32);
}
}
fn emit_attrset(
&mut self,
stcs: &crate::ir::HashMap<'_, StringId, (RawIrRef<'_>, TextRange)>,
dyns: &[(RawIrRef<'_>, RawIrRef<'_>, TextRange)],
) {
if stcs.is_empty() && dyns.is_empty() {
self.emit_op(Op::MakeEmptyAttrs);
return;
}
if !dyns.is_empty() {
for (&sym, &(val, _)) in stcs.iter() {
self.emit_op(Op::PushString);
self.emit_str_id(sym);
self.emit_expr(val);
}
for (_, &(_, span)) in stcs.iter() {
let span_id = self.ctx.register_span(span);
self.emit_op(Op::PushSmi);
self.emit_u32(span_id);
}
for &(key, val, span) in dyns.iter() {
self.emit_expr(key);
self.emit_expr(val);
let span_id = self.ctx.register_span(span);
self.emit_op(Op::PushSmi);
self.emit_u32(span_id);
}
self.emit_op(Op::MakeAttrsDyn);
self.emit_u32(stcs.len() as u32);
self.emit_u32(dyns.len() as u32);
} else {
for (&sym, &(val, _)) in stcs.iter() {
self.emit_op(Op::PushString);
self.emit_str_id(sym);
self.emit_expr(val);
}
for (_, &(_, span)) in stcs.iter() {
let span_id = self.ctx.register_span(span);
self.emit_op(Op::PushSmi);
self.emit_u32(span_id);
}
self.emit_op(Op::MakeAttrs);
self.emit_u32(stcs.len() as u32);
}
}
fn emit_select(
&mut self,
expr: RawIrRef<'_>,
attrpath: &[Attr<RawIrRef<'_>>],
default: Option<RawIrRef<'_>>,
span: TextRange,
) {
self.emit_expr(expr);
for attr in attrpath.iter() {
match *attr {
Attr::Str(sym, _) => {
self.emit_op(Op::PushString);
self.emit_str_id(sym);
}
Attr::Dynamic(expr, _) => {
self.emit_expr(expr);
}
}
}
if let Some(default) = default {
self.emit_expr(default);
let span_id = self.ctx.register_span(span);
self.emit_op(Op::SelectDefault);
self.emit_u16(attrpath.len() as u16);
self.emit_u32(span_id);
} else {
let span_id = self.ctx.register_span(span);
self.emit_op(Op::Select);
self.emit_u16(attrpath.len() as u16);
self.emit_u32(span_id);
}
}
fn emit_has_attr(&mut self, lhs: RawIrRef<'_>, rhs: &[Attr<RawIrRef<'_>>]) {
self.emit_expr(lhs);
for attr in rhs.iter() {
match *attr {
Attr::Str(sym, _) => {
self.emit_op(Op::PushString);
self.emit_str_id(sym);
}
Attr::Dynamic(expr, _) => {
self.emit_expr(expr);
}
}
}
self.emit_op(Op::HasAttr);
self.emit_u16(rhs.len() as u16);
}
fn emit_with(
&mut self,
namespace: RawIrRef<'_>,
body: RawIrRef<'_>,
thunks: &[(ThunkId, RawIrRef<'_>)],
) {
self.emit_expr(namespace);
self.emit_op(Op::PushWith);
self.emit_scope_thunks(thunks);
self.emit_expr(body);
self.emit_op(Op::PopWith);
}
fn emit_toplevel_inner(&mut self, body: RawIrRef<'_>, thunks: &[(ThunkId, RawIrRef<'_>)]) {
self.emit_scope_thunks(thunks);
self.emit_expr(body);
}
}

142
fix/src/derivation.rs Normal file
View File

@@ -0,0 +1,142 @@
use std::collections::{BTreeMap, BTreeSet};
pub struct OutputInfo {
pub path: String,
pub hash_algo: String,
pub hash: String,
}
pub struct DerivationData {
pub name: String,
pub outputs: BTreeMap<String, OutputInfo>,
pub input_drvs: BTreeMap<String, BTreeSet<String>>,
pub input_srcs: BTreeSet<String>,
pub platform: String,
pub builder: String,
pub args: Vec<String>,
pub env: BTreeMap<String, String>,
}
fn escape_string(s: &str) -> String {
let mut result = String::with_capacity(s.len() + 2);
result.push('"');
for c in s.chars() {
match c {
'"' => result.push_str("\\\""),
'\\' => result.push_str("\\\\"),
'\n' => result.push_str("\\n"),
'\r' => result.push_str("\\r"),
'\t' => result.push_str("\\t"),
_ => result.push(c),
}
}
result.push('"');
result
}
fn quote_string(s: &str) -> String {
format!("\"{}\"", s)
}
impl DerivationData {
pub fn generate_aterm(&self) -> String {
let mut output_entries = Vec::new();
for (name, info) in &self.outputs {
output_entries.push(format!(
"({},{},{},{})",
quote_string(name),
quote_string(&info.path),
quote_string(&info.hash_algo),
quote_string(&info.hash),
));
}
let outputs = output_entries.join(",");
let mut input_drv_entries = Vec::new();
for (drv_path, output_names) in &self.input_drvs {
let sorted_outs: Vec<String> = output_names.iter().map(|s| quote_string(s)).collect();
let out_list = format!("[{}]", sorted_outs.join(","));
input_drv_entries.push(format!("({},{})", quote_string(drv_path), out_list));
}
let input_drvs = input_drv_entries.join(",");
let input_srcs: Vec<String> = self.input_srcs.iter().map(|s| quote_string(s)).collect();
let input_srcs = input_srcs.join(",");
let args: Vec<String> = self.args.iter().map(|s| escape_string(s)).collect();
let args = args.join(",");
let mut env_entries: Vec<String> = Vec::new();
for (k, v) in &self.env {
env_entries.push(format!("({},{})", escape_string(k), escape_string(v)));
}
format!(
"Derive([{}],[{}],[{}],{},{},[{}],[{}])",
outputs,
input_drvs,
input_srcs,
quote_string(&self.platform),
escape_string(&self.builder),
args,
env_entries.join(","),
)
}
pub fn generate_aterm_modulo(&self, input_drv_hashes: &BTreeMap<String, String>) -> String {
let mut output_entries = Vec::new();
for (name, info) in &self.outputs {
output_entries.push(format!(
"({},{},{},{})",
quote_string(name),
quote_string(&info.path),
quote_string(&info.hash_algo),
quote_string(&info.hash),
));
}
let outputs = output_entries.join(",");
let mut input_drv_entries = Vec::new();
for (drv_hash, outputs_csv) in input_drv_hashes {
let mut sorted_outs: Vec<&str> = outputs_csv.split(',').collect();
sorted_outs.sort();
let out_list: Vec<String> = sorted_outs.iter().map(|s| quote_string(s)).collect();
let out_list = format!("[{}]", out_list.join(","));
input_drv_entries.push(format!("({},{})", quote_string(drv_hash), out_list));
}
let input_drvs = input_drv_entries.join(",");
let input_srcs: Vec<String> = self.input_srcs.iter().map(|s| quote_string(s)).collect();
let input_srcs = input_srcs.join(",");
let args: Vec<String> = self.args.iter().map(|s| escape_string(s)).collect();
let args = args.join(",");
let mut env_entries: Vec<String> = Vec::new();
for (k, v) in &self.env {
env_entries.push(format!("({},{})", escape_string(k), escape_string(v)));
}
format!(
"Derive([{}],[{}],[{}],{},{},[{}],[{}])",
outputs,
input_drvs,
input_srcs,
quote_string(&self.platform),
escape_string(&self.builder),
args,
env_entries.join(","),
)
}
pub fn collect_references(&self) -> Vec<String> {
let mut refs = BTreeSet::new();
for src in &self.input_srcs {
refs.insert(src.clone());
}
for drv_path in self.input_drvs.keys() {
refs.insert(drv_path.clone());
}
refs.into_iter().collect()
}
}

372
fix/src/disassembler.rs Normal file
View File

@@ -0,0 +1,372 @@
use std::fmt::Write;
use colored::Colorize;
use num_enum::TryFromPrimitive;
use crate::codegen::{Bytecode, Op};
pub(crate) trait DisassemblerContext {
fn lookup_string(&self, id: u32) -> &str;
}
pub(crate) struct Disassembler<'a, Ctx> {
code: &'a [u8],
ctx: &'a Ctx,
pos: usize,
}
impl<'a, Ctx: DisassemblerContext> Disassembler<'a, Ctx> {
pub fn new(bytecode: &'a Bytecode, ctx: &'a Ctx) -> Self {
Self {
code: &bytecode.code,
ctx,
pos: 0,
}
}
fn read_u8(&mut self) -> u8 {
let b = self.code[self.pos];
self.pos += 1;
b
}
fn read_u16(&mut self) -> u16 {
let bytes = self.code[self.pos..self.pos + 2]
.try_into()
.expect("no enough bytes");
self.pos += 2;
u16::from_le_bytes(bytes)
}
fn read_u32(&mut self) -> u32 {
let bytes = self.code[self.pos..self.pos + 4]
.try_into()
.expect("no enough bytes");
self.pos += 4;
u32::from_le_bytes(bytes)
}
fn read_i32(&mut self) -> i32 {
let bytes = self.code[self.pos..self.pos + 4]
.try_into()
.expect("no enough bytes");
self.pos += 4;
i32::from_le_bytes(bytes)
}
fn read_i64(&mut self) -> i64 {
let bytes = self.code[self.pos..self.pos + 8]
.try_into()
.expect("no enough bytes");
self.pos += 8;
i64::from_le_bytes(bytes)
}
fn read_f64(&mut self) -> f64 {
let bytes = self.code[self.pos..self.pos + 8]
.try_into()
.expect("no enough bytes");
self.pos += 8;
f64::from_le_bytes(bytes)
}
pub fn disassemble(&mut self) -> String {
self.disassemble_impl(false)
}
pub fn disassemble_colored(&mut self) -> String {
self.disassemble_impl(true)
}
fn disassemble_impl(&mut self, color: bool) -> String {
let mut out = String::new();
if color {
let _ = writeln!(out, "{}", "=== Bytecode Disassembly ===".bold().white());
let _ = writeln!(
out,
"{} {}",
"Length:".white(),
format!("{} bytes", self.code.len()).cyan()
);
} else {
let _ = writeln!(out, "=== Bytecode Disassembly ===");
let _ = writeln!(out, "Length: {} bytes", self.code.len());
}
while self.pos < self.code.len() {
let start_pos = self.pos;
let op_byte = self.read_u8();
let (mnemonic, args) = self.decode_instruction(op_byte, start_pos);
let bytes_slice = &self.code[start_pos + 1..self.pos];
for (i, chunk) in bytes_slice.chunks(4).enumerate() {
let bytes_str = {
let mut temp = String::new();
if i == 0 {
let _ = write!(&mut temp, "{:02x}", self.code[start_pos]);
} else {
let _ = write!(&mut temp, " ");
}
for b in chunk.iter() {
let _ = write!(&mut temp, " {:02x}", b);
}
temp
};
if i == 0 {
if color {
let sep = if args.is_empty() { "" } else { " " };
let _ = writeln!(
out,
"{} {:<14} | {}{}{}",
format!("{:04x}", start_pos).dimmed(),
bytes_str.green(),
mnemonic.yellow().bold(),
sep,
args.cyan()
);
} else {
let op_str = if args.is_empty() {
mnemonic.to_string()
} else {
format!("{} {}", mnemonic, args)
};
let _ = writeln!(out, "{:04x} {:<14} | {}", start_pos, bytes_str, op_str);
}
} else {
let extra_width = start_pos.ilog2() >> 4;
if color {
let _ = write!(out, " ");
for _ in 0..extra_width {
let _ = write!(out, " ");
}
let _ = writeln!(out, " {:<14} |", bytes_str.green());
} else {
let _ = write!(out, " ");
for _ in 0..extra_width {
let _ = write!(out, " ");
}
let _ = writeln!(out, " {:<14} |", bytes_str);
}
}
}
}
out
}
fn decode_instruction(&mut self, op_byte: u8, current_pc: usize) -> (&'static str, String) {
let op = Op::try_from_primitive(op_byte).expect("invalid op code");
match op {
Op::PushSmi => {
let val = self.read_i32();
("PushSmi", format!("{}", val))
}
Op::PushBigInt => {
let val = self.read_i64();
("PushBigInt", format!("{}", val))
}
Op::PushFloat => {
let val = self.read_f64();
("PushFloat", format!("{}", val))
}
Op::PushString => {
let idx = self.read_u32();
let s = self.ctx.lookup_string(idx);
let len = s.len();
let mut s_fmt = format!("{:?}", s);
if s_fmt.len() > 60 {
s_fmt.truncate(57);
#[allow(clippy::unwrap_used)]
write!(s_fmt, "...\" (total {len} bytes)").unwrap();
}
("PushString", format!("@{} {}", idx, s_fmt))
}
Op::PushNull => ("PushNull", String::new()),
Op::PushTrue => ("PushTrue", String::new()),
Op::PushFalse => ("PushFalse", String::new()),
Op::LoadLocal => {
let idx = self.read_u32();
("LoadLocal", format!("[{}]", idx))
}
Op::LoadOuter => {
let depth = self.read_u8();
let idx = self.read_u32();
("LoadOuter", format!("depth={} [{}]", depth, idx))
}
Op::StoreLocal => {
let idx = self.read_u32();
("StoreLocal", format!("[{}]", idx))
}
Op::AllocLocals => {
let count = self.read_u32();
("AllocLocals", format!("count={}", count))
}
Op::MakeThunk => {
let offset = self.read_u32();
let label_idx = self.read_u32();
let label = self.ctx.lookup_string(label_idx);
("MakeThunk", format!("-> {:04x} label={}", offset, label))
}
Op::MakeClosure => {
let offset = self.read_u32();
let slots = self.read_u32();
("MakeClosure", format!("-> {:04x} slots={}", offset, slots))
}
Op::MakePatternClosure => {
let offset = self.read_u32();
let slots = self.read_u32();
let req_count = self.read_u16();
let opt_count = self.read_u16();
let ellipsis = self.read_u8() != 0;
let mut arg_str = format!(
"-> {:04x} slots={} req={} opt={} ...={})",
offset, slots, req_count, opt_count, ellipsis
);
arg_str.push_str(" Args=[");
for _ in 0..req_count {
let idx = self.read_u32();
arg_str.push_str(&format!("Req({}) ", self.ctx.lookup_string(idx)));
}
for _ in 0..opt_count {
let idx = self.read_u32();
arg_str.push_str(&format!("Opt({}) ", self.ctx.lookup_string(idx)));
}
let total_args = req_count + opt_count;
for _ in 0..total_args {
let _name_idx = self.read_u32();
let _span_id = self.read_u32();
}
arg_str.push(']');
("MakePatternClosure", arg_str)
}
Op::Call => {
let span_id = self.read_u32();
("Call", format!("span={}", span_id))
}
Op::CallNoSpan => ("CallNoSpan", String::new()),
Op::MakeAttrs => {
let count = self.read_u32();
("MakeAttrs", format!("size={}", count))
}
Op::MakeAttrsDyn => {
let static_count = self.read_u32();
let dyn_count = self.read_u32();
(
"MakeAttrsDyn",
format!("static={} dyn={}", static_count, dyn_count),
)
}
Op::MakeEmptyAttrs => ("MakeEmptyAttrs", String::new()),
Op::Select => {
let path_len = self.read_u16();
let span_id = self.read_u32();
("Select", format!("path_len={} span={}", path_len, span_id))
}
Op::SelectDefault => {
let path_len = self.read_u16();
let span_id = self.read_u32();
(
"SelectDefault",
format!("path_len={} span={}", path_len, span_id),
)
}
Op::HasAttr => {
let path_len = self.read_u16();
("HasAttr", format!("path_len={}", path_len))
}
Op::MakeList => {
let count = self.read_u32();
("MakeList", format!("size={}", count))
}
Op::OpAdd => ("OpAdd", String::new()),
Op::OpSub => ("OpSub", String::new()),
Op::OpMul => ("OpMul", String::new()),
Op::OpDiv => ("OpDiv", String::new()),
Op::OpEq => ("OpEq", String::new()),
Op::OpNeq => ("OpNeq", String::new()),
Op::OpLt => ("OpLt", String::new()),
Op::OpGt => ("OpGt", String::new()),
Op::OpLeq => ("OpLeq", String::new()),
Op::OpGeq => ("OpGeq", String::new()),
Op::OpConcat => ("OpConcat", String::new()),
Op::OpUpdate => ("OpUpdate", String::new()),
Op::OpNeg => ("OpNeg", String::new()),
Op::OpNot => ("OpNot", String::new()),
Op::ForceBool => ("ForceBool", String::new()),
Op::JumpIfFalse => {
let offset = self.read_i32();
let target = (current_pc as isize + 1 + 4 + offset as isize) as usize;
(
"JumpIfFalse",
format!("-> {:04x} offset={}", target, offset),
)
}
Op::JumpIfTrue => {
let offset = self.read_i32();
let target = (current_pc as isize + 1 + 4 + offset as isize) as usize;
("JumpIfTrue", format!("-> {:04x} offset={}", target, offset))
}
Op::Jump => {
let offset = self.read_i32();
let target = (current_pc as isize + 1 + 4 + offset as isize) as usize;
("Jump", format!("-> {:04x} offset={}", target, offset))
}
Op::ConcatStrings => {
let count = self.read_u16();
let force = self.read_u8();
("ConcatStrings", format!("count={} force={}", count, force))
}
Op::ResolvePath => ("ResolvePath", String::new()),
Op::Assert => {
let raw_idx = self.read_u32();
let span_id = self.read_u32();
("Assert", format!("text_id={} span={}", raw_idx, span_id))
}
Op::PushWith => ("PushWith", String::new()),
Op::PopWith => ("PopWith", String::new()),
Op::WithLookup => {
let idx = self.read_u32();
let name = self.ctx.lookup_string(idx);
("WithLookup", format!("{:?}", name))
}
Op::LoadBuiltins => ("LoadBuiltins", String::new()),
Op::LoadBuiltin => {
let idx = self.read_u32();
let name = self.ctx.lookup_string(idx);
("LoadBuiltin", format!("{:?}", name))
}
Op::MkPos => {
let span_id = self.read_u32();
("MkPos", format!("id={}", span_id))
}
Op::LoadReplBinding => {
let idx = self.read_u32();
let name = self.ctx.lookup_string(idx);
("LoadReplBinding", format!("{:?}", name))
}
Op::LoadScopedBinding => {
let idx = self.read_u32();
let name = self.ctx.lookup_string(idx);
("LoadScopedBinding", format!("{:?}", name))
}
Op::Return => ("Return", String::new()),
}
}
}

1249
fix/src/downgrade.rs Normal file

File diff suppressed because it is too large Load Diff

226
fix/src/error.rs Normal file
View File

@@ -0,0 +1,226 @@
use std::path::{Path, PathBuf};
use std::sync::Arc;
use miette::{Diagnostic, NamedSource, SourceSpan};
use thiserror::Error;
pub type Result<T> = core::result::Result<T, Box<Error>>;
#[derive(Clone, Debug)]
pub enum SourceType {
/// dir
Eval(Arc<PathBuf>),
/// dir
Repl(Arc<PathBuf>),
/// file
File(Arc<PathBuf>),
/// virtual (name, no path)
Virtual(Arc<str>),
}
#[derive(Clone, Debug)]
pub struct Source {
pub ty: SourceType,
pub src: Arc<str>,
}
impl TryFrom<&str> for Source {
type Error = Box<Error>;
fn try_from(value: &str) -> Result<Self> {
Source::new_eval(value.into())
}
}
impl From<Source> for NamedSource<Arc<str>> {
fn from(value: Source) -> Self {
let name = value.get_name();
NamedSource::new(name, value.src.clone())
}
}
impl Source {
pub fn new_file(path: PathBuf) -> std::io::Result<Self> {
Ok(Source {
src: std::fs::read_to_string(&path)?.into(),
ty: crate::error::SourceType::File(Arc::new(path)),
})
}
pub fn new_eval(src: String) -> Result<Self> {
Ok(Self {
ty: std::env::current_dir()
.map_err(|err| Error::internal(format!("Failed to get current working dir: {err}")))
.map(Arc::new)
.map(SourceType::Eval)?,
src: src.into(),
})
}
pub fn new_repl(src: String) -> Result<Self> {
Ok(Self {
ty: std::env::current_dir()
.map_err(|err| Error::internal(format!("Failed to get current working dir: {err}")))
.map(Arc::new)
.map(SourceType::Repl)?,
src: src.into(),
})
}
pub fn new_virtual(name: Arc<str>, src: String) -> Self {
Self {
ty: SourceType::Virtual(name),
src: src.into(),
}
}
pub fn get_dir(&self) -> &Path {
use SourceType::*;
match &self.ty {
Eval(dir) | Repl(dir) => dir.as_ref(),
File(file) => file
.as_path()
.parent()
.expect("source file must have a parent dir"),
Virtual(_) => Path::new("/"),
}
}
pub fn get_name(&self) -> String {
match &self.ty {
SourceType::Eval(_) => "«eval»".into(),
SourceType::Repl(_) => "«repl»".into(),
SourceType::File(path) => path.as_os_str().to_string_lossy().to_string(),
SourceType::Virtual(name) => name.to_string(),
}
}
}
#[derive(Error, Debug, Diagnostic)]
pub enum Error {
#[error("Parse error: {message}")]
#[diagnostic(code(nix::parse))]
ParseError {
#[source_code]
src: Option<NamedSource<Arc<str>>>,
#[label("error occurred here")]
span: Option<SourceSpan>,
message: String,
},
#[error("Downgrade error: {message}")]
#[diagnostic(code(nix::downgrade))]
DowngradeError {
#[source_code]
src: Option<NamedSource<Arc<str>>>,
#[label("{message}")]
span: Option<SourceSpan>,
message: String,
},
#[error("Evaluation error: {message}")]
#[diagnostic(code(nix::eval))]
EvalError {
#[source_code]
src: Option<NamedSource<Arc<str>>>,
#[label("error occurred here")]
span: Option<SourceSpan>,
message: String,
#[help]
js_backtrace: Option<String>,
#[related]
stack_trace: Vec<StackFrame>,
},
#[error("Internal error: {message}")]
#[diagnostic(code(nix::internal))]
InternalError { message: String },
#[error("{message}")]
#[diagnostic(code(nix::catchable))]
Catchable { message: String },
#[error("Unknown error")]
#[diagnostic(code(nix::unknown))]
Unknown,
}
impl Error {
pub fn parse_error(msg: String) -> Box<Self> {
Error::ParseError {
src: None,
span: None,
message: msg,
}
.into()
}
pub fn downgrade_error(msg: String, src: Source, span: rnix::TextRange) -> Box<Self> {
Error::DowngradeError {
src: Some(src.into()),
span: Some(text_range_to_source_span(span)),
message: msg,
}
.into()
}
pub fn eval_error(msg: String, backtrace: Option<String>) -> Box<Self> {
Error::EvalError {
src: None,
span: None,
message: msg,
js_backtrace: backtrace,
stack_trace: Vec::new(),
}
.into()
}
pub fn internal(msg: String) -> Box<Self> {
Error::InternalError { message: msg }.into()
}
pub fn catchable(msg: String) -> Box<Self> {
Error::Catchable { message: msg }.into()
}
pub fn with_span(mut self: Box<Self>, span: rnix::TextRange) -> Box<Self> {
use Error::*;
let source_span = Some(text_range_to_source_span(span));
let (ParseError { span, .. } | DowngradeError { span, .. } | EvalError { span, .. }) =
self.as_mut()
else {
return self;
};
*span = source_span;
self
}
pub fn with_source(mut self: Box<Self>, source: Source) -> Box<Self> {
use Error::*;
let new_src = Some(source.into());
let (ParseError { src, .. } | DowngradeError { src, .. } | EvalError { src, .. }) =
self.as_mut()
else {
return self;
};
*src = new_src;
self
}
}
pub fn text_range_to_source_span(range: rnix::TextRange) -> SourceSpan {
let start = usize::from(range.start());
let len = usize::from(range.end()) - start;
SourceSpan::new(start.into(), len)
}
/// Stack frame types from Nix evaluation
#[derive(Debug, Clone, Error, Diagnostic)]
#[error("{message}")]
pub struct StackFrame {
#[label]
pub span: SourceSpan,
#[help]
pub message: String,
#[source_code]
pub src: NamedSource<Arc<str>>,
}

305
fix/src/fetcher.rs Normal file
View File

@@ -0,0 +1,305 @@
use deno_core::OpState;
use deno_core::ToV8;
use deno_core::op2;
use nix_compat::nixhash::HashAlgo;
use nix_compat::nixhash::NixHash;
use tracing::{debug, info, warn};
use crate::store::Store as _;
mod archive;
pub(crate) mod cache;
mod download;
mod git;
mod metadata_cache;
pub use cache::FetcherCache;
pub use download::Downloader;
pub use metadata_cache::MetadataCache;
use crate::nar;
#[derive(ToV8)]
pub struct FetchUrlResult {
pub store_path: String,
pub hash: String,
}
#[derive(ToV8)]
pub struct FetchTarballResult {
pub store_path: String,
pub nar_hash: String,
}
#[derive(ToV8)]
pub struct FetchGitResult {
pub out_path: String,
pub rev: String,
pub short_rev: String,
pub rev_count: u64,
pub last_modified: u64,
pub last_modified_date: String,
pub submodules: bool,
pub nar_hash: Option<String>,
}
#[op2]
pub fn op_fetch_url<Ctx: RuntimeContext>(
state: &mut OpState,
#[string] url: String,
#[string] expected_hash: Option<String>,
#[string] name: Option<String>,
executable: bool,
) -> Result<FetchUrlResult, NixRuntimeError> {
let _span = tracing::info_span!("op_fetch_url", url = %url).entered();
info!("fetchurl started");
let file_name =
name.unwrap_or_else(|| url.rsplit('/').next().unwrap_or("download").to_string());
let metadata_cache =
MetadataCache::new(3600).map_err(|e| NixRuntimeError::from(e.to_string()))?;
let input = serde_json::json!({
"type": "file",
"url": url,
"name": file_name,
"executable": executable,
});
if let Some(cached_entry) = metadata_cache
.lookup(&input)
.map_err(|e| NixRuntimeError::from(e.to_string()))?
{
let cached_hash = cached_entry
.info
.get("hash")
.and_then(|v| v.as_str())
.unwrap_or("");
if let Some(ref expected) = expected_hash {
let normalized_expected = normalize_hash(expected);
if cached_hash != normalized_expected {
warn!("Cached hash mismatch, re-fetching");
} else {
info!("Cache hit");
return Ok(FetchUrlResult {
store_path: cached_entry.store_path.clone(),
hash: cached_hash.to_string(),
});
}
} else {
info!("Cache hit (no hash check)");
return Ok(FetchUrlResult {
store_path: cached_entry.store_path.clone(),
hash: cached_hash.to_string(),
});
}
}
info!("Cache miss, downloading");
let downloader = Downloader::new();
let data = downloader
.download(&url)
.map_err(|e| NixRuntimeError::from(e.to_string()))?;
info!(bytes = data.len(), "Download complete");
let hash = crate::nix_utils::sha256_hex(&data);
if let Some(ref expected) = expected_hash {
let normalized_expected = normalize_hash(expected);
if hash != normalized_expected {
return Err(NixRuntimeError::from(format!(
"hash mismatch for '{}': expected {}, got {}",
url, normalized_expected, hash
)));
}
}
let ctx: &Ctx = state.get_ctx();
let store = ctx.get_store();
let store_path = store
.add_to_store(&file_name, &data, false, vec![])
.map_err(|e| NixRuntimeError::from(e.to_string()))?;
info!(store_path = %store_path, "Added to store");
#[cfg(unix)]
if executable {
use std::os::unix::fs::PermissionsExt;
if let Ok(metadata) = std::fs::metadata(&store_path) {
let mut perms = metadata.permissions();
perms.set_mode(0o755);
let _ = std::fs::set_permissions(&store_path, perms);
}
}
let info = serde_json::json!({
"hash": hash,
"url": url,
});
metadata_cache
.add(&input, &info, &store_path, true)
.map_err(|e| NixRuntimeError::from(e.to_string()))?;
Ok(FetchUrlResult { store_path, hash })
}
#[op2]
pub fn op_fetch_tarball<Ctx: RuntimeContext>(
state: &mut OpState,
#[string] url: String,
#[string] name: Option<String>,
#[string] sha256: Option<String>,
) -> Result<FetchTarballResult, NixRuntimeError> {
let _span = tracing::info_span!("op_fetch_tarball", url = %url).entered();
info!("fetchTarball started");
let dir_name = name.unwrap_or_else(|| "source".to_string());
let metadata_cache =
MetadataCache::new(3600).map_err(|e| NixRuntimeError::from(e.to_string()))?;
let input = serde_json::json!({
"type": "tarball",
"url": url,
"name": dir_name,
});
let expected_sha256 = sha256
.map(
|ref sha256| match NixHash::from_str(sha256, Some(HashAlgo::Sha256)) {
Ok(NixHash::Sha256(digest)) => Ok(digest),
_ => Err(format!("fetchTarball: invalid sha256 '{sha256}'")),
},
)
.transpose()?;
let expected_hex = expected_sha256.map(hex::encode);
if let Some(cached_entry) = metadata_cache
.lookup(&input)
.map_err(|e| NixRuntimeError::from(e.to_string()))?
{
let cached_nar_hash = cached_entry
.info
.get("nar_hash")
.and_then(|v| v.as_str())
.unwrap_or("");
if let Some(ref hex) = expected_hex {
if cached_nar_hash == hex {
info!("Cache hit");
return Ok(FetchTarballResult {
store_path: cached_entry.store_path.clone(),
nar_hash: cached_nar_hash.to_string(),
});
}
} else if !cached_entry.is_expired(3600) {
info!("Cache hit (no hash check)");
return Ok(FetchTarballResult {
store_path: cached_entry.store_path.clone(),
nar_hash: cached_nar_hash.to_string(),
});
}
}
info!("Cache miss, downloading tarball");
let downloader = Downloader::new();
let data = downloader
.download(&url)
.map_err(|e| NixRuntimeError::from(e.to_string()))?;
info!(bytes = data.len(), "Download complete");
info!("Extracting tarball");
let (extracted_path, _temp_dir) = archive::extract_tarball_to_temp(&data)
.map_err(|e| NixRuntimeError::from(e.to_string()))?;
info!("Computing NAR hash");
let nar_hash =
nar::compute_nar_hash(&extracted_path).map_err(|e| NixRuntimeError::from(e.to_string()))?;
let nar_hash_hex = hex::encode(nar_hash);
debug!(
nar_hash = %nar_hash_hex,
"Hash computation complete"
);
if let Some(ref expected) = expected_hex
&& nar_hash_hex != *expected
{
return Err(NixRuntimeError::from(format!(
"Tarball hash mismatch for '{}': expected {}, got {}",
url, expected, nar_hash_hex
)));
}
info!("Adding to store");
let ctx: &Ctx = state.get_ctx();
let store = ctx.get_store();
let store_path = store
.add_to_store_from_path(&dir_name, &extracted_path, vec![])
.map_err(|e| NixRuntimeError::from(e.to_string()))?;
info!(store_path = %store_path, "Added to store");
let info = serde_json::json!({
"nar_hash": nar_hash_hex,
"url": url,
});
let immutable = expected_sha256.is_some();
metadata_cache
.add(&input, &info, &store_path, immutable)
.map_err(|e| NixRuntimeError::from(e.to_string()))?;
Ok(FetchTarballResult {
store_path,
nar_hash: nar_hash_hex,
})
}
#[op2]
pub fn op_fetch_git<Ctx: RuntimeContext>(
state: &mut OpState,
#[string] url: String,
#[string] git_ref: Option<String>,
#[string] rev: Option<String>,
shallow: bool,
submodules: bool,
all_refs: bool,
#[string] name: Option<String>,
) -> Result<FetchGitResult, NixRuntimeError> {
let _span = tracing::info_span!("op_fetch_git", url = %url).entered();
info!("fetchGit started");
let cache = FetcherCache::new().map_err(|e| NixRuntimeError::from(e.to_string()))?;
let dir_name = name.unwrap_or_else(|| "source".to_string());
let ctx: &Ctx = state.get_ctx();
let store = ctx.get_store();
git::fetch_git(
&cache,
store,
&url,
git_ref.as_deref(),
rev.as_deref(),
shallow,
submodules,
all_refs,
&dir_name,
)
.map_err(|e| NixRuntimeError::from(e.to_string()))
}
fn normalize_hash(hash: &str) -> String {
use base64::prelude::*;
if hash.starts_with("sha256-")
&& let Some(b64) = hash.strip_prefix("sha256-")
&& let Ok(bytes) = BASE64_STANDARD.decode(b64)
{
return hex::encode(bytes);
}
hash.to_string()
}

173
fix/src/fetcher/archive.rs Normal file
View File

@@ -0,0 +1,173 @@
use std::fs;
use std::io::Cursor;
use std::os::unix::ffi::OsStrExt;
use std::path::{Path, PathBuf};
use flate2::read::GzDecoder;
#[derive(Debug, Clone, Copy)]
pub enum ArchiveFormat {
TarGz,
TarXz,
TarBz2,
Tar,
}
impl ArchiveFormat {
pub fn detect(url: &str, data: &[u8]) -> Self {
if url.ends_with(".tar.gz") || url.ends_with(".tgz") {
return ArchiveFormat::TarGz;
}
if url.ends_with(".tar.xz") || url.ends_with(".txz") {
return ArchiveFormat::TarXz;
}
if url.ends_with(".tar.bz2") || url.ends_with(".tbz2") {
return ArchiveFormat::TarBz2;
}
if url.ends_with(".tar") {
return ArchiveFormat::Tar;
}
if data.len() >= 2 && data[0] == 0x1f && data[1] == 0x8b {
return ArchiveFormat::TarGz;
}
if data.len() >= 6 && &data[0..6] == b"\xfd7zXZ\x00" {
return ArchiveFormat::TarXz;
}
if data.len() >= 3 && &data[0..3] == b"BZh" {
return ArchiveFormat::TarBz2;
}
ArchiveFormat::TarGz
}
}
pub fn extract_tarball(data: &[u8], dest: &Path) -> Result<PathBuf, ArchiveError> {
let format = ArchiveFormat::detect("", data);
let temp_dir = dest.join("_extract_temp");
fs::create_dir_all(&temp_dir)?;
match format {
ArchiveFormat::TarGz => extract_tar_gz(data, &temp_dir)?,
ArchiveFormat::TarXz => extract_tar_xz(data, &temp_dir)?,
ArchiveFormat::TarBz2 => extract_tar_bz2(data, &temp_dir)?,
ArchiveFormat::Tar => extract_tar(data, &temp_dir)?,
}
strip_single_toplevel(&temp_dir, dest)
}
fn extract_tar_gz(data: &[u8], dest: &Path) -> Result<(), ArchiveError> {
let decoder = GzDecoder::new(Cursor::new(data));
let mut archive = tar::Archive::new(decoder);
archive.unpack(dest)?;
Ok(())
}
fn extract_tar_xz(data: &[u8], dest: &Path) -> Result<(), ArchiveError> {
let decoder = xz2::read::XzDecoder::new(Cursor::new(data));
let mut archive = tar::Archive::new(decoder);
archive.unpack(dest)?;
Ok(())
}
fn extract_tar_bz2(data: &[u8], dest: &Path) -> Result<(), ArchiveError> {
let decoder = bzip2::read::BzDecoder::new(Cursor::new(data));
let mut archive = tar::Archive::new(decoder);
archive.unpack(dest)?;
Ok(())
}
fn extract_tar(data: &[u8], dest: &Path) -> Result<(), ArchiveError> {
let mut archive = tar::Archive::new(Cursor::new(data));
archive.unpack(dest)?;
Ok(())
}
fn strip_single_toplevel(temp_dir: &Path, dest: &Path) -> Result<PathBuf, ArchiveError> {
let entries: Vec<_> = fs::read_dir(temp_dir)?
.filter_map(|e| e.ok())
.filter(|e| e.file_name().as_os_str().as_bytes()[0] != b'.')
.collect();
let source_dir = if entries.len() == 1 && entries[0].file_type()?.is_dir() {
entries[0].path()
} else {
temp_dir.to_path_buf()
};
let final_dest = dest.join("content");
if final_dest.exists() {
fs::remove_dir_all(&final_dest)?;
}
if source_dir == *temp_dir {
fs::rename(temp_dir, &final_dest)?;
} else {
copy_dir_recursive(&source_dir, &final_dest)?;
fs::remove_dir_all(temp_dir)?;
}
Ok(final_dest)
}
fn copy_dir_recursive(src: &Path, dst: &Path) -> Result<(), std::io::Error> {
fs::create_dir_all(dst)?;
for entry in fs::read_dir(src)? {
let entry = entry?;
let path = entry.path();
let dest_path = dst.join(entry.file_name());
let metadata = fs::symlink_metadata(&path)?;
if metadata.is_symlink() {
let target = fs::read_link(&path)?;
#[cfg(unix)]
{
std::os::unix::fs::symlink(&target, &dest_path)?;
}
#[cfg(windows)]
{
if target.is_dir() {
std::os::windows::fs::symlink_dir(&target, &dest_path)?;
} else {
std::os::windows::fs::symlink_file(&target, &dest_path)?;
}
}
} else if metadata.is_dir() {
copy_dir_recursive(&path, &dest_path)?;
} else {
fs::copy(&path, &dest_path)?;
}
}
Ok(())
}
pub fn extract_tarball_to_temp(data: &[u8]) -> Result<(PathBuf, tempfile::TempDir), ArchiveError> {
let temp_dir = tempfile::tempdir()?;
let extracted_path = extract_tarball(data, temp_dir.path())?;
Ok((extracted_path, temp_dir))
}
#[derive(Debug)]
pub enum ArchiveError {
IoError(std::io::Error),
}
impl std::fmt::Display for ArchiveError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
ArchiveError::IoError(e) => write!(f, "I/O error: {}", e),
}
}
}
impl std::error::Error for ArchiveError {}
impl From<std::io::Error> for ArchiveError {
fn from(e: std::io::Error) -> Self {
ArchiveError::IoError(e)
}
}

29
fix/src/fetcher/cache.rs Normal file
View File

@@ -0,0 +1,29 @@
use std::fs;
use std::path::PathBuf;
#[derive(Debug)]
pub struct FetcherCache {
base_dir: PathBuf,
}
impl FetcherCache {
pub fn new() -> Result<Self, std::io::Error> {
let base_dir = dirs::cache_dir()
.unwrap_or_else(|| PathBuf::from("/tmp"))
.join("fix")
.join("fetchers");
fs::create_dir_all(&base_dir)?;
Ok(Self { base_dir })
}
fn git_cache_dir(&self) -> PathBuf {
self.base_dir.join("git")
}
pub fn get_git_bare(&self, url: &str) -> PathBuf {
let key = crate::nix_utils::sha256_hex(url.as_bytes());
self.git_cache_dir().join(key)
}
}

View File

@@ -0,0 +1,64 @@
use std::time::Duration;
use reqwest::blocking::Client;
pub struct Downloader {
client: Client,
}
impl Downloader {
pub fn new() -> Self {
let client = Client::builder()
.timeout(Duration::from_secs(300))
.user_agent("nix-js/0.1")
.build()
.expect("Failed to create HTTP client");
Self { client }
}
pub fn download(&self, url: &str) -> Result<Vec<u8>, DownloadError> {
let response = self
.client
.get(url)
.send()
.map_err(|e| DownloadError::NetworkError(e.to_string()))?;
if !response.status().is_success() {
return Err(DownloadError::HttpError {
url: url.to_string(),
status: response.status().as_u16(),
});
}
response
.bytes()
.map(|b| b.to_vec())
.map_err(|e| DownloadError::NetworkError(e.to_string()))
}
}
impl Default for Downloader {
fn default() -> Self {
Self::new()
}
}
#[derive(Debug)]
pub enum DownloadError {
NetworkError(String),
HttpError { url: String, status: u16 },
}
impl std::fmt::Display for DownloadError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
DownloadError::NetworkError(msg) => write!(f, "Network error: {}", msg),
DownloadError::HttpError { url, status } => {
write!(f, "HTTP error {} for URL: {}", status, url)
}
}
}
}
impl std::error::Error for DownloadError {}

315
fix/src/fetcher/git.rs Normal file
View File

@@ -0,0 +1,315 @@
use std::fs;
use std::path::PathBuf;
use std::process::Command;
use super::FetchGitResult;
use super::cache::FetcherCache;
use crate::store::Store;
#[allow(clippy::too_many_arguments)]
pub fn fetch_git(
cache: &FetcherCache,
store: &dyn Store,
url: &str,
git_ref: Option<&str>,
rev: Option<&str>,
_shallow: bool,
submodules: bool,
all_refs: bool,
name: &str,
) -> Result<FetchGitResult, GitError> {
let bare_repo = cache.get_git_bare(url);
if !bare_repo.exists() {
clone_bare(url, &bare_repo)?;
} else {
fetch_repo(&bare_repo, all_refs)?;
}
let target_rev = resolve_rev(&bare_repo, git_ref, rev)?;
let temp_dir = tempfile::tempdir()?;
let checkout_dir = checkout_rev_to_temp(&bare_repo, &target_rev, submodules, temp_dir.path())?;
let nar_hash = hex::encode(
crate::nar::compute_nar_hash(&checkout_dir)
.map_err(|e| GitError::NarHashError(e.to_string()))?,
);
let store_path = store
.add_to_store_from_path(name, &checkout_dir, vec![])
.map_err(|e| GitError::StoreError(e.to_string()))?;
let rev_count = get_rev_count(&bare_repo, &target_rev)?;
let last_modified = get_last_modified(&bare_repo, &target_rev)?;
let last_modified_date = format_timestamp(last_modified);
let short_rev = if target_rev.len() >= 7 {
target_rev[..7].to_string()
} else {
target_rev.clone()
};
Ok(FetchGitResult {
out_path: store_path,
rev: target_rev,
short_rev,
rev_count,
last_modified,
last_modified_date,
submodules,
nar_hash: Some(nar_hash),
})
}
fn clone_bare(url: &str, dest: &PathBuf) -> Result<(), GitError> {
fs::create_dir_all(dest.parent().unwrap_or(dest))?;
let output = Command::new("git")
.args(["clone", "--bare", url])
.arg(dest)
.output()?;
if !output.status.success() {
return Err(GitError::CommandFailed {
operation: "clone".to_string(),
message: String::from_utf8_lossy(&output.stderr).to_string(),
});
}
Ok(())
}
fn fetch_repo(repo: &PathBuf, all_refs: bool) -> Result<(), GitError> {
let mut args = vec!["fetch", "--prune"];
if all_refs {
args.push("--all");
}
let output = Command::new("git").args(args).current_dir(repo).output()?;
if !output.status.success() {
return Err(GitError::CommandFailed {
operation: "fetch".to_string(),
message: String::from_utf8_lossy(&output.stderr).to_string(),
});
}
Ok(())
}
fn resolve_rev(
repo: &PathBuf,
git_ref: Option<&str>,
rev: Option<&str>,
) -> Result<String, GitError> {
if let Some(rev) = rev {
return Ok(rev.to_string());
}
let ref_to_resolve = git_ref.unwrap_or("HEAD");
let output = Command::new("git")
.args(["rev-parse", ref_to_resolve])
.current_dir(repo)
.output()?;
if !output.status.success() {
let output = Command::new("git")
.args(["rev-parse", &format!("refs/heads/{}", ref_to_resolve)])
.current_dir(repo)
.output()?;
if !output.status.success() {
let output = Command::new("git")
.args(["rev-parse", &format!("refs/tags/{}", ref_to_resolve)])
.current_dir(repo)
.output()?;
if !output.status.success() {
return Err(GitError::CommandFailed {
operation: "rev-parse".to_string(),
message: format!("Could not resolve ref: {}", ref_to_resolve),
});
}
return Ok(String::from_utf8_lossy(&output.stdout).trim().to_string());
}
return Ok(String::from_utf8_lossy(&output.stdout).trim().to_string());
}
Ok(String::from_utf8_lossy(&output.stdout).trim().to_string())
}
fn checkout_rev_to_temp(
bare_repo: &PathBuf,
rev: &str,
submodules: bool,
temp_path: &std::path::Path,
) -> Result<PathBuf, GitError> {
let checkout_dir = temp_path.join("checkout");
fs::create_dir_all(&checkout_dir)?;
let output = Command::new("git")
.args(["--work-tree", checkout_dir.to_str().unwrap_or(".")])
.arg("checkout")
.arg(rev)
.arg("--")
.arg(".")
.current_dir(bare_repo)
.output()?;
if !output.status.success() {
fs::remove_dir_all(&checkout_dir)?;
return Err(GitError::CommandFailed {
operation: "checkout".to_string(),
message: String::from_utf8_lossy(&output.stderr).to_string(),
});
}
if submodules {
let output = Command::new("git")
.args(["submodule", "update", "--init", "--recursive"])
.current_dir(&checkout_dir)
.output()?;
if !output.status.success() {
tracing::warn!(
"failed to initialize submodules: {}",
String::from_utf8_lossy(&output.stderr)
);
}
}
let git_dir = checkout_dir.join(".git");
if git_dir.exists() {
fs::remove_dir_all(&git_dir)?;
}
Ok(checkout_dir)
}
fn get_rev_count(repo: &PathBuf, rev: &str) -> Result<u64, GitError> {
let output = Command::new("git")
.args(["rev-list", "--count", rev])
.current_dir(repo)
.output()?;
if !output.status.success() {
return Ok(0);
}
let count_str = String::from_utf8_lossy(&output.stdout);
count_str.trim().parse().unwrap_or(0).pipe(Ok)
}
fn get_last_modified(repo: &PathBuf, rev: &str) -> Result<u64, GitError> {
let output = Command::new("git")
.args(["log", "-1", "--format=%ct", rev])
.current_dir(repo)
.output()?;
if !output.status.success() {
return Ok(0);
}
let ts_str = String::from_utf8_lossy(&output.stdout);
ts_str.trim().parse().unwrap_or(0).pipe(Ok)
}
fn format_timestamp(ts: u64) -> String {
use std::time::{Duration, UNIX_EPOCH};
let datetime = UNIX_EPOCH + Duration::from_secs(ts);
let secs = datetime
.duration_since(UNIX_EPOCH)
.map(|d| d.as_secs())
.unwrap_or(0);
let days_since_epoch = secs / 86400;
let remaining_secs = secs % 86400;
let hours = remaining_secs / 3600;
let minutes = (remaining_secs % 3600) / 60;
let seconds = remaining_secs % 60;
let (year, month, day) = days_to_ymd(days_since_epoch);
format!(
"{:04}{:02}{:02}{:02}{:02}{:02}",
year, month, day, hours, minutes, seconds
)
}
fn days_to_ymd(days: u64) -> (u64, u64, u64) {
let mut y = 1970;
let mut remaining = days as i64;
loop {
let days_in_year = if is_leap_year(y) { 366 } else { 365 };
if remaining < days_in_year {
break;
}
remaining -= days_in_year;
y += 1;
}
let days_in_months: [i64; 12] = if is_leap_year(y) {
[31, 29, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31]
} else {
[31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31]
};
let mut m = 1;
for days_in_month in days_in_months.iter() {
if remaining < *days_in_month {
break;
}
remaining -= *days_in_month;
m += 1;
}
(y, m, (remaining + 1) as u64)
}
fn is_leap_year(y: u64) -> bool {
(y.is_multiple_of(4) && !y.is_multiple_of(100)) || y.is_multiple_of(400)
}
trait Pipe: Sized {
fn pipe<F, R>(self, f: F) -> R
where
F: FnOnce(Self) -> R,
{
f(self)
}
}
impl<T> Pipe for T {}
#[derive(Debug)]
pub enum GitError {
IoError(std::io::Error),
CommandFailed { operation: String, message: String },
NarHashError(String),
StoreError(String),
}
impl std::fmt::Display for GitError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
GitError::IoError(e) => write!(f, "I/O error: {}", e),
GitError::CommandFailed { operation, message } => {
write!(f, "Git {} failed: {}", operation, message)
}
GitError::NarHashError(e) => write!(f, "NAR hash error: {}", e),
GitError::StoreError(e) => write!(f, "Store error: {}", e),
}
}
}
impl std::error::Error for GitError {}
impl From<std::io::Error> for GitError {
fn from(e: std::io::Error) -> Self {
GitError::IoError(e)
}
}

View File

@@ -0,0 +1,218 @@
#![allow(dead_code)]
use std::path::PathBuf;
use std::time::{SystemTime, UNIX_EPOCH};
use rusqlite::{Connection, OptionalExtension, params};
use serde::{Deserialize, Serialize};
#[derive(Debug)]
pub enum CacheError {
Database(rusqlite::Error),
Json(serde_json::Error),
}
impl std::fmt::Display for CacheError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
CacheError::Database(e) => write!(f, "Database error: {}", e),
CacheError::Json(e) => write!(f, "JSON error: {}", e),
}
}
}
impl std::error::Error for CacheError {}
impl From<rusqlite::Error> for CacheError {
fn from(e: rusqlite::Error) -> Self {
CacheError::Database(e)
}
}
impl From<serde_json::Error> for CacheError {
fn from(e: serde_json::Error) -> Self {
CacheError::Json(e)
}
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct CacheEntry {
pub input: serde_json::Value,
pub info: serde_json::Value,
pub store_path: String,
pub immutable: bool,
pub timestamp: u64,
}
impl CacheEntry {
pub fn is_expired(&self, ttl_seconds: u64) -> bool {
if self.immutable {
return false;
}
if ttl_seconds == 0 {
return false;
}
let now = SystemTime::now()
.duration_since(UNIX_EPOCH)
.expect("Clock may have gone backwards")
.as_secs();
now > self.timestamp + ttl_seconds
}
}
pub struct MetadataCache {
conn: Connection,
ttl_seconds: u64,
}
impl MetadataCache {
pub fn new(ttl_seconds: u64) -> Result<Self, CacheError> {
let cache_dir = dirs::cache_dir()
.unwrap_or_else(|| PathBuf::from("/tmp"))
.join("nix-js");
std::fs::create_dir_all(&cache_dir).map_err(|e| {
CacheError::Database(rusqlite::Error::ToSqlConversionFailure(Box::new(e)))
})?;
let db_path = cache_dir.join("fetcher-cache.sqlite");
let conn = Connection::open(db_path)?;
conn.execute(
"CREATE TABLE IF NOT EXISTS cache (
input TEXT NOT NULL PRIMARY KEY,
info TEXT NOT NULL,
store_path TEXT NOT NULL,
immutable INTEGER NOT NULL,
timestamp INTEGER NOT NULL
)",
[],
)?;
Ok(Self { conn, ttl_seconds })
}
pub fn lookup(&self, input: &serde_json::Value) -> Result<Option<CacheEntry>, CacheError> {
let input_str = serde_json::to_string(input)?;
let entry: Option<(String, String, String, i64, i64)> = self
.conn
.query_row(
"SELECT input, info, store_path, immutable, timestamp FROM cache WHERE input = ?1",
params![input_str],
|row| {
Ok((
row.get(0)?,
row.get(1)?,
row.get(2)?,
row.get(3)?,
row.get(4)?,
))
},
)
.optional()?;
match entry {
Some((input_json, info_json, store_path, immutable, timestamp)) => {
let entry = CacheEntry {
input: serde_json::from_str(&input_json)?,
info: serde_json::from_str(&info_json)?,
store_path,
immutable: immutable != 0,
timestamp: timestamp as u64,
};
if entry.is_expired(self.ttl_seconds) {
Ok(None)
} else {
Ok(Some(entry))
}
}
None => Ok(None),
}
}
pub fn lookup_expired(
&self,
input: &serde_json::Value,
) -> Result<Option<CacheEntry>, CacheError> {
let input_str = serde_json::to_string(input)?;
let entry: Option<(String, String, String, i64, i64)> = self
.conn
.query_row(
"SELECT input, info, store_path, immutable, timestamp FROM cache WHERE input = ?1",
params![input_str],
|row| {
Ok((
row.get(0)?,
row.get(1)?,
row.get(2)?,
row.get(3)?,
row.get(4)?,
))
},
)
.optional()?;
match entry {
Some((input_json, info_json, store_path, immutable, timestamp)) => {
Ok(Some(CacheEntry {
input: serde_json::from_str(&input_json)?,
info: serde_json::from_str(&info_json)?,
store_path,
immutable: immutable != 0,
timestamp: timestamp as u64,
}))
}
None => Ok(None),
}
}
pub fn add(
&self,
input: &serde_json::Value,
info: &serde_json::Value,
store_path: &str,
immutable: bool,
) -> Result<(), CacheError> {
let input_str = serde_json::to_string(input)?;
let info_str = serde_json::to_string(info)?;
let timestamp = SystemTime::now()
.duration_since(UNIX_EPOCH)
.expect("Clock may have gone backwards")
.as_secs();
self.conn.execute(
"INSERT OR REPLACE INTO cache (input, info, store_path, immutable, timestamp)
VALUES (?1, ?2, ?3, ?4, ?5)",
params![
input_str,
info_str,
store_path,
if immutable { 1 } else { 0 },
timestamp as i64
],
)?;
Ok(())
}
pub fn update_timestamp(&self, input: &serde_json::Value) -> Result<(), CacheError> {
let input_str = serde_json::to_string(input)?;
let timestamp = SystemTime::now()
.duration_since(UNIX_EPOCH)
.expect("Clock may have gone backwards")
.as_secs();
self.conn.execute(
"UPDATE cache SET timestamp = ?1 WHERE input = ?2",
params![timestamp as i64, input_str],
)?;
Ok(())
}
}

688
fix/src/ir.rs Normal file
View File

@@ -0,0 +1,688 @@
use std::{
hash::{Hash, Hasher},
ops::Deref,
};
use bumpalo::{Bump, boxed::Box, collections::Vec};
use gc_arena::Collect;
use ghost_cell::{GhostCell, GhostToken};
use rnix::{TextRange, ast};
use string_interner::symbol::SymbolU32;
pub type HashMap<'ir, K, V> = hashbrown::HashMap<K, V, hashbrown::DefaultHashBuilder, &'ir Bump>;
#[repr(transparent)]
#[derive(Clone, Copy)]
pub struct IrRef<'id, 'ir>(&'ir GhostCell<'id, Ir<'ir, Self>>);
impl<'id, 'ir> IrRef<'id, 'ir> {
pub fn new(ir: &'ir GhostCell<'id, Ir<'ir, Self>>) -> Self {
Self(ir)
}
pub fn alloc(bump: &'ir Bump, ir: Ir<'ir, Self>) -> Self {
Self(bump.alloc(GhostCell::new(ir)))
}
pub fn borrow<'a>(&'a self, token: &'a GhostToken<'id>) -> &'a Ir<'ir, Self> {
self.0.borrow(token)
}
/// Freeze a mutable IR reference into a read-only one, consuming the
/// `GhostToken` to prevent any further mutation.
///
/// # Safety
/// The transmute is sound because:
/// - `GhostCell<'id, T>` is `#[repr(transparent)]` over `T`
/// - `IrRef<'id, 'ir>` is `#[repr(transparent)]` over
/// `&'ir GhostCell<'id, Ir<'ir, Self>>`
/// - `RawIrRef<'ir>` is `#[repr(transparent)]` over `&'ir Ir<'ir, Self>`
/// - `Ir<'ir, Ref>` is `#[repr(C)]` and both ref types are pointer-sized
///
/// Consuming the `GhostToken` guarantees no `borrow_mut` calls can occur
/// afterwards, so the shared `&Ir` references from `RawIrRef::Deref` can
/// never alias with mutable references.
pub fn freeze(self, _token: GhostToken<'id>) -> RawIrRef<'ir> {
unsafe { std::mem::transmute(self) }
}
}
#[repr(transparent)]
#[derive(Clone, Copy)]
pub struct RawIrRef<'ir>(&'ir Ir<'ir, Self>);
impl<'ir> Deref for RawIrRef<'ir> {
type Target = Ir<'ir, RawIrRef<'ir>>;
fn deref(&self) -> &Self::Target {
self.0
}
}
#[repr(C)]
pub enum Ir<'ir, Ref> {
Int(i64),
Float(f64),
Bool(bool),
Null,
Str(Box<'ir, String>),
AttrSet {
stcs: HashMap<'ir, StringId, (Ref, TextRange)>,
dyns: Vec<'ir, (Ref, Ref, TextRange)>,
},
List {
items: Vec<'ir, Ref>,
},
Path(Ref),
ConcatStrings {
parts: Vec<'ir, Ref>,
force_string: bool,
},
// OPs
UnOp {
rhs: Ref,
kind: UnOpKind,
},
BinOp {
lhs: Ref,
rhs: Ref,
kind: BinOpKind,
},
HasAttr {
lhs: Ref,
rhs: Vec<'ir, Attr<Ref>>,
},
Select {
expr: Ref,
attrpath: Vec<'ir, Attr<Ref>>,
default: Option<Ref>,
span: TextRange,
},
// Conditionals
If {
cond: Ref,
consq: Ref,
alter: Ref,
},
Assert {
assertion: Ref,
expr: Ref,
assertion_raw: String,
span: TextRange,
},
With {
namespace: Ref,
body: Ref,
thunks: Vec<'ir, (ThunkId, Ref)>,
},
WithLookup(StringId),
// Function related
Func {
body: Ref,
param: Option<Param<'ir>>,
arg: ArgId,
thunks: Vec<'ir, (ThunkId, Ref)>,
},
Arg(ArgId),
Call {
func: Ref,
arg: Ref,
span: TextRange,
},
// Builtins
Builtins,
Builtin(StringId),
// Misc
TopLevel {
body: Ref,
thunks: Vec<'ir, (ThunkId, Ref)>,
},
Thunk(ThunkId),
CurPos(TextRange),
ReplBinding(StringId),
ScopedImportBinding(StringId),
}
#[repr(transparent)]
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, PartialOrd, Ord)]
pub struct ThunkId(pub usize);
#[repr(transparent)]
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, PartialOrd, Ord, Collect)]
#[collect(require_static)]
pub struct StringId(pub SymbolU32);
#[repr(transparent)]
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, PartialOrd, Ord)]
pub struct ArgId(pub u32);
#[repr(transparent)]
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, PartialOrd, Ord)]
pub struct SpanId(pub u32);
/// Represents a key in an attribute path.
#[allow(unused)]
#[derive(Debug)]
pub enum Attr<Ref> {
/// A dynamic attribute key, which is an expression that must evaluate to a string.
/// Example: `attrs.${key}`
Dynamic(Ref, TextRange),
/// A static attribute key.
/// Example: `attrs.key`
Str(StringId, TextRange),
}
/// The kinds of binary operations supported in Nix.
#[derive(Clone, Copy, Debug, Hash, PartialEq, Eq)]
pub enum BinOpKind {
// Arithmetic
Add,
Sub,
Div,
Mul,
// Comparison
Eq,
Neq,
Lt,
Gt,
Leq,
Geq,
// Logical
And,
Or,
Impl,
// Set/String/Path operations
Con, // List concatenation (`++`)
Upd, // AttrSet update (`//`)
// Not standard, but part of rnix AST
PipeL,
PipeR,
}
impl From<ast::BinOpKind> for BinOpKind {
fn from(op: ast::BinOpKind) -> Self {
use BinOpKind::*;
use ast::BinOpKind as kind;
match op {
kind::Concat => Con,
kind::Update => Upd,
kind::Add => Add,
kind::Sub => Sub,
kind::Mul => Mul,
kind::Div => Div,
kind::And => And,
kind::Equal => Eq,
kind::Implication => Impl,
kind::Less => Lt,
kind::LessOrEq => Leq,
kind::More => Gt,
kind::MoreOrEq => Geq,
kind::NotEqual => Neq,
kind::Or => Or,
kind::PipeLeft => PipeL,
kind::PipeRight => PipeR,
}
}
}
/// The kinds of unary operations.
#[derive(Clone, Copy, Debug, Hash, PartialEq, Eq)]
pub enum UnOpKind {
Neg, // Negation (`-`)
Not, // Logical not (`!`)
}
impl From<ast::UnaryOpKind> for UnOpKind {
fn from(value: ast::UnaryOpKind) -> Self {
match value {
ast::UnaryOpKind::Invert => UnOpKind::Not,
ast::UnaryOpKind::Negate => UnOpKind::Neg,
}
}
}
/// Describes the parameters of a function.
#[derive(Debug)]
pub struct Param<'ir> {
pub required: Vec<'ir, (StringId, TextRange)>,
pub optional: Vec<'ir, (StringId, TextRange)>,
pub ellipsis: bool,
}
#[derive(Clone, Copy)]
pub(crate) struct IrKey<'id, 'ir, 'a>(pub IrRef<'id, 'ir>, pub &'a GhostToken<'id>);
impl std::hash::Hash for IrKey<'_, '_, '_> {
fn hash<H: Hasher>(&self, state: &mut H) {
ir_content_hash(self.0, self.1, state);
}
}
impl PartialEq for IrKey<'_, '_, '_> {
fn eq(&self, other: &Self) -> bool {
ir_content_eq(self.0, other.0, self.1)
}
}
impl Eq for IrKey<'_, '_, '_> {}
fn attr_content_hash<'id>(
attr: &Attr<IrRef<'id, '_>>,
token: &GhostToken<'id>,
state: &mut impl Hasher,
) {
core::mem::discriminant(attr).hash(state);
match attr {
Attr::Dynamic(expr, _) => ir_content_hash(*expr, token, state),
Attr::Str(sym, _) => sym.hash(state),
}
}
fn attr_content_eq<'id, 'ir>(
a: &Attr<IrRef<'id, 'ir>>,
b: &Attr<IrRef<'id, 'ir>>,
token: &GhostToken<'id>,
) -> bool {
match (a, b) {
(Attr::Dynamic(ae, _), Attr::Dynamic(be, _)) => ir_content_eq(*ae, *be, token),
(Attr::Str(a, _), Attr::Str(b, _)) => a == b,
_ => false,
}
}
fn param_content_hash(param: &Param<'_>, state: &mut impl Hasher) {
param.required.len().hash(state);
for (sym, _) in param.required.iter() {
sym.hash(state);
}
param.optional.len().hash(state);
for (sym, _) in param.optional.iter() {
sym.hash(state);
}
param.ellipsis.hash(state);
}
fn param_content_eq(a: &Param<'_>, b: &Param<'_>) -> bool {
a.ellipsis == b.ellipsis
&& a.required.len() == b.required.len()
&& a.optional.len() == b.optional.len()
&& a.required
.iter()
.zip(b.required.iter())
.all(|((a, _), (b, _))| a == b)
&& a.optional
.iter()
.zip(b.optional.iter())
.all(|((a, _), (b, _))| a == b)
}
fn thunks_content_hash<'id>(
thunks: &[(ThunkId, IrRef<'id, '_>)],
token: &GhostToken<'id>,
state: &mut impl Hasher,
) {
thunks.len().hash(state);
for &(id, ir) in thunks {
id.hash(state);
ir_content_hash(ir, token, state);
}
}
fn thunks_content_eq<'id, 'ir>(
a: &[(ThunkId, IrRef<'id, 'ir>)],
b: &[(ThunkId, IrRef<'id, 'ir>)],
token: &GhostToken<'id>,
) -> bool {
a.len() == b.len()
&& a.iter()
.zip(b.iter())
.all(|(&(ai, ae), &(bi, be))| ai == bi && ir_content_eq(ae, be, token))
}
fn ir_content_hash<'id>(ir: IrRef<'id, '_>, token: &GhostToken<'id>, state: &mut impl Hasher) {
let ir = ir.borrow(token);
core::mem::discriminant(ir).hash(state);
match ir {
Ir::Int(x) => x.hash(state),
Ir::Float(x) => x.to_bits().hash(state),
Ir::Bool(x) => x.hash(state),
Ir::Null => {}
Ir::Str(x) => x.hash(state),
Ir::AttrSet { stcs, dyns } => {
stcs.len().hash(state);
let mut combined: u64 = 0;
for (&key, &(val, _)) in stcs.iter() {
let mut h = std::hash::DefaultHasher::new();
key.hash(&mut h);
ir_content_hash(val, token, &mut h);
combined = combined.wrapping_add(h.finish());
}
combined.hash(state);
dyns.len().hash(state);
for &(k, v, _) in dyns.iter() {
ir_content_hash(k, token, state);
ir_content_hash(v, token, state);
}
}
Ir::List { items } => {
items.len().hash(state);
for &item in items.iter() {
ir_content_hash(item, token, state);
}
}
Ir::HasAttr { lhs, rhs } => {
ir_content_hash(*lhs, token, state);
rhs.len().hash(state);
for attr in rhs.iter() {
attr_content_hash(attr, token, state);
}
}
&Ir::BinOp { lhs, rhs, kind } => {
ir_content_hash(lhs, token, state);
ir_content_hash(rhs, token, state);
kind.hash(state);
}
&Ir::UnOp { rhs, kind } => {
ir_content_hash(rhs, token, state);
kind.hash(state);
}
Ir::Select {
expr,
attrpath,
default,
..
} => {
ir_content_hash(*expr, token, state);
attrpath.len().hash(state);
for attr in attrpath.iter() {
attr_content_hash(attr, token, state);
}
default.is_some().hash(state);
if let Some(d) = default {
ir_content_hash(*d, token, state);
}
}
&Ir::If { cond, consq, alter } => {
ir_content_hash(cond, token, state);
ir_content_hash(consq, token, state);
ir_content_hash(alter, token, state);
}
&Ir::Call { func, arg, .. } => {
ir_content_hash(func, token, state);
ir_content_hash(arg, token, state);
}
Ir::Assert {
assertion,
expr,
assertion_raw,
..
} => {
ir_content_hash(*assertion, token, state);
ir_content_hash(*expr, token, state);
assertion_raw.hash(state);
}
Ir::ConcatStrings {
force_string,
parts,
} => {
force_string.hash(state);
parts.len().hash(state);
for &part in parts.iter() {
ir_content_hash(part, token, state);
}
}
&Ir::Path(expr) => ir_content_hash(expr, token, state),
Ir::Func {
body,
arg,
param,
thunks,
} => {
ir_content_hash(*body, token, state);
arg.hash(state);
param.is_some().hash(state);
if let Some(p) = param {
param_content_hash(p, state);
}
thunks_content_hash(thunks, token, state);
}
Ir::TopLevel { body, thunks } => {
ir_content_hash(*body, token, state);
thunks_content_hash(thunks, token, state);
}
Ir::Arg(x) => x.hash(state),
Ir::Thunk(x) => x.hash(state),
Ir::Builtins => {}
Ir::Builtin(x) => x.hash(state),
Ir::CurPos(x) => x.hash(state),
Ir::ReplBinding(x) => x.hash(state),
Ir::ScopedImportBinding(x) => x.hash(state),
&Ir::With {
namespace,
body,
ref thunks,
} => {
ir_content_hash(namespace, token, state);
ir_content_hash(body, token, state);
thunks_content_hash(thunks, token, state);
}
Ir::WithLookup(x) => x.hash(state),
}
}
pub(crate) fn ir_content_eq<'id, 'ir>(
a: IrRef<'id, 'ir>,
b: IrRef<'id, 'ir>,
token: &GhostToken<'id>,
) -> bool {
std::ptr::eq(a.0, b.0)
|| match (a.borrow(token), b.borrow(token)) {
(Ir::Int(a), Ir::Int(b)) => a == b,
(Ir::Float(a), Ir::Float(b)) => a.to_bits() == b.to_bits(),
(Ir::Bool(a), Ir::Bool(b)) => a == b,
(Ir::Null, Ir::Null) => true,
(Ir::Str(a), Ir::Str(b)) => **a == **b,
(
Ir::AttrSet {
stcs: a_stcs,
dyns: a_dyns,
},
Ir::AttrSet {
stcs: b_stcs,
dyns: b_dyns,
},
) => {
a_stcs.len() == b_stcs.len()
&& a_dyns.len() == b_dyns.len()
&& a_stcs.iter().all(|(&k, &(av, _))| {
b_stcs
.get(&k)
.is_some_and(|&(bv, _)| ir_content_eq(av, bv, token))
})
&& a_dyns
.iter()
.zip(b_dyns.iter())
.all(|(&(ak, av, _), &(bk, bv, _))| {
ir_content_eq(ak, bk, token) && ir_content_eq(av, bv, token)
})
}
(Ir::List { items: a }, Ir::List { items: b }) => {
a.len() == b.len()
&& a.iter()
.zip(b.iter())
.all(|(&a, &b)| ir_content_eq(a, b, token))
}
(Ir::HasAttr { lhs: al, rhs: ar }, Ir::HasAttr { lhs: bl, rhs: br }) => {
ir_content_eq(*al, *bl, token)
&& ar.len() == br.len()
&& ar
.iter()
.zip(br.iter())
.all(|(a, b)| attr_content_eq(a, b, token))
}
(
&Ir::BinOp {
lhs: al,
rhs: ar,
kind: ak,
},
&Ir::BinOp {
lhs: bl,
rhs: br,
kind: bk,
},
) => ak == bk && ir_content_eq(al, bl, token) && ir_content_eq(ar, br, token),
(&Ir::UnOp { rhs: ar, kind: ak }, &Ir::UnOp { rhs: br, kind: bk }) => {
ak == bk && ir_content_eq(ar, br, token)
}
(
Ir::Select {
expr: ae,
attrpath: aa,
default: ad,
..
},
Ir::Select {
expr: be,
attrpath: ba,
default: bd,
..
},
) => {
ir_content_eq(*ae, *be, token)
&& aa.len() == ba.len()
&& aa
.iter()
.zip(ba.iter())
.all(|(a, b)| attr_content_eq(a, b, token))
&& match (ad, bd) {
(Some(a), Some(b)) => ir_content_eq(*a, *b, token),
(None, None) => true,
_ => false,
}
}
(
&Ir::If {
cond: ac,
consq: acs,
alter: aa,
},
&Ir::If {
cond: bc,
consq: bcs,
alter: ba,
},
) => {
ir_content_eq(ac, bc, token)
&& ir_content_eq(acs, bcs, token)
&& ir_content_eq(aa, ba, token)
}
(
&Ir::Call {
func: af, arg: aa, ..
},
&Ir::Call {
func: bf, arg: ba, ..
},
) => ir_content_eq(af, bf, token) && ir_content_eq(aa, ba, token),
(
Ir::Assert {
assertion: aa,
expr: ae,
assertion_raw: ar,
..
},
Ir::Assert {
assertion: ba,
expr: be,
assertion_raw: br,
..
},
) => ar == br && ir_content_eq(*aa, *ba, token) && ir_content_eq(*ae, *be, token),
(
Ir::ConcatStrings {
force_string: af,
parts: ap,
},
Ir::ConcatStrings {
force_string: bf,
parts: bp,
},
) => {
af == bf
&& ap.len() == bp.len()
&& ap
.iter()
.zip(bp.iter())
.all(|(&a, &b)| ir_content_eq(a, b, token))
}
(&Ir::Path(a), &Ir::Path(b)) => ir_content_eq(a, b, token),
(
Ir::Func {
body: ab,
arg: aa,
param: ap,
thunks: at,
},
Ir::Func {
body: bb,
arg: ba,
param: bp,
thunks: bt,
},
) => {
ir_content_eq(*ab, *bb, token)
&& aa == ba
&& match (ap, bp) {
(Some(a), Some(b)) => param_content_eq(a, b),
(None, None) => true,
_ => false,
}
&& thunks_content_eq(at, bt, token)
}
(
Ir::TopLevel {
body: ab,
thunks: at,
},
Ir::TopLevel {
body: bb,
thunks: bt,
},
) => ir_content_eq(*ab, *bb, token) && thunks_content_eq(at, bt, token),
(Ir::Arg(a), Ir::Arg(b)) => a == b,
(Ir::Thunk(a), Ir::Thunk(b)) => a == b,
(Ir::Builtins, Ir::Builtins) => true,
(Ir::Builtin(a), Ir::Builtin(b)) => a == b,
(Ir::CurPos(a), Ir::CurPos(b)) => a == b,
(Ir::ReplBinding(a), Ir::ReplBinding(b)) => a == b,
(Ir::ScopedImportBinding(a), Ir::ScopedImportBinding(b)) => a == b,
(
Ir::With {
namespace: a_ns,
body: a_body,
thunks: a_thunks,
},
Ir::With {
namespace: b_ns,
body: b_body,
thunks: b_thunks,
},
) => {
ir_content_eq(*a_ns, *b_ns, token)
&& ir_content_eq(*a_body, *b_body, token)
&& thunks_content_eq(a_thunks, b_thunks, token)
}
(Ir::WithLookup(a), Ir::WithLookup(b)) => a == b,
_ => false,
}
}

21
fix/src/lib.rs Normal file
View File

@@ -0,0 +1,21 @@
#![warn(clippy::unwrap_used)]
#![allow(dead_code)]
pub mod error;
pub mod logging;
pub mod runtime;
pub mod value;
mod codegen;
mod derivation;
mod disassembler;
mod downgrade;
// mod fetcher;
mod ir;
mod nar;
mod nix_utils;
mod store;
mod string_context;
#[global_allocator]
static GLOBAL: mimalloc::MiMalloc = mimalloc::MiMalloc;

48
fix/src/logging.rs Normal file
View File

@@ -0,0 +1,48 @@
use std::env;
use std::io::IsTerminal;
use tracing_subscriber::{EnvFilter, Layer, fmt, layer::SubscriberExt, util::SubscriberInitExt};
pub fn init_logging() {
let is_terminal = std::io::stderr().is_terminal();
let show_time = env::var("NIX_JS_LOG_TIME")
.map(|v| v == "1" || v.to_lowercase() == "true")
.unwrap_or(false);
let filter = EnvFilter::from_default_env();
let fmt_layer = fmt::layer()
.with_target(true)
.with_thread_ids(false)
.with_thread_names(false)
.with_file(false)
.with_line_number(false)
.with_ansi(is_terminal)
.with_level(true);
let fmt_layer = if show_time {
fmt_layer.with_timer(fmt::time::uptime()).boxed()
} else {
fmt_layer.without_time().boxed()
};
tracing_subscriber::registry()
.with(filter)
.with(fmt_layer)
.init();
init_miette_handler();
}
fn init_miette_handler() {
let is_terminal = std::io::stderr().is_terminal();
miette::set_hook(Box::new(move |_| {
Box::new(
miette::MietteHandlerOpts::new()
.terminal_links(is_terminal)
.unicode(is_terminal)
.color(is_terminal)
.build(),
)
}))
.ok();
}

152
fix/src/main.rs Normal file
View File

@@ -0,0 +1,152 @@
use std::path::PathBuf;
use std::process::exit;
use anyhow::Result;
use clap::{Args, Parser, Subcommand};
use fix::error::Source;
use fix::runtime::Runtime;
use hashbrown::HashSet;
use rustyline::DefaultEditor;
use rustyline::error::ReadlineError;
#[derive(Parser)]
#[command(name = "nix-js", about = "Nix expression evaluator")]
struct Cli {
#[command(subcommand)]
command: Command,
}
#[derive(Subcommand)]
enum Command {
Compile {
#[clap(flatten)]
source: ExprSource,
#[arg(long)]
silent: bool,
},
Eval {
#[clap(flatten)]
source: ExprSource,
},
Repl,
}
#[derive(Args)]
#[group(required = true, multiple = false)]
struct ExprSource {
#[clap(short, long)]
expr: Option<String>,
#[clap(short, long)]
file: Option<PathBuf>,
}
fn run_compile(runtime: &mut Runtime, src: ExprSource, silent: bool) -> Result<()> {
let src = if let Some(expr) = src.expr {
Source::new_eval(expr)?
} else if let Some(file) = src.file {
Source::new_file(file)?
} else {
unreachable!()
};
todo!()
/* match runtime.compile_bytecode(src) {
Ok(compiled) => {
if !silent {
println!("{}", runtime.disassemble_colored(&compiled));
}
}
Err(err) => {
eprintln!("{:?}", miette::Report::new(*err));
exit(1);
}
};
Ok(()) */
}
fn run_eval(runtime: &mut Runtime, src: ExprSource) -> Result<()> {
let src = if let Some(expr) = src.expr {
Source::new_eval(expr)?
} else if let Some(file) = src.file {
Source::new_file(file)?
} else {
unreachable!()
};
match runtime.eval_deep(src) {
Ok(value) => {
println!("{}", value.display_compat());
}
Err(err) => {
eprintln!("{:?}", miette::Report::new(*err));
exit(1);
}
};
Ok(())
}
fn run_repl(runtime: &mut Runtime) -> Result<()> {
let mut rl = DefaultEditor::new()?;
let mut scope = HashSet::new();
const RE: ere::Regex<3> = ere::compile_regex!("^[ \t]*([a-zA-Z_][a-zA-Z0-9_'-]*)[ \t]*(.*)$");
loop {
let readline = rl.readline("nix-js-repl> ");
match readline {
Ok(line) => {
if line.trim().is_empty() {
continue;
}
let _ = rl.add_history_entry(line.as_str());
if let Some([Some(_), Some(ident), Some(rest)]) = RE.exec(&line) {
if let Some(expr) = rest.strip_prefix('=') {
let expr = expr.trim_start();
if expr.is_empty() {
eprintln!("Error: missing expression after '='");
continue;
}
match runtime.add_binding(ident, expr, &mut scope) {
Ok(value) => println!("{} = {}", ident, value),
Err(err) => eprintln!("{:?}", miette::Report::new(*err)),
}
} else {
let src = Source::new_repl(line)?;
match runtime.eval_repl(src, &scope) {
Ok(value) => println!("{value}"),
Err(err) => eprintln!("{:?}", miette::Report::new(*err)),
}
}
} else {
let src = Source::new_repl(line)?;
match runtime.eval_repl(src, &scope) {
Ok(value) => println!("{value}"),
Err(err) => eprintln!("{:?}", miette::Report::new(*err)),
}
}
}
Err(ReadlineError::Interrupted) => {
println!();
}
Err(ReadlineError::Eof) => {
println!("CTRL-D");
break;
}
Err(err) => {
eprintln!("Error: {err:?}");
break;
}
}
}
Ok(())
}
fn main() -> Result<()> {
fix::logging::init_logging();
let cli = Cli::parse();
let mut runtime = Runtime::new()?;
match cli.command {
Command::Compile { source, silent } => run_compile(&mut runtime, source, silent),
Command::Eval { source } => run_eval(&mut runtime, source),
Command::Repl => run_repl(&mut runtime),
}
}

66
fix/src/nar.rs Normal file
View File

@@ -0,0 +1,66 @@
use std::io::Read;
use std::path::Path;
use nix_nar::Encoder;
use sha2::{Digest, Sha256};
use crate::error::{Error, Result};
pub fn compute_nar_hash(path: &Path) -> Result<[u8; 32]> {
let mut hasher = Sha256::new();
std::io::copy(
&mut Encoder::new(path).map_err(|err| Error::internal(err.to_string()))?,
&mut hasher,
)
.map_err(|err| Error::internal(err.to_string()))?;
Ok(hasher.finalize().into())
}
pub fn pack_nar(path: &Path) -> Result<Vec<u8>> {
let mut buffer = Vec::new();
Encoder::new(path)
.map_err(|err| Error::internal(err.to_string()))?
.read_to_end(&mut buffer)
.map_err(|err| Error::internal(err.to_string()))?;
Ok(buffer)
}
#[cfg(test)]
#[allow(clippy::unwrap_used)]
mod tests {
use std::fs;
use tempfile::TempDir;
use super::*;
#[test_log::test]
fn test_simple_file() {
let temp = TempDir::new().unwrap();
let file_path = temp.path().join("test.txt");
fs::write(&file_path, "hello").unwrap();
let hash = hex::encode(compute_nar_hash(&file_path).unwrap());
assert_eq!(
hash,
"0a430879c266f8b57f4092a0f935cf3facd48bbccde5760d4748ca405171e969"
);
assert!(!hash.is_empty());
assert_eq!(hash.len(), 64);
}
#[test_log::test]
fn test_directory() {
let temp = TempDir::new().unwrap();
fs::write(temp.path().join("a.txt"), "aaa").unwrap();
fs::write(temp.path().join("b.txt"), "bbb").unwrap();
let hash = hex::encode(compute_nar_hash(temp.path()).unwrap());
assert_eq!(
hash,
"0036c14209749bc9b9631e2077b108b701c322ab53853cd26f2746268a86fc0f"
);
assert!(!hash.is_empty());
assert_eq!(hash.len(), 64);
}
}

21
fix/src/nix_utils.rs Normal file
View File

@@ -0,0 +1,21 @@
use nix_compat::store_path::compress_hash;
use sha2::{Digest as _, Sha256};
pub fn sha256_hex(data: &[u8]) -> String {
let mut hasher = Sha256::new();
hasher.update(data);
hex::encode(hasher.finalize())
}
pub fn make_store_path(store_dir: &str, ty: &str, hash_hex: &str, name: &str) -> String {
let s = format!("{}:sha256:{}:{}:{}", ty, hash_hex, store_dir, name);
let mut hasher = Sha256::new();
hasher.update(s.as_bytes());
let hash: [u8; 32] = hasher.finalize().into();
let compressed = compress_hash::<20>(&hash);
let encoded = nix_compat::nixbase32::encode(&compressed);
format!("{}/{}-{}", store_dir, encoded, name)
}

531
fix/src/runtime.rs Normal file
View File

@@ -0,0 +1,531 @@
use std::hash::BuildHasher;
use bumpalo::Bump;
use gc_arena::{Arena, Rootable, arena::CollectionPhase};
use ghost_cell::{GhostCell, GhostToken};
use hashbrown::{DefaultHashBuilder, HashMap, HashSet, HashTable};
use rnix::TextRange;
use string_interner::DefaultStringInterner;
use crate::codegen::{BytecodeContext, InstructionPtr};
use crate::downgrade::{Downgrade as _, DowngradeContext};
use crate::error::{Error, Result, Source};
use crate::ir::{ArgId, Ir, IrKey, IrRef, RawIrRef, StringId, ThunkId, ir_content_eq};
use crate::runtime::builtins::new_builtins_env;
use crate::store::{DaemonStore, StoreConfig};
use crate::value::Symbol;
mod builtins;
mod stack;
mod value;
mod vm;
use vm::{Action, VM};
pub struct Runtime {
bytecode: Vec<u8>,
global_env: HashMap<StringId, Ir<'static, RawIrRef<'static>>>,
sources: Vec<Source>,
store: DaemonStore,
spans: Vec<(usize, TextRange)>,
thunk_count: usize,
strings: DefaultStringInterner,
arena: Arena<Rootable![VM<'_>]>,
}
impl Runtime {
const COLLECTOR_GRANULARITY: f64 = 1024.0;
pub fn new() -> Result<Self> {
let mut strings = DefaultStringInterner::new();
let global_env = new_builtins_env(&mut strings);
let config = StoreConfig::from_env();
let store = DaemonStore::connect(&config.daemon_socket)?;
Ok(Self {
global_env,
store,
strings,
thunk_count: 0,
bytecode: Vec::new(),
sources: Vec::new(),
spans: Vec::new(),
arena: Arena::new(|mc| VM::new(mc)),
})
}
pub fn eval(&mut self, source: Source) -> Result<crate::value::Value> {
let root = self.downgrade(source, None)?;
let ip = crate::codegen::compile_bytecode(root.as_ref(), self);
self.run(ip)
}
pub fn eval_shallow(&mut self, _source: Source) -> Result<crate::value::Value> {
todo!()
}
pub fn eval_deep(&mut self, source: Source) -> Result<crate::value::Value> {
// FIXME: deep
let root = self.downgrade(source, None)?;
let ip = crate::codegen::compile_bytecode(root.as_ref(), self);
self.run(ip)
}
pub fn eval_repl(
&mut self,
source: Source,
scope: &HashSet<StringId>,
) -> Result<crate::value::Value> {
// FIXME: shallow
let root = self.downgrade(source, Some(Scope::Repl(scope)))?;
let ip = crate::codegen::compile_bytecode(root.as_ref(), self);
self.run(ip)
}
pub fn add_binding(
&mut self,
_ident: &str,
_expr: &str,
_scope: &mut HashSet<StringId>,
) -> Result<crate::value::Value> {
todo!()
}
fn downgrade_ctx<'a, 'bump, 'id>(
&'a mut self,
bump: &'bump Bump,
token: GhostToken<'id>,
extra_scope: Option<Scope<'a>>,
) -> DowngradeCtx<'a, 'id, 'bump> {
let Runtime {
global_env,
sources,
thunk_count,
strings,
..
} = self;
DowngradeCtx {
bump,
token,
strings,
source: sources.last().unwrap().clone(),
scopes: [Scope::Global(global_env)].into_iter().chain(extra_scope.into_iter()).collect(),
with_scope_count: 0,
arg_count: 0,
thunk_count,
thunk_scopes: vec![ThunkScope::new_in(bump)],
}
}
fn downgrade<'a>(&'a mut self, source: Source, extra_scope: Option<Scope<'a>>) -> Result<OwnedIr> {
tracing::debug!("Parsing Nix expression");
self.sources.push(source.clone());
let root = rnix::Root::parse(&source.src);
handle_parse_error(root.errors(), source).map_or(Ok(()), Err)?;
tracing::debug!("Downgrading Nix expression");
let expr = root
.tree()
.expr()
.ok_or_else(|| Error::parse_error("unexpected EOF".into()))?;
let bump = Bump::new();
GhostToken::new(|token| {
let ir = self
.downgrade_ctx(&bump, token, extra_scope)
.downgrade_toplevel(expr)?;
let ir = unsafe { std::mem::transmute::<RawIrRef<'_>, RawIrRef<'static>>(ir) };
Ok(OwnedIr { _bump: bump, ir })
})
}
fn run(&mut self, ip: InstructionPtr) -> Result<crate::value::Value> {
let mut pc = ip.0;
loop {
let Runtime {
bytecode,
strings,
arena,
..
} = self;
let action =
arena.mutate_root(|mc, root| root.run_batch(bytecode, &mut pc, mc, strings));
match action {
Action::NeedGc => {
if self.arena.collection_phase() == CollectionPhase::Sweeping {
self.arena.collect_debt();
} else if let Some(marked) = self.arena.mark_debt() {
marked.start_sweeping();
}
}
Action::Done(done) => {
break done;
}
Action::Continue => (),
Action::IoRequest(_) => todo!(),
}
}
}
}
fn parse_error_span(error: &rnix::ParseError) -> Option<rnix::TextRange> {
use rnix::ParseError::*;
match error {
Unexpected(range)
| UnexpectedExtra(range)
| UnexpectedWanted(_, range, _)
| UnexpectedDoubleBind(range)
| DuplicatedArgs(range, _) => Some(*range),
_ => None,
}
}
fn handle_parse_error<'a>(
errors: impl IntoIterator<Item = &'a rnix::ParseError>,
source: Source,
) -> Option<Box<Error>> {
for err in errors {
if let Some(span) = parse_error_span(err) {
return Some(
Error::parse_error(err.to_string())
.with_source(source)
.with_span(span),
);
}
}
None
}
struct DowngradeCtx<'ctx, 'id, 'ir> {
bump: &'ir Bump,
token: GhostToken<'id>,
strings: &'ctx mut DefaultStringInterner,
source: Source,
scopes: Vec<Scope<'ctx>>,
with_scope_count: u32,
arg_count: u32,
thunk_count: &'ctx mut usize,
thunk_scopes: Vec<ThunkScope<'id, 'ir>>,
}
fn should_thunk<'id>(ir: IrRef<'id, '_>, token: &GhostToken<'id>) -> bool {
!matches!(
ir.borrow(token),
Ir::Builtin(_)
| Ir::Builtins
| Ir::Int(_)
| Ir::Float(_)
| Ir::Bool(_)
| Ir::Null
| Ir::Str(_)
| Ir::Thunk(_)
)
}
impl<'ctx, 'id, 'ir> DowngradeCtx<'ctx, 'id, 'ir> {
fn new(
bump: &'ir Bump,
token: GhostToken<'id>,
symbols: &'ctx mut DefaultStringInterner,
global: &'ctx HashMap<StringId, Ir<'static, RawIrRef<'static>>>,
extra_scope: Option<Scope<'ctx>>,
thunk_count: &'ctx mut usize,
source: Source,
) -> Self {
Self {
bump,
token,
strings: symbols,
source,
scopes: std::iter::once(Scope::Global(global))
.chain(extra_scope)
.collect(),
thunk_count,
arg_count: 0,
with_scope_count: 0,
thunk_scopes: vec![ThunkScope::new_in(bump)],
}
}
}
impl<'ctx: 'ir, 'id, 'ir> DowngradeContext<'id, 'ir> for DowngradeCtx<'ctx, 'id, 'ir> {
fn new_expr(&self, expr: Ir<'ir, IrRef<'id, 'ir>>) -> IrRef<'id, 'ir> {
IrRef::new(self.bump.alloc(GhostCell::new(expr)))
}
fn new_arg(&mut self) -> ArgId {
self.arg_count += 1;
ArgId(self.arg_count - 1)
}
fn maybe_thunk(&mut self, ir: IrRef<'id, 'ir>) -> IrRef<'id, 'ir> {
if !should_thunk(ir, &self.token) {
return ir;
}
let cached = self
.thunk_scopes
.last()
.expect("no active cache scope")
.lookup_cache(ir, &self.token);
if let Some(id) = cached {
return IrRef::alloc(self.bump, Ir::Thunk(id));
}
let id = ThunkId(*self.thunk_count);
*self.thunk_count = self.thunk_count.checked_add(1).expect("thunk id overflow");
self.thunk_scopes
.last_mut()
.expect("no active cache scope")
.add_binding(id, ir, &self.token);
IrRef::alloc(self.bump, Ir::Thunk(id))
}
fn new_sym(&mut self, sym: String) -> StringId {
StringId(self.strings.get_or_intern(sym))
}
fn get_sym(&self, id: StringId) -> Symbol<'_> {
self.strings.resolve(id.0).expect("no symbol found").into()
}
fn lookup(&self, sym: StringId, span: TextRange) -> Result<IrRef<'id, 'ir>> {
for scope in self.scopes.iter().rev() {
match scope {
&Scope::Global(global_scope) => {
if let Some(expr) = global_scope.get(&sym) {
let ir = match expr {
Ir::Builtins => Ir::Builtins,
Ir::Builtin(s) => Ir::Builtin(*s),
Ir::Bool(b) => Ir::Bool(*b),
Ir::Null => Ir::Null,
_ => unreachable!("globals should only contain leaf IR nodes"),
};
return Ok(self.new_expr(ir));
}
}
&Scope::Repl(repl_bindings) => {
if repl_bindings.contains(&sym) {
return Ok(self.new_expr(Ir::ReplBinding(sym)));
}
}
Scope::ScopedImport(scoped_bindings) => {
if scoped_bindings.contains(&sym) {
return Ok(self.new_expr(Ir::ScopedImportBinding(sym)));
}
}
Scope::Let(let_scope) => {
if let Some(&expr) = let_scope.get(&sym) {
return Ok(self.new_expr(Ir::Thunk(expr)));
}
}
&Scope::Param(param_sym, id) => {
if param_sym == sym {
return Ok(self.new_expr(Ir::Arg(id)));
}
}
}
}
if self.with_scope_count > 0 {
Ok(self.new_expr(Ir::WithLookup(sym)))
} else {
Err(Error::downgrade_error(
format!("'{}' not found", self.get_sym(sym)),
self.get_current_source(),
span,
))
}
}
fn get_current_source(&self) -> Source {
self.source.clone()
}
fn with_let_scope<F, R>(&mut self, keys: &[StringId], f: F) -> Result<R>
where
F: FnOnce(&mut Self) -> Result<(bumpalo::collections::Vec<'ir, IrRef<'id, 'ir>>, R)>,
{
let base = *self.thunk_count;
*self.thunk_count = self
.thunk_count
.checked_add(keys.len())
.expect("thunk id overflow");
let iter = keys.iter().enumerate().map(|(offset, &key)| {
(
key,
ThunkId(unsafe { base.checked_add(offset).unwrap_unchecked() }),
)
});
self.scopes.push(Scope::Let(iter.collect()));
let (vals, ret) = {
let mut guard = ScopeGuard { ctx: self };
f(guard.as_ctx())?
};
assert_eq!(keys.len(), vals.len());
let scope = self.thunk_scopes.last_mut().expect("no active thunk scope");
scope.extend_bindings((base..base + keys.len()).map(ThunkId).zip(vals));
Ok(ret)
}
fn with_param_scope<F, R>(&mut self, param: StringId, arg: ArgId, f: F) -> R
where
F: FnOnce(&mut Self) -> R,
{
self.scopes.push(Scope::Param(param, arg));
let mut guard = ScopeGuard { ctx: self };
f(guard.as_ctx())
}
fn with_with_scope<F, R>(&mut self, f: F) -> R
where
F: FnOnce(&mut Self) -> R,
{
self.with_scope_count += 1;
let ret = f(self);
self.with_scope_count -= 1;
ret
}
fn with_thunk_scope<F, R>(
&mut self,
f: F,
) -> (
R,
bumpalo::collections::Vec<'ir, (ThunkId, IrRef<'id, 'ir>)>,
)
where
F: FnOnce(&mut Self) -> R,
{
self.thunk_scopes.push(ThunkScope::new_in(self.bump));
let ret = f(self);
(
ret,
self.thunk_scopes
.pop()
.expect("no thunk scope left???")
.bindings,
)
}
fn bump(&self) -> &'ir bumpalo::Bump {
self.bump
}
}
impl<'id, 'ir, 'ctx: 'ir> DowngradeCtx<'ctx, 'id, 'ir> {
fn downgrade_toplevel(mut self, root: rnix::ast::Expr) -> Result<RawIrRef<'ir>> {
let body = root.downgrade(&mut self)?;
let thunks = self
.thunk_scopes
.pop()
.expect("no thunk scope left???")
.bindings;
let ir = IrRef::alloc(self.bump, Ir::TopLevel { body, thunks });
Ok(ir.freeze(self.token))
}
}
struct ThunkScope<'id, 'ir> {
bindings: bumpalo::collections::Vec<'ir, (ThunkId, IrRef<'id, 'ir>)>,
cache: HashTable<(IrRef<'id, 'ir>, ThunkId)>,
hasher: DefaultHashBuilder,
}
impl<'id, 'ir> ThunkScope<'id, 'ir> {
fn new_in(bump: &'ir Bump) -> Self {
Self {
bindings: bumpalo::collections::Vec::new_in(bump),
cache: HashTable::new(),
hasher: DefaultHashBuilder::default(),
}
}
fn lookup_cache(&self, key: IrRef<'id, 'ir>, token: &GhostToken<'id>) -> Option<ThunkId> {
let hash = self.hasher.hash_one(IrKey(key, token));
self.cache
.find(hash, |&(ir, _)| ir_content_eq(key, ir, token))
.map(|&(_, id)| id)
}
fn add_binding(&mut self, id: ThunkId, ir: IrRef<'id, 'ir>, token: &GhostToken<'id>) {
self.bindings.push((id, ir));
let hash = self.hasher.hash_one(IrKey(ir, token));
self.cache.insert_unique(hash, (ir, id), |&(ir, _)| {
self.hasher.hash_one(IrKey(ir, token))
});
}
fn extend_bindings(&mut self, iter: impl IntoIterator<Item = (ThunkId, IrRef<'id, 'ir>)>) {
self.bindings.extend(iter);
}
}
enum Scope<'ctx> {
Global(&'ctx HashMap<StringId, Ir<'static, RawIrRef<'static>>>),
Repl(&'ctx HashSet<StringId>),
ScopedImport(HashSet<StringId>),
Let(HashMap<StringId, ThunkId>),
Param(StringId, ArgId),
}
struct ScopeGuard<'a, 'ctx, 'id, 'ir> {
ctx: &'a mut DowngradeCtx<'ctx, 'id, 'ir>,
}
impl Drop for ScopeGuard<'_, '_, '_, '_> {
fn drop(&mut self) {
self.ctx.scopes.pop();
}
}
impl<'id, 'ir, 'ctx> ScopeGuard<'_, 'ctx, 'id, 'ir> {
fn as_ctx(&mut self) -> &mut DowngradeCtx<'ctx, 'id, 'ir> {
self.ctx
}
}
struct OwnedIr {
_bump: Bump,
ir: RawIrRef<'static>,
}
impl OwnedIr {
unsafe fn new(ir: RawIrRef<'_>, bump: Bump) -> Self {
Self {
_bump: bump,
ir: unsafe { std::mem::transmute::<RawIrRef<'_>, RawIrRef<'static>>(ir) }
}
}
fn as_ref(&self) -> RawIrRef<'_> {
self.ir
}
}
impl BytecodeContext for Runtime {
fn intern_string(&mut self, s: &str) -> StringId {
StringId(self.strings.get_or_intern(s))
}
fn register_span(&mut self, range: TextRange) -> u32 {
let id = self.spans.len();
let source_id = self
.sources
.len()
.checked_sub(1)
.expect("current_source not set");
self.spans.push((source_id, range));
id as u32
}
fn get_code(&self) -> &[u8] {
&self.bytecode
}
fn get_code_mut(&mut self) -> &mut Vec<u8> {
&mut self.bytecode
}
}

View File

@@ -0,0 +1,51 @@
use hashbrown::HashMap;
use string_interner::DefaultStringInterner;
use crate::ir::{Ir, RawIrRef, StringId};
pub(super) fn new_builtins_env(
interner: &mut DefaultStringInterner,
) -> HashMap<StringId, Ir<'static, RawIrRef<'static>>> {
let mut builtins = HashMap::new();
let builtins_sym = StringId(interner.get_or_intern("builtins"));
builtins.insert(builtins_sym, Ir::Builtins);
let free_globals = [
"abort",
"baseNameOf",
"break",
"dirOf",
"derivation",
"derivationStrict",
"fetchGit",
"fetchMercurial",
"fetchTarball",
"fetchTree",
"fromTOML",
"import",
"isNull",
"map",
"placeholder",
"removeAttrs",
"scopedImport",
"throw",
"toString",
];
let consts = [
("true", Ir::Bool(true)),
("false", Ir::Bool(false)),
("null", Ir::Null),
];
for name in free_globals {
let name = StringId(interner.get_or_intern(name));
let value = Ir::Builtin(name);
builtins.insert(name, value);
}
for (name, value) in consts {
let name = StringId(interner.get_or_intern(name));
builtins.insert(name, value);
}
builtins
}

View File

@@ -0,0 +1,31 @@
drvAttrs@{
outputs ? [ "out" ],
...
}:
let
strict = derivationStrict drvAttrs;
commonAttrs =
drvAttrs
// (builtins.listToAttrs outputsList)
// {
all = map (x: x.value) outputsList;
inherit drvAttrs;
};
outputToAttrListElement = outputName: {
name = outputName;
value = commonAttrs // {
outPath = builtins.getAttr outputName strict;
drvPath = strict.drvPath;
type = "derivation";
inherit outputName;
};
};
outputsList = map outputToAttrListElement outputs;
in
(builtins.head outputsList).value

View File

@@ -0,0 +1,76 @@
{
system ? "", # obsolete
url,
hash ? "", # an SRI hash
# Legacy hash specification
md5 ? "",
sha1 ? "",
sha256 ? "",
sha512 ? "",
outputHash ?
if hash != "" then
hash
else if sha512 != "" then
sha512
else if sha1 != "" then
sha1
else if md5 != "" then
md5
else
sha256,
outputHashAlgo ?
if hash != "" then
""
else if sha512 != "" then
"sha512"
else if sha1 != "" then
"sha1"
else if md5 != "" then
"md5"
else
"sha256",
executable ? false,
unpack ? false,
name ? baseNameOf (toString url),
# still translates to __impure to trigger derivationStrict error checks.
impure ? false,
}:
derivation (
{
builder = "builtin:fetchurl";
# New-style output content requirements.
outputHashMode = if unpack || executable then "recursive" else "flat";
inherit
name
url
executable
unpack
;
system = "builtin";
# No need to double the amount of network traffic
preferLocalBuild = true;
impureEnvVars = [
# We borrow these environment variables from the caller to allow
# easy proxy configuration. This is impure, but a fixed-output
# derivation like fetchurl is allowed to do so since its result is
# by definition pure.
"http_proxy"
"https_proxy"
"ftp_proxy"
"all_proxy"
"no_proxy"
];
# To make "nix-prefetch-url" work.
urls = [ url ];
}
// (if impure then { __impure = true; } else { inherit outputHashAlgo outputHash; })
)

89
fix/src/runtime/stack.rs Normal file
View File

@@ -0,0 +1,89 @@
use std::mem::MaybeUninit;
use gc_arena::Collect;
use smallvec::SmallVec;
// FIXME: Drop???
pub(super) struct Stack<const N: usize, T> {
inner: Box<[MaybeUninit<T>; N]>,
len: usize,
}
unsafe impl<'gc, const N: usize, T: Collect<'gc> + 'gc> Collect<'gc> for Stack<N, T> {
const NEEDS_TRACE: bool = true;
fn trace<U: gc_arena::collect::Trace<'gc>>(&self, cc: &mut U) {
for item in self.inner[..self.len].iter() {
unsafe {
item.assume_init_ref().trace(cc);
}
}
}
}
impl<const N: usize, T> Default for Stack<N, T> {
fn default() -> Self {
Self::new()
}
}
impl<const N: usize, T> Stack<N, T> {
pub(super) fn new() -> Self {
Self {
inner: Box::new([const { MaybeUninit::uninit() }; N]),
len: 0,
}
}
pub(super) fn push(&mut self, val: T) -> Result<(), T> {
if self.len == N {
return Err(val);
}
unsafe {
self.inner.get_unchecked_mut(self.len).write(val);
}
self.len += 1;
Ok(())
}
pub(super) fn pop(&mut self) -> Option<T> {
if self.len == 0 {
return None;
}
let ret = unsafe {
self.inner
.get_unchecked_mut(self.len - 1)
.assume_init_read()
};
self.len -= 1;
Some(ret)
}
pub(super) fn tos(&self) -> Option<&T> {
if self.len == 0 {
return None;
}
Some(unsafe { self.inner.get_unchecked(self.len - 1).assume_init_ref() })
}
pub(super) fn tos_mut(&mut self) -> Option<&mut T> {
if self.len == 0 {
return None;
}
Some(unsafe { self.inner.get_unchecked_mut(self.len - 1).assume_init_mut() })
}
pub(super) fn len(&self) -> usize {
self.len
}
pub(super) fn pop_n<const M: usize>(&mut self, n: usize) -> SmallVec<[T; M]> {
assert!(n <= self.len, "pop_n: not enough items on stack");
let mut result = SmallVec::new();
let start = self.len - n;
for i in start..self.len {
result.push(unsafe { self.inner.get_unchecked(i).assume_init_read() });
}
self.len = start;
result
}
}

489
fix/src/runtime/value.rs Normal file
View File

@@ -0,0 +1,489 @@
use std::fmt;
use std::marker::PhantomData;
use std::mem::size_of;
use std::ops::Deref;
use boxing::nan::raw::{RawBox, RawStore, RawTag, Value as RawValue};
use gc_arena::{Collect, Gc, Mutation, RefLock, collect::Trace};
use sealed::sealed;
use smallvec::SmallVec;
use string_interner::{Symbol, symbol::SymbolU32};
use crate::ir::StringId;
#[sealed]
pub(crate) trait Storable {
const TAG: (bool, u8);
}
pub(crate) trait InlineStorable: Storable + RawStore {}
pub(crate) trait GcStorable: Storable {}
macro_rules! define_value_types {
(
inline { $($itype:ty => $itag:expr, $iname:literal;)* }
gc { $($gtype:ty => $gtag:expr, $gname:literal;)* }
) => {
$(
#[sealed]
impl Storable for $itype {
const TAG: (bool, u8) = $itag;
}
impl InlineStorable for $itype {}
)*
$(
#[sealed]
impl Storable for $gtype {
const TAG: (bool, u8) = $gtag;
}
impl GcStorable for $gtype {}
)*
const _: () = assert!(size_of::<Value<'static>>() == 8);
$(const _: () = assert!(size_of::<$itype>() <= 6);)*
$(const _: () = { let (_, val) = $itag; assert!(val >= 1 && val <= 7); };)*
$(const _: () = { let (_, val) = $gtag; assert!(val >= 1 && val <= 7); };)*
const _: () = {
let tags: &[(bool, u8)] = &[$($itag),*, $($gtag),*];
let mut mask_false: u8 = 0;
let mut mask_true: u8 = 0;
let mut i = 0;
while i < tags.len() {
let (neg, val) = tags[i];
let bit = 1 << val;
if neg {
assert!(mask_true & bit == 0, "duplicate true tag id");
mask_true |= bit;
} else {
assert!(mask_false & bit == 0, "duplicate false tag id");
mask_false |= bit;
}
i += 1;
}
};
unsafe impl<'gc> Collect<'gc> for Value<'gc> {
const NEEDS_TRACE: bool = true;
fn trace<T: Trace<'gc>>(&self, cc: &mut T) {
let Some(tag) = self.raw.tag() else { return };
match tag.neg_val() {
$(<$gtype as Storable>::TAG => unsafe {
self.load_gc::<$gtype>().trace(cc)
},)*
$(<$itype as Storable>::TAG => (),)*
(neg, val) => unreachable!("invalid tag: neg={neg}, val={val}"),
}
}
}
impl fmt::Debug for Value<'_> {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self.tag() {
None => write!(f, "Float({:?})", unsafe {
self.raw.float().unwrap_unchecked()
}),
$(Some(<$itype as Storable>::TAG) => write!(f, "{}({:?})", $iname, unsafe {
self.as_inline::<$itype>().unwrap_unchecked()
}),)*
$(Some(<$gtype as Storable>::TAG) =>
write!(f, "{}({:?})", $gname, unsafe { self.as_gc::<$gtype>().unwrap_unchecked() }),)*
Some((neg, val)) => write!(f, "Unknown(neg={neg}, val={val})"),
}
}
}
};
}
define_value_types! {
inline {
i32 => (false, 1), "SmallInt";
bool => (false, 2), "Bool";
Null => (false, 3), "Null";
StringId => (false, 4), "SmallString";
PrimOp => (false, 5), "PrimOp";
}
gc {
i64 => (false, 6), "BigInt";
NixString => (false, 7), "String";
AttrSet<'_> => (true, 1), "AttrSet";
List<'_> => (true, 2), "List";
Thunk<'_> => (true, 3), "Thunk";
Closure<'_> => (true, 4), "Closure";
PrimOpApp<'_> => (true, 5), "PrimOpApp";
}
}
/// # Nix runtime value representation
///
/// NaN-boxed value fitting in 8 bytes.
#[repr(transparent)]
pub(crate) struct Value<'gc> {
raw: RawBox,
_marker: PhantomData<Gc<'gc, ()>>,
}
impl Clone for Value<'_> {
#[inline]
fn clone(&self) -> Self {
Self {
raw: self.raw.clone(),
_marker: PhantomData,
}
}
}
impl Default for Value<'_> {
#[inline(always)]
fn default() -> Self {
Self::new_inline(Null)
}
}
impl<'gc> Value<'gc> {
#[inline(always)]
fn mk_tag(neg: bool, val: u8) -> RawTag {
// Safety: val is asserted to be in 1..=7.
unsafe { RawTag::new_unchecked(neg, val) }
}
#[inline(always)]
fn from_raw_value(rv: RawValue) -> Self {
Self {
raw: RawBox::from_value(rv),
_marker: PhantomData,
}
}
/// Load a GC pointer from a value with a negative tag.
///
/// # Safety
///
/// The value must actually store a `Gc<'gc, T>` with the matching type.
#[inline(always)]
unsafe fn load_gc<T: GcStorable>(&self) -> Gc<'gc, T> {
unsafe {
let rv = self.raw.value().unwrap_unchecked();
let ptr: *const T = <*const T as RawStore>::from_val(rv);
Gc::from_ptr(ptr)
}
}
/// Returns the `(negative, val)` tag, or `None` for a float.
#[inline(always)]
fn tag(&self) -> Option<(bool, u8)> {
self.raw.tag().map(|t| t.neg_val())
}
}
impl<'gc> Value<'gc> {
#[inline]
pub(crate) fn new_float(val: f64) -> Self {
Self {
raw: RawBox::from_float(val),
_marker: PhantomData,
}
}
#[inline]
pub(crate) fn new_inline<T: InlineStorable>(val: T) -> Self {
Self::from_raw_value(RawValue::store(Self::mk_tag(T::TAG.0, T::TAG.1), val))
}
#[inline]
pub(crate) fn new_gc<T: GcStorable>(gc: Gc<'gc, T>) -> Self {
let ptr = Gc::as_ptr(gc);
Self::from_raw_value(RawValue::store(Self::mk_tag(T::TAG.0, T::TAG.1), ptr))
}
}
impl<'gc> Value<'gc> {
#[inline]
pub(crate) fn is_float(&self) -> bool {
self.raw.is_float()
}
#[inline]
pub(crate) fn is<T: Storable>(&self) -> bool {
self.tag() == Some(T::TAG)
}
}
impl<'gc> Value<'gc> {
#[inline]
pub(crate) fn as_float(&self) -> Option<f64> {
self.raw.float().copied()
}
#[inline]
pub(crate) fn as_inline<T: InlineStorable>(&self) -> Option<T> {
if self.is::<T>() {
Some(unsafe {
let rv = self.raw.value().unwrap_unchecked();
T::from_val(rv)
})
} else {
None
}
}
#[inline]
pub(crate) fn as_gc<T: GcStorable>(&self) -> Option<Gc<'gc, T>> {
if self.is::<T>() {
Some(unsafe {
let rv = self.raw.value().unwrap_unchecked();
let ptr: *const T = <*const T as RawStore>::from_val(rv);
Gc::from_ptr(ptr)
})
} else {
None
}
}
}
#[derive(Clone, Copy, Debug)]
pub(crate) struct Null;
impl RawStore for Null {
fn to_val(self, value: &mut RawValue) {
value.set_data([0; 6]);
}
fn from_val(_: &RawValue) -> Self {
Self
}
}
impl RawStore for StringId {
fn to_val(self, value: &mut RawValue) {
(self.0.to_usize() as u32).to_val(value);
}
fn from_val(value: &RawValue) -> Self {
Self(
SymbolU32::try_from_usize(u32::from_val(value) as usize)
.expect("failed to read StringId from Value"),
)
}
}
/// Heap-allocated Nix string.
///
/// Stored on the GC heap via `Gc<'gc, NixString>`. The string data itself
/// lives in a standard `Box<str>` owned by this struct; the GC only manages
/// the outer allocation.
#[derive(Collect)]
#[collect(require_static)]
pub(crate) struct NixString {
data: Box<str>,
// TODO: string context for derivation dependency tracking
}
impl NixString {
pub(crate) fn new(s: impl Into<Box<str>>) -> Self {
Self { data: s.into() }
}
pub(crate) fn as_str(&self) -> &str {
&self.data
}
}
impl fmt::Debug for NixString {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
fmt::Debug::fmt(&self.data, f)
}
}
#[derive(Collect, Debug)]
#[collect(no_drop)]
pub(crate) struct AttrSet<'gc> {
pub(crate) entries: SmallVec<[(StringId, Value<'gc>); 4]>,
}
impl<'gc> AttrSet<'gc> {
pub(crate) fn from_sorted(entries: SmallVec<[(StringId, Value<'gc>); 4]>) -> Self {
debug_assert!(entries.is_sorted_by_key(|(key, _)| *key));
Self { entries }
}
pub(crate) fn lookup(&self, key: StringId) -> Option<Value<'gc>> {
self.entries
.binary_search_by_key(&key, |(k, _)| *k)
.ok()
.map(|i| self.entries[i].1.clone())
}
pub(crate) fn has(&self, key: StringId) -> bool {
self.entries
.binary_search_by_key(&key, |(k, _)| *k)
.is_ok()
}
pub(crate) fn merge(&self, other: &Self, mc: &Mutation<'gc>) -> Gc<'gc, Self> {
use std::cmp::Ordering::*;
debug_assert!(self.entries.is_sorted_by_key(|(key, _)| *key));
debug_assert!(other.entries.is_sorted_by_key(|(key, _)| *key));
let mut entries = SmallVec::new();
let mut i = 0;
let mut j = 0;
while i < self.entries.len() && j < other.entries.len() {
match self.entries[i].0.cmp(&other.entries[j].0) {
Less => {
entries.push(self.entries[i].clone());
i += 1;
}
Greater => {
entries.push(other.entries[j].clone());
j += 1;
}
Equal => {
entries.push(other.entries[j].clone());
i += 1;
j += 1;
}
}
}
entries.extend(other.entries[j..].iter().cloned());
entries.extend(self.entries[i..].iter().cloned());
debug_assert!(entries.is_sorted_by_key(|(key, _)| *key));
Gc::new(mc, AttrSet { entries })
}
}
#[derive(Collect, Debug)]
#[collect(no_drop)]
pub(crate) struct List<'gc> {
pub(crate) inner: SmallVec<[Value<'gc>; 4]>,
}
pub(crate) type Thunk<'gc> = RefLock<ThunkState<'gc>>;
#[derive(Collect, Debug)]
#[collect(no_drop)]
pub(crate) enum ThunkState<'gc> {
Pending {
ip: u32,
env: Gc<'gc, RefLock<Env<'gc>>>,
},
Blackhole,
Evaluated(Value<'gc>),
}
#[derive(Collect, Debug)]
#[collect(no_drop)]
pub(crate) struct Env<'gc> {
pub(crate) locals: SmallVec<[Value<'gc>; 4]>,
pub(crate) prev: Option<Gc<'gc, RefLock<Env<'gc>>>>,
}
impl<'gc> Env<'gc> {
pub(crate) fn empty() -> Self {
Env {
locals: SmallVec::new(),
prev: None,
}
}
pub(crate) fn with_arg(
arg: Value<'gc>,
n_locals: u32,
prev: Gc<'gc, RefLock<Env<'gc>>>,
) -> Self {
let mut locals = smallvec::smallvec![Value::default(); 1 + n_locals as usize];
locals[0] = arg;
Env {
locals,
prev: Some(prev),
}
}
}
#[derive(Collect, Debug)]
#[collect(no_drop)]
pub(crate) struct Closure<'gc> {
pub(crate) ip: u32,
pub(crate) n_locals: u32,
pub(crate) env: Gc<'gc, RefLock<Env<'gc>>>,
pub(crate) pattern: Option<Gc<'gc, PatternInfo>>,
}
#[derive(Collect, Debug)]
#[collect(require_static)]
pub(crate) struct PatternInfo {
pub(crate) required: SmallVec<[StringId; 4]>,
pub(crate) optional: SmallVec<[StringId; 4]>,
pub(crate) ellipsis: bool,
pub(crate) param_spans: Box<[(StringId, u32)]>,
}
#[derive(Clone, Copy, Debug, Collect)]
#[collect(require_static)]
pub(crate) struct PrimOp {
pub(crate) id: u8,
pub(crate) arity: u8,
}
impl RawStore for PrimOp {
fn to_val(self, value: &mut RawValue) {
value.set_data([0, 0, 0, 0, self.id, self.arity]);
}
fn from_val(value: &RawValue) -> Self {
let [.., id, arity] = *value.data();
Self { id, arity }
}
}
#[derive(Collect, Debug)]
#[collect(no_drop)]
pub(crate) struct PrimOpApp<'gc> {
pub(crate) primop: PrimOp,
pub(crate) args: SmallVec<[Value<'gc>; 2]>,
}
#[repr(transparent)]
pub(crate) struct StrictValue<'gc>(Value<'gc>);
impl<'gc> StrictValue<'gc> {
#[inline]
pub(crate) fn try_from_forced(val: Value<'gc>) -> Option<Self> {
if !val.is::<Thunk<'gc>>() {
Some(Self(val))
} else {
None
}
}
#[inline]
pub(crate) fn into_relaxed(self) -> Value<'gc> {
self.0
}
}
impl<'gc> Deref for StrictValue<'gc> {
type Target = Value<'gc>;
#[inline]
fn deref(&self) -> &Value<'gc> {
&self.0
}
}
impl Clone for StrictValue<'_> {
#[inline]
fn clone(&self) -> Self {
Self(self.0.clone())
}
}
impl fmt::Debug for StrictValue<'_> {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
fmt::Debug::fmt(&self.0, f)
}
}
unsafe impl<'gc> Collect<'gc> for StrictValue<'gc> {
const NEEDS_TRACE: bool = true;
fn trace<T: gc_arena::collect::Trace<'gc>>(&self, cc: &mut T) {
self.0.trace(cc);
}
}

1849
fix/src/runtime/vm.rs Normal file

File diff suppressed because it is too large Load Diff

40
fix/src/store.rs Normal file
View File

@@ -0,0 +1,40 @@
use crate::error::Result;
mod config;
mod daemon;
mod error;
mod validation;
pub use config::StoreConfig;
pub use daemon::DaemonStore;
pub use validation::validate_store_path;
pub trait Store: Send + Sync {
fn get_store_dir(&self) -> &str;
fn is_valid_path(&self, path: &str) -> Result<bool>;
fn ensure_path(&self, path: &str) -> Result<()>;
fn add_to_store(
&self,
name: &str,
content: &[u8],
recursive: bool,
references: Vec<String>,
) -> Result<String>;
fn add_to_store_from_path(
&self,
name: &str,
source_path: &std::path::Path,
references: Vec<String>,
) -> Result<String>;
fn add_text_to_store(
&self,
name: &str,
content: &str,
references: Vec<String>,
) -> Result<String>;
}

22
fix/src/store/config.rs Normal file
View File

@@ -0,0 +1,22 @@
use std::path::PathBuf;
#[derive(Debug, Clone)]
pub struct StoreConfig {
pub daemon_socket: PathBuf,
}
impl StoreConfig {
pub fn from_env() -> Self {
let daemon_socket = std::env::var("NIX_DAEMON_SOCKET")
.map(PathBuf::from)
.unwrap_or_else(|_| PathBuf::from("/nix/var/nix/daemon-socket/socket"));
Self { daemon_socket }
}
}
impl Default for StoreConfig {
fn default() -> Self {
Self::from_env()
}
}

773
fix/src/store/daemon.rs Normal file
View File

@@ -0,0 +1,773 @@
use std::io::{Error as IoError, ErrorKind as IoErrorKind, Result as IoResult};
use std::path::Path;
use nix_compat::nix_daemon::types::{AddToStoreNarRequest, UnkeyedValidPathInfo};
use nix_compat::nix_daemon::worker_protocol::{ClientSettings, Operation};
use nix_compat::store_path::StorePath;
use nix_compat::wire::ProtocolVersion;
use nix_compat::wire::de::{NixRead, NixReader};
use nix_compat::wire::ser::{NixSerialize, NixWrite, NixWriter, NixWriterBuilder};
use num_enum::{IntoPrimitive, TryFromPrimitive};
use thiserror::Error;
use tokio::io::{AsyncReadExt, AsyncWriteExt, ReadHalf, WriteHalf, split};
use tokio::net::UnixStream;
use tokio::sync::Mutex;
use super::Store;
use crate::error::{Error, Result};
pub struct DaemonStore {
runtime: tokio::runtime::Runtime,
connection: NixDaemonConnection,
}
impl DaemonStore {
pub fn connect(socket_path: &Path) -> Result<Self> {
let runtime = tokio::runtime::Runtime::new()
.map_err(|e| Error::internal(format!("Failed to create tokio runtime: {}", e)))?;
let connection = runtime.block_on(async {
NixDaemonConnection::connect(socket_path)
.await
.map_err(|e| {
Error::internal(format!(
"Failed to connect to nix-daemon at {}: {}",
socket_path.display(),
e
))
})
})?;
Ok(Self {
runtime,
connection,
})
}
fn block_on<F>(&self, future: F) -> F::Output
where
F: std::future::Future,
{
self.runtime.block_on(future)
}
}
impl Store for DaemonStore {
fn get_store_dir(&self) -> &str {
"/nix/store"
}
fn is_valid_path(&self, path: &str) -> Result<bool> {
self.block_on(async {
self.connection
.is_valid_path(path)
.await
.map_err(|e| Error::internal(format!("Daemon error in is_valid_path: {}", e)))
})
}
fn ensure_path(&self, path: &str) -> Result<()> {
self.block_on(async {
self.connection.ensure_path(path).await.map_err(|e| {
Error::eval_error(
format!(
"builtins.storePath: path '{}' is not valid in nix store: {}",
path, e
),
None,
)
})
})
}
fn add_to_store(
&self,
name: &str,
content: &[u8],
recursive: bool,
references: Vec<String>,
) -> Result<String> {
use std::fs;
use nix_compat::nix_daemon::types::AddToStoreNarRequest;
use nix_compat::nixhash::{CAHash, NixHash};
use nix_compat::store_path::{StorePath, build_ca_path};
use sha2::{Digest, Sha256};
use tempfile::NamedTempFile;
let temp_file = NamedTempFile::new()
.map_err(|e| Error::internal(format!("Failed to create temp file: {}", e)))?;
fs::write(temp_file.path(), content)
.map_err(|e| Error::internal(format!("Failed to write temp file: {}", e)))?;
let nar_data = crate::nar::pack_nar(temp_file.path())?;
let nar_hash_hex = {
let mut hasher = Sha256::new();
hasher.update(&nar_data);
hex::encode(hasher.finalize())
};
let nar_hash_bytes = hex::decode(&nar_hash_hex)
.map_err(|e| Error::internal(format!("Invalid nar hash: {}", e)))?;
let mut nar_hash_arr = [0u8; 32];
nar_hash_arr.copy_from_slice(&nar_hash_bytes);
let ca_hash = if recursive {
CAHash::Nar(NixHash::Sha256(nar_hash_arr))
} else {
let mut content_hasher = Sha256::new();
content_hasher.update(content);
let content_hash = content_hasher.finalize();
let mut content_hash_arr = [0u8; 32];
content_hash_arr.copy_from_slice(&content_hash);
CAHash::Flat(NixHash::Sha256(content_hash_arr))
};
let ref_store_paths: std::result::Result<Vec<StorePath<String>>, _> = references
.iter()
.map(|r| StorePath::<String>::from_absolute_path(r.as_bytes()))
.collect();
let ref_store_paths = ref_store_paths
.map_err(|e| Error::internal(format!("Invalid reference path: {}", e)))?;
let store_path: StorePath<String> =
build_ca_path(name, &ca_hash, references.clone(), false)
.map_err(|e| Error::internal(format!("Failed to build store path: {}", e)))?;
let store_path_str = store_path.to_absolute_path();
if self.is_valid_path(&store_path_str)? {
return Ok(store_path_str);
}
let request = AddToStoreNarRequest {
path: store_path,
deriver: None,
nar_hash: unsafe {
std::mem::transmute::<[u8; 32], nix_compat::nix_daemon::types::NarHash>(
nar_hash_arr,
)
},
references: ref_store_paths,
registration_time: 0,
nar_size: nar_data.len() as u64,
ultimate: false,
signatures: vec![],
ca: Some(ca_hash),
repair: false,
dont_check_sigs: false,
};
self.block_on(async {
self.connection
.add_to_store_nar(request, &nar_data)
.await
.map_err(|e| Error::internal(format!("Failed to add to store: {}", e)))
})?;
Ok(store_path_str)
}
fn add_to_store_from_path(
&self,
name: &str,
source_path: &std::path::Path,
references: Vec<String>,
) -> Result<String> {
use nix_compat::nix_daemon::types::AddToStoreNarRequest;
use nix_compat::nixhash::{CAHash, NixHash};
use nix_compat::store_path::{StorePath, build_ca_path};
use sha2::{Digest, Sha256};
let nar_data = crate::nar::pack_nar(source_path)?;
let nar_hash: [u8; 32] = {
let mut hasher = Sha256::new();
hasher.update(&nar_data);
hasher.finalize().into()
};
let ca_hash = CAHash::Nar(NixHash::Sha256(nar_hash));
let ref_store_paths: std::result::Result<Vec<StorePath<String>>, _> = references
.iter()
.map(|r| StorePath::<String>::from_absolute_path(r.as_bytes()))
.collect();
let ref_store_paths = ref_store_paths
.map_err(|e| Error::internal(format!("Invalid reference path: {}", e)))?;
let store_path: StorePath<String> =
build_ca_path(name, &ca_hash, references.clone(), false)
.map_err(|e| Error::internal(format!("Failed to build store path: {}", e)))?;
let store_path_str = store_path.to_absolute_path();
if self.is_valid_path(&store_path_str)? {
return Ok(store_path_str);
}
let request = AddToStoreNarRequest {
path: store_path,
deriver: None,
nar_hash: unsafe {
std::mem::transmute::<[u8; 32], nix_compat::nix_daemon::types::NarHash>(nar_hash)
},
references: ref_store_paths,
registration_time: 0,
nar_size: nar_data.len() as u64,
ultimate: false,
signatures: vec![],
ca: Some(ca_hash),
repair: false,
dont_check_sigs: false,
};
self.block_on(async {
self.connection
.add_to_store_nar(request, &nar_data)
.await
.map_err(|e| Error::internal(format!("Failed to add to store: {}", e)))
})?;
Ok(store_path_str)
}
fn add_text_to_store(
&self,
name: &str,
content: &str,
references: Vec<String>,
) -> Result<String> {
use std::fs;
use nix_compat::nix_daemon::types::AddToStoreNarRequest;
use nix_compat::nixhash::CAHash;
use nix_compat::store_path::{StorePath, build_text_path};
use sha2::{Digest, Sha256};
use tempfile::NamedTempFile;
let temp_file = NamedTempFile::new()
.map_err(|e| Error::internal(format!("Failed to create temp file: {}", e)))?;
fs::write(temp_file.path(), content.as_bytes())
.map_err(|e| Error::internal(format!("Failed to write temp file: {}", e)))?;
let nar_data = crate::nar::pack_nar(temp_file.path())?;
let nar_hash: [u8; 32] = {
let mut hasher = Sha256::new();
hasher.update(&nar_data);
hasher.finalize().into()
};
let content_hash = {
let mut hasher = Sha256::new();
hasher.update(content.as_bytes());
hasher.finalize().into()
};
let ref_store_paths: std::result::Result<Vec<StorePath<String>>, _> = references
.iter()
.map(|r| StorePath::<String>::from_absolute_path(r.as_bytes()))
.collect();
let ref_store_paths = ref_store_paths
.map_err(|e| Error::internal(format!("Invalid reference path: {}", e)))?;
let store_path: StorePath<String> = build_text_path(name, content, references.clone())
.map_err(|e| Error::internal(format!("Failed to build text store path: {}", e)))?;
let store_path_str = store_path.to_absolute_path();
if self.is_valid_path(&store_path_str)? {
return Ok(store_path_str);
}
let request = AddToStoreNarRequest {
path: store_path,
deriver: None,
nar_hash: unsafe {
std::mem::transmute::<[u8; 32], nix_compat::nix_daemon::types::NarHash>(nar_hash)
},
references: ref_store_paths,
registration_time: 0,
nar_size: nar_data.len() as u64,
ultimate: false,
signatures: vec![],
ca: Some(CAHash::Text(content_hash)),
repair: false,
dont_check_sigs: false,
};
self.block_on(async {
self.connection
.add_to_store_nar(request, &nar_data)
.await
.map_err(|e| Error::internal(format!("Failed to add text to store: {}", e)))
})?;
Ok(store_path_str)
}
}
const PROTOCOL_VERSION: ProtocolVersion = ProtocolVersion::from_parts(1, 37);
// Protocol magic numbers (from nix-compat worker_protocol.rs)
const WORKER_MAGIC_1: u64 = 0x6e697863; // "nixc"
const WORKER_MAGIC_2: u64 = 0x6478696f; // "dxio"
const STDERR_LAST: u64 = 0x616c7473; // "alts"
const STDERR_ERROR: u64 = 0x63787470; // "cxtp"
/// Performs the client handshake with a nix-daemon server
///
/// This is the client-side counterpart to `server_handshake_client`.
/// It exchanges magic numbers, negotiates protocol version, and sends client settings.
async fn client_handshake<RW>(
conn: &mut RW,
client_settings: &ClientSettings,
) -> IoResult<ProtocolVersion>
where
RW: AsyncReadExt + AsyncWriteExt + Unpin,
{
// 1. Send magic number 1
conn.write_u64_le(WORKER_MAGIC_1).await?;
// 2. Receive magic number 2
let magic2 = conn.read_u64_le().await?;
if magic2 != WORKER_MAGIC_2 {
return Err(IoError::new(
IoErrorKind::InvalidData,
format!("Invalid magic number from server: {}", magic2),
));
}
// 3. Receive server protocol version
let server_version_raw = conn.read_u64_le().await?;
let server_version: ProtocolVersion = server_version_raw.try_into().map_err(|e| {
IoError::new(
IoErrorKind::InvalidData,
format!("Invalid protocol version: {}", e),
)
})?;
// 4. Send our protocol version
conn.write_u64_le(PROTOCOL_VERSION.into()).await?;
// Pick the minimum version
let protocol_version = std::cmp::min(PROTOCOL_VERSION, server_version);
// 5. Send obsolete fields based on protocol version
if protocol_version.minor() >= 14 {
// CPU affinity (obsolete, send 0)
conn.write_u64_le(0).await?;
}
if protocol_version.minor() >= 11 {
// Reserve space (obsolete, send 0)
conn.write_u64_le(0).await?;
}
if protocol_version.minor() >= 33 {
// Read Nix version string
let version_len = conn.read_u64_le().await? as usize;
let mut version_bytes = vec![0u8; version_len];
conn.read_exact(&mut version_bytes).await?;
// Padding
let padding = (8 - (version_len % 8)) % 8;
if padding > 0 {
let mut pad = vec![0u8; padding];
conn.read_exact(&mut pad).await?;
}
}
if protocol_version.minor() >= 35 {
// Read trust level
let _trust = conn.read_u64_le().await?;
}
// 6. Read STDERR_LAST
let stderr_last = conn.read_u64_le().await?;
if stderr_last != STDERR_LAST {
return Err(IoError::new(
IoErrorKind::InvalidData,
format!("Expected STDERR_LAST, got: {}", stderr_last),
));
}
// 7. Send SetOptions operation with client settings
conn.write_u64_le(Operation::SetOptions.into()).await?;
conn.flush().await?;
// Serialize client settings
let mut settings_buf = Vec::new();
{
let mut writer = NixWriterBuilder::default()
.set_version(protocol_version)
.build(&mut settings_buf);
writer.write_value(client_settings).await?;
writer.flush().await?;
}
conn.write_all(&settings_buf).await?;
conn.flush().await?;
// 8. Read response to SetOptions
let response = conn.read_u64_le().await?;
if response != STDERR_LAST {
return Err(IoError::new(
IoErrorKind::InvalidData,
format!("Expected STDERR_LAST after SetOptions, got: {}", response),
));
}
Ok(protocol_version)
}
/// Low-level Nix Daemon client
///
/// This struct manages communication with a nix-daemon using the wire protocol.
/// It is NOT thread-safe and should be wrapped in a Mutex for concurrent access.
pub struct NixDaemonClient {
protocol_version: ProtocolVersion,
reader: NixReader<ReadHalf<UnixStream>>,
writer: NixWriter<WriteHalf<UnixStream>>,
_marker: std::marker::PhantomData<std::cell::Cell<()>>,
}
impl NixDaemonClient {
/// Connect to a nix-daemon at the given Unix socket path
pub async fn connect(socket_path: &Path) -> IoResult<Self> {
let stream = UnixStream::connect(socket_path).await?;
Self::from_stream(stream).await
}
/// Create a client from an existing Unix stream
pub async fn from_stream(mut stream: UnixStream) -> IoResult<Self> {
let client_settings = ClientSettings::default();
// Perform handshake
let protocol_version = client_handshake(&mut stream, &client_settings).await?;
// Split stream into reader and writer
let (read_half, write_half) = split(stream);
let reader = NixReader::builder()
.set_version(protocol_version)
.build(read_half);
let writer = NixWriterBuilder::default()
.set_version(protocol_version)
.build(write_half);
Ok(Self {
protocol_version,
reader,
writer,
_marker: Default::default(),
})
}
/// Execute an operation with a single parameter
async fn execute_with<P, T>(&mut self, operation: Operation, param: &P) -> IoResult<T>
where
P: NixSerialize + Send,
T: nix_compat::wire::de::NixDeserialize,
{
// Send operation
self.writer.write_value(&operation).await?;
// Send parameter
self.writer.write_value(param).await?;
self.writer.flush().await?;
self.read_response().await
}
/// Read a response from the daemon
///
/// The daemon sends either:
/// - STDERR_LAST followed by the result
/// - STDERR_ERROR followed by a structured error
async fn read_response<T>(&mut self) -> IoResult<T>
where
T: nix_compat::wire::de::NixDeserialize,
{
loop {
let msg = self.reader.read_number().await?;
if msg == STDERR_LAST {
let result: T = self.reader.read_value().await?;
return Ok(result);
} else if msg == STDERR_ERROR {
let error_msg = self.read_daemon_error().await?;
return Err(IoError::other(error_msg));
} else {
let _data: String = self.reader.read_value().await?;
continue;
}
}
}
async fn read_daemon_error(&mut self) -> IoResult<NixDaemonError> {
let type_marker: String = self.reader.read_value().await?;
assert_eq!(type_marker, "Error");
let level = NixDaemonErrorLevel::try_from_primitive(
self.reader
.read_number()
.await?
.try_into()
.map_err(|_| IoError::other("invalid nix-daemon error level"))?,
)
.map_err(|_| IoError::other("invalid nix-daemon error level"))?;
// removed
let _name: String = self.reader.read_value().await?;
let msg: String = self.reader.read_value().await?;
let have_pos: u64 = self.reader.read_number().await?;
assert_eq!(have_pos, 0);
let nr_traces: u64 = self.reader.read_number().await?;
let mut traces = Vec::new();
for _ in 0..nr_traces {
let _trace_pos: u64 = self.reader.read_number().await?;
let trace_hint: String = self.reader.read_value().await?;
traces.push(trace_hint);
}
Ok(NixDaemonError { level, msg, traces })
}
/// Check if a path is valid in the store
pub async fn is_valid_path(&mut self, path: &str) -> IoResult<bool> {
let store_path = StorePath::<String>::from_absolute_path(path.as_bytes())
.map_err(|e| IoError::new(IoErrorKind::InvalidInput, e.to_string()))?;
self.execute_with(Operation::IsValidPath, &store_path).await
}
/// Query information about a store path
#[allow(dead_code)]
pub async fn query_path_info(&mut self, path: &str) -> IoResult<Option<UnkeyedValidPathInfo>> {
let store_path = StorePath::<String>::from_absolute_path(path.as_bytes())
.map_err(|e| IoError::new(IoErrorKind::InvalidInput, e.to_string()))?;
self.writer.write_value(&Operation::QueryPathInfo).await?;
self.writer.write_value(&store_path).await?;
self.writer.flush().await?;
loop {
let msg = self.reader.read_number().await?;
if msg == STDERR_LAST {
let has_value: bool = self.reader.read_value().await?;
if has_value {
use nix_compat::narinfo::Signature;
use nix_compat::nixhash::CAHash;
let deriver = self.reader.read_value().await?;
let nar_hash: String = self.reader.read_value().await?;
let references = self.reader.read_value().await?;
let registration_time = self.reader.read_value().await?;
let nar_size = self.reader.read_value().await?;
let ultimate = self.reader.read_value().await?;
let signatures: Vec<Signature<String>> = self.reader.read_value().await?;
let ca: Option<CAHash> = self.reader.read_value().await?;
let value = UnkeyedValidPathInfo {
deriver,
nar_hash,
references,
registration_time,
nar_size,
ultimate,
signatures,
ca,
};
return Ok(Some(value));
} else {
return Ok(None);
}
} else if msg == STDERR_ERROR {
let error_msg = self.read_daemon_error().await?;
return Err(IoError::other(error_msg));
} else {
let _data: String = self.reader.read_value().await?;
continue;
}
}
}
/// Ensure a path is available in the store
pub async fn ensure_path(&mut self, path: &str) -> IoResult<()> {
let store_path = StorePath::<String>::from_absolute_path(path.as_bytes())
.map_err(|e| IoError::new(IoErrorKind::InvalidInput, e.to_string()))?;
self.writer.write_value(&Operation::EnsurePath).await?;
self.writer.write_value(&store_path).await?;
self.writer.flush().await?;
loop {
let msg = self.reader.read_number().await?;
if msg == STDERR_LAST {
return Ok(());
} else if msg == STDERR_ERROR {
let error_msg = self.read_daemon_error().await?;
return Err(IoError::other(error_msg));
} else {
let _data: String = self.reader.read_value().await?;
continue;
}
}
}
/// Query which paths are valid
#[allow(dead_code)]
pub async fn query_valid_paths(&mut self, paths: Vec<String>) -> IoResult<Vec<String>> {
let store_paths: IoResult<Vec<StorePath<String>>> = paths
.iter()
.map(|p| {
StorePath::<String>::from_absolute_path(p.as_bytes())
.map_err(|e| IoError::new(IoErrorKind::InvalidInput, e.to_string()))
})
.collect();
let store_paths = store_paths?;
// Send operation
self.writer.write_value(&Operation::QueryValidPaths).await?;
// Manually serialize the request since QueryValidPaths doesn't impl NixSerialize
// QueryValidPaths = { paths: Vec<StorePath>, substitute: bool }
self.writer.write_value(&store_paths).await?;
// For protocol >= 1.27, send substitute flag
if self.protocol_version.minor() >= 27 {
self.writer.write_value(&false).await?;
}
self.writer.flush().await?;
let result: Vec<StorePath<String>> = self.read_response().await?;
Ok(result.into_iter().map(|p| p.to_absolute_path()).collect())
}
/// Add a NAR to the store
pub async fn add_to_store_nar(
&mut self,
request: AddToStoreNarRequest,
nar_data: &[u8],
) -> IoResult<()> {
tracing::debug!(
"add_to_store_nar: path={}, nar_size={}",
request.path.to_absolute_path(),
request.nar_size,
);
self.writer.write_value(&Operation::AddToStoreNar).await?;
self.writer.write_value(&request.path).await?;
self.writer.write_value(&request.deriver).await?;
let nar_hash_hex = hex::encode(request.nar_hash.as_ref());
self.writer.write_value(&nar_hash_hex).await?;
self.writer.write_value(&request.references).await?;
self.writer.write_value(&request.registration_time).await?;
self.writer.write_value(&request.nar_size).await?;
self.writer.write_value(&request.ultimate).await?;
self.writer.write_value(&request.signatures).await?;
self.writer.write_value(&request.ca).await?;
self.writer.write_value(&request.repair).await?;
self.writer.write_value(&request.dont_check_sigs).await?;
if self.protocol_version.minor() >= 23 {
self.writer.write_number(nar_data.len() as u64).await?;
self.writer.write_all(nar_data).await?;
self.writer.write_number(0u64).await?;
} else {
self.writer.write_slice(nar_data).await?;
}
self.writer.flush().await?;
loop {
let msg = self.reader.read_number().await?;
if msg == STDERR_LAST {
return Ok(());
} else if msg == STDERR_ERROR {
let error_msg = self.read_daemon_error().await?;
return Err(IoError::other(error_msg));
} else {
let _data: String = self.reader.read_value().await?;
continue;
}
}
}
}
/// Thread-safe wrapper around NixDaemonClient
pub struct NixDaemonConnection {
client: Mutex<NixDaemonClient>,
}
impl NixDaemonConnection {
/// Connect to a nix-daemon at the given socket path
pub async fn connect(socket_path: &Path) -> IoResult<Self> {
let client = NixDaemonClient::connect(socket_path).await?;
Ok(Self {
client: Mutex::new(client),
})
}
/// Check if a path is valid in the store
pub async fn is_valid_path(&self, path: &str) -> IoResult<bool> {
let mut client = self.client.lock().await;
client.is_valid_path(path).await
}
/// Query information about a store path
#[allow(dead_code)]
pub async fn query_path_info(&self, path: &str) -> IoResult<Option<UnkeyedValidPathInfo>> {
let mut client = self.client.lock().await;
client.query_path_info(path).await
}
/// Ensure a path is available in the store
pub async fn ensure_path(&self, path: &str) -> IoResult<()> {
let mut client = self.client.lock().await;
client.ensure_path(path).await
}
/// Query which paths are valid
#[allow(dead_code)]
pub async fn query_valid_paths(&self, paths: Vec<String>) -> IoResult<Vec<String>> {
let mut client = self.client.lock().await;
client.query_valid_paths(paths).await
}
/// Add a NAR to the store
pub async fn add_to_store_nar(
&self,
request: AddToStoreNarRequest,
nar_data: &[u8],
) -> IoResult<()> {
let mut client = self.client.lock().await;
client.add_to_store_nar(request, nar_data).await
}
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, IntoPrimitive, TryFromPrimitive)]
#[repr(u8)]
pub enum NixDaemonErrorLevel {
Error = 0,
Warn,
Notice,
Info,
Talkative,
Chatty,
Debug,
Vomit,
}
#[derive(Debug, Error)]
#[error("{msg}")]
pub struct NixDaemonError {
level: NixDaemonErrorLevel,
msg: String,
traces: Vec<String>,
}

34
fix/src/store/error.rs Normal file
View File

@@ -0,0 +1,34 @@
#![allow(dead_code)]
use std::fmt;
#[derive(Debug)]
pub enum StoreError {
DaemonConnectionFailed(String),
OperationFailed(String),
InvalidPath(String),
PathNotFound(String),
Io(std::io::Error),
}
impl fmt::Display for StoreError {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self {
StoreError::DaemonConnectionFailed(msg) => {
write!(f, "Failed to connect to nix-daemon: {}", msg)
}
StoreError::OperationFailed(msg) => write!(f, "Store operation failed: {}", msg),
StoreError::InvalidPath(msg) => write!(f, "Invalid store path: {}", msg),
StoreError::PathNotFound(path) => write!(f, "Path not found in store: {}", path),
StoreError::Io(e) => write!(f, "I/O error: {}", e),
}
}
}
impl std::error::Error for StoreError {}
impl From<std::io::Error> for StoreError {
fn from(e: std::io::Error) -> Self {
StoreError::Io(e)
}
}

153
fix/src/store/validation.rs Normal file
View File

@@ -0,0 +1,153 @@
use crate::error::{Error, Result};
pub fn validate_store_path(store_dir: &str, path: &str) -> Result<()> {
if !path.starts_with(store_dir) {
return Err(Error::eval_error(
format!("path '{}' is not in the Nix store", path),
None,
));
}
let relative = path
.strip_prefix(store_dir)
.and_then(|s| s.strip_prefix('/'))
.ok_or_else(|| Error::eval_error(format!("invalid store path format: {}", path), None))?;
if relative.is_empty() {
return Err(Error::eval_error(
format!("store path cannot be store directory itself: {}", path),
None,
));
}
let parts: Vec<&str> = relative.splitn(2, '-').collect();
if parts.len() != 2 {
return Err(Error::eval_error(
format!("invalid store path format (missing name): {}", path),
None,
));
}
let hash = parts[0];
let name = parts[1];
if hash.len() != 32 {
return Err(Error::eval_error(
format!(
"invalid store path hash length (expected 32, got {}): {}",
hash.len(),
hash
),
None,
));
}
for ch in hash.chars() {
if !matches!(ch, '0'..='9' | 'a'..='d' | 'f'..='n' | 'p'..='s' | 'v'..='z') {
return Err(Error::eval_error(
format!("invalid character '{}' in store path hash: {}", ch, hash),
None,
));
}
}
if name.is_empty() {
return Err(Error::eval_error(
format!("store path has empty name: {}", path),
None,
));
}
if name.starts_with('.') {
return Err(Error::eval_error(
format!("store path name cannot start with '.': {}", name),
None,
));
}
for ch in name.chars() {
if !matches!(ch, '0'..='9' | 'a'..='z' | 'A'..='Z' | '+' | '-' | '.' | '_' | '?' | '=') {
return Err(Error::eval_error(
format!("invalid character '{}' in store path name: {}", ch, name),
None,
));
}
}
Ok(())
}
#[cfg(test)]
mod tests {
use super::*;
#[test_log::test]
fn test_valid_store_paths() {
let store_dir = "/nix/store";
let valid_paths = vec![
"/nix/store/0123456789abcdfghijklmnpqrsvwxyz-hello",
"/nix/store/abcdfghijklmnpqrsvwxyz0123456789-hello-1.0",
"/nix/store/00000000000000000000000000000000-test_+-.?=",
];
for path in valid_paths {
assert!(
validate_store_path(store_dir, path).is_ok(),
"Expected {} to be valid, got {:?}",
path,
validate_store_path(store_dir, path)
);
}
}
#[test_log::test]
fn test_invalid_store_paths() {
let store_dir = "/nix/store";
let invalid_paths = vec![
("/tmp/foo", "not in store"),
("/nix/store", "empty relative"),
("/nix/store/tooshort-name", "hash too short"),
(
"/nix/store/abc123defghijklmnopqrstuvwxyz123-name",
"hash too long",
),
(
"/nix/store/abcd1234abcd1234abcd1234abcd123e-name",
"e in hash",
),
(
"/nix/store/abcd1234abcd1234abcd1234abcd123o-name",
"o in hash",
),
(
"/nix/store/abcd1234abcd1234abcd1234abcd123u-name",
"u in hash",
),
(
"/nix/store/abcd1234abcd1234abcd1234abcd123t-name",
"t in hash",
),
(
"/nix/store/abcd1234abcd1234abcd1234abcd1234-.name",
"name starts with dot",
),
(
"/nix/store/abcd1234abcd1234abcd1234abcd1234-na/me",
"slash in name",
),
(
"/nix/store/abcd1234abcd1234abcd1234abcd1234",
"missing name",
),
];
for (path, reason) in invalid_paths {
assert!(
validate_store_path(store_dir, path).is_err(),
"Expected {} to be invalid ({})",
path,
reason
);
}
}
}

209
fix/src/string_context.rs Normal file
View File

@@ -0,0 +1,209 @@
use std::collections::{BTreeMap, BTreeSet, VecDeque};
pub enum StringContextElem {
Opaque { path: String },
DrvDeep { drv_path: String },
Built { drv_path: String, output: String },
}
impl StringContextElem {
pub fn decode(encoded: &str) -> Self {
if let Some(drv_path) = encoded.strip_prefix('=') {
StringContextElem::DrvDeep {
drv_path: drv_path.to_string(),
}
} else if let Some(rest) = encoded.strip_prefix('!') {
if let Some(second_bang) = rest.find('!') {
let output = rest[..second_bang].to_string();
let drv_path = rest[second_bang + 1..].to_string();
StringContextElem::Built { drv_path, output }
} else {
StringContextElem::Opaque {
path: encoded.to_string(),
}
}
} else {
StringContextElem::Opaque {
path: encoded.to_string(),
}
}
}
}
pub type InputDrvs = BTreeMap<String, BTreeSet<String>>;
pub type Srcs = BTreeSet<String>;
pub fn extract_input_drvs_and_srcs(context: &[String]) -> Result<(InputDrvs, Srcs), String> {
let mut input_drvs: BTreeMap<String, BTreeSet<String>> = BTreeMap::new();
let mut input_srcs: BTreeSet<String> = BTreeSet::new();
for encoded in context {
match StringContextElem::decode(encoded) {
StringContextElem::Opaque { path } => {
input_srcs.insert(path);
}
StringContextElem::DrvDeep { drv_path } => {
compute_fs_closure(&drv_path, &mut input_drvs, &mut input_srcs)?;
}
StringContextElem::Built { drv_path, output } => {
input_drvs.entry(drv_path).or_default().insert(output);
}
}
}
Ok((input_drvs, input_srcs))
}
fn compute_fs_closure(
drv_path: &str,
input_drvs: &mut BTreeMap<String, BTreeSet<String>>,
input_srcs: &mut BTreeSet<String>,
) -> Result<(), String> {
let mut queue: VecDeque<String> = VecDeque::new();
let mut visited: BTreeSet<String> = BTreeSet::new();
queue.push_back(drv_path.to_string());
while let Some(current_path) = queue.pop_front() {
if visited.contains(&current_path) {
continue;
}
visited.insert(current_path.clone());
input_srcs.insert(current_path.clone());
if !current_path.ends_with(".drv") {
continue;
}
let content = std::fs::read_to_string(&current_path)
.map_err(|e| format!("failed to read derivation {}: {}", current_path, e))?;
let inputs = parse_derivation_inputs(&content)
.ok_or_else(|| format!("failed to parse derivation {}", current_path))?;
for src in inputs.input_srcs {
input_srcs.insert(src.clone());
if !visited.contains(&src) {
queue.push_back(src);
}
}
for (dep_drv, outputs) in inputs.input_drvs {
input_srcs.insert(dep_drv.clone());
let entry = input_drvs.entry(dep_drv.clone()).or_default();
for output in outputs {
entry.insert(output);
}
if !visited.contains(&dep_drv) {
queue.push_back(dep_drv);
}
}
}
Ok(())
}
struct DerivationInputs {
input_drvs: Vec<(String, Vec<String>)>,
input_srcs: Vec<String>,
}
fn parse_derivation_inputs(aterm: &str) -> Option<DerivationInputs> {
let aterm = aterm.strip_prefix("Derive([")?;
let mut bracket_count: i32 = 1;
let mut pos = 0;
let bytes = aterm.as_bytes();
while pos < bytes.len() && bracket_count > 0 {
match bytes[pos] {
b'[' => bracket_count += 1,
b']' => bracket_count -= 1,
_ => {}
}
pos += 1;
}
if bracket_count != 0 {
return None;
}
let rest = &aterm[pos..];
let rest = rest.strip_prefix(",[")?;
let mut input_drvs = Vec::new();
let mut bracket_count: i32 = 1;
let mut start = 0;
pos = 0;
let bytes = rest.as_bytes();
while pos < bytes.len() && bracket_count > 0 {
match bytes[pos] {
b'[' => bracket_count += 1,
b']' => bracket_count -= 1,
b'(' if bracket_count == 1 => {
start = pos;
}
b')' if bracket_count == 1 => {
let entry = &rest[start + 1..pos];
if let Some((drv_path, outputs)) = parse_input_drv_entry(entry) {
input_drvs.push((drv_path, outputs));
}
}
_ => {}
}
pos += 1;
}
let rest = &rest[pos..];
let rest = rest.strip_prefix(",[")?;
let mut input_srcs = Vec::new();
bracket_count = 1;
pos = 0;
let bytes = rest.as_bytes();
while pos < bytes.len() && bracket_count > 0 {
match bytes[pos] {
b'[' => bracket_count += 1,
b']' => bracket_count -= 1,
b'"' if bracket_count == 1 => {
pos += 1;
let src_start = pos;
while pos < bytes.len() && bytes[pos] != b'"' {
if bytes[pos] == b'\\' && pos + 1 < bytes.len() {
pos += 2;
} else {
pos += 1;
}
}
let src = std::str::from_utf8(&bytes[src_start..pos]).ok()?;
input_srcs.push(src.to_string());
}
_ => {}
}
pos += 1;
}
Some(DerivationInputs {
input_drvs,
input_srcs,
})
}
fn parse_input_drv_entry(entry: &str) -> Option<(String, Vec<String>)> {
let entry = entry.strip_prefix('"')?;
let quote_end = entry.find('"')?;
let drv_path = entry[..quote_end].to_string();
let rest = &entry[quote_end + 1..];
let rest = rest.strip_prefix(",[")?;
let rest = rest.strip_suffix(']')?;
let mut outputs = Vec::new();
for part in rest.split(',') {
let part = part.trim();
if let Some(name) = part.strip_prefix('"').and_then(|s| s.strip_suffix('"')) {
outputs.push(name.to_string());
}
}
Some((drv_path, outputs))
}

372
fix/src/value.rs Normal file
View File

@@ -0,0 +1,372 @@
use core::fmt::{Debug, Display, Formatter, Result as FmtResult};
use core::hash::Hash;
use core::ops::Deref;
use std::borrow::Cow;
use std::collections::BTreeMap;
use std::ops::DerefMut;
use derive_more::{Constructor, IsVariant, Unwrap};
/// Represents a Nix symbol, which is used as a key in attribute sets.
#[derive(Debug, Clone, Hash, PartialEq, Eq, PartialOrd, Ord, Constructor)]
pub struct Symbol<'a>(Cow<'a, str>);
pub type StaticSymbol = Symbol<'static>;
impl From<String> for Symbol<'_> {
fn from(value: String) -> Self {
Symbol(Cow::Owned(value))
}
}
impl<'a> From<&'a str> for Symbol<'a> {
fn from(value: &'a str) -> Self {
Symbol(Cow::Borrowed(value))
}
}
/// Formats a string slice as a Nix symbol, quoting it if necessary.
pub fn format_symbol<'a>(sym: impl Into<Cow<'a, str>>) -> Cow<'a, str> {
let sym = sym.into();
if Symbol::NORMAL_REGEX.test(&sym) {
sym
} else {
Cow::Owned(escape_quote_string(&sym))
}
}
impl Display for Symbol<'_> {
fn fmt(&self, f: &mut Formatter<'_>) -> FmtResult {
if self.normal() {
write!(f, "{}", self.0)
} else {
write!(f, "{}", escape_quote_string(&self.0))
}
}
}
impl Symbol<'_> {
const NORMAL_REGEX: ere::Regex<1> = ere::compile_regex!("^[a-zA-Z_][a-zA-Z0-9_'-]*$");
/// Checks if the symbol is a "normal" identifier that doesn't require quotes.
fn normal(&self) -> bool {
Self::NORMAL_REGEX.test(self)
}
}
impl Deref for Symbol<'_> {
type Target = str;
fn deref(&self) -> &Self::Target {
&self.0
}
}
/// Represents a Nix attribute set, which is a map from symbols to values.
#[derive(Constructor, Default, Clone, PartialEq)]
pub struct AttrSet {
data: BTreeMap<StaticSymbol, Value>,
}
impl AttrSet {
/// Gets a value by key (string or Symbol).
pub fn get<'a, 'sym: 'a>(&'a self, key: impl Into<Symbol<'sym>>) -> Option<&'a Value> {
self.data.get(&key.into())
}
/// Checks if a key exists in the attribute set.
pub fn contains_key<'a, 'sym: 'a>(&'a self, key: impl Into<Symbol<'sym>>) -> bool {
self.data.contains_key(&key.into())
}
}
impl Deref for AttrSet {
type Target = BTreeMap<StaticSymbol, Value>;
fn deref(&self) -> &Self::Target {
&self.data
}
}
impl DerefMut for AttrSet {
fn deref_mut(&mut self) -> &mut Self::Target {
&mut self.data
}
}
impl Debug for AttrSet {
fn fmt(&self, f: &mut Formatter<'_>) -> FmtResult {
use Value::*;
write!(f, "{{")?;
for (k, v) in self.data.iter() {
write!(f, " {k:?} = ")?;
match v {
List(_) => write!(f, "[ ... ];")?,
AttrSet(_) => write!(f, "{{ ... }};")?,
v => write!(f, "{v:?};")?,
}
}
write!(f, " }}")
}
}
impl Display for AttrSet {
fn fmt(&self, f: &mut Formatter<'_>) -> FmtResult {
use Value::*;
if self.data.len() > 1 {
writeln!(f, "{{")?;
for (k, v) in self.data.iter() {
write!(f, " {k} = ")?;
match v {
List(_) => writeln!(f, "[ ... ];")?,
AttrSet(_) => writeln!(f, "{{ ... }};")?,
v => writeln!(f, "{v};")?,
}
}
write!(f, "}}")
} else {
write!(f, "{{")?;
for (k, v) in self.data.iter() {
write!(f, " {k} = ")?;
match v {
List(_) => write!(f, "[ ... ];")?,
AttrSet(_) => write!(f, "{{ ... }};")?,
v => write!(f, "{v};")?,
}
}
write!(f, " }}")
}
}
}
impl AttrSet {
pub fn display_compat(&self) -> AttrSetCompatDisplay<'_> {
AttrSetCompatDisplay(self)
}
}
pub struct AttrSetCompatDisplay<'a>(&'a AttrSet);
impl Display for AttrSetCompatDisplay<'_> {
fn fmt(&self, f: &mut Formatter<'_>) -> FmtResult {
write!(f, "{{")?;
for (k, v) in self.0.data.iter() {
write!(f, " {k} = {};", v.display_compat())?;
}
write!(f, " }}")
}
}
/// Represents a Nix list, which is a vector of values.
#[derive(Constructor, Default, Clone, Debug, PartialEq)]
pub struct List {
data: Vec<Value>,
}
impl Deref for List {
type Target = Vec<Value>;
fn deref(&self) -> &Self::Target {
&self.data
}
}
impl DerefMut for List {
fn deref_mut(&mut self) -> &mut Self::Target {
&mut self.data
}
}
impl Display for List {
fn fmt(&self, f: &mut Formatter<'_>) -> FmtResult {
use Value::*;
if self.data.len() > 1 {
writeln!(f, "[")?;
for v in self.data.iter() {
match v {
List(_) => writeln!(f, " [ ... ]")?,
AttrSet(_) => writeln!(f, " {{ ... }}")?,
v => writeln!(f, " {v}")?,
}
}
write!(f, "]")
} else {
write!(f, "[ ")?;
for v in self.data.iter() {
match v {
List(_) => write!(f, "[ ... ] ")?,
AttrSet(_) => write!(f, "{{ ... }} ")?,
v => write!(f, "{v} ")?,
}
}
write!(f, "]")
}
}
}
impl List {
pub fn display_compat(&self) -> ListCompatDisplay<'_> {
ListCompatDisplay(self)
}
}
pub struct ListCompatDisplay<'a>(&'a List);
impl Display for ListCompatDisplay<'_> {
fn fmt(&self, f: &mut Formatter<'_>) -> FmtResult {
write!(f, "[ ")?;
for v in self.0.data.iter() {
write!(f, "{} ", v.display_compat())?;
}
write!(f, "]")
}
}
/// Represents any possible Nix value that can be returned from an evaluation.
#[derive(IsVariant, Unwrap, Clone, Debug, PartialEq)]
pub enum Value {
/// An integer value.
Int(i64),
/// An floating-point value.
Float(f64),
/// An boolean value.
Bool(bool),
/// An null value.
Null,
/// A string value.
String(String),
/// A path value (absolute path string).
Path(String),
/// An attribute set.
AttrSet(AttrSet),
/// A list.
List(List),
/// A thunk, representing a delayed computation.
Thunk,
/// A function (lambda).
Func,
/// A primitive (built-in) operation.
PrimOp(String),
/// A partially applied primitive operation.
PrimOpApp(String),
/// A marker for a value that has been seen before during serialization, to break cycles.
/// This is used to prevent infinite recursion when printing or serializing cyclic data structures.
Repeated,
}
fn escape_quote_string(s: &str) -> String {
let mut ret = String::with_capacity(s.len() + 2);
ret.push('"');
let mut iter = s.chars().peekable();
while let Some(c) = iter.next() {
match c {
'\\' => ret.push_str("\\\\"),
'"' => ret.push_str("\\\""),
'\n' => ret.push_str("\\n"),
'\r' => ret.push_str("\\r"),
'\t' => ret.push_str("\\t"),
'$' if iter.peek() == Some(&'{') => ret.push_str("\\$"),
c => ret.push(c),
}
}
ret.push('"');
ret
}
/// Format a float matching C's `printf("%g", x)` with default precision 6.
fn fmt_nix_float(f: &mut Formatter<'_>, x: f64) -> FmtResult {
if !x.is_finite() {
return write!(f, "{x}");
}
if x == 0.0 {
return if x.is_sign_negative() {
write!(f, "-0")
} else {
write!(f, "0")
};
}
let precision: i32 = 6;
let exp = x.abs().log10().floor() as i32;
let formatted = if exp >= -4 && exp < precision {
let decimal_places = (precision - 1 - exp) as usize;
format!("{x:.decimal_places$}")
} else {
let sig_digits = (precision - 1) as usize;
let s = format!("{x:.sig_digits$e}");
let (mantissa, exp_part) = s
.split_once('e')
.expect("scientific notation must contain 'e'");
let (sign, digits) = if let Some(d) = exp_part.strip_prefix('-') {
("-", d)
} else if let Some(d) = exp_part.strip_prefix('+') {
("+", d)
} else {
("+", exp_part)
};
if digits.len() < 2 {
format!("{mantissa}e{sign}0{digits}")
} else {
format!("{mantissa}e{sign}{digits}")
}
};
if formatted.contains('.') {
if let Some(e_pos) = formatted.find('e') {
let trimmed = formatted[..e_pos]
.trim_end_matches('0')
.trim_end_matches('.');
write!(f, "{}{}", trimmed, &formatted[e_pos..])
} else {
let trimmed = formatted.trim_end_matches('0').trim_end_matches('.');
write!(f, "{trimmed}")
}
} else {
write!(f, "{formatted}")
}
}
impl Display for Value {
fn fmt(&self, f: &mut Formatter<'_>) -> FmtResult {
use Value::*;
match self {
&Int(x) => write!(f, "{x}"),
&Float(x) => fmt_nix_float(f, x),
&Bool(x) => write!(f, "{x}"),
Null => write!(f, "null"),
String(x) => write!(f, "{}", escape_quote_string(x)),
Path(x) => write!(f, "{x}"),
AttrSet(x) => write!(f, "{x}"),
List(x) => write!(f, "{x}"),
Thunk => write!(f, "«code»"),
Func => write!(f, "«lambda»"),
PrimOp(name) => write!(f, "«primop {name}»"),
PrimOpApp(name) => write!(f, "«partially applied primop {name}»"),
Repeated => write!(f, "«repeated»"),
}
}
}
impl Value {
pub fn display_compat(&self) -> ValueCompatDisplay<'_> {
ValueCompatDisplay(self)
}
}
pub struct ValueCompatDisplay<'a>(&'a Value);
impl Display for ValueCompatDisplay<'_> {
fn fmt(&self, f: &mut Formatter<'_>) -> FmtResult {
use Value::*;
match self.0 {
&Int(x) => write!(f, "{x}"),
&Float(x) => fmt_nix_float(f, x),
&Bool(x) => write!(f, "{x}"),
Null => write!(f, "null"),
String(x) => write!(f, "{}", escape_quote_string(x)),
Path(x) => write!(f, "{x}"),
AttrSet(x) => write!(f, "{}", x.display_compat()),
List(x) => write!(f, "{}", x.display_compat()),
Thunk => write!(f, "«thunk»"),
Func => write!(f, "<LAMBDA>"),
PrimOp(_) => write!(f, "<PRIMOP>"),
PrimOpApp(_) => write!(f, "<PRIMOP-APP>"),
Repeated => write!(f, "«repeated»"),
}
}
}

View File

@@ -0,0 +1,69 @@
use fix::value::Value;
use crate::utils::{eval, eval_result};
#[test_log::test]
fn arithmetic() {
assert_eq!(eval("1 + 1"), Value::Int(2));
}
#[test_log::test]
fn simple_function_application() {
assert_eq!(eval("(x: x) 1"), Value::Int(1));
}
#[test_log::test]
fn curried_function() {
assert_eq!(eval("(x: y: x - y) 2 1"), Value::Int(1));
}
#[test_log::test]
fn rec_attrset() {
assert_eq!(eval("rec { b = a; a = 1; }.b"), Value::Int(1));
}
#[test_log::test]
fn let_binding() {
assert_eq!(eval("let b = a; a = 1; in b"), Value::Int(1));
}
#[test_log::test]
fn fibonacci() {
assert_eq!(
eval(
"let fib = n: if n == 1 || n == 2 then 1 else (fib (n - 1)) + (fib (n - 2)); in fib 30"
),
Value::Int(832040)
);
}
#[test_log::test]
fn fixed_point_combinator() {
assert_eq!(
eval("((f: let x = f x; in x)(self: { x = 1; y = self.x + 1; })).y"),
Value::Int(2)
);
}
#[test_log::test]
fn conditional_true() {
assert_eq!(eval("if true then 1 else 0"), Value::Int(1));
}
#[test_log::test]
fn conditional_false() {
assert_eq!(eval("if false then 1 else 0"), Value::Int(0));
}
#[test_log::test]
fn nested_let() {
assert_eq!(
eval("let x = 1; in let y = x + 1; z = y + 1; in z"),
Value::Int(3)
);
}
#[test_log::test]
fn rec_inherit_fails() {
assert!(eval_result("{ inherit x; }").is_err());
}

326
fix/tests/tests/builtins.rs Normal file
View File

@@ -0,0 +1,326 @@
use std::collections::BTreeMap;
use fix::value::{AttrSet, List, Value};
use crate::utils::eval;
#[test_log::test]
fn builtins_accessible() {
let result = eval("builtins");
assert!(matches!(result, Value::AttrSet(_)));
}
#[test_log::test]
fn builtins_self_reference() {
let result = eval("builtins.builtins");
assert!(matches!(result, Value::AttrSet(_)));
}
#[test_log::test]
fn builtins_add() {
assert_eq!(eval("builtins.add 1 2"), Value::Int(3));
}
#[test_log::test]
fn builtins_length() {
assert_eq!(eval("builtins.length [1 2 3]"), Value::Int(3));
}
#[test_log::test]
fn builtins_map() {
assert_eq!(
eval("builtins.map (x: x * 2) [1 2 3]"),
Value::List(List::new(vec![Value::Int(2), Value::Int(4), Value::Int(6)]))
);
}
#[test_log::test]
fn builtins_filter() {
assert_eq!(
eval("builtins.filter (x: x > 1) [1 2 3]"),
Value::List(List::new(vec![Value::Int(2), Value::Int(3)]))
);
}
#[test_log::test]
fn builtins_attrnames() {
let result = eval("builtins.attrNames { a = 1; b = 2; }");
assert!(matches!(result, Value::List(_)));
if let Value::List(list) = result {
assert_eq!(format!("{:?}", list).matches(',').count() + 1, 2);
}
}
#[test_log::test]
fn builtins_head() {
assert_eq!(eval("builtins.head [1 2 3]"), Value::Int(1));
}
#[test_log::test]
fn builtins_tail() {
assert_eq!(
eval("builtins.tail [1 2 3]"),
Value::List(List::new(vec![Value::Int(2), Value::Int(3)]))
);
}
#[test_log::test]
fn builtins_in_let() {
assert_eq!(eval("let b = builtins; in b.add 5 3"), Value::Int(8));
}
#[test_log::test]
fn builtins_in_with() {
assert_eq!(eval("with builtins; add 10 20"), Value::Int(30));
}
#[test_log::test]
fn builtins_nested_calls() {
assert_eq!(
eval("builtins.add (builtins.mul 2 3) (builtins.sub 10 5)"),
Value::Int(11)
);
}
#[test_log::test]
fn builtins_is_list() {
assert_eq!(eval("builtins.isList [1 2 3]"), Value::Bool(true));
}
#[test_log::test]
fn builtins_is_attrs() {
assert_eq!(eval("builtins.isAttrs { a = 1; }"), Value::Bool(true));
}
#[test_log::test]
fn builtins_is_function() {
assert_eq!(eval("builtins.isFunction (x: x)"), Value::Bool(true));
}
#[test_log::test]
fn builtins_is_null() {
assert_eq!(eval("builtins.isNull null"), Value::Bool(true));
}
#[test_log::test]
fn builtins_is_bool() {
assert_eq!(eval("builtins.isBool true"), Value::Bool(true));
}
#[test_log::test]
fn builtins_shadowing() {
assert_eq!(
eval("let builtins = { add = x: y: x - y; }; in builtins.add 5 3"),
Value::Int(2)
);
}
#[test_log::test]
fn builtins_lazy_evaluation() {
let result = eval("builtins.builtins.builtins.add 1 1");
assert_eq!(result, Value::Int(2));
}
#[test_log::test]
fn builtins_foldl() {
assert_eq!(
eval("builtins.foldl' (acc: x: acc + x) 0 [1 2 3 4 5]"),
Value::Int(15)
);
}
#[test_log::test]
fn builtins_elem() {
assert_eq!(eval("builtins.elem 2 [1 2 3]"), Value::Bool(true));
assert_eq!(eval("builtins.elem 5 [1 2 3]"), Value::Bool(false));
}
#[test_log::test]
fn builtins_concat_lists() {
assert_eq!(
eval("builtins.concatLists [[1 2] [3 4] [5]]"),
Value::List(List::new(vec![
Value::Int(1),
Value::Int(2),
Value::Int(3),
Value::Int(4),
Value::Int(5)
]))
);
}
#[test_log::test]
fn builtins_compare_versions_basic() {
assert_eq!(
eval("builtins.compareVersions \"1.0\" \"2.3\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"2.1\" \"2.3\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3\" \"2.3\""),
Value::Int(0)
);
assert_eq!(
eval("builtins.compareVersions \"2.5\" \"2.3\""),
Value::Int(1)
);
assert_eq!(
eval("builtins.compareVersions \"3.1\" \"2.3\""),
Value::Int(1)
);
}
#[test_log::test]
fn builtins_compare_versions_components() {
assert_eq!(
eval("builtins.compareVersions \"2.3.1\" \"2.3\""),
Value::Int(1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3\" \"2.3.1\""),
Value::Int(-1)
);
}
#[test_log::test]
fn builtins_compare_versions_numeric_vs_alpha() {
// Numeric component comes before alpha component
assert_eq!(
eval("builtins.compareVersions \"2.3.1\" \"2.3a\""),
Value::Int(1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3a\" \"2.3.1\""),
Value::Int(-1)
);
}
#[test_log::test]
fn builtins_compare_versions_pre() {
// "pre" is special: comes before everything except another "pre"
assert_eq!(
eval("builtins.compareVersions \"2.3pre1\" \"2.3\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3pre3\" \"2.3pre12\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3pre1\" \"2.3c\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3pre1\" \"2.3q\""),
Value::Int(-1)
);
}
#[test_log::test]
fn builtins_compare_versions_alpha() {
// Alphabetic comparison
assert_eq!(
eval("builtins.compareVersions \"2.3a\" \"2.3c\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3c\" \"2.3a\""),
Value::Int(1)
);
}
#[test_log::test]
fn builtins_compare_versions_symmetry() {
// Test symmetry: compareVersions(a, b) == -compareVersions(b, a)
assert_eq!(
eval("builtins.compareVersions \"1.0\" \"2.3\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"2.3\" \"1.0\""),
Value::Int(1)
);
}
#[test_log::test]
fn builtins_compare_versions_complex() {
// Complex version strings with multiple components
assert_eq!(
eval("builtins.compareVersions \"1.2.3.4\" \"1.2.3.5\""),
Value::Int(-1)
);
assert_eq!(
eval("builtins.compareVersions \"1.2.10\" \"1.2.9\""),
Value::Int(1)
);
assert_eq!(
eval("builtins.compareVersions \"1.2a3\" \"1.2a10\""),
Value::Int(-1)
);
}
#[test_log::test]
fn builtins_generic_closure() {
assert_eq!(
eval(
"with builtins; length (genericClosure { startSet = [ { key = 1; } ]; operator = { key }: [ { key = key / 1.; } ]; a = 1; })"
),
Value::Int(1),
);
assert_eq!(
eval(
"with builtins; (elemAt (genericClosure { startSet = [ { key = 1; } ]; operator = { key }: [ { key = key / 1.; } ]; a = 1; }) 0).key"
),
Value::Int(1),
);
}
#[test_log::test]
fn builtins_function_args() {
assert_eq!(
eval("builtins.functionArgs (x: 1)"),
Value::AttrSet(AttrSet::default())
);
assert_eq!(
eval("builtins.functionArgs ({}: 1)"),
Value::AttrSet(AttrSet::default())
);
assert_eq!(
eval("builtins.functionArgs ({...}: 1)"),
Value::AttrSet(AttrSet::default())
);
assert_eq!(
eval("builtins.functionArgs ({a}: 1)"),
Value::AttrSet(AttrSet::new(BTreeMap::from([(
"a".into(),
Value::Bool(false)
)])))
);
assert_eq!(
eval("builtins.functionArgs ({a, b ? 1}: 1)"),
Value::AttrSet(AttrSet::new(BTreeMap::from([
("a".into(), Value::Bool(false)),
("b".into(), Value::Bool(true))
])))
);
assert_eq!(
eval("builtins.functionArgs ({a, b ? 1, ...}: 1)"),
Value::AttrSet(AttrSet::new(BTreeMap::from([
("a".into(), Value::Bool(false)),
("b".into(), Value::Bool(true))
])))
);
}
#[test_log::test]
fn builtins_parse_drv_name() {
let result = eval(r#"builtins.parseDrvName "nix-js-0.1.0pre""#).unwrap_attr_set();
assert_eq!(result.get("name"), Some(&Value::String("nix-js".into())));
assert_eq!(
result.get("version"),
Some(&Value::String("0.1.0pre".into()))
);
}

View File

@@ -0,0 +1,193 @@
use fix::value::Value;
use crate::utils::eval_result;
#[test_log::test]
fn to_file_simple() {
let result =
eval_result(r#"builtins.toFile "hello.txt" "Hello, World!""#).expect("Failed to evaluate");
match result {
Value::String(path) => {
assert!(path.contains("-hello.txt"));
assert!(std::path::Path::new(&path).exists());
let contents = std::fs::read_to_string(&path).expect("Failed to read file");
assert_eq!(contents, "Hello, World!");
}
_ => panic!("Expected string, got {:?}", result),
}
}
#[test_log::test]
fn to_file_with_references() {
let result = eval_result(
r#"
let
dep = builtins.toFile "dep.txt" "dependency";
in
builtins.toFile "main.txt" "Reference: ${dep}"
"#,
)
.expect("Failed to evaluate");
match result {
Value::String(path) => {
assert!(path.contains("-main.txt"));
let contents = std::fs::read_to_string(&path).expect("Failed to read file");
assert!(contents.contains("Reference: "));
assert!(contents.contains("-dep.txt"));
}
_ => panic!("Expected string"),
}
}
#[test_log::test]
fn to_file_invalid_name_with_slash() {
let result = eval_result(r#"builtins.toFile "foo/bar.txt" "content""#);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.to_string()
.contains("name cannot contain '/'")
);
}
#[test_log::test]
fn to_file_invalid_name_dot() {
let result = eval_result(r#"builtins.toFile "." "content""#);
assert!(result.is_err());
assert!(result.unwrap_err().to_string().contains("invalid name"));
}
#[test_log::test]
fn to_file_invalid_name_dotdot() {
let result = eval_result(r#"builtins.toFile ".." "content""#);
assert!(result.is_err());
assert!(result.unwrap_err().to_string().contains("invalid name"));
}
#[test_log::test]
fn store_path_validation_not_in_store() {
let result = eval_result(r#"builtins.storePath "/tmp/foo""#);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.to_string()
.contains("not in the Nix store")
);
}
#[test_log::test]
fn store_path_validation_malformed_hash() {
let dummy_file_result = eval_result(r#"builtins.toFile "dummy.txt" "content""#)
.expect("Failed to create dummy file");
let dummy_path = match dummy_file_result {
Value::String(ref p) => p.clone(),
_ => panic!("Expected string"),
};
let store_dir = std::path::Path::new(&dummy_path)
.parent()
.expect("Failed to get parent dir")
.to_str()
.expect("Failed to convert to string");
let test_path = format!("{}/invalid-hash-hello", store_dir);
let result = eval_result(&format!(r#"builtins.storePath "{}""#, test_path));
assert!(result.is_err());
let err_str = result.unwrap_err().to_string();
assert!(
err_str.contains("invalid") || err_str.contains("hash"),
"Expected hash validation error, got: {}",
err_str
);
}
#[test_log::test]
fn store_path_validation_missing_name() {
let dummy_file_result = eval_result(r#"builtins.toFile "dummy.txt" "content""#)
.expect("Failed to create dummy file");
let dummy_path = match dummy_file_result {
Value::String(ref p) => p.clone(),
_ => panic!("Expected string"),
};
let store_dir = std::path::Path::new(&dummy_path)
.parent()
.expect("Failed to get parent dir")
.to_str()
.expect("Failed to convert to string");
let test_path = format!("{}/abcd1234abcd1234abcd1234abcd1234", store_dir);
let result = eval_result(&format!(r#"builtins.storePath "{}""#, test_path));
assert!(result.is_err());
let err_str = result.unwrap_err().to_string();
assert!(
err_str.contains("missing name") || err_str.contains("format"),
"Expected missing name error, got: {}",
err_str
);
}
#[test_log::test]
fn to_file_curried_application() {
let result = eval_result(
r#"
let
makeFile = builtins.toFile "test.txt";
in
makeFile "test content"
"#,
)
.expect("Failed to evaluate");
match result {
Value::String(path) => {
assert!(path.contains("-test.txt"));
let contents = std::fs::read_to_string(&path).expect("Failed to read file");
assert_eq!(contents, "test content");
}
_ => panic!("Expected string"),
}
}
#[test_log::test]
fn to_file_number_conversion() {
let result = eval_result(r#"builtins.toFile "number.txt" (builtins.toString 42)"#)
.expect("Failed to evaluate");
match result {
Value::String(path) => {
let contents = std::fs::read_to_string(&path).expect("Failed to read file");
assert_eq!(contents, "42");
}
_ => panic!("Expected string"),
}
}
#[test_log::test]
fn to_file_list_conversion() {
let result = eval_result(
r#"builtins.toFile "list.txt" (builtins.concatStringsSep "\n" ["line1" "line2" "line3"])"#,
)
.expect("Failed to evaluate");
match result {
Value::String(path) => {
let contents = std::fs::read_to_string(&path).expect("Failed to read file");
assert_eq!(contents, "line1\nline2\nline3");
}
_ => panic!("Expected string"),
}
}

View File

@@ -0,0 +1,687 @@
use fix::value::Value;
use crate::utils::{eval_deep, eval_deep_result};
#[test_log::test]
fn add_operator_preserves_derivation_context() {
let result = eval_deep(
r#"
let
dep = derivation { name = "dep"; builder = "/bin/sh"; system = "x86_64-linux"; outputs = ["out" "dev"]; };
getOutput = output: pkg: pkg.${output} or pkg.out or pkg;
user = derivation {
name = "user";
builder = "/bin/sh";
system = "x86_64-linux";
libPath = (getOutput "lib" dep) + "/lib";
devPath = dep.dev + "/include";
};
in user.drvPath
"#,
);
let nix_result = eval_deep(
r#"
let
dep = derivation { name = "dep"; builder = "/bin/sh"; system = "x86_64-linux"; outputs = ["out" "dev"]; };
user = derivation {
name = "user";
builder = "/bin/sh";
system = "x86_64-linux";
libPath = "${dep.out}/lib";
devPath = "${dep.dev}/include";
};
in user.drvPath
"#,
);
assert_eq!(result, nix_result);
}
#[test_log::test]
fn derivation_minimal() {
let result = eval_deep(
r#"derivation { name = "hello"; builder = "/bin/sh"; system = "x86_64-linux"; }"#,
);
match result {
Value::AttrSet(attrs) => {
assert_eq!(attrs.get("type"), Some(&Value::String("derivation".into())));
assert_eq!(attrs.get("name"), Some(&Value::String("hello".into())));
assert_eq!(attrs.get("builder"), Some(&Value::String("/bin/sh".into())));
assert_eq!(
attrs.get("system"),
Some(&Value::String("x86_64-linux".into()))
);
assert!(attrs.contains_key("outPath"));
assert!(attrs.contains_key("drvPath"));
if let Some(Value::String(path)) = attrs.get("outPath") {
assert_eq!(path, "/nix/store/pnwh4xsfs4j508bs9iw6bpkyc4zw6ryx-hello");
} else {
panic!("outPath should be a string");
}
if let Some(Value::String(path)) = attrs.get("drvPath") {
assert_eq!(
path,
"/nix/store/x0sj6ynccvc1a8kxr8fifnlf7qlxw6hd-hello.drv"
);
} else {
panic!("drvPath should be a string");
}
}
_ => panic!("Expected AttrSet, got {:?}", result),
}
}
#[test_log::test]
fn derivation_with_args() {
let result = eval_deep(
r#"derivation {
name = "test";
builder = "/bin/sh";
system = "x86_64-linux";
args = ["-c" "echo hello"];
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("args"));
if let Some(Value::List(args)) = attrs.get("args") {
assert_eq!(args.len(), 2);
}
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn derivation_to_string() {
let result = eval_deep(
r#"toString (derivation { name = "foo"; builder = "/bin/sh"; system = "x86_64-linux"; })"#,
);
match result {
Value::String(s) => assert_eq!(s, "/nix/store/xpcvxsx5sw4rbq666blz6sxqlmsqphmr-foo"),
_ => panic!("Expected String, got {:?}", result),
}
}
#[test_log::test]
fn derivation_missing_name() {
let result =
eval_deep_result(r#"derivation { builder = "/bin/sh"; system = "x86_64-linux"; }"#);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("missing required attribute 'name'"));
}
#[test_log::test]
fn derivation_invalid_name_with_drv_suffix() {
let result = eval_deep_result(
r#"derivation { name = "foo.drv"; builder = "/bin/sh"; system = "x86_64-linux"; }"#,
);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("cannot end with .drv"));
}
#[test_log::test]
fn derivation_missing_builder() {
let result = eval_deep_result(r#"derivation { name = "test"; system = "x86_64-linux"; }"#);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("missing required attribute 'builder'"));
}
#[test_log::test]
fn derivation_missing_system() {
let result = eval_deep_result(r#"derivation { name = "test"; builder = "/bin/sh"; }"#);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("missing required attribute 'system'"));
}
#[test_log::test]
fn derivation_with_env_vars() {
let result = eval_deep(
r#"derivation {
name = "test";
builder = "/bin/sh";
system = "x86_64-linux";
MY_VAR = "hello";
ANOTHER = "world";
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert_eq!(attrs.get("MY_VAR"), Some(&Value::String("hello".into())));
assert_eq!(attrs.get("ANOTHER"), Some(&Value::String("world".into())));
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn derivation_strict() {
let result = eval_deep(
r#"builtins.derivationStrict { name = "test"; builder = "/bin/sh"; system = "x86_64-linux"; }"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("drvPath"));
assert!(attrs.contains_key("out"));
assert!(!attrs.contains_key("type"));
assert!(!attrs.contains_key("outPath"));
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn derivation_deterministic_paths() {
let expr = r#"derivation { name = "hello"; builder = "/bin/sh"; system = "x86_64-linux"; }"#;
let result1 = eval_deep(expr);
let result2 = eval_deep(expr);
match (result1, result2) {
(Value::AttrSet(attrs1), Value::AttrSet(attrs2)) => {
assert_eq!(attrs1.get("drvPath"), attrs2.get("drvPath"));
assert_eq!(attrs1.get("outPath"), attrs2.get("outPath"));
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn derivation_escaping_in_aterm() {
let result = eval_deep(
r#"derivation {
name = "test";
builder = "/bin/sh";
system = "x86_64-linux";
args = ["-c" "echo \"hello\nworld\""];
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("drvPath"));
assert!(attrs.contains_key("outPath"));
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn multi_output_two_outputs() {
let drv = eval_deep(
r#"derivation {
name = "multi";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out" "dev"];
}"#,
);
match drv {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("drvPath"));
assert!(attrs.contains_key("out"));
assert!(attrs.contains_key("dev"));
assert!(attrs.contains_key("outPath"));
if let Some(Value::String(drv_path)) = attrs.get("drvPath") {
assert_eq!(
drv_path,
"/nix/store/vmyjryfipkn9ss3ya23hk8p3m58l6dsl-multi.drv"
);
} else {
panic!(
"drvPath should be a string, got: {:?}",
attrs.get("drvPath")
);
}
if let Some(Value::String(out_path)) = attrs.get("outPath") {
assert_eq!(
out_path,
"/nix/store/a3d95yg9d215c54n0ybr4npmpnj29229-multi"
);
} else {
panic!("outPath should be a string");
}
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn multi_output_three_outputs() {
let result = eval_deep(
r#"derivation {
name = "three";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out" "dev" "doc"];
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("out"));
assert!(attrs.contains_key("dev"));
assert!(attrs.contains_key("doc"));
// Verify exact paths match CppNix
if let Some(Value::String(drv_path)) = attrs.get("drvPath") {
assert_eq!(
drv_path,
"/nix/store/w08rpwvs5j9yxvdx5f5yg0p5i3ncazdx-three.drv"
);
}
if let Some(Value::String(out_path)) = attrs.get("out") {
assert_eq!(
out_path,
"/nix/store/i479clih5pb6bn2d2b758sbaylvbs2cl-three"
);
}
if let Some(Value::String(dev_path)) = attrs.get("dev") {
assert_eq!(
dev_path,
"/nix/store/gg8v395vci5xg1i9grc8ifh5xagw5f2j-three-dev"
);
}
if let Some(Value::String(doc_path)) = attrs.get("doc") {
assert_eq!(
doc_path,
"/nix/store/p2avgz16qx5k2jgnq3ch04k154xj1ac0-three-doc"
);
}
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn multi_output_backward_compat() {
let result = eval_deep(
r#"derivation {
name = "compat";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out"];
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("outPath"));
assert!(attrs.contains_key("out"));
if let (Some(Value::String(out_path)), Some(Value::String(out))) =
(attrs.get("outPath"), attrs.get("out"))
{
assert_eq!(out_path, out);
}
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn multi_output_deterministic() {
let result1 = eval_deep(
r#"derivation {
name = "determ";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out" "dev"];
}"#,
);
let result2 = eval_deep(
r#"derivation {
name = "determ";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out" "dev"];
}"#,
);
assert_eq!(result1, result2);
}
#[test_log::test]
fn fixed_output_sha256_flat() {
let result = eval_deep(
r#"derivation {
name = "fixed";
builder = "/bin/sh";
system = "x86_64-linux";
outputHash = "0000000000000000000000000000000000000000000000000000000000000000";
outputHashAlgo = "sha256";
outputHashMode = "flat";
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("outPath"));
assert!(attrs.contains_key("drvPath"));
// Verify exact paths match CppNix
if let Some(Value::String(out_path)) = attrs.get("outPath") {
assert_eq!(
out_path,
"/nix/store/ap9h69qwrm5060ldi96axyklh3pr3yjn-fixed"
);
}
if let Some(Value::String(drv_path)) = attrs.get("drvPath") {
assert_eq!(
drv_path,
"/nix/store/kj9gsfz5cngc38n1xlf6ljlgvnsfg0cj-fixed.drv"
);
}
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn fixed_output_missing_hashalgo() {
assert!(
eval_deep_result(
r#"derivation {
name = "default";
builder = "/bin/sh";
system = "x86_64-linux";
outputHash = "0000000000000000000000000000000000000000000000000000000000000000";
}"#,
)
.is_err()
);
}
#[test_log::test]
fn fixed_output_recursive_mode() {
let result = eval_deep(
r#"derivation {
name = "recursive";
builder = "/bin/sh";
system = "x86_64-linux";
outputHash = "1111111111111111111111111111111111111111111111111111111111111111";
outputHashAlgo = "sha256";
outputHashMode = "recursive";
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("outPath"));
assert!(attrs.contains_key("drvPath"));
// Verify exact path matches CppNix
if let Some(Value::String(out_path)) = attrs.get("outPath") {
assert_eq!(
out_path,
"/nix/store/qyal5s16hfwxhz5zwpf8h8yv2bs84z56-recursive"
);
}
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn fixed_output_rejects_multi_output() {
let result = eval_deep_result(
r#"derivation {
name = "invalid";
builder = "/bin/sh";
system = "x86_64-linux";
outputHash = "0000000000000000000000000000000000000000000000000000000000000000";
outputHashAlgo = "sha256";
outputs = ["out" "dev"];
}"#,
);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("fixed-output") && err_msg.contains("one"));
}
#[test_log::test]
fn fixed_output_invalid_hash_mode() {
let result = eval_deep_result(
r#"derivation {
name = "invalid";
builder = "/bin/sh";
system = "x86_64-linux";
outputHash = "0000000000000000000000000000000000000000000000000000000000000000";
outputHashMode = "invalid";
}"#,
);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("outputHashMode") && err_msg.contains("invalid"));
}
#[test_log::test]
fn structured_attrs_basic() {
let result = eval_deep(
r#"derivation {
name = "struct";
builder = "/bin/sh";
system = "x86_64-linux";
__structuredAttrs = true;
foo = "bar";
count = 42;
enabled = true;
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("drvPath"));
assert!(attrs.contains_key("outPath"));
assert!(attrs.contains_key("foo"));
assert!(attrs.contains_key("count"));
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn structured_attrs_nested() {
let result = eval_deep(
r#"derivation {
name = "nested";
builder = "/bin/sh";
system = "x86_64-linux";
__structuredAttrs = true;
data = { x = 1; y = [2 3]; };
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("drvPath"));
assert!(attrs.contains_key("data"));
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn structured_attrs_rejects_functions() {
let result = eval_deep_result(
r#"derivation {
name = "invalid";
builder = "/bin/sh";
system = "x86_64-linux";
__structuredAttrs = true;
func = x: x + 1;
}"#,
);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("cannot convert lambda to JSON"));
}
#[test_log::test]
fn structured_attrs_false() {
let result = eval_deep(
r#"derivation {
name = "normal";
builder = "/bin/sh";
system = "x86_64-linux";
__structuredAttrs = false;
foo = "bar";
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("foo"));
if let Some(Value::String(val)) = attrs.get("foo") {
assert_eq!(val, "bar");
}
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn ignore_nulls_true() {
let result = eval_deep(
r#"derivation {
name = "ignore";
builder = "/bin/sh";
system = "x86_64-linux";
__ignoreNulls = true;
foo = "bar";
nullValue = null;
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("foo"));
assert!(attrs.contains_key("nullValue"));
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn ignore_nulls_false() {
let result = eval_deep(
r#"derivation {
name = "keep";
builder = "/bin/sh";
system = "x86_64-linux";
__ignoreNulls = false;
nullValue = null;
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("nullValue"));
if let Some(Value::String(val)) = attrs.get("nullValue") {
assert_eq!(val, "");
}
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn ignore_nulls_with_structured_attrs() {
let result = eval_deep(
r#"derivation {
name = "combined";
builder = "/bin/sh";
system = "x86_64-linux";
__structuredAttrs = true;
__ignoreNulls = true;
foo = "bar";
nullValue = null;
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("drvPath"));
assert!(attrs.contains_key("foo"));
assert!(attrs.contains_key("nullValue"));
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn all_features_combined() {
let result = eval_deep(
r#"derivation {
name = "all";
builder = "/bin/sh";
system = "x86_64-linux";
outputs = ["out" "dev"];
__structuredAttrs = true;
__ignoreNulls = true;
data = { x = 1; };
nullValue = null;
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("out"));
assert!(attrs.contains_key("dev"));
assert!(attrs.contains_key("outPath"));
assert!(attrs.contains_key("data"));
assert!(attrs.contains_key("nullValue"));
}
_ => panic!("Expected AttrSet"),
}
}
#[test_log::test]
fn fixed_output_with_structured_attrs() {
let result = eval_deep(
r#"derivation {
name = "fixstruct";
builder = "/bin/sh";
system = "x86_64-linux";
outputHash = "0000000000000000000000000000000000000000000000000000000000000000";
outputHashAlgo = "sha256";
__structuredAttrs = true;
data = { key = "value"; };
}"#,
);
match result {
Value::AttrSet(attrs) => {
assert!(attrs.contains_key("outPath"));
assert!(attrs.contains_key("drvPath"));
assert!(attrs.contains_key("data"));
}
_ => panic!("Expected AttrSet"),
}
}

View File

@@ -0,0 +1,36 @@
use crate::utils::eval;
#[test_log::test]
fn test_find_file_corepkg_fetchurl() {
let result = eval(
r#"
let
searchPath = [];
lookupPath = "nix/fetchurl.nix";
in
builtins.findFile searchPath lookupPath
"#,
);
assert!(result.to_string().contains("fetchurl.nix"));
}
#[test_log::test]
fn test_lookup_path_syntax() {
let result = eval(r#"<nix/fetchurl.nix>"#);
assert!(result.to_string().contains("fetchurl.nix"));
}
#[test_log::test]
fn test_import_corepkg() {
let result = eval(
r#"
let
fetchurl = import <nix/fetchurl.nix>;
in
builtins.typeOf fetchurl
"#,
);
assert_eq!(result.to_string(), "\"lambda\"");
}

View File

@@ -0,0 +1,74 @@
use fix::value::{List, Value};
use crate::utils::{eval, eval_result};
#[test_log::test]
fn true_literal() {
assert_eq!(eval("true"), Value::Bool(true));
}
#[test_log::test]
fn false_literal() {
assert_eq!(eval("false"), Value::Bool(false));
}
#[test_log::test]
fn null_literal() {
assert_eq!(eval("null"), Value::Null);
}
#[test_log::test]
fn map_function() {
assert_eq!(
eval("map (x: x * 2) [1 2 3]"),
Value::List(List::new(vec![Value::Int(2), Value::Int(4), Value::Int(6)]))
);
}
#[test_log::test]
fn is_null_function() {
assert_eq!(eval("isNull null"), Value::Bool(true));
assert_eq!(eval("isNull 5"), Value::Bool(false));
}
#[test_log::test]
fn shadow_true() {
assert_eq!(eval("let true = false; in true"), Value::Bool(false));
}
#[test_log::test]
fn shadow_map() {
assert_eq!(eval("let map = x: y: x; in map 1 2"), Value::Int(1));
}
#[test_log::test]
fn mixed_usage() {
assert_eq!(
eval("if true then map (x: x + 1) [1 2] else []"),
Value::List(List::new(vec![Value::Int(2), Value::Int(3)]))
);
}
#[test_log::test]
fn in_let_bindings() {
assert_eq!(
eval("let x = true; y = false; in x && y"),
Value::Bool(false)
);
}
#[test_log::test]
fn shadow_in_function() {
assert_eq!(eval("(true: true) false"), Value::Bool(false));
}
#[test_log::test]
fn throw_function() {
let result = eval_result("throw \"error message\"");
assert!(result.is_err());
}
#[test_log::test]
fn to_string_function() {
assert_eq!(eval("toString 42"), Value::String("42".to_string()));
}

View File

@@ -0,0 +1,120 @@
use fix::value::Value;
use crate::utils::{eval, eval_result};
#[test_log::test]
fn required_parameters() {
assert_eq!(eval("({ a, b }: a + b) { a = 1; b = 2; }"), Value::Int(3));
}
#[test_log::test]
fn missing_required_parameter() {
let result = eval_result("({ a, b }: a + b) { a = 1; }");
assert!(result.is_err());
}
#[test_log::test]
fn all_required_parameters_present() {
assert_eq!(
eval("({ x, y, z }: x + y + z) { x = 1; y = 2; z = 3; }"),
Value::Int(6)
);
}
#[test_log::test]
fn reject_unexpected_arguments() {
let result = eval_result("({ a, b }: a + b) { a = 1; b = 2; c = 3; }");
assert!(result.is_err());
}
#[test_log::test]
fn ellipsis_accepts_extra_arguments() {
assert_eq!(
eval("({ a, b, ... }: a + b) { a = 1; b = 2; c = 3; }"),
Value::Int(3)
);
}
#[test_log::test]
fn default_parameters() {
assert_eq!(eval("({ a, b ? 5 }: a + b) { a = 1; }"), Value::Int(6));
}
#[test_log::test]
fn override_default_parameter() {
assert_eq!(
eval("({ a, b ? 5 }: a + b) { a = 1; b = 10; }"),
Value::Int(11)
);
}
#[test_log::test]
fn at_pattern_alias() {
assert_eq!(
eval("(args@{ a, b }: args.a + args.b) { a = 1; b = 2; }"),
Value::Int(3)
);
}
#[test_log::test]
fn simple_parameter_no_validation() {
assert_eq!(eval("(x: x.a + x.b) { a = 1; b = 2; }"), Value::Int(3));
}
#[test_log::test]
fn simple_parameter_accepts_any_argument() {
assert_eq!(eval("(x: x) 42"), Value::Int(42));
}
#[test_log::test]
fn nested_function_parameters() {
assert_eq!(
eval("({ a }: { b }: a + b) { a = 5; } { b = 3; }"),
Value::Int(8)
);
}
#[test_log::test]
fn pattern_param_simple_reference_in_default() {
assert_eq!(eval("({ a, b ? a }: b) { a = 10; }"), Value::Int(10));
}
#[test_log::test]
fn pattern_param_multiple_references_in_default() {
assert_eq!(
eval("({ a, b ? a + 5, c ? 1 }: b + c) { a = 10; }"),
Value::Int(16)
);
}
#[test_log::test]
fn pattern_param_mutual_reference() {
assert_eq!(
eval("({ a, b ? c + 1, c ? 5 }: b) { a = 1; }"),
Value::Int(6)
);
}
#[test_log::test]
fn pattern_param_override_mutual_reference() {
assert_eq!(
eval("({ a, b ? c + 1, c ? 5 }: b) { a = 1; c = 10; }"),
Value::Int(11)
);
}
#[test_log::test]
fn pattern_param_reference_list() {
assert_eq!(
eval("({ a, b ? [ a 2 ] }: builtins.elemAt b 0) { a = 42; }"),
Value::Int(42)
);
}
#[test_log::test]
fn pattern_param_alias_in_default() {
assert_eq!(
eval("(args@{ a, b ? args.a + 10 }: b) { a = 5; }"),
Value::Int(15)
);
}

View File

@@ -0,0 +1,368 @@
use fix::error::Source;
use fix::runtime::Runtime;
use fix::value::Value;
use crate::utils::{eval, eval_result};
#[test_log::test]
fn import_absolute_path() {
let temp_dir = tempfile::tempdir().unwrap();
let lib_path = temp_dir.path().join("nix_test_lib.nix");
std::fs::write(&lib_path, "{ add = a: b: a + b; }").unwrap();
let expr = format!(r#"(import "{}").add 3 5"#, lib_path.display());
assert_eq!(eval(&expr), Value::Int(8));
}
#[test_log::test]
fn import_nested() {
let temp_dir = tempfile::tempdir().unwrap();
let lib_path = temp_dir.path().join("lib.nix");
std::fs::write(&lib_path, "{ add = a: b: a + b; }").unwrap();
let main_path = temp_dir.path().join("main.nix");
let main_content = format!(
r#"let lib = import {}; in {{ result = lib.add 10 20; }}"#,
lib_path.display()
);
std::fs::write(&main_path, main_content).unwrap();
let expr = format!(r#"(import "{}").result"#, main_path.display());
assert_eq!(eval(&expr), Value::Int(30));
}
#[test_log::test]
fn import_relative_path() {
let temp_dir = tempfile::tempdir().unwrap();
let subdir = temp_dir.path().join("subdir");
std::fs::create_dir_all(&subdir).unwrap();
let lib_path = temp_dir.path().join("lib.nix");
std::fs::write(&lib_path, "{ multiply = a: b: a * b; }").unwrap();
let helper_path = subdir.join("helper.nix");
std::fs::write(&helper_path, "{ subtract = a: b: a - b; }").unwrap();
let main_path = temp_dir.path().join("main.nix");
let main_content = r#"
let
lib = import ./lib.nix;
helper = import ./subdir/helper.nix;
in {
result1 = lib.multiply 3 4;
result2 = helper.subtract 10 3;
}
"#;
std::fs::write(&main_path, main_content).unwrap();
let expr = format!(r#"let x = import "{}"; in x.result1"#, main_path.display());
assert_eq!(eval(&expr), Value::Int(12));
let expr = format!(r#"let x = import "{}"; in x.result2"#, main_path.display());
assert_eq!(eval(&expr), Value::Int(7));
}
#[test_log::test]
fn import_returns_function() {
let temp_dir = tempfile::tempdir().unwrap();
let func_path = temp_dir.path().join("nix_test_func.nix");
std::fs::write(&func_path, "x: x * 2").unwrap();
let expr = format!(r#"(import "{}") 5"#, func_path.display());
assert_eq!(eval(&expr), Value::Int(10));
}
#[test_log::test]
fn import_with_complex_dependency_graph() {
let temp_dir = tempfile::tempdir().unwrap();
let utils_path = temp_dir.path().join("utils.nix");
std::fs::write(&utils_path, "{ double = x: x * 2; }").unwrap();
let math_path = temp_dir.path().join("math.nix");
let math_content = r#"let utils = import ./utils.nix; in { triple = x: x + utils.double x; }"#;
std::fs::write(&math_path, math_content).unwrap();
let main_path = temp_dir.path().join("main.nix");
let main_content = r#"let math = import ./math.nix; in math.triple 5"#;
std::fs::write(&main_path, main_content).unwrap();
let expr = format!(r#"import "{}""#, main_path.display());
assert_eq!(eval(&expr), Value::Int(15));
}
// Tests for builtins.path
#[test_log::test]
fn path_with_file() {
let mut ctx = Runtime::new().unwrap();
let temp_dir = tempfile::tempdir().unwrap();
let test_file = temp_dir.path().join("test.txt");
std::fs::write(&test_file, "Hello, World!").unwrap();
let expr = format!(r#"builtins.path {{ path = {}; }}"#, test_file.display());
let result = ctx.eval(Source::new_eval(expr).unwrap()).unwrap();
// Should return a store path string
if let Value::String(store_path) = result {
assert!(store_path.starts_with("/nix/store"));
assert!(store_path.contains("test.txt"));
} else {
panic!("Expected string, got {:?}", result);
}
}
#[test_log::test]
fn path_with_custom_name() {
let temp_dir = tempfile::tempdir().unwrap();
let test_file = temp_dir.path().join("original.txt");
std::fs::write(&test_file, "Content").unwrap();
let expr = format!(
r#"builtins.path {{ path = {}; name = "custom-name"; }}"#,
test_file.display()
);
let result = eval(&expr);
if let Value::String(store_path) = result {
assert!(store_path.contains("custom-name"));
assert!(!store_path.contains("original.txt"));
} else {
panic!("Expected string, got {:?}", result);
}
}
#[test_log::test]
fn path_with_directory_recursive() {
let mut ctx = Runtime::new().unwrap();
let temp_dir = tempfile::tempdir().unwrap();
let test_dir = temp_dir.path().join("mydir");
std::fs::create_dir_all(&test_dir).unwrap();
std::fs::write(test_dir.join("file1.txt"), "Content 1").unwrap();
std::fs::write(test_dir.join("file2.txt"), "Content 2").unwrap();
let expr = format!(
r#"builtins.path {{ path = {}; recursive = true; }}"#,
test_dir.display()
);
let result = ctx.eval(Source::new_eval(expr).unwrap()).unwrap();
if let Value::String(store_path) = result {
assert!(store_path.starts_with("/nix/store"));
assert!(store_path.contains("mydir"));
} else {
panic!("Expected string, got {:?}", result);
}
}
#[test_log::test]
fn path_flat_with_file() {
let mut ctx = Runtime::new().unwrap();
let temp_dir = tempfile::tempdir().unwrap();
let test_file = temp_dir.path().join("flat.txt");
std::fs::write(&test_file, "Flat content").unwrap();
let expr = format!(
r#"builtins.path {{ path = {}; recursive = false; }}"#,
test_file.display()
);
let result = ctx.eval(Source::new_eval(expr).unwrap()).unwrap();
if let Value::String(store_path) = result {
assert!(store_path.starts_with("/nix/store"));
} else {
panic!("Expected string, got {:?}", result);
}
}
#[test_log::test]
fn path_flat_with_directory_fails() {
let temp_dir = tempfile::tempdir().unwrap();
let test_dir = temp_dir.path().join("mydir");
std::fs::create_dir_all(&test_dir).unwrap();
let expr = format!(
r#"builtins.path {{ path = {}; recursive = false; }}"#,
test_dir.display()
);
let result = eval_result(&expr);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("recursive") || err_msg.contains("regular file"));
}
#[test_log::test]
fn path_nonexistent_fails() {
let expr = r#"builtins.path { path = "/nonexistent/path/that/should/not/exist"; }"#;
let result = eval_result(expr);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("does not exist"));
}
#[test_log::test]
fn path_missing_path_param() {
let expr = r#"builtins.path { name = "test"; }"#;
let result = eval_result(expr);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("path") && err_msg.contains("required"));
}
#[test_log::test]
fn path_with_sha256() {
let temp_dir = tempfile::tempdir().unwrap();
let test_file = temp_dir.path().join("hash_test.txt");
std::fs::write(&test_file, "Test content for hashing").unwrap();
// First, get the hash by calling without sha256
let expr1 = format!(r#"builtins.path {{ path = {}; }}"#, test_file.display());
let result1 = eval(&expr1);
let store_path1 = match result1 {
Value::String(s) => s,
_ => panic!("Expected string"),
};
// Compute the actual hash (for testing, we'll just verify the same path is returned)
// In real usage, the user would know the hash beforehand
let expr2 = format!(r#"builtins.path {{ path = {}; }}"#, test_file.display());
let result2 = eval(&expr2);
let store_path2 = match result2 {
Value::String(s) => s,
_ => panic!("Expected string"),
};
// Same input should produce same output
assert_eq!(store_path1, store_path2);
}
#[test_log::test]
fn path_deterministic() {
let temp_dir = tempfile::tempdir().unwrap();
let test_file = temp_dir.path().join("deterministic.txt");
std::fs::write(&test_file, "Same content").unwrap();
let expr = format!(
r#"builtins.path {{ path = {}; name = "myfile"; }}"#,
test_file.display()
);
let result1 = eval(&expr);
let result2 = eval(&expr);
// Same inputs should produce same store path
assert_eq!(result1, result2);
}
#[test_log::test]
fn read_file_type_regular_file() {
let temp_dir = tempfile::tempdir().unwrap();
let test_file = temp_dir.path().join("test.txt");
std::fs::write(&test_file, "Test content").unwrap();
let expr = format!(r#"builtins.readFileType {}"#, test_file.display());
assert_eq!(eval(&expr), Value::String("regular".to_string()));
}
#[test_log::test]
fn read_file_type_directory() {
let temp_dir = tempfile::tempdir().unwrap();
let test_dir = temp_dir.path().join("testdir");
std::fs::create_dir(&test_dir).unwrap();
let expr = format!(r#"builtins.readFileType {}"#, test_dir.display());
assert_eq!(eval(&expr), Value::String("directory".to_string()));
}
#[test_log::test]
fn read_file_type_symlink() {
let temp_dir = tempfile::tempdir().unwrap();
let target = temp_dir.path().join("target.txt");
let symlink = temp_dir.path().join("link.txt");
std::fs::write(&target, "Target content").unwrap();
#[cfg(unix)]
std::os::unix::fs::symlink(&target, &symlink).unwrap();
#[cfg(unix)]
{
let expr = format!(r#"builtins.readFileType {}"#, symlink.display());
assert_eq!(eval(&expr), Value::String("symlink".to_string()));
}
}
#[test_log::test]
fn read_dir_basic() {
let temp_dir = tempfile::tempdir().unwrap();
let test_dir = temp_dir.path().join("readdir_test");
std::fs::create_dir(&test_dir).unwrap();
std::fs::write(test_dir.join("file1.txt"), "Content 1").unwrap();
std::fs::write(test_dir.join("file2.txt"), "Content 2").unwrap();
std::fs::create_dir(test_dir.join("subdir")).unwrap();
let expr = format!(r#"builtins.readDir {}"#, test_dir.display());
let result = eval(&expr);
if let Value::AttrSet(attrs) = result {
assert_eq!(
attrs.get("file1.txt"),
Some(&Value::String("regular".to_string()))
);
assert_eq!(
attrs.get("file2.txt"),
Some(&Value::String("regular".to_string()))
);
assert_eq!(
attrs.get("subdir"),
Some(&Value::String("directory".to_string()))
);
assert_eq!(attrs.len(), 3);
} else {
panic!("Expected AttrSet, got {:?}", result);
}
}
#[test_log::test]
fn read_dir_empty() {
let temp_dir = tempfile::tempdir().unwrap();
let test_dir = temp_dir.path().join("empty_dir");
std::fs::create_dir(&test_dir).unwrap();
let expr = format!(r#"builtins.readDir {}"#, test_dir.display());
let result = eval(&expr);
if let Value::AttrSet(attrs) = result {
assert_eq!(attrs.len(), 0);
} else {
panic!("Expected AttrSet, got {:?}", result);
}
}
#[test_log::test]
fn read_dir_nonexistent_fails() {
let expr = r#"builtins.readDir "/nonexistent/directory""#;
let result = eval_result(expr);
assert!(result.is_err());
}
#[test_log::test]
fn read_dir_on_file_fails() {
let temp_dir = tempfile::tempdir().unwrap();
let test_file = temp_dir.path().join("test.txt");
std::fs::write(&test_file, "Test content").unwrap();
let expr = format!(r#"builtins.readDir {}"#, test_file.display());
let result = eval_result(&expr);
assert!(result.is_err());
let err_msg = result.unwrap_err().to_string();
assert!(err_msg.contains("not a directory"));
}

319
fix/tests/tests/lang.rs Normal file
View File

@@ -0,0 +1,319 @@
#![allow(non_snake_case)]
use std::path::PathBuf;
use fix::error::Source;
use fix::runtime::Runtime;
use fix::value::Value;
fn get_lang_dir() -> PathBuf {
PathBuf::from(env!("CARGO_MANIFEST_DIR")).join("tests/tests/lang")
}
fn eval_file(name: &str) -> Result<(Value, Source), String> {
let lang_dir = get_lang_dir();
let nix_path = lang_dir.join(format!("{name}.nix"));
let expr = format!(r#"import "{}""#, nix_path.display());
let mut ctx = Runtime::new().map_err(|e| e.to_string())?;
let source = Source {
ty: fix::error::SourceType::File(nix_path.into()),
src: expr.into(),
};
ctx.eval_deep(source.clone())
.map(|val| (val, source))
.map_err(|e| e.to_string())
}
fn read_expected(name: &str) -> String {
let lang_dir = get_lang_dir();
let exp_path = lang_dir.join(format!("{name}.exp"));
std::fs::read_to_string(exp_path)
.expect("expected file should exist")
.trim_end()
.to_string()
}
fn format_value(value: &Value) -> String {
value.display_compat().to_string()
}
macro_rules! eval_okay_test {
($(#[$attr:meta])* $name:ident$(, $pre:expr)?) => {
$(#[$attr])*
#[test_log::test]
fn $name() {
$(($pre)();)?
let test_name = concat!("eval-okay-", stringify!($name))
.replace("_", "-")
.replace("r#", "");
let result = eval_file(&test_name);
match result {
Ok((value, source)) => {
let actual = format_value(&value);
let actual = actual.replace(
source
.get_dir()
.parent()
.unwrap()
.to_string_lossy()
.as_ref(),
"/pwd",
);
let expected = read_expected(&test_name);
assert_eq!(actual, expected, "Output mismatch for {}", test_name);
}
Err(e) => {
panic!("Test {} failed to evaluate: {}", test_name, e);
}
}
}
};
}
macro_rules! eval_fail_test {
($name:ident) => {
#[test_log::test]
fn $name() {
let test_name = concat!("eval-fail-", stringify!($name))
.replace("_", "-")
.replace("r#", "");
let result = eval_file(&test_name);
assert!(
result.is_err(),
"Test {} should have failed but succeeded with: {:?}",
test_name,
result
);
}
};
}
eval_okay_test!(any_all);
eval_okay_test!(arithmetic);
eval_okay_test!(attrnames);
eval_okay_test!(attrs);
eval_okay_test!(attrs2);
eval_okay_test!(attrs3);
eval_okay_test!(attrs4);
eval_okay_test!(attrs5);
eval_okay_test!(
#[ignore = "__overrides is not supported"]
attrs6
);
eval_okay_test!(
#[ignore = "requires --arg/--argstr CLI flags"]
autoargs
);
eval_okay_test!(backslash_newline_1);
eval_okay_test!(backslash_newline_2);
eval_okay_test!(baseNameOf);
eval_okay_test!(builtins);
eval_okay_test!(builtins_add);
eval_okay_test!(callable_attrs);
eval_okay_test!(catattrs);
eval_okay_test!(closure);
eval_okay_test!(comments);
eval_okay_test!(concat);
eval_okay_test!(concatmap);
eval_okay_test!(concatstringssep);
eval_okay_test!(context);
eval_okay_test!(context_introspection);
eval_okay_test!(convertHash);
eval_okay_test!(curpos);
eval_okay_test!(deepseq);
eval_okay_test!(delayed_with);
eval_okay_test!(delayed_with_inherit);
eval_okay_test!(deprecate_cursed_or);
eval_okay_test!(derivation_legacy);
eval_okay_test!(dynamic_attrs);
eval_okay_test!(dynamic_attrs_2);
eval_okay_test!(dynamic_attrs_bare);
eval_okay_test!(elem);
eval_okay_test!(empty_args);
eval_okay_test!(eq);
eval_okay_test!(eq_derivations);
eval_okay_test!(filter);
eval_okay_test!(
#[ignore = "not implemented: flakeRefToString"]
flake_ref_to_string
);
eval_okay_test!(flatten);
eval_okay_test!(float);
eval_okay_test!(floor_ceil);
eval_okay_test!(foldlStrict);
eval_okay_test!(foldlStrict_lazy_elements);
eval_okay_test!(foldlStrict_lazy_initial_accumulator);
eval_okay_test!(fromjson);
eval_okay_test!(fromjson_escapes);
eval_okay_test!(fromTOML);
eval_okay_test!(
#[ignore = "timestamps are not supported"]
fromTOML_timestamps
);
eval_okay_test!(functionargs);
eval_okay_test!(hashfile);
eval_okay_test!(hashstring);
eval_okay_test!(getattrpos);
eval_okay_test!(getattrpos_functionargs);
eval_okay_test!(getattrpos_undefined);
eval_okay_test!(getenv, || {
unsafe { std::env::set_var("TEST_VAR", "foo") };
});
eval_okay_test!(groupBy);
eval_okay_test!(r#if);
eval_okay_test!(ind_string);
eval_okay_test!(import);
eval_okay_test!(inherit_attr_pos);
eval_okay_test!(
#[ignore = "__overrides is not supported"]
inherit_from
);
eval_okay_test!(intersectAttrs);
eval_okay_test!(r#let);
eval_okay_test!(list);
eval_okay_test!(listtoattrs);
eval_okay_test!(logic);
eval_okay_test!(map);
eval_okay_test!(mapattrs);
eval_okay_test!(merge_dynamic_attrs);
eval_okay_test!(nested_with);
eval_okay_test!(new_let);
eval_okay_test!(null_dynamic_attrs);
eval_okay_test!(
#[ignore = "__overrides is not supported"]
overrides
);
eval_okay_test!(
#[ignore = "not implemented: parseFlakeRef"]
parse_flake_ref
);
eval_okay_test!(partition);
eval_okay_test!(path);
eval_okay_test!(pathexists);
eval_okay_test!(path_string_interpolation, || {
unsafe {
std::env::set_var("HOME", "/fake-home");
}
});
eval_okay_test!(patterns);
eval_okay_test!(print);
eval_okay_test!(readDir);
eval_okay_test!(readfile);
eval_okay_test!(readFileType);
eval_okay_test!(redefine_builtin);
eval_okay_test!(regex_match);
eval_okay_test!(regex_split);
eval_okay_test!(regression_20220122);
eval_okay_test!(regression_20220125);
eval_okay_test!(regrettable_rec_attrset_merge);
eval_okay_test!(remove);
eval_okay_test!(repeated_empty_attrs);
eval_okay_test!(repeated_empty_list);
eval_okay_test!(replacestrings);
eval_okay_test!(
#[ignore = "requires -I CLI flags"]
search_path
);
eval_okay_test!(scope_1);
eval_okay_test!(scope_2);
eval_okay_test!(scope_3);
eval_okay_test!(scope_4);
eval_okay_test!(scope_6);
eval_okay_test!(scope_7);
eval_okay_test!(seq);
eval_okay_test!(sort);
eval_okay_test!(splitversion);
eval_okay_test!(string);
eval_okay_test!(strings_as_attrs_names);
eval_okay_test!(substring);
eval_okay_test!(substring_context);
eval_okay_test!(symlink_resolution);
eval_okay_test!(
#[ignore = "TCO not implemented, also disabled in CppNix"]
tail_call_1
);
eval_okay_test!(tojson);
eval_okay_test!(toxml);
eval_okay_test!(toxml2);
eval_okay_test!(tryeval);
eval_okay_test!(types);
eval_okay_test!(versions);
eval_okay_test!(with);
eval_okay_test!(zipAttrsWith);
eval_fail_test!(fail_abort);
eval_fail_test!(fail_addDrvOutputDependencies_empty_context);
eval_fail_test!(fail_addDrvOutputDependencies_multi_elem_context);
eval_fail_test!(fail_addDrvOutputDependencies_wrong_element_kind);
eval_fail_test!(fail_addErrorRuntime_example);
eval_fail_test!(fail_assert);
eval_fail_test!(fail_assert_equal_attrs_names);
eval_fail_test!(fail_assert_equal_attrs_names_2);
eval_fail_test!(fail_assert_equal_derivations);
eval_fail_test!(fail_assert_equal_derivations_extra);
eval_fail_test!(fail_assert_equal_floats);
eval_fail_test!(fail_assert_equal_function_direct);
eval_fail_test!(fail_assert_equal_int_float);
eval_fail_test!(fail_assert_equal_ints);
eval_fail_test!(fail_assert_equal_list_length);
eval_fail_test!(fail_assert_equal_paths);
eval_fail_test!(fail_assert_equal_type);
eval_fail_test!(fail_assert_equal_type_nested);
eval_fail_test!(fail_assert_nested_bool);
eval_fail_test!(fail_attr_name_type);
eval_fail_test!(fail_attrset_merge_drops_later_rec);
eval_fail_test!(fail_bad_string_interpolation_1);
eval_fail_test!(fail_bad_string_interpolation_2);
eval_fail_test!(fail_bad_string_interpolation_3);
eval_fail_test!(fail_bad_string_interpolation_4);
eval_fail_test!(fail_blackhole);
eval_fail_test!(fail_call_primop);
eval_fail_test!(fail_deepseq);
eval_fail_test!(fail_derivation_name);
eval_fail_test!(fail_dup_dynamic_attrs);
eval_fail_test!(fail_duplicate_traces);
eval_fail_test!(fail_eol_1);
eval_fail_test!(fail_eol_2);
eval_fail_test!(fail_eol_3);
eval_fail_test!(fail_fetchTree_negative);
eval_fail_test!(fail_fetchurl_baseName);
eval_fail_test!(fail_fetchurl_baseName_attrs);
eval_fail_test!(fail_fetchurl_baseName_attrs_name);
eval_fail_test!(fail_flake_ref_to_string_negative_integer);
eval_fail_test!(fail_foldlStrict_strict_op_application);
eval_fail_test!(fail_fromJSON_keyWithNullByte);
eval_fail_test!(fail_fromJSON_overflowing);
eval_fail_test!(fail_fromJSON_valueWithNullByte);
eval_fail_test!(fail_fromTOML_keyWithNullByte);
eval_fail_test!(fail_fromTOML_timestamps);
eval_fail_test!(fail_fromTOML_valueWithNullByte);
eval_fail_test!(fail_hashfile_missing);
eval_fail_test!(fail_infinite_recursion_lambda);
eval_fail_test!(fail_list);
eval_fail_test!(fail_missing_arg);
eval_fail_test!(fail_mutual_recursion);
eval_fail_test!(fail_nested_list_items);
eval_fail_test!(fail_nonexist_path);
eval_fail_test!(fail_not_throws);
eval_fail_test!(fail_overflowing_add);
eval_fail_test!(fail_overflowing_div);
eval_fail_test!(fail_overflowing_mul);
eval_fail_test!(fail_overflowing_sub);
eval_fail_test!(fail_path_slash);
eval_fail_test!(fail_pipe_operators);
eval_fail_test!(fail_recursion);
eval_fail_test!(fail_remove);
eval_fail_test!(fail_scope_5);
eval_fail_test!(fail_seq);
eval_fail_test!(fail_set);
eval_fail_test!(fail_set_override);
eval_fail_test!(fail_string_nul_1);
eval_fail_test!(fail_string_nul_2);
eval_fail_test!(fail_substring);
eval_fail_test!(fail_toJSON);
eval_fail_test!(fail_toJSON_non_utf_8);
eval_fail_test!(fail_to_path);
eval_fail_test!(fail_undeclared_arg);
eval_fail_test!(fail_using_set_as_attr_name);

Binary file not shown.

View File

@@ -0,0 +1 @@
foo

View File

@@ -0,0 +1 @@
"a"

View File

@@ -0,0 +1 @@
"X"

View File

@@ -0,0 +1 @@
"b"

View File

@@ -0,0 +1 @@
"X"

View File

@@ -0,0 +1 @@
"X"

View File

@@ -0,0 +1 @@
"c"

View File

@@ -0,0 +1 @@
"X"

View File

@@ -0,0 +1 @@
"X"

View File

@@ -0,0 +1,8 @@
error:
… while calling the 'abort' builtin
at /pwd/lang/eval-fail-abort.nix:1:14:
1| if true then abort "this should fail" else 1
| ^
2|
error: evaluation aborted with the following error message: 'this should fail'

View File

@@ -0,0 +1 @@
if true then abort "this should fail" else 1

View File

@@ -0,0 +1,8 @@
error:
… while calling the 'addDrvOutputDependencies' builtin
at /pwd/lang/eval-fail-addDrvOutputDependencies-empty-context.nix:1:1:
1| builtins.addDrvOutputDependencies ""
| ^
2|
error: context of string '' must have exactly one element, but has 0

View File

@@ -0,0 +1 @@
builtins.addDrvOutputDependencies ""

View File

@@ -0,0 +1,9 @@
error:
… while calling the 'addDrvOutputDependencies' builtin
at /pwd/lang/eval-fail-addDrvOutputDependencies-multi-elem-context.nix:25:1:
24| in
25| builtins.addDrvOutputDependencies combo-path
| ^
26|
error: context of string '/nix/store/pg9yqs4yd85yhdm3f4i5dyaqp5jahrsz-fail.drv/nix/store/2dxd5frb715z451vbf7s8birlf3argbk-fail-2.drv' must have exactly one element, but has 2

View File

@@ -0,0 +1,25 @@
let
drv0 = derivation {
name = "fail";
builder = "/bin/false";
system = "x86_64-linux";
outputs = [
"out"
"foo"
];
};
drv1 = derivation {
name = "fail-2";
builder = "/bin/false";
system = "x86_64-linux";
outputs = [
"out"
"foo"
];
};
combo-path = "${drv0.drvPath}${drv1.drvPath}";
in
builtins.addDrvOutputDependencies combo-path

View File

@@ -0,0 +1,9 @@
error:
… while calling the 'addDrvOutputDependencies' builtin
at /pwd/lang/eval-fail-addDrvOutputDependencies-wrong-element-kind.nix:13:1:
12| in
13| builtins.addDrvOutputDependencies drv.outPath
| ^
14|
error: `addDrvOutputDependencies` can only act on derivations, not on a derivation output such as 'out'

View File

@@ -0,0 +1,13 @@
let
drv = derivation {
name = "fail";
builder = "/bin/false";
system = "x86_64-linux";
outputs = [
"out"
"foo"
];
};
in
builtins.addDrvOutputDependencies drv.outPath

View File

@@ -0,0 +1,24 @@
error:
… while counting down; n = 10
… while counting down; n = 9
… while counting down; n = 8
… while counting down; n = 7
… while counting down; n = 6
… while counting down; n = 5
… while counting down; n = 4
… while counting down; n = 3
… while counting down; n = 2
… while counting down; n = 1
(stack trace truncated; use '--show-trace' to show the full, detailed trace)
error: kaboom

View File

@@ -0,0 +1,9 @@
let
countDown =
n:
if n == 0 then
throw "kaboom"
else
builtins.addErrorContext "while counting down; n = ${toString n}" ("x" + countDown (n - 1));
in
countDown 10

View File

@@ -0,0 +1,8 @@
error:
… while evaluating the condition of the assertion '({ a = true; } == { a = true; b = true; })'
at /pwd/lang/eval-fail-assert-equal-attrs-names-2.nix:1:1:
1| assert
| ^
2| {
error: attribute names of attribute set '{ a = true; }' differs from attribute set '{ a = true; b = true; }'

View File

@@ -0,0 +1,8 @@
assert
{
a = true;
} == {
a = true;
b = true;
};
throw "unreachable"

View File

@@ -0,0 +1,8 @@
error:
… while evaluating the condition of the assertion '({ a = true; b = true; } == { a = true; })'
at /pwd/lang/eval-fail-assert-equal-attrs-names.nix:1:1:
1| assert
| ^
2| {
error: attribute names of attribute set '{ a = true; b = true; }' differs from attribute set '{ a = true; }'

View File

@@ -0,0 +1,8 @@
assert
{
a = true;
b = true;
} == {
a = true;
};
throw "unreachable"

View File

@@ -0,0 +1,26 @@
error:
… while evaluating the condition of the assertion '({ foo = { outPath = "/nix/store/0"; type = "derivation"; }; } == { foo = { devious = true; outPath = "/nix/store/1"; type = "derivation"; }; })'
at /pwd/lang/eval-fail-assert-equal-derivations-extra.nix:1:1:
1| assert
| ^
2| {
… while comparing attribute 'foo'
… where left hand side is
at /pwd/lang/eval-fail-assert-equal-derivations-extra.nix:3:5:
2| {
3| foo = {
| ^
4| type = "derivation";
… where right hand side is
at /pwd/lang/eval-fail-assert-equal-derivations-extra.nix:8:5:
7| } == {
8| foo = {
| ^
9| type = "derivation";
… while comparing a derivation by its 'outPath' attribute
error: string '"/nix/store/0"' is not equal to string '"/nix/store/1"'

View File

@@ -0,0 +1,14 @@
assert
{
foo = {
type = "derivation";
outPath = "/nix/store/0";
};
} == {
foo = {
type = "derivation";
outPath = "/nix/store/1";
devious = true;
};
};
throw "unreachable"

View File

@@ -0,0 +1,26 @@
error:
… while evaluating the condition of the assertion '({ foo = { ignored = (abort "not ignored"); outPath = "/nix/store/0"; type = "derivation"; }; } == { foo = { ignored = (abort "not ignored"); outPath = "/nix/store/1"; type = "derivation"; }; })'
at /pwd/lang/eval-fail-assert-equal-derivations.nix:1:1:
1| assert
| ^
2| {
… while comparing attribute 'foo'
… where left hand side is
at /pwd/lang/eval-fail-assert-equal-derivations.nix:3:5:
2| {
3| foo = {
| ^
4| type = "derivation";
… where right hand side is
at /pwd/lang/eval-fail-assert-equal-derivations.nix:9:5:
8| } == {
9| foo = {
| ^
10| type = "derivation";
… while comparing a derivation by its 'outPath' attribute
error: string '"/nix/store/0"' is not equal to string '"/nix/store/1"'

View File

@@ -0,0 +1,15 @@
assert
{
foo = {
type = "derivation";
outPath = "/nix/store/0";
ignored = abort "not ignored";
};
} == {
foo = {
type = "derivation";
outPath = "/nix/store/1";
ignored = abort "not ignored";
};
};
throw "unreachable"

View File

@@ -0,0 +1,22 @@
error:
… while evaluating the condition of the assertion '({ b = 1; } == { b = 1.01; })'
at /pwd/lang/eval-fail-assert-equal-floats.nix:1:1:
1| assert { b = 1.0; } == { b = 1.01; };
| ^
2| abort "unreachable"
… while comparing attribute 'b'
… where left hand side is
at /pwd/lang/eval-fail-assert-equal-floats.nix:1:10:
1| assert { b = 1.0; } == { b = 1.01; };
| ^
2| abort "unreachable"
… where right hand side is
at /pwd/lang/eval-fail-assert-equal-floats.nix:1:26:
1| assert { b = 1.0; } == { b = 1.01; };
| ^
2| abort "unreachable"
error: a float with value '1' is not equal to a float with value '1.01'

View File

@@ -0,0 +1,2 @@
assert { b = 1.0; } == { b = 1.01; };
abort "unreachable"

View File

@@ -0,0 +1,9 @@
error:
… while evaluating the condition of the assertion '((x: x) == (x: x))'
at /pwd/lang/eval-fail-assert-equal-function-direct.nix:3:1:
2| # This only compares a direct comparison and makes no claims about functions in nested structures.
3| assert (x: x) == (x: x);
| ^
4| abort "unreachable"
error: distinct functions and immediate comparisons of identical functions compare as unequal

View File

@@ -0,0 +1,4 @@
# Note: functions in nested structures, e.g. attributes, may be optimized away by pointer identity optimization.
# This only compares a direct comparison and makes no claims about functions in nested structures.
assert (x: x) == (x: x);
abort "unreachable"

View File

@@ -0,0 +1,8 @@
error:
… while evaluating the condition of the assertion '(1 == 1.1)'
at /pwd/lang/eval-fail-assert-equal-int-float.nix:1:1:
1| assert 1 == 1.1;
| ^
2| throw "unreachable"
error: an integer with value '1' is not equal to a float with value '1.1'

View File

@@ -0,0 +1,2 @@
assert 1 == 1.1;
throw "unreachable"

View File

@@ -0,0 +1,22 @@
error:
… while evaluating the condition of the assertion '({ b = 1; } == { b = 2; })'
at /pwd/lang/eval-fail-assert-equal-ints.nix:1:1:
1| assert { b = 1; } == { b = 2; };
| ^
2| abort "unreachable"
… while comparing attribute 'b'
… where left hand side is
at /pwd/lang/eval-fail-assert-equal-ints.nix:1:10:
1| assert { b = 1; } == { b = 2; };
| ^
2| abort "unreachable"
… where right hand side is
at /pwd/lang/eval-fail-assert-equal-ints.nix:1:24:
1| assert { b = 1; } == { b = 2; };
| ^
2| abort "unreachable"
error: an integer with value '1' is not equal to an integer with value '2'

View File

@@ -0,0 +1,2 @@
assert { b = 1; } == { b = 2; };
abort "unreachable"

View File

@@ -0,0 +1,8 @@
error:
… while evaluating the condition of the assertion '([ (1) (0) ] == [ (10) ])'
at /pwd/lang/eval-fail-assert-equal-list-length.nix:1:1:
1| assert
| ^
2| [
error: list of size '2' is not equal to list of size '1', left hand side is '[ 1 0 ]', right hand side is '[ 10 ]'

View File

@@ -0,0 +1,6 @@
assert
[
1
0
] == [ 10 ];
throw "unreachable"

View File

@@ -0,0 +1,8 @@
error:
… while evaluating the condition of the assertion '(/pwd/lang/foo == /pwd/lang/bar)'
at /pwd/lang/eval-fail-assert-equal-paths.nix:1:1:
1| assert ./foo == ./bar;
| ^
2| throw "unreachable"
error: path '/pwd/lang/foo' is not equal to path '/pwd/lang/bar'

View File

@@ -0,0 +1,2 @@
assert ./foo == ./bar;
throw "unreachable"

View File

@@ -0,0 +1,22 @@
error:
… while evaluating the condition of the assertion '({ ding = false; } == { ding = null; })'
at /pwd/lang/eval-fail-assert-equal-type-nested.nix:1:1:
1| assert { ding = false; } == { ding = null; };
| ^
2| abort "unreachable"
… while comparing attribute 'ding'
… where left hand side is
at /pwd/lang/eval-fail-assert-equal-type-nested.nix:1:10:
1| assert { ding = false; } == { ding = null; };
| ^
2| abort "unreachable"
… where right hand side is
at /pwd/lang/eval-fail-assert-equal-type-nested.nix:1:31:
1| assert { ding = false; } == { ding = null; };
| ^
2| abort "unreachable"
error: a Boolean of value 'false' is not equal to null of value 'null'

Some files were not shown because too many files have changed in this diff Show More