Compare commits

...

198 Commits
4.6.0 ... 4.7.0

Author SHA1 Message Date
Johannes Altmanninger
e071de3b68 Release 4.7.0
Created by ./build_tools/release.sh 4.7.0
2026-05-05 15:24:27 +08:00
Nahor
ed6fe3f315 tmux-history-search2: fix test on WSL or in console sessions
Those two do not use the "return symbol"

Closes #12704
2026-05-05 14:57:42 +08:00
Nahor
4f539dffaf cd: fix path when trying to cd out of the root directory
Part of #12704
2026-05-05 14:57:42 +08:00
Remo Senekowitsch
d885e0efd7 completions/date: add rfc-3339 option
Closes #12703
2026-05-05 14:57:23 +08:00
Johannes Altmanninger
330e897acc Update changelog 2026-05-05 14:55:56 +08:00
Peter Ammon
b638aa198f Make the tmux-history-search2.fish test pass reliably on macOS 2026-05-03 19:50:26 -07:00
Peter Ammon
fd44c23678 Suppress an annoying warning on nightly
Prior to this commit, running

> cargo +nightly bench --features benchmark --no-run

Reports:
warning: feature `test` is declared but not used
 --> src/lib.rs:1:58
  |
1 | #![cfg_attr(all(nightly, feature = "benchmark"), feature(test))]
  |                                                          ^^^^
  |
  = note: `#[warn(unused_features)]` (part of `#[warn(unused)]`) on by default

Which is a false positive. Allow unused features in this cfg_attr.
2026-05-03 18:03:37 -07:00
David Adam
f84179f8fe RPM/Debian packaging: add pkg-config dependency
This is required by the pcre2 crate to find the system PCRE2 library.
2026-05-03 23:05:03 +08:00
Johannes Altmanninger
71d6ec4ab9 wcsfilecmp: make sure trailing slashes sort first
This command

	cd $(mktemp -d)
	mkdir a "a b"
	complete -C": "

prints

	a b/
	a/

which is wrong ordering.
Usually the trailing slash should not be compared.

Fix this by always sorting slashes first.  Not sure if this is correct
for middle slashes but I couldn't find a case where it matters.

Closes #12695
2026-05-03 20:04:21 +08:00
Johannes Altmanninger
683e4c8d15 wcsfilecmp: extract function, use shadowing 2026-05-03 20:04:21 +08:00
xtqqczze
d7cc3c7bb6 format: use 2-space indents in toml files
Closes #12699
2026-05-03 20:04:21 +08:00
Johannes Altmanninger
9370830733 Make profiling API a bit harder to misuse 2026-05-03 20:03:46 +08:00
Johannes Altmanninger
161f31f42b run_1_job: remove code clone for profiling 2026-05-03 20:03:46 +08:00
Johannes Altmanninger
5998421410 Remove obsolete Send/Sync impls for ParsedSource 2026-05-03 20:03:46 +08:00
David Adam
5b1e163f22 docs/function: add caution about shadowing builtins
See #3000, #12962
2026-05-02 15:14:33 +08:00
Johannes Altmanninger
7c5fc85d96 builtin commandline: fix unintuitive clone 2026-05-01 18:35:41 +08:00
Johannes Altmanninger
2f9f46b2a5 eval_node: extract function for getting exec counts 2026-05-01 18:35:41 +08:00
Johannes Altmanninger
4b069b51e7 Remove "get_" prefix from some getters
In C++ we can't have a field and method sharing a name,
but in Rust we can.

For some structs, most getters don't have a "get_", so it's weird
that some do.  Remove the "get_" prefix where it's obvious enough.

While at it, give some related getters better names.
2026-05-01 18:35:41 +08:00
Johannes Altmanninger
398fc17b81 reader: use simpler test environment constructor
A following commit wants to pass parser by exclusive reference,
which disallows passing "parser" as well as "parser.vars()"
in one function call.  This use case also doesn't make sense.
The "OperationContext::test_only_foreground" constructor is used to
inject a special "PwdEnvironment" into the context.  When we don't need
this environment, we can use a regular constructor, which already uses
"parser.vars()".
2026-05-01 18:03:13 +08:00
Johannes Altmanninger
12fa0d8b3d reader: make exec_prompt_cmd a free function
A following commit will pass parser as "&mut Parser".  This would
create aliasing issues in our calls to exec_prompt_cmd; make it a
free function so rustc can understand how the borrows are split.
2026-05-01 18:03:13 +08:00
Johannes Altmanninger
0441bdc634 Remove ScopeGuard::commit in favor of drop
As of commit a296ee085c (Stop returning a value from ScopeGuarding::commit,
2025-03-15) "ScopeGuard::commit()" is equivalent to "drop()".
Let's use that instead.
2026-05-01 18:03:13 +08:00
Johannes Altmanninger
d0e47cf58a Move current_filename out of LibraryData
The ScopedRefCell wrapping from library_data
is used for two things
1. to allow mutating library_data from a &Parser (for this, a RefCell would be enough)
2. to replace "current_filename" for a scope

A following commit wants to pass parser as "&mut Parser", which
voids reason 1.  It will also remove the ScopedRefCell wrapping
from LibraryData because reason 2 alone is not strong enough.  Move
"current_filename" outside of that, next to "current_node" which is
already a ScopedRefCell.  In future we could maybe consolidate them
into one field, like (or even merging with) ScopedData.
2026-05-01 18:03:13 +08:00
Johannes Altmanninger
d35aa3860a Fix weird initial value for internal job ID
InternalJobId(0) aka InternalJobId::default() is treated specially
so it should not be given to a regular job.
2026-05-01 18:03:13 +08:00
Johannes Altmanninger
5971e79c3f Strong type for internal job IDs 2026-05-01 18:03:13 +08:00
Johannes Altmanninger
dad660cda5 Move internal job ID type
Move this type to where its non-default instances are constructed.
2026-05-01 18:03:13 +08:00
Johannes Altmanninger
8328e53050 reader: reduce variable scope 2026-05-01 18:03:13 +08:00
Johannes Altmanninger
4aadeea184 parser: remove unused field
This has been moved to InputData.
2026-05-01 18:03:13 +08:00
Johannes Altmanninger
df5067cc1c parse_execution: remove pipeline node reference from ExecutionContext
Upcoming changes will pass Parser by exclusive reference ("&mut") which
prevents aliasing; let's remove an alias which seems simpler anyway.
2026-05-01 18:03:13 +08:00
Johannes Altmanninger
3d708d6fc1 history: remove redundant argument 2026-05-01 18:03:13 +08:00
Johannes Altmanninger
a564238d82 highlight_and_colorize: remove redundant environment argument
This highlighting function is always called with with an operation
context created from a parser; Since parser.context().vars() is the same
as parser.vars(), we can use the former, reducing the number of aliases.
2026-05-01 18:03:13 +08:00
Johannes Altmanninger
64443aa173 Lower OnceLock to LazyLock
LazyLock is less powerful so we should use it when possible.

Ref: https://github.com/fish-shell/fish-shell/pull/12661#discussion_r3158097032
2026-05-01 18:03:13 +08:00
Johannes Altmanninger
d2c2b23d1f Fix "fish -d reader" crash on left mouse click
See #12693
2026-05-01 18:00:37 +08:00
Daniel Rainer
4e3898d0d7 feat: xtask gettext
Rewrite the PO file handling logic in Rust and make it available via an
xtask. Replaces the
`build_tools/{update_translations,fish_xgettext}.fish` scripts.

Main benefits:
- Better ergonomics
- Better error handling
- Eliminates the need for a fish executable for updating PO files,
  which is particularly useful in CI
- Improved performance, mainly due to concurrent threads working on the
  PO files in parallel

The behavior is mostly unchanged, with the minor exception that section
headers for empty sections are now omitted in PO files.
The interface for invoking the tooling is quite different. Instead of
working with flags, `cargo xtask gettext` has 3 subcommands:
- `update` modifies the PO files to match the current sources
- `check` is like update, but instead of modifying the PO files, it
  shows diffs between the current version of the PO files and what they
  would look like after updating. When there is a difference, the xtask
  exits non-zero, making it useful for checks to detect outdated PO
  files.
- `new` creates a new PO file for the given language.

Both the `update` and `check` command take any number of file paths to
specify the PO files to consider. If none are specified, all files in
`localization/po/` are considered.

Extracting gettext messages from Rust still requires compiling with the
`gettext-extract` feature active. In situations where compilation is
needed for other purposes as well, it can make sense to only build once
and then tell the gettext xtask about the directory into which the
messages have been extracted. This can be done via the
`--rust-extraction-dir` flag. If we stop having gettext messages in
Rust, this logic can be removed.

Closes #12676
2026-04-30 17:31:03 +00:00
Daniel Rainer
3ad45d8fb1 feat: make as_os_strs easier to use
Make trailing comma optional.

Return array, rather than reference to array, to eliminate lifetime
issues.

Closes #12688
2026-04-30 18:16:43 +08:00
xtqqczze
39bd54cb49 highlight: derive Display trait for HighlightRole 2026-04-28 21:49:51 +02:00
Johannes Altmanninger
281399561b Distinguish builtin read history session ID from private mode
Fixes #12662
2026-04-29 01:55:32 +08:00
Johannes Altmanninger
e5f57b1daf history: fix constructor naming
The only public constructor should be called new().
2026-04-29 01:55:32 +08:00
Johannes Altmanninger
6c04a72697 history: hide private constructor 2026-04-29 01:55:32 +08:00
Johannes Altmanninger
1034945690 Fix regression causing "nosuchcommand || hello" to short-circuit
Commit 3534c07584 (Adopt the new AST in parse_execution, 2020-07-03)
added to parse_execution_context_t::run_job_conjunction an early
return when any job in a job conjunction fails to launch.  This causes
"nosuchcommand || echo hello" to not execute the continuation.

Fix this by restoring the previous behavior.

Fixes #12654
2026-04-29 01:55:32 +08:00
Johannes Altmanninger
e2b18fc5b6 config.fish: don't load default theme in noninteractive shells
We define colors in noninteractive shells for historical reasons
(because colors used to be universal variables).

The other potential reason is to get regular syntax highlighting for
commands like:

	fish -c 'read --shell'

but if anyone actually uses that they can probably load a theme
explicitly.

Stop defining colors in noninteractive shells.  It's usually not
a good idea to make them behave differently from interactive ones,
but color seems only relevant for interactive shells?

Let's see if anyone complains.. we may end up reverting this if people
want to use noninteractive fish to query colors..  but I'm not sure
why that would be necessary.

Closes #12673
2026-04-29 01:48:47 +08:00
Johannes Altmanninger
319b093ef8 autoload: improve enum naming 2026-04-28 23:11:33 +08:00
Johannes Altmanninger
ab2678082e builtin string: add names to RegexError enum fields 2026-04-28 23:11:33 +08:00
Johannes Altmanninger
81e8eebd8d Use UpperCamelCase for enum variants
Missed in 17ba602acf (Use PascalCase for Enums, 2025-12-14).

Fixes #12647
2026-04-28 23:11:33 +08:00
Johannes Altmanninger
2b41f132be Remove obsolete comment working around late fish_indent bug 2026-04-28 15:41:15 +08:00
Johannes Altmanninger
688d1954a8 Fix unused import on systems without eventfd (Cygwin) 2026-04-28 15:27:35 +08:00
Johannes Altmanninger
96695a2859 Document how to remove workaround for Cygwin select() 2026-04-28 14:49:28 +08:00
Johannes Altmanninger
f2b0706494 reader: repaint commands to not disable "last_cmd"-based UI states (pager etc.)
"commandline -f repaint" might be triggered for various reasons;
since this sets "last_cmd", it will reset some UI states, notably
pager selection:

1. press tab
2. trigger repaint
3. press tab

The repaint prevents us from selecting the first candidate.

Work around this by ignoring repaint events for the last_cmd logic.

Fixes #12683
2026-04-28 14:43:55 +08:00
Johannes Altmanninger
c91bfba08c env_dispatch: reduce scope of captured $TERM local var 2026-04-28 14:19:51 +08:00
Daniel Rainer
cc40fa4a4c completions: use typst's built-in completions
https://github.com/typst/typst/pull/6568 (merged 2025-07-09), presumably
released in 0.14.0 (2025-10-24) introduces completion generation in
typst. Use them to replace our outdated manual completions.

Closes #12679

Closes #12684
2026-04-28 13:51:24 +08:00
Johannes Altmanninger
ff6ee65deb Assert that FD monitor Drop implementation is really test-only 2026-04-28 13:51:24 +08:00
Nahor
1771a325aa CI: enable check.sh on Windows
Closes #12171
2026-04-28 13:51:24 +08:00
Nahor
58648054c0 fd_monitor: wait for select() to return when removing an item
It is unspecified what `select()` returns if a descriptor is closed
while `select()` uses it. This can result in spurious error messages,
notably in Cygwin.

Also delete corresponding tests since they don't really help with
anything. Any `select()` result is valid when a socket is closed, so
checking that result is pointless. Moreover, fish already does not rely
on any specific result beyond logging.

Part of #12171
2026-04-28 13:51:24 +08:00
Nahor
27fb4d6731 Always heightenize file descriptors
Fixes #12618

Closes #12681
2026-04-28 13:51:24 +08:00
Nahor
6701b7f6c8 parser: remove unused cwd_fd field
Part of #12681
2026-04-28 13:51:24 +08:00
Johannes Altmanninger
e175a317af proc: use shorthand method for reading file /proc/pid/stat 2026-04-28 13:51:24 +08:00
Jaakko Koivisto
7b98a275fe Added 'updates' -directory to the kernel module locations.
Linux kernel modules installed by target 'modules_install' are installed
to '/usr/lib/<kernel>/updates'. This applies to both out-of-tree kernel
modules, or when building in-tree modules individually.

Module tools like 'modprobe' and 'modinfo' search the
'updates'-directory automatically, so it should be expected that fish
autocomplete to provide these modules as well.

Closes #12682
2026-04-28 13:51:24 +08:00
Johannes Altmanninger
b78dc4fbec completions/sudo-rs: fix when sudo is not installed
Fixes #12678
2026-04-28 13:51:24 +08:00
Johannes Altmanninger
12e97ea7fc fd monitor: hide test-only method 2026-04-28 13:15:28 +08:00
Johannes Altmanninger
af8594c611 Fix inconsistent case 2026-04-27 15:18:01 +08:00
Johannes Altmanninger
006fa86ef4 tests/checks/tmux-source.fish: reduce flakiness
As seen in
https://github.com/fish-shell/fish-shell/actions/runs/24944417077/job/73043241890?pr=12171

	Failure:

	  The CHECK on line 12 wants:
	    prompt 1> source -

	  which failed to match line stdout:3:
	    source -

	  Context:
	    prompt 0> source
	    source: missing filename argument or input redirection
	    source - <= no check matches this, previous check on line 11
	    prompt 1> source -
	    prompt 1>
2026-04-26 13:15:23 +08:00
Daniel Rainer
9b04300dc3 refactor: use anyhow for xtask errors
Terminating the process at arbitrary points with `std::process::exit`
when errors occur has several problems. There is a lack of information
about what lead up to the error, and it prevents destructors from
running, which in the cases of xtasks can for example result in
temporary files being left on the file system.

Instead, use `anyhow` which conveniently integrates with Rust's Result
type, allowing to return `anyhow::Result<T>`, which is an alias for
`Result<T, anyhow::Error>`, which is compatible with any error type that
implements `std::error::Error`. The advantages of using `anyhow` over
plain `Result`s are that it makes it easier to handle different error
types, attach context to errors, and show the call/context stack
associated with the error. Returning an `anyhow::Result<()>` from `main`
is possible because it implements `std::process::Termination`, so we get
automatic error reporting and corresponding exit codes by simply
bubbling up errors to `main`, attaching context as desired, and finally
returning the result from `main.`

In addition to removing the `std::process::exit` calls, this commit also
improves error handling in a few spots in other ways, such as replacing
`unwrap` by returning errors.

Closes #12674
2026-04-26 13:12:25 +08:00
Daniel Rainer
c80496fad1 cleanup: remove useless variable
Closes #12675
2026-04-25 17:08:03 +08:00
Johannes Altmanninger
ca56949028 release-notes.sh: fix language 2026-04-24 18:28:02 +08:00
Nathaniel
fa74d0fe54 complections/systemctl add missing subcommands
reorder subcommand descriptions

remove unused subcommands

add extra subcommand descriptions

remove old version check

Closes #12672
2026-04-24 13:34:22 +08:00
cunlem
59f3719e95 Allow opening script read-only with editor
Closes #12671
2026-04-24 13:30:25 +08:00
Johannes Altmanninger
170c171e85 shellcheck: lower OnceLock to LazyLock 2026-04-24 13:28:54 +08:00
Johannes Altmanninger
c33ca660e3 Replace OnceLock<()> with better(?) alternatives 2026-04-24 13:26:22 +08:00
Johannes Altmanninger
f7c336021b threads: ThreadId type 2026-04-23 19:12:40 +08:00
Johannes Altmanninger
523e25df17 reader: fix improper use of get_or_init() 2026-04-23 19:12:40 +08:00
Johannes Altmanninger
c8b28d4d24 cargo-test: remove unnecessary TTY initialization 2026-04-23 19:12:40 +08:00
Johannes Altmanninger
ba35214e1e Fix exit handlers being called on panic in background threads
Commit 1286745e78 (Remove bits for async-signal-safety of old SIGTERM
handler, 2026-04-11) introduced inconsistency; fix that.
2026-04-23 16:17:45 +08:00
Johannes Altmanninger
d05d8557a7 build_tools/*.sh: fix inconsistent bash shebang 2026-04-22 14:47:51 +08:00
Daniel Rainer
a3dc57873c lint: run shellcheck in CI
Closes #12661
2026-04-22 14:38:22 +08:00
Daniel Rainer
0c078c179d lint: run shellcheck xtask in main checks
Part of #12661
2026-04-22 14:28:45 +08:00
Daniel Rainer
ca443e2e54 lint: add xtask for running ShellCheck
ShellCheck does not have a built-in way of detecting which files it
should check, so we use ripgrep's `ignore` library to find files not
ignored by our gitignore rules, and then look for a non-fish shebang in
the first line of the file. The resulting shell scripts are then passed
to ShellCheck.

Part of #12661
2026-04-22 14:28:45 +08:00
Daniel Rainer
63c3306e6c lint: fix ShellCheck warnings
Part of #12661
2026-04-22 14:23:25 +08:00
Johannes Altmanninger
923d0b7974 config: use default XDG_DATA_DIRS when unset or empty
Installing a program like sway to /usr/local installs fish
completions to /usr/local/share/fish/vendor_completions.d/sway.fish.
When $XDG_DATA_DIRS is empty, these will typically not
be picked up.

(Since "__extra_completionsdir" is usually
"/usr/share/fish/vendor_completions.d/", this issue typically only
affects "/usr/share", not "/usr".)

Fix this by using the correct fallback value for XDG_DATA_DIRS.

Fixes #11349

Closes #12656
2026-04-22 14:21:09 +08:00
joveian
52998635f9 Avoid losing work in funced when no changes between parse errors
From the inital dd69ca5 commit that started checking if the file was modified
the initial checksum to compare against has been updated in the loop, causing
funced to lose work silently if you get a parse error, can't find the issue,
and want to look at the error message again.

Closes #12663
2026-04-22 00:53:56 +08:00
Saúl Nogueras
1ccf4ad480 Fix wget completion typo: non-verbose -> no-verbose
Closes #12664
2026-04-22 00:48:26 +08:00
Johannes Altmanninger
23b5b01242 Prune stale gitignore rules
After a few changes to our build system, lots of gitignore rules
are obsolete. Meanwhile, in-tree CMake builds are missing some rules
like "/cargo/".

Drop the obsolete ones, and add the in-tree CMake ones for now.
Also add ".venv/" (used by build_tools/release.sh).
Also limit some rules like .vscode to top-level (?).
2026-04-22 00:09:57 +08:00
Daniel Rainer
ca2b5dc40b checks: run with all features enabled
As discussed in #12649, we should check builds with all Cargo features
enabled. Previously, this did cause issues with the `benchmark` feature,
since that only works with nightly Rust. #12653 resolves that by only
enabling the `benchmark` feature with the nightly toolchain, so now we
can use `--all-features` with stable Rust.

Closes #12657
2026-04-20 21:21:24 +08:00
Armandas Jarušauskas
0dfe06f4c9 webconfig: highlight table entries on hover
- Makes it easier to identify which history entry is being deleted.
- Remove gap between rows that becomes visible on hover.
- Makes delete button a bit nicer looking by centering it and giving it a bit more space from the edge.

Closes #12659
2026-04-20 21:21:24 +08:00
xtqqczze
4e47f47d85 clippy: fix question_mark lint
https://rust-lang.github.io/rust-clippy/master/index.html#question_mark

Closes #12658
2026-04-20 21:21:24 +08:00
xtqqczze
f3e43e932f clippy: fix byte_char_slices lint
https://rust-lang.github.io/rust-clippy/master/index.html#byte_char_slices

Part of #12658
2026-04-20 21:21:24 +08:00
Johannes Altmanninger
1dfc75bb9c Better name for async-signal-safe functions
In Rust, "safety" is usually used in the context of unsafe functions,
which have documented preconditions.  Our async-signal-safe functions
are different; they offer extra safety properties. Rename them to
reduce confusion.

Ref: https://github.com/fish-shell/fish-shell/pull/12625#discussion_r3067819966
2026-04-20 21:21:24 +08:00
Johannes Altmanninger
fa33f6f0e0 tests/checks/disown.fish: improve test robustness
If the job never gets into stopped state, it will keep running forever.
Narrow the wait condition, to prevent a timeout in failure scenarios.
2026-04-20 17:03:09 +08:00
Johannes Altmanninger
31363120aa build_tools/version-available-in-debian.sh: fix for BSD sed
Fixes https://github.com/fish-shell/fish-shell/pull/12651#issuecomment-4275827646
2026-04-20 09:57:42 +08:00
xtqqczze
2304077e0d gate benchmark feature on nightly toolchain
Closes #12653
2026-04-19 17:38:04 +08:00
xtqqczze
86c052b6ba fix non_upper_case_globals lint
Closes #12648
2026-04-19 17:37:41 +08:00
xtqqczze
68472da48a highlight: derive Display trait for HighlightRole
Closes #12645
2026-04-19 17:14:42 +08:00
Daniel D. Beck
4b172fc735 set_color: document output more prominently
Issue: https://github.com/fish-shell/fish-shell/issues/2378

Closes #12644
2026-04-19 17:14:42 +08:00
Nahor
944ab91fab tests: various fixes for Cygwin itself and ACL mounts
Most notably:
- Unlike MSYS, Cygwin seems to always properly handle symlinks (at least
in common scenarios)
- With ACL, "x" permission also requires "r" do to anything, be it files
or directories

Closes #12642
2026-04-19 17:09:36 +08:00
Johannes Altmanninger
34535fcb61 tests/checks/disown: fix signal delivery race
Intermittent test failure suggests that kill(3p) returns before the
signal is delivered.  Fix the failure by waiting until the signal
has been delivered before continuing the test.

Fixes #12635
2026-04-19 17:07:39 +08:00
Johannes Altmanninger
9e4eb37696 complete: remove stale comment
Commit a4b6348315 (clippy: fix collapsible_match lint, 2026-04-18) made
it so '$' characters are handled here, which contradicts the comment.
Remove it.
2026-04-19 15:55:13 +08:00
Johannes Altmanninger
dda76d7f18 Update to Rust 1.95 2026-04-19 15:53:36 +08:00
Johannes Altmanninger
fdb1d95521 updatecli.d/rust.yml: fix staleness check when using rustup 1.29 2026-04-19 15:53:08 +08:00
xtqqczze
937f3bc6cb Update to Rust 1.94 2026-04-19 00:17:30 +00:00
Daniel Rainer
ebc32adc09 clippy: fix map_unwrap_or lint
https://rust-lang.github.io/rust-clippy/rust-1.95.0/index.html#map_unwrap_or
2026-04-18 23:27:51 +00:00
xtqqczze
a4b6348315 clippy: fix collapsible_match lint
https://rust-lang.github.io/rust-clippy/rust-1.95.0/index.html#collapsible_match
2026-04-18 23:16:34 +00:00
xtqqczze
b21a4a7197 benchmark: fix unresolved import error
```rust
error[E0432]: unresolved import `crate::common::bytes2wcstring`
   --> src/common.rs:714:9
    |
714 |     use crate::common::bytes2wcstring;
    |         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ no `bytes2wcstring` in `common`
```
2026-04-18 21:28:15 +00:00
xtqqczze
0cd227533f highlight: implement Display trait for HighlightRole 2026-04-17 20:27:24 -07:00
Johannes Altmanninger
5eb7687a64 tests/checks/tmux-complete4.fish: fix for macOS sed 2026-04-17 03:22:56 +08:00
Johannes Altmanninger
8d6426295e complete: automatically resolve REPLACES_TOKEN flag
This flag is implied by matches that require replacements.  Reflect that
in the Completion::new, reducing the number of places where we raise the
flag.  This slightly simplifies tasks like proving the parent commit.

There are other scenarios (e.g. wildcards) where we currently set
the flag additionally.
2026-04-17 01:31:29 +08:00
Johannes Altmanninger
85e76ba356 Fix option substr completions not being filtered out
Commit 3546ffa3ef (reader handle_completions(): remove dead filtering code,
2026-01-02) gives a proof of correctness that still makes sense;
The first lemma ("if will_replace_token") is trivially true, so no need to
assert it.
The second lemma ("if !will_replace_token") is violated in some edge cases:
we claim that given a token "-c", the option completion "--clip" is an exact match,
which is not true, it's a substring match.

Fix that, asserting the claim.
2026-04-17 01:31:29 +08:00
Johannes Altmanninger
fee4288122 complete: reuse string fuzzy match when completing ~$USER
If we get to this code path, we'll only get completions for user
names, so technically the full StringFuzzyMatch with its ranking of
samecase/smartcase/icase (only showing the best) might be overkill,
but it seems like a good idea to treat this the same way as other
completions.

The occasion for this commit is to correct a wrong
StringFuzzyMatch::exact_match() in the icase branch; which will be
important for a following commit.  Add a test for that.
2026-04-17 01:28:54 +08:00
Johannes Altmanninger
413246a93d reader handle_completions(): move loop-invariant code 2026-04-17 01:28:02 +08:00
Daniel Rainer
3cb939c9a8 fix: actually run without symlinks
The old behavior seems to have been introduced inadvertently:
https://github.com/fish-shell/fish-shell/pull/12636#issuecomment-4254328105

Closes #12636
2026-04-16 15:10:15 +08:00
Daniel Rainer
4790a444d8 lint: disable incorrect warning about unused fn
`cleanup` is used via `trap`.

Part of #12636
2026-04-16 15:10:15 +08:00
Daniel Rainer
da924927a0 cleanup: split up assignment and export
This prevents hiding failures of the `rustc` command.

Part of #12636
2026-04-16 15:10:15 +08:00
Daniel Rainer
29ff2fdd43 lint: disable warning about variable export
Here, we want `"$@"` to be expanded, since its components are the
arguments we want to pass to `export`.

Part of #12636
2026-04-16 15:10:14 +08:00
Daniel Rainer
732c04420b lint: disable warnings about desired behavior
We deliberately create subshells for the export in these cases, so we
don't want warnings about it.

Part of #12636
2026-04-16 15:10:14 +08:00
Daniel Rainer
947abd7464 cleanup: quote shell variables
This is not a functional change, since the variable names don't have
spaces, but it is more robust to changes and removes ShellCheck warnings

Part of #12636
2026-04-16 15:10:14 +08:00
Daniel Rainer
12cfe59578 fix: don't use echo -e in POSIX shell
The `-e` flag is not defined for `echo` in POSIX shell. Use `printf`
instead.

Part of #12636
2026-04-16 15:10:13 +08:00
Daniel Rainer
4b60d18b44 cleanup: remove unused variable
Part of #12636
2026-04-16 15:10:13 +08:00
Daniel Rainer
dd8e59db03 fix: check if system_tests args are empty
When `system_tests` is called without arguments, `[ -n "$@" ]` becomes
`[ -n ]`, which is true, resulting in running `export`, which lists all
exported variables, unnecessarily cluttering the output.
If `system_tests` is called with more than one argument, the check would
fail because having more than one argument after `-n` is invalid syntax.
Fix this by using `$*`, which concatenates all positional arguments to
`system_tests` into a single value.

Part of #12636
2026-04-16 15:10:13 +08:00
Johannes Altmanninger
47a3757f73 update changelog 2026-04-14 16:56:43 +08:00
Johannes Altmanninger
f278c29733 key: address "non_upper_case_globals" lint on named key constants 2026-04-14 16:56:43 +08:00
Peter Ammon
2bab8b5830 prompt_pwd: strip control characters
If a directory has a control sequence in it, then prompt_pwd (used in
the default prompt) would emit it to the console, which could cause
the terminal to interpret the escape sequence.

Strip control sequences from within prompt_pwd, in the same way as
we do in __fish_paste.fish, to sanitize it.

Closes #12629
2026-04-14 16:56:43 +08:00
Johannes Altmanninger
1dac221684 doc terminal_compatibility: tab title is OSC 1 2026-04-14 16:56:43 +08:00
David Adam
8dbbe71bc6 disable Linux development builds for now
I'll add the rest of the infrastructure later.
2026-04-13 21:12:51 +08:00
David Adam
d9d9eced98 workflow: build development builds on master branch 2026-04-12 21:11:26 +08:00
David Adam
64a829f0df add workflow to create development build source packages 2026-04-12 18:50:56 +08:00
Nahor
440e7fcbc1 Fix failing system tests on Cygwin
The main changes are:
- disabling some checks related to POSIX file permissions when a filesystem is
mounted with "noacl" (default on MSYS2)
- disabling some checks related to symlinks when using fake ones (file copy)

Windows with acl hasn't been tested because 1) Cygwin itself does not have any
Rust package yet to compile fish, and 2) MSYS2 defaults to `noacl`

Part of #12171
2026-04-11 18:55:00 +08:00
Nahor
52495c8124 tests: make realpath tests easier to debug
- Use the different strings for different checks to more easily narrow down
where a failure happens
- Move CHECK comments outside a `if...else...end` to avoid giving the impression
that the check only runs in the `if` case.

Part of #12171
2026-04-11 18:44:22 +08:00
Vishrut Sachan
467b03d715 git completions: prioritize recent commits for rebase --interactive
Fixes #12537

Closes #12619
2026-04-11 18:33:05 +08:00
Johannes Altmanninger
b5c40478f6 Fix typo, closes #12586, closes #12577 2026-04-11 18:33:05 +08:00
Johannes Altmanninger
1286745e78 Remove bits for async-signal-safety of old SIGTERM handler
This is implied by the parent commit.

To enable this, stop trying to run cleanup in panic handlers if we
panic on a background thread.
2026-04-11 18:25:26 +08:00
Yakov Till
b99ae291d6 Save history on SIGTERM and SIGHUP before exit
Previously, SIGTERM immediately re-raised with SIG_DFL, killing
fish without saving history. SIGHUP deferred via a flag but never
re-raised, so the parent saw a normal exit instead of signal death.

Unify both signals: the handler stores the signal number in a single
AtomicI32, the reader loop exits normally, throwing_main() saves
history and re-raises with SIG_DFL so the parent sees WIFSIGNALED.

Fixes #10300

Closes #12615
2026-04-11 18:06:02 +08:00
Daniel Rainer
8ae71c80f4 refactor: extract string escape and unescape funcs
Move the functions for escaping and unescaping strings from
`src/common.rs` into `fish_common`. It might make sense to move them
into a dedicated crate at some point, but for now just move them to the
preexisting crate to unblock other extraction.

Closes #12625
2026-04-11 17:49:50 +08:00
Daniel Rainer
cf6170200c refactor: move const to fish_widestring
Another step to eliminate dependency cycles between `src/expand.rs` and
`src/common.rs`.

Part of #12625
2026-04-11 17:49:49 +08:00
Daniel Rainer
c13038b968 refactor: move move char consts to widestring
This time, move char constants from `src/expand.rs` to
`fish_widestring`, which resolves a dependency cycle between
`src/expand.rs` and `src/common.rs`.

Part of #12625
2026-04-11 17:49:48 +08:00
Daniel Rainer
816077281d refactor: move encoding functions to widestring
The decoding functions for our widestrings are already in the
`fish_widestring` crate, so by symmetry, it makes sense to put the
encoding functions there as well. This also makes it easier to depend on
these functions, giving more options when it comes to further code
extraction.

Part of #12625
2026-04-11 17:49:47 +08:00
Daniel Rainer
78ea24a262 refactor: move char definitions into widestring
Use `fish_widestring` as the place where char definitions live. This has
the advantage that all our code can depend on `fish_widestring` without
introducing dependency cycles. Having a common place for character
definitions also makes it easier to see which chars have a special
meaning assigned to them.

This change also unblocks some follow-up refactoring by removing a
dependency cycle between `src/common.rs` and `src/wildcard.rs`.

Part of #12625
2026-04-11 17:49:46 +08:00
Daniel Rainer
6a5b9bcde1 refactor: move PUA-decoding function to widestring
These functions don't depend on `wcstringutil` functionality, so there
is no need for them to be there. The advantage of putting them into our
`widestring` crate is that quite a lot of code depends on it, and
extracting some of that code would result in crate dependency cycles if
the functions stayed in the `wcstringutil` crate. Our `widestring` crate
does not depend on any of our other crates, so there won't be any cyclic
dependency issues with code in it.

Part of #12625
2026-04-11 17:49:45 +08:00
Daniel Rainer
b7b786aabf cleanup: move escape_string_with_quote
It makes a lot more sense to have this function in the same module as
the other escaping functions. There was no usage of this function in
`parse_util` except for the test, so it makes little sense to keep the
function there. Moving it also eliminates a pointless cyclic dependency
between `common` and `parse_util`.

Part of #12625
2026-04-11 17:49:43 +08:00
Daniel Rainer
dc63b7bb20 cleanup: don't export write_loop twice
Exporting it as both `safe_write_loop` and `write_loop` is redundant and
causes inconsistencies. Remove the `pub use` and use `write_loop` for
the function name. It is shorter, and in Rust the default assumption is
that code is safe unless otherwise indicated, so there is no need to be
explicit about it.

Part of #12625
2026-04-11 17:49:42 +08:00
Daniel Rainer
9c819c020e refactor: don't reexport fish_common in common
Not reexporting means that imports have to change to directly import
from `fish_common`. This makes it easier to see which dependencies on
`src/common.rs` actually remain, which helps with identifying candidates
for extraction.

While at it, group some imports.

Part of #12625
2026-04-11 17:49:41 +08:00
Daniel Rainer
dc9b1141c8 refactor: extract fish_reserved_codepoint
Part of #12625
2026-04-11 17:49:40 +08:00
Daniel Rainer
3d364478ee refactor: remove dep on key mod from common
Removing this dependency allows extracting the `fish_reserved_codepoint`
function, and other code depending on it in subsequent commits.

Part of #12625
2026-04-11 17:49:39 +08:00
Daniel Rainer
faf331fdad refactor: use macro for special key char def
Reduce verbosity of const definitions. Define a dedicated const for the
base of the special key encoding range. This range is 256 bytes wide, so
by defining consts via an u8 offset from the base, we can guarantee that
the consts fall into the allocated range. Ideally, we would also check
for collisions, but Rust's const capabilities don't allow for that as
far as I'm aware.

Having `SPECIAL_KEY_ENCODE_BASE` in the `rust-widestring` crate allows
getting rid of the dependency on `key::Backspace` in the
`fish_reserved_codepoint` function, which unblocks code extraction.

Part of #12625
2026-04-11 17:49:37 +08:00
Daniel Rainer
47b6c0aec2 cleanup: rename and document UTF-8 decoding function
While the function is only used to decode single codepoints, nothing in
its implementation limits it to single codepoints, so the name
`decode_one_codepoint_utf8` is misleading. Change it to the simpler and
more accurate `decode_utf8`. Add a doc comment to describe the
function's behavior.

Part of #12625
2026-04-11 17:49:36 +08:00
Daniel Rainer
eb478bfc3e refactor: remove syntactic deps on main crate
Part of #12625
2026-04-11 17:49:35 +08:00
Johannes Altmanninger
63cf79f5f6 Reuse function for creating sighupint topic monitor 2026-04-11 17:27:25 +08:00
Johannes Altmanninger
ff284d642e tests/checks/tmux-abbr.fish: fix on BusyBox less (alpine CI) 2026-04-11 17:03:46 +08:00
Johannes Altmanninger
cc64da62a9 fish_color_valid_path: also apply bg and underline colors
Closes #12622
2026-04-11 17:03:46 +08:00
Johannes Altmanninger
a974fe990f fish_color_valid_path: respect explicit normal foreground 2026-04-11 17:03:46 +08:00
David Adam
39239724ec make_vendor_tarball: drop unused tar search 2026-04-10 05:56:00 +08:00
Daniel Rainer
524a7bac6e l10n: restore translations from error rewrite
Most translations were adjusted correctly, but a few were missed, so
restore them here.

Closes #12623
2026-04-09 01:36:53 +08:00
Nahor
d649c2aab4 string: remove StringError::NotANumber
Use the more generic `StringError::InvalidArgs` instead

Closes #12556
2026-04-08 14:11:31 +08:00
Nahor
30e6aa85e2 error rewrite: use new Error to report errors
To homogenize error reporting format, use a new Error struct. Currently this
is used for builtins and ensuring a common cmd/subcmd prefix.

Part of #12556
2026-04-08 14:11:31 +08:00
Nahor
abd7442521 printf: de-duplicate code
Part of #12556
2026-04-08 14:05:05 +08:00
Johannes Altmanninger
f5c48038b5 Skip tmux-abbr's "pipe builtin into less" test on BusyBox less
Fails on Alpine CI.
2026-04-08 14:04:15 +08:00
Daniel Rainer
895a6e7034 l10n(fr): use non-breaking space before colon
Closes #12617
2026-04-08 13:54:04 +08:00
Johannes Altmanninger
2193e88423 Pass job group down to builtin_print_help
Repro:

	$ abbr -h | less
	fish: Job 1, 'abbr --help | less' has stopped

Bisects to 5ece9bec6c (__fish_print_help: use man as-is, 2025-08-29).

Root cause is similar to
https://github.com/fish-shell/fish-shell/issues/8699#issuecomment-1030762748
except it's specific to how __fish_print_help is invoked, since
"__fish_print_help abbr | less" works fine.

Fix this by letting __fish_print_help access the terminal.

Fixes #12612

Solved by Claude in a single prompt:

---

 ▐▛███▜▌   Claude Code v2.1.92
▝▜█████▛▘  Sonnet 4.6
  ▘▘ ▝▝    ~/git/fish-shell

❯ "abbr -h | less" gets a signal, resulting in the message "fish: Job 1, 'abbr -h | less' has stopped"
whereas "__fish_print_help abbr | less" works just fine, even though "abbr -h" should run exactly that.
Why this difference?

[...]

● Now I can give you a clear answer.

  ---
  The difference comes down to job group inheritance when man is
  spawned.

  The deferred execution order

  In abbr -h | less, fish uses the "deferred process" optimization:
  since abbr is a non-external process piping into an external one,
  fish launches less first,
   transfers the terminal to less's process group, then runs the
   deferred abbr builtin. By the time abbr's help function spawns man,
   the terminal already
  belongs to less's pgroup.

  Same thing happens with __fish_print_help abbr | less — the outer
  less is launched first and gets the terminal.

  Where the paths diverge

  Function case (__fish_print_help abbr | less):
  get_performer_for_function (exec.rs:1050) captures job.group.clone()
  and passes it to eval_node. So when command man runs inside the
  function, it's given the outer pipeline's job group — the same
  one that owns the terminal. man's inner less pager joins that group
  and can read from the terminal freely. ✓

  Builtin case (abbr -h | less): builtin_print_help calls
  parser.eval(&cmd, streams.io_chain) (shared.rs:652). parser.eval
  is defined as:

  pub fn eval(&self, cmd: &wstr, io: &IoChain) -> EvalRes {
      self.eval_with(cmd, io, None, BlockType::top, false)
      //                      ^^^^ job_group = None
  }

  job_group = None means man creates a brand-new job group with no
  terminal access. When man spawns its less pager, the pager tries
  to read keyboard input from
   the terminal, but the terminal belongs to the outer less's process
   group. The pager receives SIGTTIN → stops → the whole job is
   reported as stopped.

  The fix

  builtin_print_help should pass streams.job_group to the eval call
  (which is already populated at exec.rs:1192). Changing line 652 from:

  let res = parser.eval(&cmd, streams.io_chain);

  to:

  let res = parser.eval_with(&cmd, streams.io_chain, streams.job_group.as_ref(), BlockType::top, false);

  would give man the outer job's job group, letting its pager access
  the terminal just like the function case.

✻ Sautéed for 13m 0s
2026-04-07 20:21:38 +08:00
Kayce Basques
86c3778c2a docs: Format path as inline code
Closes #12609
2026-04-07 18:52:36 +08:00
Nahor
d2653b7cac tmux-set.fish: fix spurious CI failure
Part of #12556
2026-04-07 17:49:57 +08:00
Johannes Altmanninger
cf16949ce7 contrib/debian/control: remove insufficient mdoc dependency
On Debian, mandoc provides "/usr/bin/mman", not "/usr/bin/man", so that
package alone is not enough.  Users that want to use mandoc could use a
package that "Provides: man", for example by creating symlink to "mman".

See https://github.com/fish-shell/fish-shell/issues/12596#issuecomment-4188332803
2026-04-07 17:49:57 +08:00
Daniel Rainer
85311546de refactor: extract fish-feature-flags crate
Another step in splitting up the main library crate.

Note that this change requires removing the `#[cfg(test)]` annotations
around the `LOCAL_OVERRIDE_STACK` code, because otherwise the code would
be removed in test builds for other packages, making the `#[cfg(test)]`
functions unusable from other packages, and functions with such feature
gates in their body would have the code guarded by these gates removed
in test builds for tests in other packages.

Closes #12494
2026-04-07 17:49:57 +08:00
Daniel Rainer
c44aa32a15 cleanup: remove syntactic dependency on main crate
This is done in preparation for extracting this file into its own crate.

Part of #12494
2026-04-07 17:49:57 +08:00
Daniel Rainer
65bc9b9e3e refactor: stop aliasing feature_test function
Having a public function named `test` is quite unspecific. Exporting it
both as `test` and `feature_test` results in inconsistent usage. Fix
this by renaming the function to `feature_test` and removing the alias.

Part of #12494
2026-04-07 17:49:57 +08:00
Daniel Rainer
8125f78a84 refactor: use override stack for feature tests
Several features of fish can be toggled at runtime (in practice at
startup). To keep track of the active features, `FEATURES`, an array of
`AtomicBool` is used. This can safely be shared across threads without
requiring locks.

Some of our tests override certain features to test behavior with a
specific value of the feature. Prior to this commit, they did this by
using thread-local versions of `FEATURES` instead of the process-wide
version used in non-test builds. This approach has two downsides:
- It does not allow nested overrides.
- It prevents using the code across package boundaries.
The former is a fairly minor issue, since I don't think we need nested
overrides. The latter prevents splitting up our large library crate,
since `#[cfg(test)]`-guarded code can only be used within a single
package.

To resolve these issues, a new approach to feature overrides in
tests is introduced in this commit: Instead of having a thread-local
version of `FEATURES`, all code, whether test or not, uses the
process-wide `FEATURES`. For non-test code, there is no change. For test
code, `FEATURES` is now also used. To override features in tests, a new
`with_overridden_feature` function is added, which replaces
`scoped_test` and `set`. It works by maintaining a thread-local stack of
feature overrides (`LOCAL_OVERRIDE_STACK`). The overridden `FeatureFlag`
and its new value are pushed to the stack, then the code for which the
override should be active is run, and finally the stack is popped again.
Feature tests now have to scan the stack for the first appearance of the
`FeatureFlag`, or use the value in `FEATURES` if the stack does not
contain the `FeatureFlag`. In most cases, the stack will be empty or
contain very few elements, so scanning it should not take long. For now,
it's only active in test code, so non-test code is unaffected. The plan
is to change this when the feature flag code is extracted from the main
library crate. This would slightly slow down feature tests in non-test
code, but there the stack will always be empty, since we only override
features in tests.

Part of #12494
2026-04-07 17:49:57 +08:00
Daniel Rainer
f0f48b4859 cleanup: stop needlessly exporting struct fields
Part of #12494
2026-04-07 17:49:57 +08:00
David Adam
a009f87630 build_tools: add sh script to build linux packages 2026-04-07 10:57:41 +08:00
David Adam
edb66d4d4e remove dput_cf_gen, not actually helpful 2026-04-07 10:57:41 +08:00
Nahor
f3f675b4cc Standardized error messages: constant names
Part of #12556
2026-04-05 13:15:47 +08:00
Nahor
434610494f argparse: fix error status code
To homogenize error reporting format, use a new Error struct. Currently this
is used for builtins and ensuring a common cmd/subcmd prefix.

Part of #12556
2026-04-05 13:15:47 +08:00
Dennis Yildirim
3cce1f3f4c Added completions for git verify-commit and verify-tag
Closes #12607
2026-04-05 13:14:31 +08:00
Johannes Altmanninger
a5bde7954e Update changelog 2026-04-05 13:07:29 +08:00
Nahor
99d63c21f1 Add tests to exercise all builtin error messages
With a few exceptions, only one test is added for a given message, even
when there are multiple ways to trigger the same message (e.g. different
invalid option combinations, or triggered in shared functions such as
`builtin_unknown_option`)

Includes a few very minor fixes, such as missing a newline, or using the
wrong var name.

Closes #12603
2026-04-05 00:22:42 +08:00
Nahor
c3e3658157 ulimit: remove unreachable error message
When there is no limit value, ulimit will have printed the current one
and exited

Part of #12603
2026-04-05 00:22:41 +08:00
Nahor
8d92016e72 string: error messages fixes
- fix wrong pattern used in `string replace` error message
- replace unreachable error with `unreachable!` in `string`
- fix cmd being used in place of subcmd 

Part of #12603
2026-04-05 00:22:41 +08:00
Nahor
f6a72b4e19 status: replace unreachable code with an assert!
Part of #12603
2026-04-05 00:22:40 +08:00
Nahor
2f9c2df10d set: report an error when called with -a,-p and no NAME
Previously executing `set -a` or `set -p` would just list all the
variables, which does not make sense since the user specifically ask
for an action (append/prepend).

Update the help page synopsis

Part of #12603
2026-04-05 00:22:40 +08:00
Johannes Altmanninger
0367aaea7d Disable relocatable tree logic when DATADIR or SYSCONFDIR are non-default
If all of

	$PREFIX/bin/fish
	$PREFIX/share/fish
	$PREFIX/etc/fish

exist, then fish assumes it's in a relocatable directory tree.
This is used by homebrew (PREFIX=/usr/local) and maybe also nix(?).

Other Linux distros prefer to use /etc/fish instead of $PREFIX/etc/fish
[1].  To do so, they need to pass -DCMAKE_INSTALL_SYSCONFDIR=/etc.
The relocatable tree logic assumes default data and sysconf dirs
(relative to a potentially relocatable prefix). If the user changes
any of those, and the relocatable tree logic happens to kick in,
that'd overrule user preference, which is surprising.

So a non-default data or sysconf path is a strong enough signal that
we want to disable the relocatable tree logic. Do that.

Closes #10748

[1]: ff2f69cd56/PKGBUILD (L43)
2026-04-04 01:22:54 +08:00
Johannes Altmanninger
e25b4b6f05 tests/checks/realpath.fish: fix on macOS 2026-04-03 23:26:37 +08:00
Yakov Till
90cbfd288e Fix test_history_path_detection panic: call test_init()
test_history_path_detection calls add_pending_with_file_detection(),
which spawns a thread pool task via ThreadPool::perform(). This
requires threads::init() to have been called, otherwise
assert_is_background_thread() panics.

Add the missing test_init() call, matching other tests that use
subsystems requiring initialization.

Closes #12604
2026-04-03 14:48:53 +08:00
Nahor
68453843d4 set: fix unreachable error messages
- Remove unreachable error message in `handle_env_return()`
While we could have put an empty block in `handle_env_return()` and
removed the condition on `NotFound` in `erase()`, we prefered to use
`unreachable!` in case `handle_env_return()` gets called in new scenarios
in the future

- Make reachable the error message when asking to show a slice

Part of #12603
2026-04-03 13:53:42 +08:00
Nahor
fb57f95391 realpath: fix random error message
With empty argument, `realpath` skips all processing, so the error
message, based on `errno`, was unrelated and changed depending on what
failed before. E.g:

```
$ builtin realpath "" /tmp "" /no-exist ""
builtin realpath: : Resource temporarily unavailable
/tmp
builtin realpath: : Invalid argument
/dont-exist
builtin realpath: : No such file or directory
```

Part of #12603
2026-04-03 13:53:41 +08:00
Nahor
1ed276292b read: remove unnecessary code
`to_stdout` is set to `true` if and only if `argv` is not empty.
- `argv` length and `to_stdout` are redundant, so we can remove `to_stdout`
- some tests in `validate_read_args` are necessarily false

Part of #12603
2026-04-03 13:53:41 +08:00
Branch Vincent
09e46b00cc completions: update ngrok
Closes #12598
2026-04-03 13:53:41 +08:00
Johannes Altmanninger
695bc293a9 contrib/debian/control: allow any "man" virtual pkg (man-db/mandoc)
We support multiple "man" implementations; at least man-db's and
mandoc's.

So we can relax the mandoc dependency to a dependency on the virtual
package providing "man". Note that as of today, "mandoc" fails to
have a "Provides: man".

However since Debian policy says in
https://www.debian.org/doc/debian-policy/ch-relationships.html

> To specify which of a set of real packages should be the default
> to satisfy a particular dependency on a virtual package, list the
> real package as an alternative before the virtual one.

we want to list possible real packages anyway, so do that.

Closes #12596
2026-04-03 12:20:08 +08:00
Nahor
344ff7be88 printf: remove unreachable code
Remove an unreachable, yet translated, error string and make the code
more idiomatic

Closes #12594
2026-04-01 09:37:57 +08:00
Nahor
014e3b3aff math: fix error message
Fix badly formatted error message, and make it translatable

Part of #12594
2026-04-01 09:37:57 +08:00
Nahor
68c7baff90 read: remove deprecation error for -i
`-i` has been an error and undocumented for 8 years now (86362e7) but
still requires some translating today. Time to retire it fully.

Part of #12594
2026-04-01 09:37:56 +08:00
Johannes Altmanninger
6eaad2cd80 Remove some redundant translation sources 2026-03-31 14:55:04 +08:00
Johannes Altmanninger
b321e38f5a psub: add missing line endings to error messages
Fixes #12593
2026-03-31 14:43:34 +08:00
Daniel Rainer
a32dd63163 fix: simplify and correct trimming of features
The previous handling was unnecessarily complex and had a bug introduced
by porting from C++ to Rust: The substrings `\0x0B` and `\0x0C` in Rust
mean `\0` (the NUL character) followed by the regular characters `0B`
and `0C`, respectively, so feature names starting or ending with these
characters would have these characters stripped away.

Replace this handling by built-in functionality, and simplify some
syntax. We now trim all whitespace, instead of just certain ASCII
characters, but I think there is no reason to limit trimming to ASCII.

Closes #12592
2026-03-31 14:43:34 +08:00
Bacal Mesfin
ef90afa5b9 feat: add completions for dnf copr
Closes #12585
2026-03-31 14:43:34 +08:00
r-vdp
7bd37dfe55 create_manpage_completions: handle groff \X'...' device control escapes
help2man 1.50 added \X'tty: link URL' hyperlink escapes to generated
man pages. coreutils 9.10 is the first widely-deployed package to ship
these, and it broke completion generation for most of its commands
(only 17/106 man pages parsed successfully).

The escape wraps option text like this:

  \X'tty: link https://example.com/a'\fB\-a, \-\-all\fP\X'tty: link'

Two places needed fixing:

- remove_groff_formatting() didn't strip \X'...', so Type1-4 parsers
  extracted garbage option names like "--all\X'tty"

- Deroffer.esc_char_backslash() didn't recognize \X, falling through
  to the generic single-char escape which stripped only the \, leaving
  "X'tty: link ...'" as literal text. Option lines then started with
  X instead of -, so TypeDeroffManParser's is_option() check failed.

Also handle \Z'...' (zero-width string) which has identical syntax.

Closes #12578
2026-03-31 13:15:52 +08:00
Johannes Altmanninger
14ce56d2a5 Remove unused fish_wcwidth wrapper 2026-03-30 13:57:27 +08:00
Johannes Altmanninger
01ee6f968d Support backward-word-end when cursor is past end
Closes #12581
2026-03-30 13:57:27 +08:00
Johannes Altmanninger
7f6dcde5e0 Fix backward-delete-char not stopping after control characters
Fixes 146384abc6 (Stop using wcwidth entirely, 2026-03-15)

Fixes #12583
2026-03-30 13:57:27 +08:00
Johannes Altmanninger
34fc573668 Modernize wcwidth API
Return None rather than -1 for nonprintables.  We probably still
differ from wcwidth which is bad (given we use the same name), but
hopefully not in a way that matters.

Fixes 146384abc6 (Stop using wcwidth entirely, 2026-03-15).
2026-03-30 13:57:10 +08:00
Johannes Altmanninger
93cbf2a0e8 Reuse wcswidth logic for rendered characters 2026-03-29 17:04:14 +08:00
Nahor
8194c6eb79 cd: replace unreachable code with assert
Closes #12584
2026-03-29 16:49:24 +08:00
Nahor
3194572156 bind: replace fake enum (c_int) with a real Rust enum
Part of #12584
2026-03-29 16:49:24 +08:00
Johannes Altmanninger
e635816b7f Fix Vi mode dl deleting from wrong character
Fixes b9b32ad157 (Fix vi mode dl and dh regressions, 2026-02-25).

Closes #12580
(which describes only the issue already fixed by b9b32ad157).
2026-03-28 21:45:33 +08:00
Johannes Altmanninger
2b3ecf22da start new cycle
Created by ./build_tools/release.sh 4.6.0
2026-03-28 13:16:34 +08:00
283 changed files with 16255 additions and 14802 deletions

View File

@@ -30,5 +30,5 @@ max_line_length = unset
[{COMMIT_EDITMSG,git-revise-todo,*.jjdescription}]
max_line_length = 72
[*.yml]
[*.{toml,yml}]
indent_size = 2

View File

@@ -25,7 +25,7 @@ runs:
set -x
toolchain=$(
case "$toolchain_channel" in
(stable) echo 1.93 ;; # updatecli.d/rust.yml
(stable) echo 1.95 ;; # updatecli.d/rust.yml
(msrv) echo 1.85 ;; # updatecli.d/rust.yml
(*)
printf >&2 "error: unsupported toolchain channel %s" "$toolchain_channel"

View File

@@ -0,0 +1,32 @@
name: Linux development builds
on:
push:
branches:
- buildscript
jobs:
deploy:
runs-on: ubuntu-latest
environment: linux-development
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2, build_tools/update-dependencies.sh
- uses: astral-sh/setup-uv@v7
- name: Update package database
run: sudo apt-get update
- name: Install deps
run: sudo apt install debhelper devscripts dpkg-dev
- name: Create tarball and source packages
run: |
version=$(build_tools/git_version_gen.sh --stdout 2>/dev/null)
mkdir /tmp/gpg
echo "$SIGNING_GPG_KEY" > /tmp/gpg/signing-gpg-key
mkdir /tmp/fish-built
FISH_ARTEFACT_PATH=/tmp/fish-built ./build_tools/make_tarball.sh
FISH_ARTEFACT_PATH=/tmp/fish-built DEB_SIGN_KEYFILE=/tmp/gpg/signing-gpg-key ./build_tools/make_linux_packages.sh $version
- uses: actions/upload-artifact@v6
with:
name: linux-source-packages
path: |
/tmp/fish-built
! /tmp/fish-built/fish-*/* # don't include the unpacked source directory

View File

@@ -21,4 +21,4 @@ jobs:
with:
command: check licenses
arguments: --all-features --locked --exclude-dev
rust-version: 1.93 # updatecli.d/rust.yml
rust-version: 1.95 # updatecli.d/rust.yml

View File

@@ -22,6 +22,27 @@ jobs:
- name: check rustfmt
run: find build.rs crates src -type f -name '*.rs' | xargs rustfmt --check
shellcheck:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2, build_tools/update-dependencies.sh
- uses: ./.github/actions/rust-toolchain@stable
- name: Update package database
run: sudo apt-get update
- name: Install shellcheck
run: sudo apt install shellcheck
- name: shellcheck
run: cargo xtask shellcheck
po_files_up_to_date:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2, build_tools/update-dependencies.sh
- uses: ./.github/actions/rust-toolchain@stable
- name: Install deps
uses: ./.github/actions/install-dependencies
- name: Check PO files
run: cargo xtask gettext check
clippy:
runs-on: ubuntu-latest
@@ -32,6 +53,8 @@ jobs:
features: ""
- rust_version: "stable"
features: "--no-default-features"
- rust_version: "stable"
features: "--all-features"
- rust_version: "msrv"
features: ""
steps:

View File

@@ -32,14 +32,6 @@ jobs:
- name: make fish_run_tests
run: |
make -C build VERBOSE=1 fish_run_tests
- name: translation updates
run: |
# Generate PO files. This should not result it a change in the repo if all translations are
# up to date.
# Ensure that fish is available as an executable.
PATH="$PWD/build:$PATH" build_tools/update_translations.fish
# Show diff output. Fail if there is any.
git --no-pager diff --exit-code || { echo 'There are uncommitted changes after regenerating the gettext PO files. Make sure to update them via `build_tools/update_translations.fish` after changing source files.'; exit 1; }
ubuntu-32bit-static-pcre2:
runs-on: ubuntu-latest
@@ -167,7 +159,7 @@ jobs:
- name: Install deps
# Not using setup-msys2 `install` option to make it easier to copy/paste
run: |
pacman --noconfirm -S --needed git rust
pacman --noconfirm -S --needed git rust python3 diffutils tmux
- name: rebase
env:
MSYS2_LOCATION: ${{ steps.msys2.outputs.msys2-location }}
@@ -177,10 +169,6 @@ jobs:
- name: cargo build
run: |
cargo build
- name: smoketest
# We can't run `build_tools/check.sh` yet, there are just too many failures
# so this is just a quick check to make sure that fish can swim
- name: tests
run: |
set -x
[ "$(target/debug/fish.exe -c 'echo (math 1 + 1)')" = 2 ]
cargo test
cargo xtask check

53
.gitignore vendored
View File

@@ -20,7 +20,6 @@
*.o
*.obj
*.orig
!tests/*.out
*.out
*.pch
*.slo
@@ -36,46 +35,31 @@
Desktop.ini
Thumbs.db
ehthumbs.db
__pycache__/
.directory
.fuse_hidden*
# Directories that only contain transitory files from building and testing.
/doc/
/share/man/
/share/doc/
/test/
/user_doc/
# File names that can appear in the project root that represent artifacts from
# building and testing.
/FISH-BUILD-VERSION-FILE
/command_list.txt
/command_list_toc.txt
/compile_commands.json
/doc.h
# Artifacts from in-tree builds ("cmake .").
/build.ninja
/cargo/
/CMakeCache.txt
/CMakeFiles/
/cmake_install.cmake
/fish
/fish.pc
/fish_indent
/fish_key_reader
/fish_tests
/lexicon.txt
/lexicon_filter
/toc.txt
/version
fish-build-version-witness.txt
__pycache__
/fish-localization-map-cache/
/fish.pc
/fish.pc.noversion
/.ninja_log
# File names that can appear below the project root that represent artifacts
# from building and testing.
/doc_src/commands.hdr
/doc_src/index.hdr
/po/*.gmo
/share/__fish_build_paths.fish
/share/pkgconfig
/tests/*.tmp.*
/tests/.last-check-all-files
/.venv/
# xcode
## Build generated
@@ -83,24 +67,19 @@ __pycache__
*.xccheckout
*.xcscmblueprin
.vscode
/DerivedData/
/build/
/DerivedData/
/tags
xcuserdata/
/xcuserdata/
# Generated by Cargo
# will have compiled files and executables
debug/
target/
/target/
# These are backup files generated by rustfmt
**/*.rs.bk
# MSVC Windows builds of rustc generate these, which store debugging information
*.pdb
# Generated by clangd
/.cache
/.cache/
# JetBrains editors.
.idea/

View File

@@ -1,3 +1,42 @@
fish 4.7.0 (released May 05, 2026)
==================================
Deprecations and removed features
---------------------------------
- The default theme (i.e. the ``fish_color_*`` variables) is no longer set in non-interactive shells.
Interactive improvements
------------------------
- :doc:`prompt_pwd <cmds/prompt_pwd>` now strips control characters.
- Repaint events (as triggered by changes to color variables or by event handlers running ``commandline -f repaint``) no longer reset the completion pager and other transient UI states (:issue:`12683`).
- :envvar:`fish_color_valid_path` now respects background and underline colors (:issue:`12622`).
- :doc:`funced <cmds/funced>` will no longer lose work if there are parse errors multiple times without new changes to the file.
- Fixed a case where directory completions were sorted in a surprising order (:issue:`12695`).
- When at the command token, the :kbd:`alt-o` binding will now open read-only files too (:issue:`12671`).
- Private mode in-memory history (``set fish_history``) is no longer shared with :doc:`builtin read <cmds/read>` (:issue:`12662`).
Other improvements
------------------
- History is no longer corrupted with NUL bytes when fish receives SIGTERM or SIGHUP (:issue:`10300`).
- :doc:`fish_update_completions <cmds/fish_update_completions>` now handles groff ``\X'...'`` device control escapes, fixing completion generation for man pages produced by help2man 1.50 and later (such as coreutils 9.10).
- Removing history entries via the :doc:`web-based config <cmds/fish_config>` is more intuitive.
- If :envvar:`XDG_DATA_DIRS` is empty, the default value is assumed, which means that fish will now also use configuration from paths like ``$PREFIX/share/fish/vendor_completions.d`` (:issue:`11349`).
- Some internal file descriptors were moved to number 10 or higher, to reduce risk of clashes with those used by the user in scripts.
- The wording of error messages has been made consistent, especially for builtin subcommands (:issue:`12556`).
For distributors and developers
-------------------------------
- When the default global config directory (``$PREFIX/etc/fish``) exists but has been overridden via ``-DCMAKE_INSTALL_SYSCONFDIR``, fish will now respect that override (:issue:`10748`).
- ``build_tools/update_translations.fish`` has been replaced by ``cargo xtask gettext {check,new,update}`` (:issue:`12676`).
- ``cargo xtask shellcheck`` to lint shell-scripts.
Regression fixes:
-----------------
- (from 4.6) Vi mode ``dl`` (:issue:`12461`).
- (from 4.6) Backspace after newline (:issue:`12583`).
- (from 4.3.3) Long options were spuriously completed after typing short options (85e76ba3561).
- (from 3.2) ``nosuchcommand || echo hello`` executes the right hand side again (:issue:`12654`).
fish 4.6.0 (released March 28, 2026)
====================================

View File

@@ -271,13 +271,14 @@ Adding translations for a new language
--------------------------------------
Creating new translations requires the Gettext tools.
More specifically, you will need ``msguniq`` and ``msgmerge`` for creating translations for a new
language.
To create a new translation, run::
More specifically, you will need ``msguniq``, ``msgmerge``, and ``msgattrib``
for creating translations for a new language.
To create a PO file for a new language ``ll_CC``, run::
build_tools/update_translations.fish localization/po/ll_CC.po
cargo xtask gettext new ll_CC
This will create a new PO file containing all messages available for translation.
This will create a new PO file in ``localization/po/``
containing all messages available for translation.
If the file already exists, it will be updated.
After modifying a PO file, you can recompile fish, and it will integrate the modifications you made.
@@ -347,10 +348,12 @@ Modifications to strings in source files
----------------------------------------
If a string changes in the sources, the old translations will no longer work.
They will be preserved in the PO files, but commented-out (starting with ``#~``).
If you add/remove/change a translatable strings in a source file,
run ``build_tools/update_translations.fish`` to propagate this to all translation files (``localization/po/*.po``).
run ``cargo xtask gettext update`` to propagate this to all translation files (``localization/po/*.po``).
This is only relevant for developers modifying the source files of fish or fish scripts.
Note translations for messages which are no longer present in the sources will be deleted from the PO files.
If the source string changed in a way which should not affect translations,
consider updating the ``msgid`` in the PO files such that translations are preserved.
Setting Code Up For Translations
--------------------------------

93
Cargo.lock generated
View File

@@ -67,6 +67,12 @@ dependencies = [
"windows-sys",
]
[[package]]
name = "anyhow"
version = "1.0.102"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7f202df86484c868dbad7eaa557ef785d5c66295e41b460ef922eca0723b842c"
[[package]]
name = "assert_matches"
version = "1.5.0"
@@ -183,6 +189,31 @@ dependencies = [
"libc",
]
[[package]]
name = "crossbeam-deque"
version = "0.8.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9dd111b7b7f7d55b72c0a6ae361660ee5853c9af73f70c3c2ef6858b950e2e51"
dependencies = [
"crossbeam-epoch",
"crossbeam-utils",
]
[[package]]
name = "crossbeam-epoch"
version = "0.9.18"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5b82ac4a3c2ca9c3460964f020e1402edd5753411d7737aa39c3714ad1b5420e"
dependencies = [
"crossbeam-utils",
]
[[package]]
name = "crossbeam-utils"
version = "0.8.21"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d0a5c400df2834b80a4c3327b3aad3a4c4cd4de0629063962b03235697506a28"
[[package]]
name = "crypto-common"
version = "0.1.7"
@@ -260,7 +291,7 @@ checksum = "5baebc0774151f905a1a2cc41989300b1e6fbb29aff0ceffa1064fdd3088d582"
[[package]]
name = "fish"
version = "4.6.0"
version = "4.7.0"
dependencies = [
"assert_matches",
"bitflags",
@@ -272,6 +303,7 @@ dependencies = [
"fish-color",
"fish-common",
"fish-fallback",
"fish-feature-flags",
"fish-gettext",
"fish-gettext-extraction",
"fish-gettext-mo-file-parser",
@@ -295,7 +327,9 @@ dependencies = [
"rand",
"rsconf",
"rust-embed",
"rustc_version",
"serial_test",
"strum_macros",
"unix_path",
"xterm-color",
]
@@ -329,6 +363,7 @@ version = "0.0.0"
dependencies = [
"bitflags",
"fish-build-helper",
"fish-feature-flags",
"fish-widestring",
"libc",
"nix",
@@ -348,6 +383,13 @@ dependencies = [
"widestring",
]
[[package]]
name = "fish-feature-flags"
version = "0.0.0"
dependencies = [
"fish-widestring",
]
[[package]]
name = "fish-gettext"
version = "0.0.0"
@@ -437,6 +479,7 @@ version = "0.0.0"
name = "fish-widestring"
version = "0.0.0"
dependencies = [
"libc",
"unicode-width",
"widestring",
]
@@ -510,6 +553,22 @@ version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2304e00983f87ffb38b55b444b5e3b60a884b5d30c0fca7d82fe33449bbe55ea"
[[package]]
name = "ignore"
version = "0.4.25"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d3d782a365a015e0f5c04902246139249abf769125006fbe7649e2ee88169b4a"
dependencies = [
"crossbeam-deque",
"globset",
"log",
"memchr",
"regex-automata",
"same-file",
"walkdir",
"winapi-util",
]
[[package]]
name = "is_terminal_polyfill"
version = "1.70.2"
@@ -879,6 +938,15 @@ dependencies = [
"walkdir",
]
[[package]]
name = "rustc_version"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cfcb3a22ef46e85b45de6ee7e79d063319ebb6594faafcf1c225ea92ab6e9b92"
dependencies = [
"semver",
]
[[package]]
name = "same-file"
version = "1.0.6"
@@ -909,6 +977,12 @@ version = "3.0.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "490dcfcbfef26be6800d11870ff2df8774fa6e86d047e3e8c8a76b25655e41ca"
[[package]]
name = "semver"
version = "1.0.28"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8a7852d02fc848982e0c167ef163aaff9cd91dc640ba85e263cb1ce46fae51cd"
[[package]]
name = "serde"
version = "1.0.228"
@@ -1005,6 +1079,18 @@ version = "0.11.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7da8b5736845d9f2fcb837ea5d9e2628564b3b043a70948a3f0b778838c5fb4f"
[[package]]
name = "strum_macros"
version = "0.28.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ab85eea0270ee17587ed4156089e10b9e6880ee688791d45a905f5b1ca36f664"
dependencies = [
"heck",
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "syn"
version = "2.0.114"
@@ -1153,9 +1239,14 @@ name = "xtask"
version = "0.0.0"
dependencies = [
"anstyle",
"anyhow",
"clap",
"fish-build-helper",
"fish-common",
"fish-tempfile",
"fish-widestring",
"ignore",
"pcre2",
"walkdir",
]

View File

@@ -12,6 +12,7 @@ license = "GPL-2.0-only AND LGPL-2.0-or-later AND MIT AND PSF-2.0"
[workspace.dependencies]
anstyle = "1.0.13"
anyhow = "1.0.102"
assert_matches = "1.5.0"
bitflags = "2.5.0"
cc = "1.0.94"
@@ -23,6 +24,7 @@ fish-build-man-pages = { path = "crates/build-man-pages" }
fish-color = { path = "crates/color" }
fish-common = { path = "crates/common" }
fish-fallback = { path = "crates/fallback" }
fish-feature-flags = { path = "crates/feature-flags" }
fish-gettext = { path = "crates/gettext" }
fish-gettext-extraction = { path = "crates/gettext-extraction" }
fish-gettext-maps = { path = "crates/gettext-maps" }
@@ -34,6 +36,7 @@ fish-wcstringutil = { path = "crates/wcstringutil" }
fish-widecharwidth = { path = "crates/widecharwidth" }
fish-widestring = { path = "crates/widestring" }
fish-wgetopt = { path = "crates/wgetopt" }
ignore = "0.4.25"
itertools = "0.14.0"
libc = "0.2.177"
# lru pulls in hashbrown by default, which uses a faster (though less DoS resistant) hashing algo.
@@ -41,38 +44,41 @@ libc = "0.2.177"
# files as of 22 April 2024.
lru = "0.16.2"
nix = { version = "0.31.1", default-features = false, features = [
"event",
"fs",
"inotify",
"hostname",
"resource",
"process",
"signal",
"term",
"user",
"event",
"fs",
"inotify",
"hostname",
"resource",
"process",
"signal",
"term",
"user",
] }
num-traits = "0.2.19"
once_cell = "1.19.0"
pcre2 = { git = "https://github.com/fish-shell/rust-pcre2", tag = "0.2.9-utf32", default-features = false, features = [
"utf32",
"utf32",
] }
phf = { version = "0.13", default-features = false }
phf_codegen = "0.13"
portable-atomic = { version = "1", default-features = false, features = [
"fallback",
"fallback",
] }
proc-macro2 = "1.0"
rand = { version = "0.9.2", default-features = false, features = [
"small_rng",
"thread_rng",
"small_rng",
"thread_rng",
] }
regex = "1.12.3"
rsconf = "0.3.0"
rust-embed = { version = "8.11.0", features = [
"deterministic-timestamps",
"include-exclude",
"interpolate-folder-path",
"deterministic-timestamps",
"include-exclude",
"interpolate-folder-path",
] }
rustc_version = "0.4.1"
serial_test = { version = "3", default-features = false }
strum_macros = "0.28.0"
widestring = "1.2.0"
unicode-segmentation = "1.12.0"
unicode-width = "0.2.0"
@@ -90,7 +96,7 @@ debug = true
[package]
name = "fish"
version = "4.6.0"
version = "4.7.0"
edition.workspace = true
rust-version.workspace = true
default-run = "fish"
@@ -108,6 +114,7 @@ fish-build-man-pages = { workspace = true, optional = true }
fish-color.workspace = true
fish-common.workspace = true
fish-fallback.workspace = true
fish-feature-flags.workspace = true
fish-gettext = { workspace = true, optional = true }
fish-gettext-extraction = { workspace = true, optional = true }
fish-printf.workspace = true
@@ -126,6 +133,7 @@ num-traits.workspace = true
once_cell.workspace = true
pcre2.workspace = true
rand.workspace = true
strum_macros.workspace = true
xterm-color.workspace = true
[target.'cfg(not(target_has_atomic = "64"))'.dependencies]
@@ -133,16 +141,16 @@ portable-atomic.workspace = true
[target.'cfg(windows)'.dependencies]
rust-embed = { workspace = true, features = [
"deterministic-timestamps",
"debug-embed",
"include-exclude",
"interpolate-folder-path",
"deterministic-timestamps",
"debug-embed",
"include-exclude",
"interpolate-folder-path",
] }
[target.'cfg(not(windows))'.dependencies]
rust-embed = { workspace = true, features = [
"deterministic-timestamps",
"include-exclude",
"interpolate-folder-path",
"deterministic-timestamps",
"include-exclude",
"interpolate-folder-path",
] }
[dev-dependencies]
@@ -154,6 +162,7 @@ fish-build-helper.workspace = true
fish-gettext-mo-file-parser.workspace = true
phf_codegen = { workspace = true, optional = true }
rsconf.workspace = true
rustc_version.workspace = true
[target.'cfg(windows)'.build-dependencies]
unix_path.workspace = true
@@ -183,16 +192,14 @@ embed-manpages = ["dep:fish-build-man-pages"]
localize-messages = ["dep:fish-gettext"]
# This feature is used to enable extracting messages from the source code for localization.
# It only needs to be enabled if updating these messages (and the corresponding PO files) is
# desired. This happens when running tests via `cargo xtask check` and when calling
# `build_tools/update_translations.fish`, so there should not be a need to enable it manually.
# desired. This happens for the `gettext` xtask, which is also invoked via `cargo xtask check`.
# There should not be a need to enable this feature manually.
gettext-extract = ["dep:fish-gettext-extraction"]
# The following features are auto-detected by the build-script and should not be enabled manually.
tsan = []
[workspace.lints]
rust.non_camel_case_types = "allow"
rust.non_upper_case_globals = "allow"
rust.unknown_lints = { level = "allow", priority = -1 }
rust.unstable_name_collisions = "allow"
rustdoc.private_intra_doc_links = "allow"

View File

@@ -6,6 +6,10 @@
use std::path::{Path, PathBuf};
fn main() {
let is_nightly =
rustc_version::version_meta().unwrap().channel == rustc_version::Channel::Nightly;
rsconf::declare_cfg("nightly", is_nightly);
setup_paths();
// Add our default to enable tools that don't go through CMake, like "cargo test" and the

View File

@@ -8,11 +8,29 @@ if [ "$FISH_CHECK_LINT" = false ]; then
lint=false
fi
case "$(uname)" in
MSYS*)
is_cygwin=true
cygwin_var=MSYS
;;
CYGWIN*)
is_cygwin=true
cygwin_var=CYGWIN
;;
*)
is_cygwin=false
;;
esac
check_dependency_versions=false
if [ "${FISH_CHECK_DEPENDENCY_VERSIONS:-false}" != false ]; then
check_dependency_versions=true
fi
green='\e[0;32m'
yellow='\e[1;33m'
reset='\e[m'
if $check_dependency_versions; then
command -v curl
command -v jq
@@ -42,6 +60,7 @@ cargo() {
fi
}
# shellcheck disable=2317,2329
cleanup () {
if [ -n "$gettext_template_dir" ] && [ -e "$gettext_template_dir" ]; then
rm -r "$gettext_template_dir"
@@ -71,6 +90,7 @@ fi
gettext_template_dir=$(mktemp -d)
(
# shellcheck disable=2030
export FISH_GETTEXT_EXTRACTION_DIR="$gettext_template_dir"
cargo build --workspace --all-targets --features=gettext-extract
)
@@ -78,17 +98,60 @@ if $lint; then
if command -v cargo-deny >/dev/null; then
cargo deny --all-features --locked --exclude-dev check licenses
fi
if command -v shellcheck >/dev/null || { test -n "$CI" && ! $is_cygwin; }; then
cargo xtask shellcheck
fi
PATH="$build_dir:$PATH" cargo xtask format --all --check
for features in "" --no-default-features; do
for features in "" --no-default-features --all-features; do
cargo clippy --workspace --all-targets $features
done
cargo xtask gettext --rust-extraction-dir="$gettext_template_dir" check
fi
cargo test --no-default-features --workspace --all-targets
# When running `cargo test`, some binaries (e.g. `fish_gettext_extraction`)
# are dynamically linked against Rust's `std-xxx.dll` instead of being
# statically link as they usually are.
# On Cygwin, `PATH`is not properly updated to point to the `std-xxx.dll`
# location, so we have to do it manually.
# See:
# - https://github.com/rust-lang/rust/issues/149050
# - https://github.com/msys2/MSYS2-packages/issues/5784
(
if $is_cygwin; then
PATH="$PATH:$(rustc --print target-libdir)"
export PATH
fi
cargo test --no-default-features --workspace --all-targets
)
cargo test --doc --workspace
if $lint; then
cargo doc --workspace --no-deps
fi
FISH_GETTEXT_EXTRACTION_DIR=$gettext_template_dir "$workspace_root/tests/test_driver.py" "$build_dir"
# Using "()" not "{}" because we do want a subshell (for the export)
system_tests() (
# shellcheck disable=2163
[ -n "$*" ] && export "$@"
"$workspace_root/tests/test_driver.py" "$build_dir"
)
if $is_cygwin; then
# shellcheck disable=2059
printf "=== Running ${green}integration tests ${yellow}with${green} symlinks${reset}\n"
system_tests "$cygwin_var"=winsymlinks
# shellcheck disable=2059
printf "=== Running ${green}integration tests ${yellow}without${green} symlinks${reset}\n"
system_tests "$cygwin_var"=
else
# shellcheck disable=2059
printf "=== Running ${green}integration tests${reset}\n"
system_tests
fi
exit
}

View File

@@ -1,28 +0,0 @@
#!/bin/sh
# Script to generate the dput.cf for a set of Ubuntu series, prints the filename
# Arguments are the PPA followed by the series names
set -e
outfile=$(mktemp --tmpdir dput.XXXXX.cf)
[ $# -lt 2 ] &&
echo "$0: at least two arguments (a PPA and at least one series) are required" >&2 &&
exit 1
ppa=$1
shift
for series in "$@"; do
cat >> "$outfile" <<EOF
[fish-$ppa-$series]
fqdn = ppa.launchpad.net
method = ftp
login = anonymous
incoming = ~fish-shell/$ppa/ubuntu/$series
EOF
done
echo "$outfile"

View File

@@ -1,150 +0,0 @@
#!/usr/bin/env fish
#
# Tool to generate gettext messages template file.
# Writes to stdout.
# Intended to be called from `update_translations.fish`.
argparse use-existing-template= -- $argv
or exit $status
begin
# Write header. This is required by msguniq.
# Note that this results in the file being overwritten.
# This is desired behavior, to get rid of the results of prior invocations
# of this script.
set -l header 'msgid ""\nmsgstr "Content-Type: text/plain; charset=UTF-8\\\\n"\n\n'
printf $header
set -g workspace_root (path resolve (status dirname)/..)
set -l rust_extraction_dir
if set -l --query _flag_use_existing_template
set rust_extraction_dir $_flag_use_existing_template
else
set rust_extraction_dir (mktemp -d)
# We need to build to ensure that the proc macro for extracting strings runs.
FISH_GETTEXT_EXTRACTION_DIR=$rust_extraction_dir cargo check --features=gettext-extract
or exit 1
end
function mark_section
set -l section_name $argv[1]
echo 'msgid "fish-section-'$section_name'"'
echo 'msgstr ""'
echo ''
end
mark_section tier1-from-rust
# Get rid of duplicates and sort.
begin
# Without providing this header, msguniq complains when a msgid is non-ASCII.
printf $header
find $rust_extraction_dir -type f -exec cat {} +
end |
msguniq --no-wrap --sort-output |
# Remove the header again. Otherwise it would appear twice, breaking the msguniq at the end
# of this file.
sed '/^msgid ""$/ {N; /\nmsgstr "Content-Type: text\/plain; charset=UTF-8\\\\n"$/ {N; d}}'
if not set -l --query _flag_use_existing_template
rm -r $rust_extraction_dir
end
function extract_fish_script_messages_impl
set -l regex $argv[1]
set -e argv[1]
# Using xgettext causes more trouble than it helps.
# This is due to handling of escaping in fish differing from formats xgettext understands
# (e.g. POSIX shell strings).
# We work around this issue by manually writing the file content.
# Steps:
# 1. We extract strings to be translated from the relevant files and drop the rest. This step
# depends on the regex matching the entire line, and the first capture group matching the
# string.
# 2. We unescape. This gets rid of some escaping necessary in fish strings.
# 3. The resulting strings are sorted alphabetically. This step is optional. Not sorting would
# result in strings from the same file appearing together. Removing duplicates is also
# optional, since msguniq takes care of that later on as well.
# 4. Single backslashes are replaced by double backslashes. This results in the backslashes
# being interpreted as literal backslashes by gettext tooling.
# 5. Double quotes are escaped, such that they are not interpreted as the start or end of
# a msgid.
# 6. We transform the string into the format expected in a PO file.
cat $argv |
string replace --filter --regex $regex '$1' |
string unescape |
sort -u |
sed -E -e 's_\\\\_\\\\\\\\_g' -e 's_"_\\\\"_g' -e 's_^(.*)$_msgid "\1"\nmsgstr ""\n_'
end
function extract_fish_script_messages
set -l tier $argv[1]
set -e argv[1]
if not set -q argv[1]
return
end
# This regex handles explicit requests to translate a message. These are more important to translate
# than messages which should be implicitly translated.
set -l explicit_regex '.*\( *_ (([\'"]).+?(?<!\\\\)\\2) *\).*'
mark_section "$tier-from-script-explicitly-added"
extract_fish_script_messages_impl $explicit_regex $argv
# This regex handles descriptions for `complete` and `function` statements. These messages are not
# particularly important to translate. Hence the "implicit" label.
set -l implicit_regex '^(?:\s|and |or )*(?:complete|function).*? (?:-d|--description) (([\'"]).+?(?<!\\\\)\\2).*'
mark_section "$tier-from-script-implicitly-added"
extract_fish_script_messages_impl $implicit_regex $argv
end
set -g share_dir $workspace_root/share
set -l tier1 $share_dir/config.fish
set -l tier2
set -l tier3
for file in $share_dir/completions/*.fish $share_dir/functions/*.fish
# set -l tier (string match -r '^# localization: .*' <$file)
set -l tier (string replace -rf -m1 \
'^# localization: (.*)$' '$1' <$file)
if set -q tier[1]
switch "$tier"
case tier1 tier2 tier3
set -a $tier $file
case 'skip*'
case '*'
echo >&2 "$file:1 unexpected localization tier: $tier"
exit 1
end
continue
end
set -l dirname (path basename (path dirname $file))
set -l command_name (path basename --no-extension $file)
if test $dirname = functions &&
string match -q -- 'fish_*' $command_name
set -a tier1 $file
continue
end
if test $dirname != completions
echo >&2 "$file:1 missing localization tier for function file"
exit 1
end
if test -e $workspace_root/doc_src/cmds/$command_name.rst
set -a tier1 $file
else
set -a tier3 $file
end
end
extract_fish_script_messages tier1 $tier1
extract_fish_script_messages tier2 $tier2
extract_fish_script_messages tier3 $tier3
end |
# At this point, all extracted strings have been written to stdout,
# starting with the ones taken from the Rust sources,
# followed by strings explicitly marked for translation in fish scripts,
# and finally the strings from fish scripts which get translated implicitly.
# Because we do not eliminate duplicates across these categories,
# we do it here, since other gettext tools expect no duplicates.
msguniq --no-wrap

View File

@@ -0,0 +1,83 @@
#!/bin/sh
# This script takes a source tarball (from build_tools/make_tarball.sh) and a vendor tarball (from
# build_tools/make_vendor_tarball.sh, generated if not present), and produces:
# * Appropriately-named symlinks to look like a Debian package
# * Debian .changes and .dsc files with plain names ($version-1) and supported Ubuntu prefixes
# ($version-1~somedistro)
# * An RPM spec file
# By default, input and output files go in ~/fish_built, but this can be controlled with the
# FISH_ARTEFACT_PATH environment variable.
{
set -e
version=$1
[ -n "$version" ] || { echo "Version number required as argument" >&2; exit 1; }
[ -n "$DEB_SIGN_KEYID$DEB_SIGN_KEYFILE" ] ||
echo "Warning: neither DEB_SIGN_KEYID or DEB_SIGN_KEYFILE environment variables are set; you
will need a signing key for the author of the most recent debian/changelog entry." >&2
workpath=${FISH_ARTEFACT_PATH:-~/fish_built}
source_tarball="$workpath"/fish-"$version".tar.xz
vendor_tarball="$workpath"/fish-"$version"-vendor.tar.xz
[ -e "$source_tarball" ] || { echo "Missing source tarball, expected at $source_tarball" >&2; exit 1; }
cd "$workpath"
# Unpack the sources
tar xf "$source_tarball"
sourcepath="$workpath"/fish-"$version"
# Generate the vendor tarball if it is not already present
[ -e "$vendor_tarball" ] || (cd "$sourcepath"; build_tools/make_vendor_tarball.sh;)
# This step requires network access, so do it early in case it fails
# sh has no real array support
ubuntu_versions=$(uv run --script "$sourcepath"/build_tools/supported_ubuntu_versions.py)
# Write the specfile
[ -e "$workpath"/fish.spec ] && { echo "Cowardly refusing to overwite an existing fish.spec" >&2;
exit 1; }
rpmversion=$(echo "$version" |sed -e 's/-/+/' -e 's/-/./g')
sed -e "s/@version@/$version/g" -e "s/@rpmversion@/$rpmversion/g" \
< "$sourcepath"/fish.spec.in > "$workpath"/fish.spec
# Make the symlinks for Debian
ln -s "$source_tarball" "$workpath"/fish_"$version".orig.tar.xz
ln -s "$vendor_tarball" "$workpath"/fish_"$version".orig-cargo-vendor.tar.xz
# Set up the Debian source tree
cd "$sourcepath"
mkdir cargo-vendor
tar -C cargo-vendor -x -f "$vendor_tarball"
cp -r contrib/debian debian
# The vendor tarball contains a new .cargo/config.toml, which has the
# vendoring overrides appended to it. dpkg-source will add this as a
# patch using the flags in debian/
cp cargo-vendor/.cargo/config.toml .cargo/config.toml
# Update the Debian changelog
# The release scripts do this for release builds - skip if it has already been done
if head -n1 debian/changelog | grep --invert-match --quiet --fixed-strings "$version"; then
debchange --newversion "$version-1" --distribution unstable "Snapshot build"
fi
# Builds the "plain" Debian package
# debuild runs lintian, which takes ten minutes to run over the vendor directories
# just use dpkg-buildpackage directly
dpkg-buildpackage --build=source -d
# Build the Ubuntu packages
# deb-reversion does not work on source packages, so do the whole thing ourselves
for series in $ubuntu_versions; do
sed -i -e "1 s/$version-1)/$version-1~$series)/" -e "1 s/unstable/$series/" debian/changelog
dpkg-buildpackage --build=source -d
sed -i -e "1 s/$version-1~$series)/$version-1)/" -e "1 s/$series/unstable/" debian/changelog
done
}

View File

@@ -16,7 +16,7 @@ manifest=$tmpdir/Cargo.toml
lockfile=$tmpdir/Cargo.lock
sed "s/^version = \".*\"\$/version = \"$VERSION\"/g" Cargo.toml >"$manifest"
awk -v version=$VERSION '
awk -v version="$VERSION" '
/^name = "fish"$/ { ok=1 }
ok == 1 && /^version = ".*"$/ {
ok = 2;

View File

@@ -8,20 +8,6 @@
# Exit on error
set -e
# We need GNU tar as that supports the --mtime and --transform options
TAR=notfound
for try in tar gtar gnutar; do
if $try -Pcf /dev/null --mtime now /dev/null >/dev/null 2>&1; then
TAR=$try
break
fi
done
if [ "$TAR" = "notfound" ]; then
echo 'No suitable tar (supporting --mtime) found as tar/gtar/gnutar in PATH'
exit 1
fi
# Get the current directory, which we'll use for telling Cargo where to find the sources
wd="$PWD"

View File

@@ -48,7 +48,7 @@ if test -z "$CI" || [ "$(git -C "$workspace_root" tag | wc -l)" -gt 1 ]; then {
num_new_authors=$(wc -l <"$relnotes_tmp/committers-new")
printf %s \
"This release brings $num_commits new commits since $previous_version," \
" contributed by $num_authors authors, $num_new_authors of which are new committers."
" contributed by $num_authors authors, $num_new_authors of which are new faces."
echo
echo
)
@@ -86,10 +86,13 @@ if test -z "$CI" || [ "$(git -C "$workspace_root" tag | wc -l)" -gt 1 ]; then {
echo
echo 'Download links:'
echo 'To download the source code for fish, we suggest the file named ``fish-'"$version"'.tar.xz``.'
# shellcheck disable=2016
echo 'The file downloaded from ``Source code (tar.gz)`` will not build correctly.'
# shellcheck disable=2016
echo 'A GPG signature using `this key <'"${FISH_GPG_PUBLIC_KEY_URL:-???}"'>`__ is available as ``fish-'"$version"'.tar.xz.asc``.'
echo
echo 'The files called ``fish-'"$version"'-linux-*.tar.xz`` contain'
# shellcheck disable=2016
echo '`standalone fish binaries <https://github.com/fish-shell/fish-shell/?tab=readme-ov-file#building-fish-with-cargo>`__'
echo 'for any Linux with the given CPU architecture.'
} >"$relnotes_tmp/fake-workspace"/CHANGELOG.rst

View File

@@ -72,7 +72,7 @@ integration_branch=$(
--format='%(refname:strip=2)'
)
[ -n "$integration_branch" ] ||
git merge-base --is-ancestor $remote/master HEAD
git merge-base --is-ancestor "$remote"/master HEAD
sed -n 1p CHANGELOG.rst | grep -q '^fish .*(released .*)$'
sed -n 2p CHANGELOG.rst | grep -q '^===*$'
@@ -113,9 +113,9 @@ CreateCommit "Release $version"
# Tags must be full objects, not lightweight tags, for
# git_version-gen.sh to work.
git -c "user.signingKey=$committer" \
tag --sign --message="Release $version" $version
tag --sign --message="Release $version" "$version"
git push $remote $version
git push "$remote" "$version"
TIMEOUT=
gh() {
@@ -173,6 +173,7 @@ actual_tag_oid=$(git ls-remote "$remote" |
(
cd "$tmpdir/local-tarball/fish-$version"
uv --no-managed-python venv
# shellcheck disable=1091
. .venv/bin/activate
cmake -GNinja -DCMAKE_BUILD_TYPE=Debug .
ninja doc
@@ -180,14 +181,17 @@ actual_tag_oid=$(git ls-remote "$remote" |
CopyDocs() {
rm -rf "$fish_site/site/docs/$1"
cp -r "$tmpdir/local-tarball/fish-$version/cargo/fish-docs/html" "$fish_site/site/docs/$1"
git -C $fish_site add "site/docs/$1"
git -C "$fish_site" add "site/docs/$1"
}
minor_version=${version%.*}
CopyDocs "$minor_version"
latest_release=$(
releases=$(git tag | grep '^[0-9]*\.[0-9]*\.[0-9]*.*' |
sed $(: "De-prioritize release candidates (1.2.3-rc0)") \
's/-/~/g' | LC_ALL=C sort --version-sort)
sed '
# De-prioritize release candidates (1.2.3-rc0)
s/-/~/g
' | LC_ALL=C sort --version-sort
)
printf %s\\n "$releases" | tail -1
)
if [ "$version" = "$latest_release" ]; then
@@ -272,7 +276,7 @@ done
)
if [ -n "$integration_branch" ]; then {
git push $remote "$version^{commit}":refs/heads/$integration_branch
git push "$remote" "$version^{commit}:refs/heads/$integration_branch"
} else {
changelog=$(cat - CHANGELOG.rst <<EOF
fish ?.?.? (released ???)
@@ -283,7 +287,7 @@ EOF
printf %s\\n "$changelog" >CHANGELOG.rst
git add CHANGELOG.rst
CreateCommit "start new cycle"
git push $remote HEAD:master
git push "$remote" HEAD:master
} fi
milestone_version="$(

View File

@@ -1,156 +0,0 @@
#!/usr/bin/env fish
# Updates the files used for gettext translations.
# By default, the whole xgettext + msgmerge pipeline runs,
# which extracts the messages from the source files into $template_file,
# and updates the PO files for each language from that.
#
# Use cases:
# For developers:
# - Run with no args to update all PO files after making changes to Rust/fish sources.
# For translators:
# - Specify the language you want to work on as an argument, which must be a file in the
# localization/po/ directory. You can specify a language which does not have translations
# yet by specifying the name of a file which does not yet exist.
# Make sure to follow the naming convention.
# For testing:
# - Specify `--dry-run` to see if any updates to the PO files would by applied by this script.
# If this flag is specified, the script will exit with an error if there are outstanding
# changes, and will display the diff. Do not specify other flags if `--dry-run` is specified.
#
# Specify `--use-existing-template=DIR` to prevent running cargo for extracting an up-to-date
# version of the localized strings. This flag is intended for testing setups which make it
# inconvenient to run cargo here, but run it in an earlier step to ensure up-to-date values.
# This argument is passed on to the `fish_xgettext.fish` script and has no other uses.
# `DIR` must be the path to a gettext template file generated from our compilation process.
# It can be obtained by running:
# set -l DIR (mktemp -d)
# FISH_GETTEXT_EXTRACTION_DIR=$DIR cargo check --features=gettext-extract
# The sort utility is locale-sensitive.
# Ensure that sorting output is consistent by setting LC_ALL here.
set -gx LC_ALL C.UTF-8
set -l build_tools (status dirname)
set -l po_dir $build_tools/../localization/po
set -l extract
argparse dry-run use-existing-template= -- $argv
or exit $status
if test -z $argv[1]
# Update everything if not specified otherwise.
set -g po_files $po_dir/*.po
else
set -l po_dir_id (stat --format='%d:%i' -- $po_dir)
for arg in $argv
set -l arg_dir_id (stat --format='%d:%i' -- (dirname $arg) 2>/dev/null)
if test $po_dir_id != "$arg_dir_id"
echo "Argument $arg is not a file in the directory $(realpath $po_dir)."
echo "Non-option arguments must specify paths to files in this directory."
echo ""
echo "If you want to add a new language to the translations not the following:"
echo "The filename must identify a language, with a two letter ISO 639-1 language code of the target language (e.g. 'pt' for Portuguese), and use the file extension '.po'."
echo "Optionally, you can specify a regional variant (e.g. 'pt_BR')."
echo "So valid filenames are of the shape 'll.po' or 'll_CC.po'."
exit 1
end
if not basename $arg | grep -qE '^[a-z]{2,3}(_[A-Z]{2})?\.po$'
echo "Filename does not match the expected format ('ll.po' or 'll_CC.po')."
exit 1
end
end
set -g po_files $argv
end
set -g template_file (mktemp)
# Protect from externally set $tmpdir leaking into this script.
set -g tmpdir
function cleanup_exit
set -l exit_status $status
rm $template_file
if set -g --query tmpdir[1]
rm -r $tmpdir
end
exit $exit_status
end
if set -l --query extract
set -l xgettext_args
if set -l --query _flag_use_existing_template
set xgettext_args --use-existing-template=$_flag_use_existing_template
end
$build_tools/fish_xgettext.fish $xgettext_args >$template_file
or cleanup_exit
end
if set -l --query _flag_dry_run
# On a dry run, we do not modify localization/po/ but write to a temporary directory instead
# and check if there is a difference between localization/po/ and the tmpdir after re-generating
# the PO files.
set -g tmpdir (mktemp -d)
# Ensure tmpdir has the same initial state as the po dir.
cp -r $po_dir/* $tmpdir
end
# This is used to identify lines which should be set here via $header_lines.
# Make sure that this prefix does not appear elsewhere in the file and only contains characters
# without special meaning in a sed pattern.
set -g header_prefix "# fish-note-sections: "
function print_header
set -l header_lines \
"Translations are divided into sections, each starting with a fish-section-* pseudo-message." \
"The first few sections are more important." \
"Ignore the tier3 sections unless you have a lot of time."
for line in $header_lines
printf '%s%s\n' $header_prefix $line
end
end
function merge_po_files --argument-names template_file po_file
msgmerge --no-wrap --update --no-fuzzy-matching --backup=none --quiet \
$po_file $template_file
or cleanup_exit
set -l new_po_file (mktemp) # TODO Remove on failure.
# Remove obsolete messages instead of keeping them as #~ entries.
and msgattrib --no-wrap --no-obsolete -o $new_po_file $po_file
or cleanup_exit
begin
print_header
# Paste PO file without old header lines.
sed '/^'$header_prefix'/d' $new_po_file
end >$po_file
rm $new_po_file
end
for po_file in $po_files
if set --query tmpdir[1]
set po_file $tmpdir/(basename $po_file)
end
if test -e $po_file
merge_po_files $template_file $po_file
else
begin
print_header
cat $template_file
end >$po_file
end
end
if set -g --query tmpdir[1]
diff -ur $po_dir $tmpdir
or begin
echo ERROR: translations in localization/po/ are stale. Try running build_tools/update_translations.fish
cleanup_exit
end
end
cleanup_exit

View File

@@ -1,4 +1,4 @@
#!/bin/bash
#!/usr/bin/env bash
set -euo pipefail
@@ -11,6 +11,6 @@ codename=$(
curl -fsS https://sources.debian.org/api/src/"${package}"/ |
jq -r --arg codename "${codename}" '
.versions[] | select(.suites[] == $codename) | .version' |
sed 's/^\([0-9]\+\.[0-9]\+\).*/\1/' |
sed -E 's/^([0-9]+\.[0-9]+).*/\1/' |
sort --version-sort |
tail -1

View File

@@ -1,3 +1,11 @@
fish (4.7.0-1) stable; urgency=medium
* Release of new version 4.7.0.
See https://github.com/fish-shell/fish-shell/releases/tag/4.7.0 for details.
-- Johannes Altmanninger <aclopte@gmail.com> Tue, 05 May 2026 15:24:27 +0800
fish (4.6.0-1) stable; urgency=medium
* Release of new version 4.6.0.

View File

@@ -10,10 +10,12 @@ Build-Depends: debhelper-compat (= 13),
gettext,
libpcre2-dev,
rustc (>= 1.85) | rustc-web (>= 1.85) | rustc-1.85,
# pkg-config is needed for the pcre2 crate to find the pcre2 system library
pkgconf | pkg-config,
python3-sphinx,
# Test dependencies
locales-all,
man-db,
man-db | man,
python3
# 4.6.2 is Debian 12/Ubuntu Noble 24.04; Ubuntu Jammy is 4.6.0.1
Standards-Version: 4.6.2
@@ -26,8 +28,8 @@ Architecture: any
# for col and lock
Depends: bsdextrautils,
file,
# for man
man-db,
# for showing built-in help pages
man-db | man,
# for kill
procps,
python3 (>=3.5),

View File

@@ -98,14 +98,14 @@ pub fn target_os_is_cygwin() -> bool {
#[macro_export]
macro_rules! as_os_strs {
[ $( $x:expr, )* ] => {
[ $( $x:expr ),* $(,)? ] => {
{
use std::ffi::OsStr;
fn as_os_str<S: AsRef<OsStr> + ?Sized>(s: &S) -> &OsStr {
s.as_ref()
}
&[
$( as_os_str($x), )*
[
$( as_os_str($x) ),*
]
}
}

View File

@@ -8,6 +8,7 @@ license.workspace = true
[dependencies]
bitflags.workspace = true
fish-feature-flags.workspace = true
fish-widestring.workspace = true
libc.workspace = true
nix.workspace = true

File diff suppressed because it is too large Load Diff

View File

@@ -8,12 +8,12 @@
use std::cmp;
use std::sync::{
LazyLock,
atomic::{AtomicIsize, Ordering},
atomic::{AtomicUsize, Ordering},
};
/// Width of ambiguous East Asian characters and, as of TR11, all private-use characters.
/// 1 is the typical default, but we accept any non-negative override via `$fish_ambiguous_width`.
pub static FISH_AMBIGUOUS_WIDTH: AtomicIsize = AtomicIsize::new(1);
pub static FISH_AMBIGUOUS_WIDTH: AtomicUsize = AtomicUsize::new(1);
/// Width of emoji characters.
///
@@ -25,34 +25,33 @@
/// Valid values are 1, and 2. 1 is the typical emoji width used in Unicode 8 while some newer
/// terminals use a width of 2 since Unicode 9.
// For some reason, this is declared here and exposed here, but is set in `env_dispatch`.
pub static FISH_EMOJI_WIDTH: AtomicIsize = AtomicIsize::new(2);
pub static FISH_EMOJI_WIDTH: AtomicUsize = AtomicUsize::new(2);
static WC_LOOKUP_TABLE: LazyLock<WcLookupTable> = LazyLock::new(WcLookupTable::new);
// Big hack to use our versions of wcswidth where we know them to be broken, which is
// EVERYWHERE (https://github.com/fish-shell/fish-shell/issues/2199)
pub fn fish_wcwidth(c: char) -> isize {
pub fn fish_wcwidth(c: char) -> Option<usize> {
// Check for VS16 which selects emoji presentation. This "promotes" a character like U+2764
// (width 1) to an emoji (probably width 2). So treat it as width 1 so the sums work. See #2652.
// VS15 selects text presentation.
let variation_selector_16 = '\u{FE0F}';
let variation_selector_15 = '\u{FE0E}';
if c == variation_selector_16 {
return 1;
return Some(1);
} else if c == variation_selector_15 {
return 0;
return Some(0);
}
// Check for Emoji_Modifier property. Only the Fitzpatrick modifiers have this, in range
// 1F3FB..1F3FF. This is a hack because such an emoji appearing on its own would be drawn as
// width 2, but that's unlikely to be useful. See #8275.
if ('\u{1F3FB}'..='\u{1F3FF}').contains(&c) {
return 0;
return Some(0);
}
let width = WC_LOOKUP_TABLE.classify(c);
match width {
WcWidth::NonCharacter | WcWidth::NonPrint | WcWidth::Combining | WcWidth::Unassigned => 0,
Some(match width {
WcWidth::NonPrint => return None,
WcWidth::NonCharacter | WcWidth::Combining | WcWidth::Unassigned => 0,
WcWidth::Ambiguous | WcWidth::PrivateUse => {
// TR11: "All private-use characters are by default classified as Ambiguous".
FISH_AMBIGUOUS_WIDTH.load(Ordering::Relaxed)
@@ -60,25 +59,25 @@ pub fn fish_wcwidth(c: char) -> isize {
WcWidth::One => 1,
WcWidth::Two => 2,
WcWidth::WidenedIn9 => FISH_EMOJI_WIDTH.load(Ordering::Relaxed),
}
})
}
/// fish's internal versions of wcwidth and wcswidth
pub fn fish_wcswidth(s: &wstr) -> isize {
// ascii fast path; empty iterator returns true for .all()
if s.chars().all(|c| c.is_ascii() && !c.is_ascii_control()) {
return s.len() as isize;
pub fn fish_wcswidth(s: &wstr) -> Option<usize> {
fish_wcswidth_canonicalizing(s, std::convert::identity)
}
pub fn fish_wcswidth_canonicalizing(s: &wstr, canonicalize: fn(char) -> char) -> Option<usize> {
let chars = s.chars().map(canonicalize);
// ascii fast path
if chars.clone().all(|c| c.is_ascii() && !c.is_ascii_control()) {
return Some(s.len());
}
let mut result = 0;
for c in s.chars() {
let w = fish_wcwidth(c);
if w < 0 {
return -1;
}
result += w;
for c in chars {
result += fish_wcwidth(c)?;
}
result
Some(result)
}
pub fn wcscasecmp(lhs: &wstr, rhs: &wstr) -> cmp::Ordering {

View File

@@ -0,0 +1,13 @@
[package]
name = "fish-feature-flags"
edition.workspace = true
rust-version.workspace = true
version = "0.0.0"
repository.workspace = true
license.workspace = true
[dependencies]
fish-widestring.workspace = true
[lints]
workspace = true

View File

@@ -1,15 +1,14 @@
//! Flags to enable upcoming features
use crate::prelude::*;
use std::sync::atomic::AtomicBool;
use std::sync::atomic::Ordering;
#[cfg(test)]
use std::cell::RefCell;
use fish_widestring::{L, WExt as _, wstr};
use std::{
cell::RefCell,
sync::atomic::{AtomicBool, Ordering},
};
/// The list of flags.
#[repr(u8)]
#[derive(Clone, Copy)]
#[derive(Clone, Copy, PartialEq, Eq)]
pub enum FeatureFlag {
/// Whether ^ is supported for stderr redirection.
StderrNoCaret,
@@ -63,10 +62,10 @@ pub struct FeatureMetadata {
pub description: &'static wstr,
/// Default flag value.
pub default_value: bool,
default_value: bool,
/// Whether the value can still be changed or not.
pub read_only: bool,
read_only: bool,
}
/// The metadata, indexed by flag.
@@ -156,31 +155,26 @@ pub struct FeatureMetadata {
];
thread_local!(
#[cfg(test)]
static LOCAL_FEATURES: RefCell<Option<Features>> = const { RefCell::new(None) };
static LOCAL_OVERRIDE_STACK: RefCell<Vec<(FeatureFlag, bool)>> =
const { RefCell::new(Vec::new()) };
);
/// The singleton shared feature set.
static FEATURES: Features = Features::new();
/// Perform a feature test on the global set of features.
pub fn test(flag: FeatureFlag) -> bool {
#[cfg(test)]
{
LOCAL_FEATURES.with(|fc| fc.borrow().as_ref().unwrap_or(&FEATURES).test(flag))
pub fn feature_test(flag: FeatureFlag) -> bool {
if let Some(value) = LOCAL_OVERRIDE_STACK.with(|stack| {
for &(overridden_feature, value) in stack.borrow().iter().rev() {
if flag == overridden_feature {
return Some(value);
}
}
None
}) {
return value;
}
#[cfg(not(test))]
{
FEATURES.test(flag)
}
}
pub use test as feature_test;
/// Set a flag.
#[cfg(test)]
pub fn set(flag: FeatureFlag, value: bool) {
LOCAL_FEATURES.with(|fc| fc.borrow().as_ref().unwrap_or(&FEATURES).set(flag, value));
FEATURES.test(flag)
}
/// Parses a comma-separated feature-flag string, updating ourselves with the values.
@@ -188,20 +182,7 @@ pub fn set(flag: FeatureFlag, value: bool) {
/// The special group name "all" may be used for those who like to live on the edge.
/// Unknown features are silently ignored.
pub fn set_from_string<'a>(str: impl Into<&'a wstr>) {
let wstr: &wstr = str.into();
#[cfg(test)]
{
LOCAL_FEATURES.with(|fc| {
fc.borrow()
.as_ref()
.unwrap_or(&FEATURES)
.set_from_string(wstr);
});
}
#[cfg(not(test))]
{
FEATURES.set_from_string(wstr);
}
FEATURES.set_from_string(str.into());
}
impl Features {
@@ -237,19 +218,14 @@ fn set(&self, flag: FeatureFlag, value: bool) {
}
fn set_from_string(&self, str: &wstr) {
let whitespace = L!("\t\n\0x0B\0x0C\r ").as_char_slice();
for entry in str.as_char_slice().split(|c| *c == ',') {
for entry in str.split(',') {
let entry = entry.trim();
if entry.is_empty() {
continue;
}
// Trim leading and trailing whitespace
let entry = &entry[entry.iter().take_while(|c| whitespace.contains(c)).count()..];
let entry =
&entry[..entry.len() - entry.iter().take_while(|c| whitespace.contains(c)).count()];
// A "no-" prefix inverts the sense.
let (name, value) = match entry.strip_prefix(L!("no-").as_char_slice()) {
let (name, value) = match entry.strip_prefix("no-") {
Some(suffix) => (suffix, false),
None => (entry, true),
};
@@ -275,28 +251,20 @@ fn set_from_string(&self, str: &wstr) {
}
}
#[cfg(test)]
pub fn scoped_test(flag: FeatureFlag, value: bool, test_fn: impl FnOnce()) {
LOCAL_FEATURES.with(|fc| {
assert!(
fc.borrow().is_none(),
"scoped_test() does not support nesting"
);
let f = Features::new();
f.set(flag, value);
*fc.borrow_mut() = Some(f);
/// Run code with a feature overridden.
/// This should only be used in tests.
pub fn with_overridden_feature(flag: FeatureFlag, value: bool, test_fn: impl FnOnce()) {
LOCAL_OVERRIDE_STACK.with(|stack| {
stack.borrow_mut().push((flag, value));
test_fn();
*fc.borrow_mut() = None;
stack.borrow_mut().pop();
});
}
#[cfg(test)]
mod tests {
use super::{FeatureFlag, Features, METADATA, scoped_test, set, test};
use crate::prelude::*;
use super::{FeatureFlag, Features, METADATA, feature_test, with_overridden_feature};
use fish_widestring::L;
#[test]
fn test_feature_flags() {
@@ -322,25 +290,19 @@ fn test_feature_flags() {
}
#[test]
fn test_scoped() {
scoped_test(FeatureFlag::QuestionMarkNoGlob, true, || {
assert!(test(FeatureFlag::QuestionMarkNoGlob));
fn test_overridden_feature() {
with_overridden_feature(FeatureFlag::QuestionMarkNoGlob, true, || {
assert!(feature_test(FeatureFlag::QuestionMarkNoGlob));
});
set(FeatureFlag::QuestionMarkNoGlob, true);
scoped_test(FeatureFlag::QuestionMarkNoGlob, false, || {
assert!(!test(FeatureFlag::QuestionMarkNoGlob));
with_overridden_feature(FeatureFlag::QuestionMarkNoGlob, false, || {
assert!(!feature_test(FeatureFlag::QuestionMarkNoGlob));
});
set(FeatureFlag::QuestionMarkNoGlob, false);
}
#[test]
#[should_panic]
fn test_nested_scopes_not_supported() {
scoped_test(FeatureFlag::QuestionMarkNoGlob, true, || {
scoped_test(FeatureFlag::QuestionMarkNoGlob, false, || {});
with_overridden_feature(FeatureFlag::QuestionMarkNoGlob, false, || {
with_overridden_feature(FeatureFlag::QuestionMarkNoGlob, true, || {
assert!(feature_test(FeatureFlag::QuestionMarkNoGlob));
});
});
}
}

View File

@@ -63,14 +63,23 @@ pub fn wcsfilecmp(a: &wstr, b: &wstr) -> Ordering {
continue;
}
// Sort dashes after Z - see #5634
let mut acl = if ac == '-' { '[' } else { ac };
let mut bcl = if bc == '-' { '[' } else { bc };
let transform = |c| {
// Sort dashes after Z - see #5634
if c == '-' {
return '[';
}
if c == '/' {
return '\0';
}
c
};
let ac = transform(ac);
let bc = transform(bc);
// TODO Compare the tail (enabled by Rust's Unicode support).
acl = acl.to_uppercase().next().unwrap();
bcl = bcl.to_uppercase().next().unwrap();
let ac = ac.to_uppercase().next().unwrap();
let bc = bc.to_uppercase().next().unwrap();
match acl.cmp(&bcl) {
match ac.cmp(&bc) {
Ordering::Equal => {
ai += 1;
bi += 1;
@@ -133,9 +142,9 @@ pub fn wcsfilecmp_glob(a: &wstr, b: &wstr) -> Ordering {
}
// TODO Compare the tail (enabled by Rust's Unicode support).
let acl = ac.to_lowercase().next().unwrap();
let bcl = bc.to_lowercase().next().unwrap();
match acl.cmp(&bcl) {
let ac = ac.to_lowercase().next().unwrap();
let bc = bc.to_lowercase().next().unwrap();
match ac.cmp(&bc) {
Ordering::Equal => {
ai += 1;
bi += 1;
@@ -314,5 +323,8 @@ macro_rules! validate {
validate!("a00b", "a0b", Ordering::Less);
validate!("a0b", "a00b", Ordering::Greater);
validate!("a-b", "azb", Ordering::Greater);
validate!("a", "a b", Ordering::Less);
validate!("a/", "a b/", Ordering::Less);
validate!("a/b", "a b", Ordering::Less); // Note this is arbitrary.
}
}

View File

@@ -1,12 +1,7 @@
//! Helper functions for working with wcstring.
use std::{
ffi::{CStr, CString, OsString},
os::unix::ffi::OsStringExt as _,
};
use fish_fallback::{fish_wcwidth, lowercase, lowercase_rev, wcscasecmp, wcscasecmp_fuzzy};
use fish_widestring::{ELLIPSIS_CHAR, decode_byte_from_char, prelude::*};
use fish_widestring::{ELLIPSIS_CHAR, prelude::*};
/// Return the number of newlines in a string.
pub fn count_newlines(s: &wstr) -> usize {
@@ -340,145 +335,6 @@ pub fn string_fuzzy_match_string(
StringFuzzyMatch::try_create(string, match_against, anchor_start)
}
/// Implementation of wcs2bytes that accepts a callback.
/// The first argument can be either a `&str` or `&wstr`.
/// This invokes `func` with byte slices containing the UTF-8 encoding of the characters in the
/// input, doing one invocation per character.
/// If `func` returns false, it stops; otherwise it continues.
/// Return false if the callback returned false, otherwise true.
pub fn str2bytes_callback(input: impl IntoCharIter, mut func: impl FnMut(&[u8]) -> bool) -> bool {
// A `char` represents an Unicode scalar value, which takes up at most 4 bytes when encoded in UTF-8.
let mut converted = [0_u8; 4];
for c in input.chars() {
let bytes = if let Some(byte) = decode_byte_from_char(c) {
converted[0] = byte;
&converted[..=0]
} else {
c.encode_utf8(&mut converted).as_bytes()
};
if !func(bytes) {
return false;
}
}
true
}
/// Returns a newly allocated multibyte character string equivalent of the specified wide character
/// string.
///
/// This function decodes illegal character sequences in a reversible way using the private use
/// area.
pub fn wcs2bytes(input: impl IntoCharIter) -> Vec<u8> {
let mut result = vec![];
wcs2bytes_appending(&mut result, input);
result
}
pub fn wcs2osstring(input: &wstr) -> OsString {
if input.is_empty() {
return OsString::new();
}
let mut result = vec![];
wcs2bytes_appending(&mut result, input);
OsString::from_vec(result)
}
/// Same as [`wcs2bytes`]. Meant to be used when we need a zero-terminated string to feed legacy APIs.
/// Note: if `input` contains any interior NUL bytes, the result will be truncated at the first!
pub fn wcs2zstring(input: &wstr) -> CString {
if input.is_empty() {
return CString::default();
}
let mut vec = Vec::with_capacity(input.len() + 1);
str2bytes_callback(input, |buff| {
vec.extend_from_slice(buff);
true
});
vec.push(b'\0');
match CString::from_vec_with_nul(vec) {
Ok(cstr) => cstr,
Err(err) => {
// `input` contained a NUL in the middle; we can retrieve `vec`, though
let mut vec = err.into_bytes();
let pos = vec.iter().position(|c| *c == b'\0').unwrap();
vec.truncate(pos + 1);
// Safety: We truncated after the first NUL
unsafe { CString::from_vec_with_nul_unchecked(vec) }
}
}
}
/// Like [`wcs2bytes`], but appends to `output` instead of returning a new string.
pub fn wcs2bytes_appending(output: &mut Vec<u8>, input: impl IntoCharIter) {
str2bytes_callback(input, |buff| {
output.extend_from_slice(buff);
true
});
}
/// A trait to make it more convenient to pass ascii/Unicode strings to functions that can take
/// non-Unicode values. The result is nul-terminated and can be passed to OS functions.
///
/// This is only implemented for owned types where an owned instance will skip allocations (e.g.
/// `CString` can return `self`) but not implemented for owned instances where a new allocation is
/// always required (e.g. implemented for `&wstr` but not `WideString`) because you might as well be
/// left with the original item if we're going to allocate from scratch in all cases.
pub trait ToCString {
/// Correctly convert to a nul-terminated [`CString`] that can be passed to OS functions.
fn to_cstring(self) -> CString;
}
impl ToCString for CString {
fn to_cstring(self) -> CString {
self
}
}
impl ToCString for &CStr {
fn to_cstring(self) -> CString {
self.to_owned()
}
}
/// Safely converts from `&wstr` to a `CString` to a nul-terminated `CString` that can be passed to
/// OS functions, taking into account non-Unicode values that have been shifted into the private-use
/// range by using [`wcs2zstring()`].
impl ToCString for &wstr {
/// The wide string may contain non-Unicode bytes mapped to the private-use Unicode range, so we
/// have to use [`wcs2zstring()`](self::wcs2zstring) to convert it correctly.
fn to_cstring(self) -> CString {
self::wcs2zstring(self)
}
}
/// Safely converts from `&WString` to a nul-terminated `CString` that can be passed to OS
/// functions, taking into account non-Unicode values that have been shifted into the private-use
/// range by using [`wcs2zstring()`].
impl ToCString for &WString {
fn to_cstring(self) -> CString {
self.as_utfstr().to_cstring()
}
}
/// Convert a (probably ascii) string to CString that can be passed to OS functions.
impl ToCString for Vec<u8> {
fn to_cstring(mut self) -> CString {
self.push(b'\0');
CString::from_vec_with_nul(self).unwrap()
}
}
/// Convert a (probably ascii) string to nul-terminated CString that can be passed to OS functions.
impl ToCString for &[u8] {
fn to_cstring(self) -> CString {
CString::new(self).unwrap()
}
}
/// Split a string by runs of any of the separator characters provided in `seps`.
/// Note the delimiters are the characters in `seps`, not `seps` itself.
/// `seps` may contain the NUL character.
@@ -665,12 +521,12 @@ fn next(&mut self) -> Option<Self::Item> {
}
}
/// Like fish_wcwidth, but returns 0 for characters with no real width instead of -1.
/// Like fish_wcwidth, but returns 0 for characters with no real width instead of none.
pub fn fish_wcwidth_visible(c: char) -> isize {
if c == '\x08' {
return -1;
}
fish_wcwidth(c).max(0)
fish_wcwidth(c).unwrap_or_default().try_into().unwrap()
}
#[cfg(test)]

View File

@@ -7,6 +7,7 @@ repository.workspace = true
license.workspace = true
[dependencies]
libc.workspace = true
unicode-width.workspace = true
widestring.workspace = true

View File

@@ -6,16 +6,34 @@
pub mod word_char;
use std::{iter, slice};
use std::{
ffi::{CStr, CString, OsStr, OsString},
iter,
os::unix::ffi::{OsStrExt as _, OsStringExt as _},
slice,
};
pub use widestring::{Utf32Str as wstr, Utf32String as WString, utf32str as L, utfstr::CharsUtf32};
pub mod prelude {
pub use crate::{IntoCharIter, L, ToWString, WExt, WString, wstr};
}
// Highest legal ASCII value.
pub const ASCII_MAX: char = 127 as char;
// Highest legal 16-bit Unicode value.
pub const UCS2_MAX: char = '\u{FFFF}';
// Highest legal byte value.
pub const BYTE_MAX: char = 0xFF as char;
// Unicode BOM value.
pub const UTF8_BOM_WCHAR: char = '\u{FEFF}';
/// The character to use where the text has been truncated.
pub const ELLIPSIS_CHAR: char = '\u{2026}'; // ('…')
pub const SPECIAL_KEY_ENCODE_BASE: char = '\u{F500}';
// These are in the Unicode private-use range. We really shouldn't use this
// range but have little choice in the matter given how our lexer/parser works.
// We can't use non-characters for these two ranges because there are only 66 of
@@ -28,9 +46,79 @@ pub mod prelude {
// Note: We don't use the highest 8 bit range (0xF800 - 0xF8FF) because we know
// of at least one use of a codepoint in that range: the Apple symbol (0xF8FF)
// on Mac OS X. See http://www.unicode.org/faq/private_use.html.
pub const ENCODE_DIRECT_BASE: char = '\u{F600}';
pub const ENCODE_DIRECT_BASE: char = char_offset(SPECIAL_KEY_ENCODE_BASE, 256);
pub const ENCODE_DIRECT_END: char = char_offset(ENCODE_DIRECT_BASE, 256);
// Use Unicode "non-characters" for internal characters as much as we can. This
// gives us 32 "characters" for internal use that we can guarantee should not
// appear in our input stream. See http://www.unicode.org/faq/private_use.html.
pub const RESERVED_CHAR_BASE: char = '\u{FDD0}';
pub const RESERVED_CHAR_END: char = '\u{FDF0}';
// Split the available non-character values into two ranges to ensure there are
// no conflicts among the places we use these special characters.
pub const EXPAND_RESERVED_BASE: char = RESERVED_CHAR_BASE;
pub const EXPAND_RESERVED_END: char = char_offset(EXPAND_RESERVED_BASE, 16);
pub const WILDCARD_RESERVED_BASE: char = EXPAND_RESERVED_END;
pub const WILDCARD_RESERVED_END: char = char_offset(WILDCARD_RESERVED_BASE, 16);
// Make sure the ranges defined above don't exceed the range for non-characters.
// This is to make sure we didn't do something stupid in subdividing the
// Unicode range for our needs.
const _: () = assert!(WILDCARD_RESERVED_END <= RESERVED_CHAR_END);
/// Character representing any character except '/' (slash).
pub const ANY_CHAR: char = char_offset(WILDCARD_RESERVED_BASE, 0);
/// Character representing any character string not containing '/' (slash).
pub const ANY_STRING: char = char_offset(WILDCARD_RESERVED_BASE, 1);
/// Character representing any character string.
pub const ANY_STRING_RECURSIVE: char = char_offset(WILDCARD_RESERVED_BASE, 2);
/// This is a special pseudo-char that is not used other than to mark the
/// end of the special characters so we can sanity check the enum range.
#[allow(dead_code)]
pub const ANY_SENTINEL: char = char_offset(WILDCARD_RESERVED_BASE, 3);
/// Character representing a home directory.
pub const HOME_DIRECTORY: char = char_offset(EXPAND_RESERVED_BASE, 0);
/// Character representing process expansion for %self.
pub const PROCESS_EXPAND_SELF: char = char_offset(EXPAND_RESERVED_BASE, 1);
/// Character representing variable expansion.
pub const VARIABLE_EXPAND: char = char_offset(EXPAND_RESERVED_BASE, 2);
/// Character representing variable expansion into a single element.
pub const VARIABLE_EXPAND_SINGLE: char = char_offset(EXPAND_RESERVED_BASE, 3);
/// Character representing the start of a bracket expansion.
pub const BRACE_BEGIN: char = char_offset(EXPAND_RESERVED_BASE, 4);
/// Character representing the end of a bracket expansion.
pub const BRACE_END: char = char_offset(EXPAND_RESERVED_BASE, 5);
/// Character representing separation between two bracket elements.
pub const BRACE_SEP: char = char_offset(EXPAND_RESERVED_BASE, 6);
/// Character that takes the place of any whitespace within non-quoted text in braces
pub const BRACE_SPACE: char = char_offset(EXPAND_RESERVED_BASE, 7);
/// Separate subtokens in a token with this character.
pub const INTERNAL_SEPARATOR: char = char_offset(EXPAND_RESERVED_BASE, 8);
/// Character representing an empty variable expansion. Only used transitively while expanding
/// variables.
pub const VARIABLE_EXPAND_EMPTY: char = char_offset(EXPAND_RESERVED_BASE, 9);
const _: () = assert!(
EXPAND_RESERVED_END as u32 > VARIABLE_EXPAND_EMPTY as u32,
"Characters used in expansions must stay within private use area"
);
/// The string represented by PROCESS_EXPAND_SELF
pub const PROCESS_EXPAND_SELF_STR: &wstr = L!("%self");
/// Return true if the character is in a range reserved for fish's private use.
///
/// NOTE: This is used when tokenizing the input. It is also used when reading input, before
/// tokenization, to replace such chars with REPLACEMENT_WCHAR if they're not part of a quoted
/// string. We don't want external input to be able to feed reserved characters into our
/// lexer/parser or code evaluator.
//
// TODO: Actually implement the replacement as documented above.
pub fn fish_reserved_codepoint(c: char) -> bool {
(c >= RESERVED_CHAR_BASE && c < RESERVED_CHAR_END)
|| (c >= SPECIAL_KEY_ENCODE_BASE && c < ENCODE_DIRECT_END)
}
/// Encode a literal byte in a UTF-32 character. This is required for e.g. the echo builtin, whose
/// escape sequences can be used to construct raw byte sequences which are then interpreted as e.g.
/// UTF-8 by the terminal. If we were to interpret each of those bytes as a codepoint and encode it
@@ -43,6 +131,86 @@ pub fn encode_byte_to_char(byte: u8) -> char {
.expect("private-use codepoint should be valid char")
}
/// Returns a newly allocated multibyte character string equivalent of the specified wide character
/// string.
///
/// This function decodes illegal character sequences in a reversible way using the private use
/// area.
pub fn wcs2bytes(input: impl IntoCharIter) -> Vec<u8> {
let mut result = vec![];
wcs2bytes_appending(&mut result, input);
result
}
pub fn wcs2osstring(input: &wstr) -> OsString {
if input.is_empty() {
return OsString::new();
}
let mut result = vec![];
wcs2bytes_appending(&mut result, input);
OsString::from_vec(result)
}
/// Same as [`wcs2bytes`]. Meant to be used when we need a zero-terminated string to feed legacy APIs.
/// Note: if `input` contains any interior NUL bytes, the result will be truncated at the first!
pub fn wcs2zstring(input: &wstr) -> CString {
if input.is_empty() {
return CString::default();
}
let mut vec = Vec::with_capacity(input.len() + 1);
str2bytes_callback(input, |buff| {
vec.extend_from_slice(buff);
true
});
vec.push(b'\0');
match CString::from_vec_with_nul(vec) {
Ok(cstr) => cstr,
Err(err) => {
// `input` contained a NUL in the middle; we can retrieve `vec`, though
let mut vec = err.into_bytes();
let pos = vec.iter().position(|c| *c == b'\0').unwrap();
vec.truncate(pos + 1);
// Safety: We truncated after the first NUL
unsafe { CString::from_vec_with_nul_unchecked(vec) }
}
}
}
/// Like [`wcs2bytes`], but appends to `output` instead of returning a new string.
pub fn wcs2bytes_appending(output: &mut Vec<u8>, input: impl IntoCharIter) {
str2bytes_callback(input, |buff| {
output.extend_from_slice(buff);
true
});
}
/// Implementation of wcs2bytes that accepts a callback.
/// The first argument can be either a `&str` or `&wstr`.
/// This invokes `func` with byte slices containing the UTF-8 encoding of the characters in the
/// input, doing one invocation per character.
/// If `func` returns false, it stops; otherwise it continues.
/// Return false if the callback returned false, otherwise true.
pub fn str2bytes_callback(input: impl IntoCharIter, mut func: impl FnMut(&[u8]) -> bool) -> bool {
// A `char` represents an Unicode scalar value, which takes up at most 4 bytes when encoded in UTF-8.
let mut converted = [0_u8; 4];
for c in input.chars() {
let bytes = if let Some(byte) = decode_byte_from_char(c) {
converted[0] = byte;
&converted[..=0]
} else {
c.encode_utf8(&mut converted).as_bytes()
};
if !func(bytes) {
return false;
}
}
true
}
/// Decode a literal byte from a UTF-32 character.
pub fn decode_byte_from_char(c: char) -> Option<u8> {
if c >= ENCODE_DIRECT_BASE && c < ENCODE_DIRECT_END {
@@ -56,6 +224,65 @@ pub fn decode_byte_from_char(c: char) -> Option<u8> {
}
}
/// A trait to make it more convenient to pass ascii/Unicode strings to functions that can take
/// non-Unicode values. The result is nul-terminated and can be passed to OS functions.
///
/// This is only implemented for owned types where an owned instance will skip allocations (e.g.
/// `CString` can return `self`) but not implemented for owned instances where a new allocation is
/// always required (e.g. implemented for `&wstr` but not `WideString`) because you might as well be
/// left with the original item if we're going to allocate from scratch in all cases.
pub trait ToCString {
/// Correctly convert to a nul-terminated [`CString`] that can be passed to OS functions.
fn to_cstring(self) -> CString;
}
impl ToCString for CString {
fn to_cstring(self) -> CString {
self
}
}
impl ToCString for &CStr {
fn to_cstring(self) -> CString {
self.to_owned()
}
}
/// Safely converts from `&wstr` to a `CString` to a nul-terminated `CString` that can be passed to
/// OS functions, taking into account non-Unicode values that have been shifted into the private-use
/// range by using [`wcs2zstring()`].
impl ToCString for &wstr {
/// The wide string may contain non-Unicode bytes mapped to the private-use Unicode range, so we
/// have to use [`wcs2zstring()`](self::wcs2zstring) to convert it correctly.
fn to_cstring(self) -> CString {
self::wcs2zstring(self)
}
}
/// Safely converts from `&WString` to a nul-terminated `CString` that can be passed to OS
/// functions, taking into account non-Unicode values that have been shifted into the private-use
/// range by using [`wcs2zstring()`].
impl ToCString for &WString {
fn to_cstring(self) -> CString {
self.as_utfstr().to_cstring()
}
}
/// Convert a (probably ascii) string to CString that can be passed to OS functions.
impl ToCString for Vec<u8> {
fn to_cstring(mut self) -> CString {
self.push(b'\0');
CString::from_vec_with_nul(self).unwrap()
}
}
/// Convert a (probably ascii) string to nul-terminated CString that can be passed to OS functions.
impl ToCString for &[u8] {
fn to_cstring(self) -> CString {
CString::new(self).unwrap()
}
}
mod decoder {
use crate::{ENCODE_DIRECT_BASE, ENCODE_DIRECT_END, char_offset, wstr};
use buffer::Buffer;
@@ -276,6 +503,82 @@ pub const fn char_offset(base: char, offset: u32) -> char {
}
}
/// Encodes the bytes in `input` into a [`WString`], encoding non-UTF-8 bytes into private-use-area
/// code-points. Bytes which would be parsed into our reserved PUA range are encoded individually,
/// to allow for correct round-tripping.
pub fn bytes2wcstring(mut input: &[u8]) -> WString {
if input.is_empty() {
return WString::new();
}
let mut result = WString::with_capacity(input.len());
fn append_escaped_str(output: &mut WString, input: &str) {
for (i, c) in input.char_indices() {
if fish_reserved_codepoint(c) {
for byte in &input.as_bytes()[i..i + c.len_utf8()] {
output.push(encode_byte_to_char(*byte));
}
} else {
output.push(c);
}
}
}
while !input.is_empty() {
match std::str::from_utf8(input) {
Ok(parsed_str) => {
append_escaped_str(&mut result, parsed_str);
// The entire remaining input could be parsed, so we are done.
break;
}
Err(e) => {
let (valid, after_valid) = input.split_at(e.valid_up_to());
// SAFETY: The previous `str::from_utf8` call established that the prefix `valid`
// is valid UTF-8. This prefix may be empty.
let parsed_str = unsafe { std::str::from_utf8_unchecked(valid) };
append_escaped_str(&mut result, parsed_str);
// The length of the prefix of `after_valid` which is invalid UTF-8.
// The remaining bytes of `input` (if any) will be parsed in subsequent iterations
// of the loop, starting from the first byte that starts a valid UTF-8-encoded codepoint.
// `error_len` can return `None`, if it sees a byte sequence that could be the
// prefix of a valid code-point encoding at the end of the byte slice.
// This is useful when the input is chunked, but we don't do that, so in this case
// we use our custom encoding for all remaining bytes (at most 3).
let error_len = e.error_len().unwrap_or(after_valid.len());
for byte in &after_valid[..error_len] {
result.push(encode_byte_to_char(*byte));
}
input = &after_valid[error_len..];
}
}
}
result
}
/// Use this rather than [`WString::from_str`] when the input could contain PUA bytes we use to
/// encode non-UTF-8 bytes. Otherwise, when decoding the resulting [`WString`], the PUA bytes in
/// the input would be converted to non-UTF-8 bytes.
pub fn str2wcstring<S: AsRef<str>>(input: S) -> WString {
bytes2wcstring(input.as_ref().as_bytes())
}
pub fn cstr2wcstring<C: AsRef<CStr>>(input: C) -> WString {
bytes2wcstring(input.as_ref().to_bytes())
}
pub fn osstr2wcstring<O: AsRef<OsStr>>(input: O) -> WString {
bytes2wcstring(input.as_ref().as_bytes())
}
/// # SAFETY
///
/// `input` must point to a valid NUL-terminated string.
pub unsafe fn charptr2wcstring(input: *const libc::c_char) -> WString {
let input: &[u8] = unsafe { CStr::from_ptr(input).to_bytes() };
bytes2wcstring(input)
}
/// Finds `needle` in a `haystack` and returns the index of the first matching element, if any.
///
/// # Examples

View File

@@ -7,7 +7,12 @@ repository.workspace = true
[dependencies]
anstyle.workspace = true
anyhow.workspace = true
clap.workspace = true
fish-build-helper.workspace = true
fish-common.workspace = true
fish-tempfile.workspace = true
fish-widestring.workspace = true
ignore.workspace = true
pcre2.workspace = true
walkdir.workspace = true

View File

@@ -1,4 +1,5 @@
use anstyle::{AnsiColor, Style};
use anyhow::{Context, Result, bail};
use clap::Args;
use std::{
io::{ErrorKind, Write},
@@ -25,12 +26,12 @@ pub struct FormatArgs {
paths: Vec<PathBuf>,
}
pub fn format(args: FormatArgs) {
pub fn format(args: FormatArgs) -> Result<()> {
if !args.all && args.paths.is_empty() {
println!(
"{YELLOW}warning: No paths specified. Nothing to do. Use the \"--all\" flag to consider all eligible files.{YELLOW:#}"
);
return;
return Ok(());
}
if !args.force && !args.check {
match Command::new("git")
@@ -39,16 +40,22 @@ pub fn format(args: FormatArgs) {
{
Ok(output) => {
if !output.stdout.is_empty() {
std::io::stdout().write_all(&output.stdout).unwrap();
std::io::stdout()
.write_all(&output.stdout)
.context("Could not write to stdout.")?;
print!(
"You have uncommitted changes (listed above). Are you sure you want to format? (y/N): "
);
std::io::stdout().flush().unwrap();
std::io::stdout()
.flush()
.context("Could not flush stdout.")?;
let mut response = String::new();
std::io::stdin().read_line(&mut response).unwrap();
std::io::stdin()
.read_line(&mut response)
.context("Could not read from stdin.")?;
if response.trim_end() != "y" {
println!("Exiting without formatting.");
return;
return Ok(());
}
}
}
@@ -58,22 +65,25 @@ pub fn format(args: FormatArgs) {
"{YELLOW}warning: Did not find git, will proceed without checking for unstaged changes.{YELLOW:#}"
)
} else {
fail!("Failed to run git status:\n{e}")
bail!("Failed to run git status:\n{e}");
}
}
}
}
format_fish(&args);
format_python(&args);
format_rust(&args);
format_fish(&args)?;
format_python(&args)?;
format_rust(&args)?;
Ok(())
}
fn run_formatter(formatter: &mut Command, name: &str) {
fn run_formatter(formatter: &mut Command, name: &str) -> Result<()> {
println!("=== Running {GREEN}{name}{GREEN:#}");
match formatter.status() {
Ok(exit_status) => {
if !exit_status.success() {
fail!("{name:?}: Files are not formatted correctly.");
if exit_status.success() {
Ok(())
} else {
bail!("{name:?}: Files are not formatted correctly.");
}
}
Err(e) => {
@@ -81,15 +91,16 @@ fn run_formatter(formatter: &mut Command, name: &str) {
eprintln!(
"{YELLOW}Formatter not found: {name:?}. Skipping associated files.{YELLOW:#}"
);
Ok(())
} else {
fail!("Error occurred while running {name:?}:\n{e}")
Err(e).with_context(|| format!("Error occurred while running {name:?}"))
}
}
}
}
fn format_fish(args: &FormatArgs) {
let mut fish_paths = files_with_extension(&args.paths, "fish");
fn format_fish(args: &FormatArgs) -> Result<()> {
let mut fish_paths = files_with_extension(&args.paths, "fish")?;
if args.all {
let workspace_root = fish_build_helper::workspace_root();
let fish_formatting_dirs = ["benchmarks", "build_tools", "etc", "share"];
@@ -98,10 +109,10 @@ fn format_fish(args: &FormatArgs) {
.iter()
.map(|dir_name| workspace_root.join(dir_name)),
"fish",
));
)?);
};
if fish_paths.is_empty() {
return;
return Ok(());
}
// TODO: make `fish_indent` available as a Rust library function, to avoid needing a
// `fish_indent` binary in `$PATH`.
@@ -113,40 +124,40 @@ fn format_fish(args: &FormatArgs) {
}
formatter.arg("--");
formatter.args(fish_paths);
run_formatter(&mut formatter, "fish_indent");
run_formatter(&mut formatter, "fish_indent")
}
fn format_python(args: &FormatArgs) {
fn format_python(args: &FormatArgs) -> Result<()> {
let mut formatter = Command::new("ruff");
formatter.arg("format");
if args.check {
formatter.arg("--check");
}
let mut python_files = files_with_extension(&args.paths, "py");
let mut python_files = files_with_extension(&args.paths, "py")?;
if args.all {
python_files.push(fish_build_helper::workspace_root().to_owned());
};
if python_files.is_empty() {
return;
return Ok(());
}
formatter.args(python_files);
run_formatter(&mut formatter, "ruff format");
run_formatter(&mut formatter, "ruff format")
}
fn format_rust(args: &FormatArgs) {
fn format_rust(args: &FormatArgs) -> Result<()> {
let rustfmt_status = Command::new("cargo")
.arg("fmt")
.arg("--version")
.stdout(Stdio::null())
.status()
.unwrap();
.context("Failed to run cargo")?;
if !rustfmt_status.success() {
eprintln!(
"{YELLOW}Please install \"rustfmt\" to format Rust, e.g. via:\n\
rustup component add rustfmt{YELLOW:#}"
);
return;
return Ok(());
}
if args.all {
let mut formatter = Command::new("cargo");
@@ -155,16 +166,17 @@ fn format_rust(args: &FormatArgs) {
if args.check {
formatter.arg("--check");
}
run_formatter(&mut formatter, "cargo fmt");
run_formatter(&mut formatter, "cargo fmt")?;
}
let rust_files = files_with_extension(&args.paths, "rs");
if !rust_files.is_empty() {
let mut formatter = Command::new("rustfmt");
if args.check {
formatter.arg("--check");
formatter.arg("--files-with-diff");
}
formatter.args(rust_files);
run_formatter(&mut formatter, "rustfmt");
let rust_files = files_with_extension(&args.paths, "rs")?;
if rust_files.is_empty() {
return Ok(());
}
let mut formatter = Command::new("rustfmt");
if args.check {
formatter.arg("--check");
formatter.arg("--files-with-diff");
}
formatter.args(rust_files);
run_formatter(&mut formatter, "rustfmt")
}

548
crates/xtask/src/gettext.rs Normal file
View File

@@ -0,0 +1,548 @@
use crate::{CARGO, CommandExt, files_with_extension};
use anyhow::{Context as _, Result, bail};
use clap::{Args, Subcommand};
use fish_build_helper::po_dir;
use pcre2::bytes::Regex;
use std::{
fs::OpenOptions,
io::{Write as _, stdout},
path::{Path, PathBuf},
process::Command,
thread::spawn,
};
#[derive(Args)]
pub struct GettextArgs {
/// Path to the directory into which the messages from the Rust sources have been extracted.
/// If this is not specified, fish will be compiled with the `gettext-extract` feature to
/// obtain the messages.
#[arg(long)]
rust_extraction_dir: Option<PathBuf>,
#[command(subcommand)]
task: Task,
}
#[derive(Subcommand)]
enum Task {
/// Check whether the PO files are up to date.
/// Prints a diff and exits non-zero if they are outdated.
/// Considers all our PO files by default, also allows explicitly specifying which files to
/// consider.
Check { paths: Vec<PathBuf> },
/// Add a PO file for a new language.
New {
/// An ISO 639-1 language identifier (ISO 639-2 if the former does not exits),
/// optionally followed by and underscore and an ISO 3166-1 country code to specify the variant,
/// e.g. `de` or `pt_BR`.
language: String,
},
/// Update PO files.
/// This will delete entries for msgids which are no longer used in the sources and introduce
/// new, untranslated entries for messages which do not have an entry yet.
/// This tool should run every time a change to the messages localized via gettext occurs,
/// including fish script files, where many strings are implicitly localized.
/// Considers all our PO files by default, also allows explicitly specifying which files to
/// consider.
Update { paths: Vec<PathBuf> },
}
fn get_po_paths<P: AsRef<Path>>(specified_paths: &[P]) -> Result<Vec<PathBuf>> {
let extension = "po";
if specified_paths.is_empty() {
files_with_extension([po_dir()], extension)
} else {
files_with_extension(specified_paths, extension)
}
}
fn update_po_file<P: AsRef<Path>, Q: AsRef<Path>>(file_to_update: P, template: Q) -> Result<()> {
Command::new("msgmerge")
.args([
"--no-wrap",
"--update",
"--no-fuzzy-matching",
"--backup=none",
"--quiet",
])
.arg(file_to_update.as_ref())
.arg(template.as_ref())
.run()?;
let msgattrib_output_file = fish_tempfile::new_file().context("Failed to create temp file")?;
Command::new("msgattrib")
.args(["--no-wrap", "--no-obsolete"])
.arg("-o")
.arg(msgattrib_output_file.path())
.arg(file_to_update.as_ref())
.run()?;
crate::copy_file(msgattrib_output_file.path(), file_to_update.as_ref())?;
Ok(())
}
pub fn gettext(args: GettextArgs) -> Result<()> {
let template = match args.rust_extraction_dir {
Some(dir) => template::Template::new(dir)?,
None => {
let temp_dir = fish_tempfile::new_dir().context("Failed to create temp file")?;
Command::new(CARGO)
.args([
"check",
"--workspace",
"--all-targets",
"--features=gettext-extract",
])
.env("FISH_GETTEXT_EXTRACTION_DIR", temp_dir.path())
.run()?;
template::Template::new(temp_dir.path())?
}
};
let mut template_file = fish_tempfile::new_file().context("Failed to create temp file")?;
template_file
.get_mut()
.write_all(template.serialize())
.with_context(|| format!("Failed to write to temp file {:?}", template_file.path()))?;
template_file
.get_mut()
.flush()
.with_context(|| format!("Failed to flush temporary file {:?}", template_file.path()))?;
match args.task {
Task::Check { paths } => {
let mut thread_handles = vec![];
for path in get_po_paths(&paths)? {
let template_path_buf = template_file.path().to_owned();
let handle = spawn(move || -> Result<Option<Vec<u8>>> {
let tmp_copy =
fish_tempfile::new_file().context("Failed to create temp file")?;
crate::copy_file(&path, tmp_copy.path())?;
update_po_file(tmp_copy.path(), template_path_buf)?;
let diff_output = Command::new("diff")
.arg("-u")
.arg(&path)
.arg(tmp_copy.path())
.output()
.context("Failed to run diff")?;
if diff_output.status.success() {
Ok(None)
} else {
Ok(Some(diff_output.stdout))
}
});
thread_handles.push(handle);
}
let mut found_diff = false;
let mut error = None;
for handle in thread_handles {
// SAFETY: `handle.join()` only returns `Err` if the thread panicked.
// Our threads should not panic, and if they do, it's OK to deal with the unexpected
// behavior by panicking here as well.
match handle.join().unwrap() {
Ok(None) => {}
Ok(Some(diff)) => {
found_diff = true;
stdout()
.write_all(&diff)
.context("Could not write to stdout")?;
}
Err(e) => {
error = Some(e);
}
}
}
if let Some(e) = error {
return Err(e);
}
if found_diff {
bail!("Not all files are up to date");
}
Ok(())
}
Task::New { language } => {
let language_regex = Regex::new("^[a-z]{2,3}(_[A-Z]{2})?$").unwrap();
if !language_regex.is_match(language.as_bytes()).unwrap() {
bail!(
"The language name '{language}' does not match the expected format.\n\
It needs to be a two-letter ISO 639-1 language code, \
or a three-letter ISO 639-2 language code \
if no ISO 639-1 code exists for the language.\n\
Optionally, the language code can be followed be an underscore \
followed by an ISO 3166-1 country code to indicate a regional variant.\n\
Check the existing file names in {:?} for examples.",
po_dir()
);
}
// TODO (MSRV>=1.91): use with_added_extension instead of with_extension
let po_path = po_dir().join(&language).with_extension("po");
let mut new_po_file = OpenOptions::new()
.create_new(true)
.write(true)
.open(&po_path)
.with_context(|| format!("Failed to create file at {po_path:?}"))?;
let mut header = String::new();
let line_prefix = "# fish-note-sections: ";
let lines = [
"Translations are divided into sections, each starting with a fish-section-* pseudo-message.",
"The first few sections are more important.",
"Ignore the tier3 sections unless you have a lot of time.",
];
for line in lines {
use std::fmt::Write as _;
let _ = writeln!(header, "{line_prefix}{line}");
}
new_po_file
.write_all(header.as_bytes())
.with_context(|| format!("Failed to write to {po_path:?}"))?;
new_po_file
.write_all(template.serialize())
.with_context(|| format!("Failed to write to {po_path:?}"))?;
Ok(())
}
Task::Update { paths } => {
let mut thread_handles = vec![];
for path in get_po_paths(&paths)? {
let template_path_buf = template_file.path().to_owned();
let handle =
spawn(move || -> Result<()> { update_po_file(path, template_path_buf) });
thread_handles.push(handle);
}
let mut error = None;
for handle in thread_handles {
// SAFETY: `handle.join()` only returns `Err` if the thread panicked.
// Our threads should not panic, and if they do, it's OK to deal with the unexpected
// behavior by panicking here as well.
if let Err(e) = handle.join().unwrap() {
error = Some(e);
}
}
if let Some(e) = error {
return Err(e);
}
Ok(())
}
}
}
mod template {
use crate::{CommandExt as _, files_with_extension};
use anyhow::{Context as _, Result, bail};
use fish_build_helper::workspace_root;
use fish_common::{UnescapeFlags, unescape_string};
use fish_widestring::{str2wcstring, wcs2bytes};
use pcre2::bytes::Regex;
use std::{
collections::{HashMap, HashSet},
fmt::Display,
fs::OpenOptions,
io::Read as _,
path::Path,
process::Command,
sync::LazyLock,
};
// Gettext tools require this header to know which encoding is used.
const MINIMAL_HEADER: &str = r#"msgid ""
msgstr "Content-Type: text/plain; charset=UTF-8\n"
"#;
#[derive(PartialEq, Eq, PartialOrd, Ord, Clone, Copy, Hash)]
enum LocalizationTier {
Tier1,
Tier2,
Tier3,
}
impl TryFrom<&str> for LocalizationTier {
type Error = ();
fn try_from(value: &str) -> Result<Self, Self::Error> {
match value {
"tier1" => Ok(Self::Tier1),
"tier2" => Ok(Self::Tier2),
"tier3" => Ok(Self::Tier3),
_ => Err(()),
}
}
}
impl Display for LocalizationTier {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.write_str(match self {
Self::Tier1 => "tier1",
Self::Tier2 => "tier2",
Self::Tier3 => "tier3",
})
}
}
#[derive(Default)]
struct FishScriptMessages {
explicit: HashSet<String>,
implicit: HashSet<String>,
}
pub struct Template {
content: Vec<u8>,
}
impl Template {
pub fn serialize(&self) -> &[u8] {
&self.content
}
/// Create a gettext template.
/// `rust_extraction_dir` must be the path to a directory which contains the messages
/// extracted from the Rust sources.
pub fn new<P: AsRef<Path>>(rust_extraction_dir: P) -> Result<Self> {
let mut template = Self {
content: Vec::from(MINIMAL_HEADER.as_bytes()),
};
template.add_rust_messages(rust_extraction_dir)?;
template.add_fish_script_messages()?;
// TODO: keep internal set of msgids to avoid having to run msguniq. requires parsing
// gettext-extraction output
let msguniq_output = Command::new("msguniq")
.args(["--no-wrap"])
.run_with_stdio(template.content)?;
Ok(Template {
content: msguniq_output,
})
}
/// Expects `extraction_dir` to contain only files whose content are single PO entries which can be
/// concatenated into a valid PO file.
/// If this is the case, the messages are de-duplicated and sorted by `msguniq`.
/// The result is appended to `template`, with a leading section marker.
/// On failure, the process aborts.
fn add_rust_messages<P: AsRef<Path>>(&mut self, extraction_dir: P) -> Result<()> {
let extraction_dir = extraction_dir.as_ref();
let mut concatenated_content = Vec::from(MINIMAL_HEADER.as_bytes());
// Concatenate the content of all files in `extraction_dir` into `concatenated_content`.
for entry_result in extraction_dir
.read_dir()
.with_context(|| format!("Failed to read directory {extraction_dir:?}"))?
{
let entry = entry_result
.with_context(|| format!("Failed to get entry in {extraction_dir:?}"))?;
let entry_path = entry.path();
if !entry
.file_type()
.with_context(|| format!("Failed to get file type of {entry_path:?}"))?
.is_file()
{
bail!("Entry in {extraction_dir:?} is not a regular file");
}
let mut file = OpenOptions::new()
.read(true)
.open(&entry_path)
.with_context(|| format!("Failed to open file {entry_path:?}"))?;
file.read_to_end(&mut concatenated_content)
.with_context(|| format!("Failed to read file {entry_path:?}"))?;
}
// Get rid of duplicates and sort.
let msguniq_output = Command::new("msguniq")
.args(["--no-wrap", "--sort-output"])
.env("LC_ALL", "C.UTF-8")
.run_with_stdio(concatenated_content)?;
// The Header entry needs to be removed again,
// because it is added outside of this function.
let expected_prefix = MINIMAL_HEADER.as_bytes();
let actual_prefix = &msguniq_output[..expected_prefix.len()];
if expected_prefix != actual_prefix {
bail!(
"Prefix of msguniq output does not match expected header.\nExpected bytes:\n{expected_prefix:02x?}\nActual bytes:\n{actual_prefix:02x?}"
);
}
self.mark_section("tier1-from-rust");
self.content
.extend_from_slice(&msguniq_output[expected_prefix.len()..]);
self.content.push(b'\n');
Ok(())
}
fn mark_section(&mut self, section_name: &str) {
self.content
.extend_from_slice("msgid \"fish-section-".as_bytes());
self.content.extend_from_slice(section_name.as_bytes());
self.content
.extend_from_slice("\"\nmsgstr \"\"\n\n".as_bytes());
}
fn append_messages(&mut self, msgids: &HashSet<String>) -> Result<()> {
let mut unescaped_msgids = HashSet::new();
for msgid in msgids {
let unescaped_wstring = unescape_string(
&str2wcstring(msgid),
fish_common::UnescapeStringStyle::Script(UnescapeFlags::default()),
)
.with_context(|| format!("Failed to unescape the following string:\n{msgid}"))?;
unescaped_msgids.insert(
String::from_utf8(wcs2bytes(&unescaped_wstring))
.context("Parsed msgid is not valid UTF-8")?,
);
}
let mut unescaped_msgids = Vec::from_iter(unescaped_msgids);
unescaped_msgids.sort();
for msgid in &unescaped_msgids {
self.content
.extend_from_slice(format_msgid_for_po(msgid).as_bytes());
}
Ok(())
}
fn add_script_tier(
&mut self,
tier: LocalizationTier,
messages: FishScriptMessages,
) -> Result<()> {
if !messages.explicit.is_empty() {
self.mark_section(&format!("{tier}-from-script-explicitly-added"));
self.append_messages(&messages.explicit)?;
}
if !messages.implicit.is_empty() {
self.mark_section(&format!("{tier}-from-script-implicitly-added"));
self.append_messages(&messages.implicit)?;
}
Ok(())
}
fn add_fish_script_messages(&mut self) -> Result<()> {
let share_dir = workspace_root().join("share");
let relevant_file_paths = files_with_extension(
[
share_dir.join("config.fish"),
share_dir.join("completions"),
share_dir.join("functions"),
],
"fish",
)?;
let mut extracted_messages = HashMap::new();
for path in relevant_file_paths {
extract_messages_from_fish_script(path, &mut extracted_messages)?;
}
let mut messages_sorted_by_tier: Vec<_> = extracted_messages.into_iter().collect();
messages_sorted_by_tier.sort_by_key(|(tier, _)| *tier);
for (tier, messages) in messages_sorted_by_tier {
self.add_script_tier(tier, messages)?;
}
Ok(())
}
}
fn find_localization_tier<P: AsRef<Path>>(
input: &str,
path: P,
) -> Result<Option<LocalizationTier>> {
static L10N_ANNOTATION: LazyLock<Regex> = LazyLock::new(|| {
Regex::new(r"(?:^|\n)# localization: (?<annotation_value>.*)\n").unwrap()
});
if let Some(annotation) = L10N_ANNOTATION.captures(input.as_bytes()).unwrap() {
// SAFETY: `tier` is the name of a capture group in the regex whose captures we are looking
// at. The capture is done on the bytes of an UTF-8 encoded string, so the result will also
// be UTF-8 encoded, and the sub-slice we are looking at will start and end at codepoint
// boundaries.
let annotation_value =
std::str::from_utf8(annotation.name("annotation_value").unwrap().as_bytes())
.unwrap();
if let Ok(tier) = LocalizationTier::try_from(annotation_value) {
return Ok(Some(tier));
}
if annotation_value.starts_with("skip") {
return Ok(None);
}
bail!(
"Unexpected localization annotation in file {:?}: {annotation_value}",
path.as_ref()
);
}
let dirname = path
.as_ref()
.parent()
.with_context(|| {
format!(
"Tried to get the parent of a path which does not have a parent: {:?}",
path.as_ref()
)
})?
.file_name()
.with_context(|| {
format!(
"The parent of {:?} does not have a filename component",
path.as_ref()
)
})?;
let command_name = path
.as_ref()
.file_stem()
.with_context(|| format!("The path {:?} does not have a file stem", path.as_ref()))?;
if dirname == "functions"
&& command_name
.to_str()
.is_some_and(|name| name.starts_with("fish_"))
{
return Ok(Some(LocalizationTier::Tier1));
}
if dirname != "completions" {
bail!(
"Missing localization tier for function file {:?}",
path.as_ref()
);
}
// TODO (MSRV>=1.91): use with_added_extension instead of with_extension
let doc_path = workspace_root()
.join("doc_src")
.join("cmds")
.join(command_name)
.with_extension("rst");
let doc_path_exists = std::fs::exists(&doc_path)
.with_context(|| format!("Failed to check whether a file exists at {doc_path:?}"))?;
Ok(Some(if doc_path_exists {
LocalizationTier::Tier1
} else {
LocalizationTier::Tier3
}))
}
fn extract_messages_from_fish_script<P: AsRef<Path>>(
path: P,
extracted_messages: &mut HashMap<LocalizationTier, FishScriptMessages>,
) -> Result<()> {
let path = path.as_ref();
let file_content = std::fs::read_to_string(path)
.with_context(|| format!("Failed to read from {path:?}"))?;
let Some(tier) = find_localization_tier(&file_content, path)? else {
return Ok(());
};
// TODO: use proper parser instead of regex
static EXPLICIT_MESSAGE: LazyLock<Regex> =
LazyLock::new(|| Regex::new(r#"\( *_ (?<message>(['"]).+?(?<!\\)\2) *\)"#).unwrap());
static IMPLICIT_MESSAGE: LazyLock<Regex> = LazyLock::new(|| {
Regex::new(r#"(?:^|\n)(?:\s|and |or )*(?:complete|function).*? (?:-d|--description) (?<message>(['"]).+?(?<!\\)\2)"#).unwrap()
});
let messages_at_tier = extracted_messages.entry(tier).or_default();
for message in EXPLICIT_MESSAGE.captures_iter(file_content.as_bytes()) {
let message =
std::str::from_utf8(message.unwrap().name("message").unwrap().as_bytes()).unwrap();
messages_at_tier.explicit.insert(message.to_owned());
}
for message in IMPLICIT_MESSAGE.captures_iter(file_content.as_bytes()) {
let message =
std::str::from_utf8(message.unwrap().name("message").unwrap().as_bytes()).unwrap();
messages_at_tier.implicit.insert(message.to_owned());
}
Ok(())
}
fn format_msgid_for_po(msgid: &str) -> String {
let escaped_msgid = msgid.replace("\\", "\\\\").replace("\"", "\\\"");
format!(
"\
msgid \"{escaped_msgid}\"\n\
msgstr \"\"\n\
\n\
"
)
}
}

View File

@@ -1,70 +1,101 @@
use std::{
ffi::OsStr,
io::Write,
path::{Path, PathBuf},
process::Command,
process::{Command, Stdio},
};
use anyhow::{Context, Result, bail};
use walkdir::WalkDir;
macro_rules! fail {
($($arg:tt)+) => {{
eprintln!($($arg)+);
std::process::exit(1);
}}
}
pub mod format;
pub mod gettext;
pub mod shellcheck;
pub trait CommandExt {
fn run_or_fail(&mut self);
fn run(&mut self) -> Result<()>;
fn run_with_stdio(&mut self, stdin: Vec<u8>) -> Result<Vec<u8>>;
}
impl CommandExt for Command {
fn run_or_fail(&mut self) {
match self.status() {
Ok(exit_status) => {
if !exit_status.success() {
fail!("Command did not run successfully: {:?}", self.get_program())
}
}
Err(err) => {
fail!("Failed to run command: {err}")
}
fn run(&mut self) -> Result<()> {
if !self
.status()
.with_context(|| format!("Failed to run {:?}", self.get_program()))?
.success()
{
bail!("Command did not run successfully: {:?}", self.get_program())
}
Ok(())
}
fn run_with_stdio(&mut self, stdin: Vec<u8>) -> Result<Vec<u8>> {
let command_name = self.get_program().to_owned();
let mut child = self
.stdin(Stdio::piped())
.stdout(Stdio::piped())
.spawn()
.with_context(|| format!("Failed to run {command_name:?}"))?;
child
.stdin
.take()
.unwrap()
.write_all(&stdin)
.with_context(|| format!("Failed to write to stdin of {command_name:?}"))?;
let command_output = child
.wait_with_output()
.with_context(|| format!("Failed to read stdout of {command_name:?}"))?;
if !command_output.status.success() {
bail!("{command_name:?} failed");
}
Ok(command_output.stdout)
}
}
pub fn cargo<I, S>(cargo_args: I)
pub const CARGO: &str = env!("CARGO");
pub fn cargo<I, S>(cargo_args: I) -> Result<()>
where
I: IntoIterator<Item = S>,
S: AsRef<OsStr>,
{
Command::new(env!("CARGO")).args(cargo_args).run_or_fail();
Command::new(CARGO).args(cargo_args).run()
}
fn get_matching_files<P: AsRef<Path>, I: IntoIterator<Item = P>, M: Fn(&Path) -> bool>(
all_paths: I,
matcher: M,
) -> Vec<PathBuf> {
all_paths
.into_iter()
.flat_map(WalkDir::new)
.filter_map(|res| {
let entry = res.unwrap();
let path = entry.path();
if entry.file_type().is_file() && matcher(path) {
Some(path.to_owned())
} else {
None
) -> Result<Vec<PathBuf>> {
let mut matching_files = vec![];
for path in all_paths {
for dir_entry in WalkDir::new(path.as_ref()) {
let dir_entry = dir_entry
.with_context(|| format!("Failed to check paths at {:?}", path.as_ref()))?;
let path = dir_entry.path();
if dir_entry.file_type().is_file() && matcher(path) {
matching_files.push(path.to_owned());
}
})
.collect()
}
}
Ok(matching_files)
}
fn files_with_extension<P: AsRef<Path>, I: IntoIterator<Item = P>>(
all_paths: I,
extension: &str,
) -> Vec<PathBuf> {
) -> Result<Vec<PathBuf>> {
let matcher = |p: &Path| p.extension().is_some_and(|e| e == extension);
get_matching_files(all_paths, matcher)
}
fn copy_file<P: AsRef<Path>, Q: AsRef<Path>>(from: P, to: Q) -> Result<()> {
std::fs::copy(&from, &to)
.with_context(|| {
format!(
"Failed to copy from {:?} to {:?}",
from.as_ref(),
to.as_ref()
)
})
.map(|_| ())
}

View File

@@ -1,7 +1,8 @@
use anyhow::{Context, Result};
use clap::{Parser, Subcommand};
use fish_build_helper::as_os_strs;
use std::{path::PathBuf, process::Command};
use xtask::{CommandExt as _, cargo, format::FormatArgs};
use xtask::{CommandExt, cargo, format::FormatArgs, gettext::GettextArgs, shellcheck::shellcheck};
#[derive(Parser)]
#[command(
@@ -20,6 +21,8 @@ enum Task {
Check,
/// Format files or check if they are correctly formatted.
Format(FormatArgs),
/// Work on the gettext PO files.
Gettext(GettextArgs),
/// Build HTML docs
HtmlDocs {
/// Path to a fish_indent executable. If none is specified, fish_indent will be built.
@@ -28,53 +31,63 @@ enum Task {
},
/// Build man pages
ManPages,
/// Run ShellCheck on non-fish shell scripts
#[command(name = "shellcheck")]
ShellCheck,
}
fn main() {
fn main() -> Result<()> {
let cli = Cli::parse();
match cli.task {
Task::Check => run_checks(),
Task::Format(format_args) => xtask::format::format(format_args),
Task::Gettext(gettext_args) => xtask::gettext::gettext(gettext_args),
Task::HtmlDocs { fish_indent } => build_html_docs(fish_indent),
Task::ManPages => cargo(["build", "--package", "fish-build-man-pages"]),
Task::ShellCheck => shellcheck(),
}
}
fn run_checks() {
fn run_checks() -> Result<()> {
let repo_root_dir = fish_build_helper::workspace_root();
let check_script = repo_root_dir.join("build_tools").join("check.sh");
Command::new(check_script).run_or_fail();
Command::new(check_script).run()
}
fn build_html_docs(fish_indent: Option<PathBuf>) {
let fish_indent_path = fish_indent.unwrap_or_else(|| {
// Build fish_indent if no existing one is specified.
cargo([
"build",
"--bin",
"fish_indent",
"--profile",
"dev",
"--no-default-features",
]);
fish_build_helper::fish_build_dir()
.join("debug")
.join("fish_indent")
});
fn build_html_docs(fish_indent: Option<PathBuf>) -> Result<()> {
let fish_indent_path = match fish_indent {
Some(path) => path,
None => {
// Build fish_indent if no existing one is specified.
cargo([
"build",
"--bin",
"fish_indent",
"--profile",
"dev",
"--no-default-features",
])?;
fish_build_helper::fish_build_dir()
.join("debug")
.join("fish_indent")
}
};
// Set path so `sphinx-build` can find `fish_indent`.
// Create tempdir to store symlink to fish_indent.
// This is done to avoid adding other binaries to the PATH.
let tempdir = fish_tempfile::new_dir().unwrap();
let tempdir = fish_tempfile::new_dir().context("Failed to create tempdir")?;
std::os::unix::fs::symlink(
std::fs::canonicalize(fish_indent_path).unwrap(),
std::fs::canonicalize(&fish_indent_path).with_context(|| {
format!("Failed to canonicalize path to `fish_indent`: {fish_indent_path:?}")
})?,
tempdir.path().join("fish_indent"),
)
.unwrap();
let new_path = format!(
"{}:{}",
tempdir.path().to_str().unwrap(),
fish_build_helper::env_var("PATH").unwrap()
);
.context("Failed to create symlink for fish_indent")?;
let mut new_path = tempdir.path().as_os_str().to_owned();
if let Some(current_path) = std::env::var_os("PATH") {
new_path.push(":");
new_path.push(current_path);
}
let doc_src_dir = fish_build_helper::workspace_root().join("doc_src");
let doctrees_dir = fish_build_helper::fish_doc_dir().join(".doctrees-html");
let html_dir = fish_build_helper::fish_doc_dir().join("html");
@@ -94,5 +107,5 @@ fn build_html_docs(fish_indent: Option<PathBuf>) {
Command::new(option_env!("FISH_SPHINX").unwrap_or("sphinx-build"))
.env("PATH", new_path)
.args(args)
.run_or_fail();
.run()
}

View File

@@ -0,0 +1,52 @@
use anyhow::{Context, Result};
use fish_build_helper::workspace_root;
use ignore::Walk;
use pcre2::bytes::Regex;
use std::{
fs::File,
io::{BufRead, BufReader},
path::{Path, PathBuf},
process::Command,
sync::LazyLock,
};
use crate::CommandExt;
pub fn shellcheck() -> Result<()> {
Command::new("shellcheck")
.args(files_to_check()?)
.current_dir(workspace_root())
.run()
}
fn is_shell_script<P: AsRef<Path>>(path: P) -> Result<bool> {
let file = File::open(&path).with_context(|| format!("Failed to open {:?}", path.as_ref()))?;
let mut first_line = String::new();
let Ok(_) = BufReader::new(file).read_line(&mut first_line) else {
return Ok(false);
};
static SHEBANG_REGEX: LazyLock<Regex> = LazyLock::new(|| Regex::new("^#!.*[^i]sh").unwrap());
Ok(SHEBANG_REGEX
.is_match(first_line.trim().as_bytes())
.unwrap())
}
fn files_to_check() -> Result<Vec<PathBuf>> {
let mut files = vec![];
for dir_entry in Walk::new(workspace_root()) {
let dir_entry = dir_entry.context("Error traversing workspace")?;
if !dir_entry
.file_type()
.with_context(|| format!("Failed to determine file type of {dir_entry:?}"))?
.is_file()
{
continue;
}
let path = dir_entry.into_path();
if !is_shell_script(&path)? {
continue;
}
files.push(path);
}
Ok(files)
}

View File

@@ -3,22 +3,22 @@
confidence-threshold = 0.93
unused-allowed-license = "allow" # don't warn for unused licenses in this list
allow = [
"BSD-2-Clause",
"BSD-3-Clause",
"BSL-1.0",
"CC0-1.0",
"GPL-2.0",
"GPL-2.0-only",
"ISC",
"LGPL-2.0",
"LGPL-2.0-or-later",
"MIT",
"MPL-2.0",
"PSF-2.0",
"Unicode-DFS-2016",
"Unicode-3.0",
"WTFPL",
"Zlib",
"BSD-2-Clause",
"BSD-3-Clause",
"BSL-1.0",
"CC0-1.0",
"GPL-2.0",
"GPL-2.0-only",
"ISC",
"LGPL-2.0",
"LGPL-2.0-or-later",
"MIT",
"MPL-2.0",
"PSF-2.0",
"Unicode-DFS-2016",
"Unicode-3.0",
"WTFPL",
"Zlib",
]
[sources.allow-org]

View File

@@ -251,7 +251,7 @@ Some *OPTION_SPEC* examples:
- ``n/name=?`` means that both ``-n`` and ``--name`` are valid. It accepts an optional value and can be used at most once. If the flag is seen then ``_flag_n`` and ``_flag_name`` will be set with the value associated with the flag if one was provided else it will be set with no values.
- ``n/name=*`` is similar, but the flag can be used more than once. If the flag is seen then ``_flag_n`` and ``_flag_name`` will be set with the values associated with each occurence. Each value will be the value given to the option, or the empty string if no value was given.
- ``n/name=*`` is similar, but the flag can be used more than once. If the flag is seen then ``_flag_n`` and ``_flag_name`` will be set with the values associated with each occurrence. Each value will be the value given to the option, or the empty string if no value was given.
- ``name=+`` means that only ``--name`` is valid. It requires a value and can be used more than once. If the flag is seen then ``_flag_name`` will be set with the values associated with each occurrence.

View File

@@ -60,6 +60,8 @@ The event handler switches (``on-event``, ``on-variable``, ``on-job-exit``, ``on
Functions names cannot be reserved words. These are elements of fish syntax or builtin commands which are essential for the operations of the shell. Current reserved words are ``[``, ``_``, ``and``, ``argparse``, ``begin``, ``break``, ``builtin``, ``case``, ``command``, ``continue``, ``else``, ``end``, ``eval``, ``exec``, ``for``, ``function``, ``if``, ``not``, ``or``, ``read``, ``return``, ``set``, ``status``, ``string``, ``switch``, ``test``, ``time``, and ``while``.
Care should be taken when creating a function of the same name as an existing shell builtin or common program. If the function behaves differently, it is very common for problems to occur within fish or in scripts written by others. Consider writing an :doc:`abbreviation <abbr>` if you are wanting to replace one tool with another for interactive use.
Example
-------

View File

@@ -6,16 +6,17 @@ Synopsis
.. synopsis::
set
set (-f | --function) (-l | --local) (-g | --global) (-U | --universal) [--no-event]
set [-Uflg] NAME [VALUE ...]
set [-Uflg] NAME[[INDEX ...]] [VALUE ...]
set (-x | --export) (-u | --unexport) [-Uflg] NAME [VALUE ...]
set (-a | --append) (-p | --prepend) [-Uflg] NAME VALUE ...
set (-e | --erase) [-Uflg] [-xu] [NAME][[INDEX]] ...]
set (-q | --query) [-Uflg] [-xu] [NAME][[INDEX]] ...]
set [(-f | --function) (-l | --local) (-g | --global) (-U | --universal)]
[(-x | --export) (-u | --unexport)]
set (-S | --show) (-L | --long) [NAME ...]
set [-Uflg] [-xu] [--no-event] NAME [VALUE ...]
set [-Uflg] [--no-event] NAME[[INDEX ...]] [VALUE ...]
set (-a | --append) (-p | --prepend) [-Uflg] [--no-event] NAME VALUE ...
set (-e | --erase) [-Uflg] [--no-event] NAME[[INDEX]] ...
set (-q | --query) [-Uflg] [-xu] NAME[[INDEX]] ...
Description
-----------

View File

@@ -11,8 +11,10 @@ Synopsis
Description
-----------
``set_color`` is used to control the color and styling of text in the terminal.
*VALUE* describes that styling.
``set_color`` controls the color and styling of text in the terminal.
It writes non-printing color and text style escape sequences to standard output.
*VALUE* describes the styling.
*VALUE* can be a reserved color name like **red** or an RGB color value given as 3 or 6 hexadecimal digits ("F27" or "FF2277").
A special keyword **normal** resets text formatting to terminal defaults, however it is not recommended and the **--reset** option is preferred as it is less confusing and more future-proof.
@@ -93,7 +95,9 @@ Notes
3. Because of the risk of confusion, ``set_color --reset`` is recommended over ``set_color normal``.
4. Setting the background color only affects subsequently written characters. Fish provides no way to set the background color for the entire terminal window. Configuring the window background color (and other attributes such as its opacity) has to be done using whatever mechanisms the terminal provides. Look for a config option.
5. Some terminals use the ``--bold`` escape sequence to switch to a brighter color set rather than increasing the weight of text.
6. ``set_color`` works by printing sequences of characters to standard output. If used in command substitution or a pipe, these characters will also be captured. This may or may not be desirable. Checking the exit status of ``isatty stdout`` before using ``set_color`` can be useful to decide not to colorize output in a script.
6. If you use ``set_color`` in a command substitution or a pipe, these characters will also be captured.
This may or may not be desirable.
Checking the exit status of ``isatty stdout`` before using ``set_color`` can be useful to decide not to colorize output in a script.
Examples
--------

View File

@@ -54,7 +54,7 @@ It also provides a large number of program specific scripted completions. Most o
You can also write your own completions or install some you got from someone else. For that, see :doc:`Writing your own completions <completions>`.
Completion scripts are loaded on demand, like :ref:`functions are <syntax-function-autoloading>`. The difference is the ``$fish_complete_path`` :ref:`list <variables-lists>` is used instead of ``$fish_function_path``. Typically you can drop new completions in ~/.config/fish/completions/name-of-command.fish and fish will find them automatically.
Completion scripts are loaded on demand, like :ref:`functions are <syntax-function-autoloading>`. The difference is the ``$fish_complete_path`` :ref:`list <variables-lists>` is used instead of ``$fish_function_path``. Typically you can drop new completions in ``~/.config/fish/completions/<name-of-command>.fish`` and fish will find them automatically.
.. _syntax-highlighting:

View File

@@ -237,7 +237,7 @@ Optional Commands
``\e]0; Pt \e\\``
- ts
- Set terminal window title (OSC 0). Used in :doc:`fish_title <cmds/fish_title>`.
* - ``\e]2; Pt \e\\``
* - ``\e]1; Pt \e\\``
- ts
- Set terminal tab title (OSC 1). Used in :doc:`fish_tab_title <cmds/fish_tab_title>`.
* - ``\e]7;file:// Pt / Pt \e\\``

View File

@@ -17,6 +17,9 @@ BuildRequires: /usr/bin/sphinx-build
# OBS: add eg "FileProvides: /usr/bin/sphinx-build python3-sphinx python3-Sphinx" to project config
BuildRequires: /usr/bin/man
# OBS: add eg "FileProvides: /usr/bin/man man-db man" to project config
# pkg-config is needed for the pcre2 crate to find the pcre2 system librar
BuildRequires: /usr/bin/pkg-config
# OBS: add eg "FileProvides: /usr/bin/pkg-config pkgconf-pkg-config pkg-config" to project config
BuildRequires: cmake >= 3.15
# for tests

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -8,8 +8,8 @@ dependencies = []
[dependency-groups]
dev = [
"sphinx>=9.1", # updatecli.d/python.yml
"sphinx-markdown-builder",
"sphinx>=9.1", # updatecli.d/python.yml
"sphinx-markdown-builder",
]
[tool.uv.sources]

View File

@@ -5,6 +5,9 @@ if date --version >/dev/null 2>/dev/null
complete -c date -s I -l iso-8601 -d "Use ISO 8601 output format" -x -a "date hours minutes seconds"
complete -c date -s s -l set -d "Set time" -x
complete -c date -s R -l rfc-2822 -d "Output in RFC 2822 format"
complete -c date -l rfc-3339=date -d "Output in RFC 3339 format with date precision"
complete -c date -l rfc-3339=second -d "Output in RFC 3339 format with second precision"
complete -c date -l rfc-3339=ns -d "Output in RFC 3339 format with nanosecond precision"
complete -c date -s r -l reference -d "Display last modification time of file" -r
complete -c date -s u -l utc -d "Print/set UTC time" -f
complete -c date -l universal -d "Print/set UTC time" -f

View File

@@ -10,6 +10,19 @@ function __dnf_list_installed_packages
dnf repoquery --cacheonly "$cur*" --qf "%{name}\n" --installed </dev/null
end
function __dnf_list_copr_repos
set -l copr_repos (dnf copr list)
switch $argv[1]
case enable
string replace -f -- " (disabled)" "" $copr_repos
case disable
string match -v -- "*(disabled)*" $copr_repos
case '*'
string replace -- " (disabled)" "" $copr_repos
end
end
function __dnf_list_available_packages
set -l tok (commandline -ct | string collect)
set -l files (__fish_complete_suffix .rpm)
@@ -86,6 +99,20 @@ complete -c dnf -n "__fish_seen_subcommand_from clean" -xa metadata -d "Removes
complete -c dnf -n "__fish_seen_subcommand_from clean" -xa packages -d "Removes any cached packages"
complete -c dnf -n "__fish_seen_subcommand_from clean" -xa all -d "Removes all cache"
# Copr
set -l coprcommands list enable disable remove debug
complete -c dnf -n __fish_use_subcommand -xa copr -d "Manage Copr repositories"
complete -c dnf -n "__fish_seen_subcommand_from copr; and not __fish_seen_subcommand_from $coprcommands" -xa list -d "List Copr repositories"
complete -c dnf -n "__fish_seen_subcommand_from copr; and not __fish_seen_subcommand_from $coprcommands" -xa enable -d "Install a Copr repository"
complete -c dnf -n "__fish_seen_subcommand_from copr; and not __fish_seen_subcommand_from $coprcommands" -xa disable -d "Disable a Copr repository"
complete -c dnf -n "__fish_seen_subcommand_from copr; and not __fish_seen_subcommand_from $coprcommands" -xa remove -d "Remove a Copr repository"
complete -c dnf -n "__fish_seen_subcommand_from copr; and not __fish_seen_subcommand_from $coprcommands" -xa debug -d "Print system info for debugging"
complete -c dnf -n "__fish_seen_subcommand_from copr; and not __fish_seen_subcommand_from $coprcommands" -l hub -d "Copr hub hostname"
for i in enable disable remove
complete -c dnf -n "__fish_seen_subcommand_from copr; and __fish_seen_subcommand_from $i" -xa "(__dnf_list_copr_repos $i)"
end
# Distro-sync
complete -c dnf -n __fish_use_subcommand -xa distro-sync -d "Synchronizes packages to match the latest"

View File

@@ -2161,6 +2161,7 @@ complete -x -c git -n '__fish_git_using_command push' -l exec -d 'Same as --rece
### rebase
complete -f -c git -n __fish_git_needs_command -a rebase -d 'Reapply commit sequence on a new base'
__fish_git_add_revision_completion -n '__fish_git_using_command rebase'
complete -f -c git -n '__fish_git_using_command rebase' -n 'string match -rq -- "^-i|^--interactive" (commandline -xpc)' -ka '(__fish_git_recent_commits)'
complete -f -c git -n '__fish_git_using_command rebase' -n __fish_git_is_rebasing -l continue -d 'Restart the rebasing process'
complete -f -c git -n '__fish_git_using_command rebase' -n __fish_git_is_rebasing -l abort -d 'Abort the rebase operation'
complete -f -c git -n '__fish_git_using_command rebase' -n __fish_git_is_rebasing -l edit-todo -d 'Edit the todo list'
@@ -2969,11 +2970,24 @@ complete -f -c git -n '__fish_git_using_command update-ref' -l create-reflog -d
complete -f -c git -n '__fish_git_using_command update-ref' -l stdin -d 'Read instructions from stdin'
complete -f -c git -n '__fish_git_using_command update-ref' -s z -d 'NUL-terminated format for stdin'
### verify-commit
complete -f -c git -n __fish_git_needs_command -a verify-commit -d 'Check the GPG signature of commits'
complete -f -c git -n '__fish_git_using_command verify-commit' -ka '(__fish_git_commits)'
complete -f -c git -n '__fish_git_using_command verify-commit' -s v -l verbose -d 'Print commit contents'
complete -f -c git -n '__fish_git_using_command verify-commit' -l raw -d 'Print raw gpg status output'
### verify-pack
complete -f -c git -n __fish_git_needs_command -a verify-pack -d 'Validate packed Git archive files'
complete -f -c git -n '__fish_git_using_command verify-pack' -s v -l verbose -d 'Show objects contained in pack'
complete -f -c git -n '__fish_git_using_command verify-pack' -s s -l stat-only -d 'Only show histogram of delta chain length'
### verify-tag
complete -f -c git -n __fish_git_needs_command -a verify-tag -d 'Check the GPG signature of tags'
complete -f -c git -n '__fish_git_using_command verify-tag' -ka '(__fish_git_tags)'
complete -f -c git -n '__fish_git_using_command verify-tag' -s v -l verbose -d 'Print tag contents'
complete -f -c git -n '__fish_git_using_command verify-tag' -l raw -d 'Print raw gpg status output'
complete -x -c git -n '__fish_git_using_command verify-tag' -l format -d 'Format to use for the output'
### write-tree
complete -f -c git -n __fish_git_needs_command -a write-tree -d 'Create a tree object from the current index'
complete -f -c git -n '__fish_git_using_command write-tree' -l missing-ok -d 'Allow missing objects'

View File

@@ -1,44 +1 @@
# Commands
complete -c ngrok -f -a authtoken -d "Save authtoken to configuration file"
complete -c ngrok -f -a credits -d "Prints author and licensing information"
complete -c ngrok -f -a http -d "Start an HTTP tunnel"
complete -c ngrok -f -a start -d "Start tunnels by name from the configuration file"
complete -c ngrok -f -a tcp -d "Start a TCP tunnel"
complete -c ngrok -f -a tls -d "Start a TLS tunnel"
complete -c ngrok -f -a update -d "Update ngrok to the latest version"
complete -c ngrok -f -a version -d "Print the version string"
complete -c ngrok -f -a help -d "Shows a list of commands or help for one command"
# General Options
complete -c ngrok -l help -e -f
complete -c ngrok -l authtoken -r -d "ngrok.com authtoken identifying a user"
complete -c ngrok -l config -r -d "path to config files; they are merged if multiple"
complete -c ngrok -l log -x -a "false stderr stdout" -d "path to log file, 'stdout', 'stderr' or 'false'"
complete -c ngrok -l log-format -x -a "term logfmt json" -d "log record format: 'term', 'logfmt', 'json'"
complete -c ngrok -l log-level -r -a info -d "logging level"
complete -c ngrok -l region -x -a "us eu au ap" -d "ngrok server region [us , eu, au, ap] (default: us)"
# http & tls's options
complete -c ngrok -l hostname -r -d "host tunnel on custom hostname (requires DNS CNAME)"
complete -c ngrok -l subdomain -r -d "host tunnel on a custom subdomain"
# http's options
complete -c ngrok -l auth -r -d "enforce basic auth on tunnel endpoint, 'user:password'"
complete -c ngrok -l bind-tls -x -a "both https http" -d "listen for http, https or both: true/false/both"
complete -c ngrok -l host-header -r -d "set Host header; if 'rewrite' use local address hostname"
complete -c ngrok -l inspect -d "enable/disable http introspection"
# tls's options
complete -c ngrok -l client-cas -r -d "path to TLS certificate authority to verify client certs"
complete -c ngrok -l crt -r -d "path to a TLS certificate for TLS termination"
complete -c ngrok -l key -r -d "path to a TLS key for TLS termination"
# start's options
complete -c ngrok -l all -d "start all tunnels in the configuration file"
complete -c ngrok -l none -d "start running no tunnels"
# tcp's options
complete -c ngrok -l remote-addr -r -d "bind remote address (requires you reserve an address)"
# update's options
complete -c ngrok -l channel -x -a "stable beta" -d "update channel (stable, beta)"
SHELL=/bin/fish ngrok completion 2>/dev/null | source

View File

@@ -1 +1 @@
complete sudo-rs --wraps sudo
__fish_complete_sudo sudo-rs

View File

@@ -1,63 +1 @@
#
# Completion for sudo
#
function __fish_sudo_print_remaining_args
set -l tokens (commandline -xpc | string escape) (commandline -ct)
set -e tokens[1]
# These are all the options mentioned in the man page for Todd Miller's "sudo.ws" sudo (in that order).
# If any other implementation has different options, this should be harmless, since they shouldn't be used anyway.
set -l opts A/askpass b/background C/close-from= E/preserve-env='?'
# Note that "-h" is both "--host" (which takes an option) and "--help" (which doesn't).
# But `-h` as `--help` only counts when it's the only argument (`sudo -h`),
# so any argument completion after that should take it as "--host".
set -a opts e/edit g/group= H/set-home h/host= 1-help
set -a opts i/login K/remove-timestamp k/reset-timestamp l/list n/non-interactive
set -a opts P/preserve-groups p/prompt= S/stdin s/shell U/other-user=
set -a opts u/user= T/command-timeout= V/version v/validate
argparse -s $opts -- $tokens 2>/dev/null
# The remaining argv is the subcommand with all its options, which is what
# we want.
if test -n "$argv"
and not string match -qr '^-' $argv[1]
string join0 -- $argv
return 0
else
return 1
end
end
function __fish_sudo_no_subcommand
not __fish_sudo_print_remaining_args >/dev/null
end
function __fish_complete_sudo_subcommand
set -l args (__fish_sudo_print_remaining_args | string split0)
set -lx -a PATH /usr/local/sbin /sbin /usr/sbin
__fish_complete_subcommand --commandline $args
end
# All these options should be valid for GNU and OSX sudo
complete -c sudo -n __fish_no_arguments -s h -d "Display help and exit"
complete -c sudo -n __fish_no_arguments -s V -d "Display version information and exit"
complete -c sudo -n __fish_sudo_no_subcommand -s A -d "Ask for password via the askpass or \$SSH_ASKPASS program"
complete -c sudo -n __fish_sudo_no_subcommand -s C -d "Close all file descriptors greater or equal to the given number" -xa "0 1 2 255"
complete -c sudo -n __fish_sudo_no_subcommand -s E -d "Preserve environment"
complete -c sudo -n __fish_sudo_no_subcommand -s H -d "Set home"
complete -c sudo -n __fish_sudo_no_subcommand -s K -d "Remove the credential timestamp entirely"
complete -c sudo -n __fish_sudo_no_subcommand -s P -d "Preserve group vector"
complete -c sudo -n __fish_sudo_no_subcommand -s S -d "Read password from stdin"
complete -c sudo -n __fish_sudo_no_subcommand -s b -d "Run command in the background"
complete -c sudo -n __fish_sudo_no_subcommand -s e -rF -d Edit
complete -c sudo -n __fish_sudo_no_subcommand -s g -a "(__fish_complete_groups)" -x -d "Run command as group"
complete -c sudo -n __fish_sudo_no_subcommand -s i -d "Run a login shell"
complete -c sudo -n __fish_sudo_no_subcommand -s k -d "Reset or ignore the credential timestamp"
complete -c sudo -n __fish_sudo_no_subcommand -s l -d "List the allowed and forbidden commands for the given user"
complete -c sudo -n __fish_sudo_no_subcommand -s n -d "Do not prompt for a password - if one is needed, fail"
complete -c sudo -n __fish_sudo_no_subcommand -s p -d "Specify a custom password prompt"
complete -c sudo -n __fish_sudo_no_subcommand -s s -d "Run the given command in a shell"
complete -c sudo -n __fish_sudo_no_subcommand -s u -a "(__fish_complete_users)" -x -d "Run command as user"
complete -c sudo -n __fish_sudo_no_subcommand -s v -n __fish_no_arguments -d "Validate the credentials, extending timeout"
# Complete the command we are executed under sudo
complete -c sudo -x -n 'not __fish_seen_argument -s e' -a "(__fish_complete_sudo_subcommand)"
__fish_complete_sudo sudo

View File

@@ -1,17 +1,24 @@
set -l systemd_version (systemctl --version | string match "systemd*" | string replace -r "\D*(\d+)\D.*" '$1')
set -l commands list-units list-sockets start stop reload restart try-restart reload-or-restart reload-or-try-restart \
isolate kill is-active is-failed status show get-cgroup-attr set-cgroup-attr unset-cgroup-attr set-cgroup help \
reset-failed list-unit-files enable disable is-enabled reenable preset mask unmask link load list-jobs cancel dump \
list-dependencies snapshot delete daemon-reload daemon-reexec show-environment set-environment unset-environment \
reset-failed list-unit-files enable disable is-enabled reenable preset mask unmask link list-jobs cancel \
list-dependencies daemon-reload daemon-reexec show-environment set-environment unset-environment \
default rescue emergency halt poweroff reboot kexec exit suspend suspend-then-hibernate hibernate hybrid-sleep switch-root \
list-timers set-property import-environment get-default list-automounts is-system-running try-reload-or-restart freeze \
thaw mount-image bind clean
if test $systemd_version -gt 208 2>/dev/null
set commands $commands cat
if test $systemd_version -gt 217 2>/dev/null
set commands $commands edit
end
thaw mount-image bind clean set-default cat list-machines preset-all add-wants add-requires edit
if test $systemd_version -gt 243 2>/dev/null
set commands $commands log-level log-target service-watchdogs
end
if test $systemd_version -gt 246 2>/dev/null
set commands $commands service-log-level service-log-target
end
if test $systemd_version -gt 253 2>/dev/null
set commands $commands list-paths soft-reboot whoami
end
if test $systemd_version -gt 255 2>/dev/null
set commands $commands sleep
end
set -l types services sockets mounts service_paths targets automounts timers
function __fish_systemd_properties
@@ -28,23 +35,51 @@ end
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a "$commands"
#### Units commands
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a start -d 'Start one or more units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a stop -d 'Stop one or more units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a restart -d 'Restart one or more units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a reload-or-restart -d 'Reload units if supported or restart them'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a try-reload-or-restart -d 'Reload units if supported or restart them, if running'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a status -d 'Runtime status about one or more units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a enable -d 'Enable one or more units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a disable -d 'Disable one or more units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a isolate -d 'Start a unit and dependencies and disable all others'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a set-default -d 'Set the default target to boot into'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a get-default -d 'Show the default target to boot into'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a set-property -d 'Sets one or more properties of a unit'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a list-automounts -d 'List automount units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a is-system-running -d 'Return if system is running/starting/degraded'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a freeze -d 'Freeze units with the cgroup freezer'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a thaw -d 'Unfreeze frozen units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a add-requires -d 'Add Requires dependencies to a target unit'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a add-wants -d 'Add Wants dependencies to a target unit'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a bind -d 'Bind mount a path into the mount namespace of a unit'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a cancel -d 'Cancel one or more jobs'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a cat -d 'Show the backing files of one or more units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a clean -d 'Remove config/state/logs for the given units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a daemon-reload -d 'Reload the configuration of the system manager'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a default -d 'Enter and isolate the default mode'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a disable -d 'Disable one or more units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a edit -d 'Edit a unit file or drop-in snippet'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a enable -d 'Enable one or more units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a emergency -d 'Enter and isolate emergency mode'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a exit -d 'Ask the service manager to exit'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a freeze -d 'Freeze units with the cgroup freezer'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a get-default -d 'Show the default target to boot into'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a help -d 'Show the manual page for a unit'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a import-environment -d 'Import environment variables into the service manager'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a isolate -d 'Start a unit and dependencies and disable all others'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a is-system-running -d 'Return if system is running/starting/degraded'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a kexec -d 'Reboot via kexec'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a kill -d 'Kill one or more units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a link -d 'Link a unit file into the unit search path'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a list-automounts -d 'List automount units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a list-machines -d 'List the host and all running local containers'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a mask -d 'Prevent one or more units from starting'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a mount-image -d 'Mount an image into the mount namespace of a unit'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a preset -d 'Enable/disable a unit depending on preset configuration'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a preset-all -d 'Enable/disable all units depending on preset configuration'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a reenable -d 'Disable the enable one or more units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a reload -d 'Request a unit reload its configuration'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a reload-or-restart -d 'Reload units if supported or restart them'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a rescue -d 'Enter and isolate rescue mode'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a restart -d 'Restart one or more units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a set-default -d 'Set the default target to boot into'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a set-property -d 'Sets one or more properties of a unit'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a show -d 'Show properties of one or more units, jobs, or the manager itself'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a show-environment -d 'Dump the systemd manager environment block'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a soft-reboot -d 'Reboot userspace'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a start -d 'Start one or more units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a status -d 'Runtime status about one or more units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a stop -d 'Stop one or more units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a thaw -d 'Unfreeze frozen units'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a try-reload-or-restart -d 'Reload units if supported or restart them, if running'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a unmask -d 'Stop preventing one or more units from starting'
complete -f -c systemctl -n "not __fish_seen_subcommand_from $commands" -a whoami -d 'Query the unit that owns a process'
# Command completion done via argparse.
complete -c systemctl -a '(__fish_systemctl)' -f
@@ -87,13 +122,9 @@ complete -x -c systemctl -s M -l machine -d 'Execute operation on a VM or contai
complete -f -c systemctl -s h -l help -d 'Print a short help and exit'
complete -f -c systemctl -l version -d 'Print a short version and exit'
complete -f -c systemctl -l no-pager -d 'Do not pipe output into a pager'
# New options since systemd 220
if test $systemd_version -gt 219 2>/dev/null
complete -f -c systemctl -l firmware-setup -n "__fish_seen_subcommand_from reboot" -d "Reboot to EFI setup"
complete -f -c systemctl -l now -n "__fish_seen_subcommand_from enable" -d "Also start unit"
complete -f -c systemctl -l now -n "__fish_seen_subcommand_from disable mask" -d "Also stop unit"
end
complete -f -c systemctl -l firmware-setup -n "__fish_seen_subcommand_from reboot" -d "Reboot to EFI setup"
complete -f -c systemctl -l now -n "__fish_seen_subcommand_from enable" -d "Also start unit"
complete -f -c systemctl -l now -n "__fish_seen_subcommand_from disable mask" -d "Also stop unit"
# New options since systemd 242
if test $systemd_version -ge 242 2>/dev/null

View File

@@ -1,52 +1 @@
set -l commands compile watch init query fonts update help
# global options
complete -c typst -n __fish_use_subcommand -f -l color -d 'Set when to use color' -a 'auto always never'
complete -c typst -n __fish_use_subcommand -r -l cert -d 'Path to custom CA certificate'
complete -c typst -n __fish_use_subcommand -f -l version -s v -d 'Print version'
# help option/subcommand
complete -c typst -f -l help -s h -d 'Print help'
complete -c typst -f -n __fish_use_subcommand -a help -d 'Print help for the given subcommand(s)'
complete -c typst -n '__fish_seen_subcommand_from help' -x -a "$commands"
# subcommands
complete -c typst -n __fish_use_subcommand -f -a compile -d 'Compile an input file'
complete -c typst -n __fish_use_subcommand -f -a watch -d 'Watch an input file and recompile on changes'
complete -c typst -n __fish_use_subcommand -f -a init -d 'Initialize a new project'
complete -c typst -n __fish_use_subcommand -f -a query -d 'Process an input file to extract metadata'
complete -c typst -n __fish_use_subcommand -f -a fonts -d 'List all discovered fonts'
complete -c typst -n __fish_use_subcommand -f -a update -d 'Self update the Typst CLI'
complete -c typst -n "__fish_seen_subcommand_from $commands" -x
# common subcommand options
# FIXME: only one input file
complete -c typst -n '__fish_seen_subcommand_from compile c watch w query' -x -ka '(__fish_complete_suffix .typ)' -d 'Input file'
#complete -c typst -n '__fish_seen_subcommand_from compile c watch w' -d 'Output file'
complete -c typst -n '__fish_seen_subcommand_from compile c watch w query' -x -l root -a '(__fish_complete_directories)' -d 'Project root (for absolute paths)'
complete -c typst -n '__fish_seen_subcommand_from compile c watch w query' -x -l input -d 'String key-value pair for `sys.inputs`'
complete -c typst -n '__fish_seen_subcommand_from compile c watch w query fonts' -x -l font-path -a '(__fish_complete_directories)' -d 'Additional directories to search for fonts'
complete -c typst -n '__fish_seen_subcommand_from compile c watch w query' -x -l diagnostic-format -a 'human short' -d 'Format to emit diagnostics in'
# compile/watch subcommands
complete -c typst -n '__fish_seen_subcommand_from compile c watch w' -x -l format -s f -a 'pdf png svg' -d 'Format of the output file'
complete -c typst -n '__fish_seen_subcommand_from compile c watch w' -l open -d 'Open output file after compilation'
complete -c typst -n '__fish_seen_subcommand_from compile c watch w' -x -l ppi -d 'Pixels per inch for PNG export'
complete -c typst -n '__fish_seen_subcommand_from compile c watch w' -l timings -d 'Produce performance timings'
# init subcommand
complete -c typst -n '__fish_seen_subcommand_from init' -n '__fish_is_nth_token 2' -x -d 'Template to use'
complete -c typst -n '__fish_seen_subcommand_from init' -n '__fish_is_nth_token 3' -x -a '(__fish_complete_directories)' -d 'Project directory'
# query subcommand
complete -c typst -n '__fish_seen_subcommand_from query' -x -l field -d 'Extract just one field'
complete -c typst -n '__fish_seen_subcommand_from query' -f -l one -d 'Retrieve exactly one element'
complete -c typst -n '__fish_seen_subcommand_from query' -x -l format -a 'json yaml' -d 'Format to serialize in'
# fonts subcommand
complete -c typst -n '__fish_seen_subcommand_from fonts' -f -l variants -d 'List style variants of each family'
# update subcommand
complete -c typst -n '__fish_seen_subcommand_from update' -f -l force -d 'Force a downgrade to an older version'
complete -c typst -n '__fish_seen_subcommand_from update' -f -l revert -d 'Revert to the version from before the last update'
typst completions fish | source

View File

@@ -11,7 +11,7 @@ complete -c wget -s a -l append-output -d "Append all messages to logfile"
complete -c wget -s d -l debug -d "Turn on debug output"
complete -c wget -s q -l quiet -d "Quiet mode"
complete -c wget -s v -l verbose -d "Verbose mode"
complete -c wget -l non-verbose -d "Turn off verbose without being completely quiet"
complete -c wget -l no-verbose -d "Turn off verbose without being completely quiet"
complete -c wget -o nv -d "Turn off verbose without being completely quiet"
complete -c wget -s i -l input-file -d "Read URLs from file" -r
complete -c wget -s F -l force-html -d "Force input to be treated as HTML"

View File

@@ -1,3 +1,5 @@
# localization: tier1
# This file does some internal fish setup.
# It is not recommended to remove or edit it.
#
@@ -36,8 +38,8 @@ status get-file __fish_build_paths.fish | source
# Compute the directories for vendor configuration. We want to include
# all of XDG_DATA_DIRS, as well as the __extra_* dirs defined above.
set -l xdg_data_dirs
if set -q XDG_DATA_DIRS
set -l xdg_data_dirs /usr/local/share/fish /usr/share/fish
if test -n "$XDG_DATA_DIRS"
set --path xdg_data_dirs $XDG_DATA_DIRS
set xdg_data_dirs (string replace -r '([^/])/$' '$1' -- $xdg_data_dirs)/fish
end
@@ -206,8 +208,8 @@ end
if status is-interactive
__fish_migrate
fish_config theme choose default --no-override
end
fish_config theme choose default --no-override
# As last part of initialization, source the conf directories.
# Implement precedence (User > Admin > Extra (e.g. vendors) > Fish) by basically doing "basename".

View File

@@ -0,0 +1,62 @@
# localization: tier3
function __fish_sudo_print_remaining_args
set -l tokens (commandline -xpc | string escape) (commandline -ct)
set -e tokens[1]
# These are all the options mentioned in the man page for Todd Miller's "sudo.ws" sudo (in that order).
# If any other implementation has different options, this should be harmless, since they shouldn't be used anyway.
set -l opts A/askpass b/background C/close-from= E/preserve-env='?'
# Note that "-h" is both "--host" (which takes an option) and "--help" (which doesn't).
# But `-h` as `--help` only counts when it's the only argument (`sudo -h`),
# so any argument completion after that should take it as "--host".
set -a opts e/edit g/group= H/set-home h/host= 1-help
set -a opts i/login K/remove-timestamp k/reset-timestamp l/list n/non-interactive
set -a opts P/preserve-groups p/prompt= S/stdin s/shell U/other-user=
set -a opts u/user= T/command-timeout= V/version v/validate
argparse -s $opts -- $tokens 2>/dev/null
# The remaining argv is the subcommand with all its options, which is what
# we want.
if test -n "$argv"
and not string match -qr '^-' $argv[1]
string join0 -- $argv
return 0
else
return 1
end
end
function __fish_sudo_no_subcommand
not __fish_sudo_print_remaining_args >/dev/null
end
function __fish_complete_sudo_subcommand
set -l args (__fish_sudo_print_remaining_args | string split0)
set -lx -a PATH /usr/local/sbin /sbin /usr/sbin
__fish_complete_subcommand --commandline $args
end
function __fish_complete_sudo -a sudo
# All these options should be valid for GNU and OSX sudo
complete -c $sudo -n __fish_no_arguments -s h -d "Display help and exit"
complete -c $sudo -n __fish_no_arguments -s V -d "Display version information and exit"
complete -c $sudo -n __fish_sudo_no_subcommand -s A -d "Ask for password via the askpass or \$SSH_ASKPASS program"
complete -c $sudo -n __fish_sudo_no_subcommand -s C -d "Close all file descriptors greater or equal to the given number" -xa "0 1 2 255"
complete -c $sudo -n __fish_sudo_no_subcommand -s E -d "Preserve environment"
complete -c $sudo -n __fish_sudo_no_subcommand -s H -d "Set home"
complete -c $sudo -n __fish_sudo_no_subcommand -s K -d "Remove the credential timestamp entirely"
complete -c $sudo -n __fish_sudo_no_subcommand -s P -d "Preserve group vector"
complete -c $sudo -n __fish_sudo_no_subcommand -s S -d "Read password from stdin"
complete -c $sudo -n __fish_sudo_no_subcommand -s b -d "Run command in the background"
complete -c $sudo -n __fish_sudo_no_subcommand -s e -rF -d Edit
complete -c $sudo -n __fish_sudo_no_subcommand -s g -a "(__fish_complete_groups)" -x -d "Run command as group"
complete -c $sudo -n __fish_sudo_no_subcommand -s i -d "Run a login shell"
complete -c $sudo -n __fish_sudo_no_subcommand -s k -d "Reset or ignore the credential timestamp"
complete -c $sudo -n __fish_sudo_no_subcommand -s l -d "List the allowed and forbidden commands for the given user"
complete -c $sudo -n __fish_sudo_no_subcommand -s n -d "Do not prompt for a password - if one is needed, fail"
complete -c $sudo -n __fish_sudo_no_subcommand -s p -d "Specify a custom password prompt"
complete -c $sudo -n __fish_sudo_no_subcommand -s s -d "Run the given command in a shell"
complete -c $sudo -n __fish_sudo_no_subcommand -s u -a "(__fish_complete_users)" -x -d "Run command as user"
complete -c $sudo -n __fish_sudo_no_subcommand -s v -n __fish_no_arguments -d "Validate the credentials, extending timeout"
# Complete the command we are executed under sudo
complete -c $sudo -x -n 'not __fish_seen_argument -s e' -a "(__fish_complete_sudo_subcommand)"
end

View File

@@ -0,0 +1,10 @@
# localization: skip(private)
function __fish_cygwin_noacl
# MSYS (default) and Cygwin (non-default) mounts may not support POSIX permissions.
__fish_is_cygwin
and {
mount |
string match "*on $(stat -c %m -- $argv[1]) *" |
string match -qr "[(,]noacl[),]"
}
end

View File

@@ -12,7 +12,7 @@ function __fish_edit_command_if_at_cursor --description 'If cursor is at the com
or return 1
set -l command_path (command -v -- $command)
or return 1
test -w $command_path
test -r $command_path
or return 1
string match -q 'text/*' (file --brief --mime-type -L -- $command_path)
or return 1

View File

@@ -0,0 +1,4 @@
# localization: skip(private)
function __fish_is_cygwin
__fish_uname | string match -qr "^(MSYS|CYGWIN)"
end

View File

@@ -9,6 +9,21 @@ function __fish_make_cache_dir --description "Create and return XDG_CACHE_HOME"
# So if you call `__fish_make_cache_dir completions`,
# this creates e.g. ~/.cache/fish/completions
if not path is -d $xdg_cache_home/fish/"$argv[1]"
mkdir -m 700 -p $xdg_cache_home/fish/"$argv[1]"
set -l mkdir_options -m 700
# Can't set the permission in Cygwin on a `noacl` mount
if __fish_is_cygwin
# Find the first existing parent so we can `stat` it and get its mountpoint
set -l existing_parent $xdg_cache_home/fish/"$argv[1]"
while not path is -d $existing_parent
set existing_parent (path dirname $existing_parent)
end
if __fish_cygwin_noacl "$existing_parent"
set mkdir_options
end
end
mkdir $mkdir_options -p $xdg_cache_home/fish/"$argv[1]"
end; and echo $xdg_cache_home/fish/"$argv[1]"
end

View File

@@ -1,5 +1,5 @@
# localization: skip(private)
# Helper function for completions that need to enumerate Linux modules
function __fish_print_modules
find /lib/modules/(uname -r)/{kernel,misc} -type f 2>/dev/null | sed -e 's$/.*/\([^/.]*\).*$\1$'
find /lib/modules/(uname -r)/{kernel,misc,updates} -type f 2>/dev/null | sed -e 's$/.*/\([^/.]*\).*$\1$'
end

View File

@@ -382,7 +382,6 @@ function __fish_config_theme_choose
end
end
if not $need_hook || test -n "$fish_terminal_color_theme" ||
# comment to work around fish_indent bug
{
$theme_is_color_theme_aware && test -z "$fish_terminal_color_theme"
}

View File

@@ -129,7 +129,9 @@ function fish_vi_exec_motion
set motion_cmd commandline -f $motion
end
switch $motion[1]
case forward-char backward-char
case forward-char
set -e seq_total[1]
case backward-char
$motion_cmd
set -e seq_total[1]
end

View File

@@ -91,9 +91,8 @@ function funced --description 'Edit function definition'
# Repeatedly edit until it either parses successfully, or the user cancels
# If the editor command itself fails, we assume the user cancelled or the file
# could not be edited, and we do not try again
set -l checksum (__fish_md5 "$tmpname")
while true
set -l checksum (__fish_md5 "$tmpname")
if not $editor $tmpname
echo (_ "Editing failed or was cancelled")
else

View File

@@ -10,7 +10,10 @@ function isatty -d "Tests if a file descriptor is a tty"
end
if set -q argv[2]
printf (_ "%s: Too many arguments") isatty >&2
{
printf (_ "%s: Too many arguments") isatty
echo
} >&2
return 1
end

View File

@@ -26,7 +26,8 @@ function prompt_pwd --description 'short CWD for the prompt'
or set -l fish_prompt_pwd_full_dirs 1
for path in $argv
set -l tmp (__fish_unexpand_tilde $path)
# Strip control characters to avoid injecting terminal escape sequences into the prompt.
set -l tmp (__fish_unexpand_tilde $path | string replace -ra '[[:cntrl:]]' '')
if test "$fish_prompt_pwd_dir_length" -eq 0
echo $tmp

View File

@@ -14,7 +14,10 @@ function psub --description "Read from stdin into a file and output the filename
set -l funcname
if not status --is-command-substitution
printf (_ "%s: Not inside of command substitution") psub >&2
{
printf (_ "%s: Not inside of command substitution") psub
echo
} >&2
return 1
end

View File

@@ -15,7 +15,10 @@ function setenv
# `setenv` accepts only two arguments: the var name and the value. If there are more than two
# args it is an error. The error message is verbatim from csh.
if set -q argv[3]
printf (_ '%s: Too many arguments\n') setenv >&2
{
printf (_ '%s: Too many arguments') setenv
echo
} >&2
return 1
end

View File

@@ -182,7 +182,10 @@ function umask --description "Set default file permission mask"
return 1
case '*'
printf (_ '%s: Too many arguments\n') umask >&2
{
printf (_ '%s: Too many arguments') umask
echo
} >&2
return 1
end
end

View File

@@ -501,6 +501,19 @@ class Deroffer:
return True
return False
def device_control(self):
# groff \X'...' device control escape (and \Z'...' zero-width).
# help2man 1.50+ uses \X'tty: link URL' for hyperlinks.
# We just skip the entire escape.
if self.str_at(1) in "XZ" and self.str_at(2) == "'":
self.skip_char(3)
while self.str_at(0) and self.str_at(0) != "'":
self.skip_char()
if self.str_at(0) == "'":
self.skip_char()
return True
return False
def var(self):
reg = ""
s0s1 = self.s[0:2]
@@ -650,6 +663,8 @@ class Deroffer:
return self.size()
elif c in "hvwud":
return self.numreq()
elif c in "XZ":
return self.device_control()
elif c in "n*":
return self.var()
elif c == "(":
@@ -1314,6 +1329,9 @@ def built_command(options, description):
def remove_groff_formatting(data):
# Strip groff \X'...' device control escapes (help2man 1.50+ hyperlinks)
# and \Z'...' zero-width escapes.
data = re.sub(r"\\[XZ]'[^']*'", "", data)
data = data.replace("\\fI", "")
data = data.replace("\\fP", "")
data = data.replace("\\f1", "")

View File

@@ -56,6 +56,7 @@ tt {
.tab:hover,
#tab_contents .master_element:hover,
.color_scheme_choice_container:hover,
.data_table > tr:hover,
.prompt_choices_list > .ng-scope:hover {
background-color: #DDE;
}
@@ -207,7 +208,7 @@ tt {
border-top-left-radius: 5;
border-bottom-left-radius: 5;
/* Pad one less than .master_element, to accomodate our border. */
/* Pad one less than .master_element, to accommodate our border. */
padding-top: 5px;
padding-bottom: 10px;
padding-left: 4px;
@@ -284,6 +285,7 @@ tt {
width: 100%;
padding-left: 10px;
padding-right: 10px;
border-spacing: 0;
}
.data_table_row {}
@@ -310,7 +312,8 @@ tt {
}
.history_delete {
width: 20px;
width: 30px;
text-align: center;
}
.data_table_cell,

View File

@@ -293,7 +293,7 @@ mod tests {
#[test]
#[serial]
fn test_abbreviations() {
let _cleanup = test_init();
test_init();
let parser = TestParser::new();
{
let mut abbrs = abbrs_get_set();
@@ -424,7 +424,7 @@ macro_rules! validate {
#[test]
#[serial]
fn rename_abbrs() {
let _cleanup = test_init();
test_init();
with_abbrs_mut(|abbrs_g| {
let mut add = |name: &wstr, repl: &wstr, position: Position| {

View File

@@ -9,19 +9,21 @@
*
* Most clients will be interested in visiting the nodes of an ast.
*/
use crate::common::{UnescapeStringStyle, unescape_string};
use crate::flog::{flog, flogf};
use crate::parse_constants::{
ERROR_BAD_COMMAND_ASSIGN_ERR_MSG, INVALID_PIPELINE_CMD_ERR_MSG, ParseError, ParseErrorCode,
ParseErrorList, ParseKeyword, ParseTokenType, ParseTreeFlags, SOURCE_OFFSET_INVALID,
SourceRange, StatementDecoration, token_type_user_presentable_description,
};
use crate::parse_tree::ParseToken;
use crate::prelude::*;
use crate::tokenizer::{
TOK_ACCEPT_UNFINISHED, TOK_ARGUMENT_LIST, TOK_CONTINUE_AFTER_ERROR, TOK_SHOW_COMMENTS,
TokFlags, TokenType, Tokenizer, TokenizerError, variable_assignment_equals_pos,
use crate::{
flog::{flog, flogf},
parse_constants::{
ERROR_BAD_COMMAND_ASSIGN_ERR_MSG, INVALID_PIPELINE_CMD_ERR_MSG, ParseError, ParseErrorCode,
ParseErrorList, ParseKeyword, ParseTokenType, ParseTreeFlags, SOURCE_OFFSET_INVALID,
SourceRange, StatementDecoration, token_type_user_presentable_description,
},
parse_tree::ParseToken,
prelude::*,
tokenizer::{
TOK_ACCEPT_UNFINISHED, TOK_ARGUMENT_LIST, TOK_CONTINUE_AFTER_ERROR, TOK_SHOW_COMMENTS,
TokFlags, TokenType, Tokenizer, TokenizerError, variable_assignment_equals_pos,
},
};
use fish_common::{UnescapeStringStyle, unescape_string};
use macro_rules_attribute::derive;
use std::borrow::Cow;
use std::convert::AsMut;
@@ -1963,11 +1965,11 @@ fn spaces(&self) -> usize {
/// Return the parser's status.
fn status(&mut self) -> ParserStatus {
if self.unwinding {
ParserStatus::unwinding
ParserStatus::Unwinding
} else if self.flags.leave_unterminated && self.peek_type(0) == ParseTokenType::Terminate {
ParserStatus::unsourcing
ParserStatus::Unsourcing
} else {
ParserStatus::ok
ParserStatus::Ok
}
}
@@ -1975,7 +1977,7 @@ fn status(&mut self) -> ParserStatus {
fn unsource_leaves(&mut self) -> bool {
matches!(
self.status(),
ParserStatus::unsourcing | ParserStatus::unwinding
ParserStatus::Unsourcing | ParserStatus::Unwinding
)
}
@@ -2732,15 +2734,15 @@ fn visit_maybe_newlines(&mut self, nls: &mut MaybeNewlines) {
/// The status of our parser.
enum ParserStatus {
/// Parsing is going just fine, thanks for asking.
ok,
Ok,
/// We have exhausted the token stream, but the caller was OK with an incomplete parse tree.
/// All further leaf nodes should have the unsourced flag set.
unsourcing,
Unsourcing,
/// We encountered an parse error and are "unwinding."
/// Do not consume any tokens until we get back to a list type which stops unwinding.
unwinding,
Unwinding,
}
/// Return tokenizer flags corresponding to parse tree flags.
@@ -2830,7 +2832,7 @@ mod tests {
#[test]
#[serial]
fn test_ast_parse() {
let _cleanup = test_init();
test_init();
let src = L!("echo");
let ast = ast::parse(src, ParseTreeFlags::default(), None);
assert!(!ast.any_error);
@@ -2887,7 +2889,7 @@ fn test_is_same_node() {
}
// Run with cargo +nightly bench --features=benchmark
#[cfg(feature = "benchmark")]
#[cfg(all(nightly, feature = "benchmark"))]
#[cfg(test)]
mod bench {
extern crate test;

View File

@@ -1,13 +1,14 @@
//! The classes responsible for autoloading functions and completions.
use crate::common::{ScopeGuard, escape};
use crate::env::Environment;
use crate::flogf;
use crate::io::IoChain;
use crate::parser::Parser;
use crate::wutil::{FileId, INVALID_FILE_ID, file_id_for_path};
use fish_wcstringutil::wcs2bytes;
use fish_widestring::{L, WExt as _, WString, wstr};
use crate::{
env::Environment,
flogf,
io::IoChain,
parser::Parser,
wutil::{FileId, INVALID_FILE_ID, file_id_for_path},
};
use fish_common::{ScopeGuard, escape};
use fish_widestring::{L, WExt as _, WString, wcs2bytes, wstr};
use lru::LruCache;
use rust_embed::RustEmbed;
use std::collections::{HashMap, HashSet};
@@ -53,8 +54,8 @@ enum AssetDir {
#[derive(Debug)]
pub enum AutoloadPath {
OnDisk(WString),
Embedded(String),
Path(WString),
}
#[derive(Debug)]
@@ -101,10 +102,7 @@ pub fn resolve_command(&mut self, cmd: &wstr, env: &dyn Environment) -> Autoload
.unwrap_or_default(),
);
match result {
AutoloadResult::Path(AutoloadPath::Embedded(_)) => {
flogf!(autoload, "Embedded: %s", cmd);
}
AutoloadResult::Path(AutoloadPath::Path(ref path)) => {
AutoloadResult::Path(AutoloadPath::OnDisk(ref path)) => {
flogf!(
autoload,
"Loading %s from var %s from path %s",
@@ -113,6 +111,9 @@ pub fn resolve_command(&mut self, cmd: &wstr, env: &dyn Environment) -> Autoload
path
);
}
AutoloadResult::Path(AutoloadPath::Embedded(_)) => {
flogf!(autoload, "Embedded: %s", cmd);
}
AutoloadResult::Loaded | AutoloadResult::Pending | AutoloadResult::None => {}
}
result
@@ -125,15 +126,15 @@ pub fn perform_autoload(path: &AutoloadPath, parser: &Parser) {
// We do the useful part of what exec_subshell does ourselves
// - we source the file.
// We don't create a buffer or check ifs or create a read_limit
let prev_statuses = parser.get_last_statuses();
let prev_statuses = parser.last_statuses();
let _put_back = ScopeGuard::new((), |()| parser.set_last_statuses(prev_statuses));
match path {
AutoloadPath::Path(p) => {
AutoloadPath::OnDisk(p) => {
let script_source = L!("source ").to_owned() + &escape(p)[..];
parser.eval(&script_source, &IoChain::new());
}
AutoloadPath::Embedded(name) => {
use crate::common::bytes2wcstring;
use fish_widestring::bytes2wcstring;
use std::sync::Arc;
flogf!(autoload, "Loading embedded: %s", name);
let emfile = Asset::get(name).expect("Embedded file not found");
@@ -218,8 +219,8 @@ fn resolve_command_impl(&mut self, cmd: &wstr, paths: &[WString]) -> AutoloadRes
};
let file_id = match &file {
AutoloadableFileInfo::FileInfo(file) => &file.file_id,
AutoloadableFileInfo::EmbeddedPath(_) => &INVALID_FILE_ID,
AutoloadableFileInfo::OnDisk { file_id, .. } => file_id,
AutoloadableFileInfo::Embedded { .. } => &INVALID_FILE_ID,
};
// Is this file the same as what we previously autoloaded?
@@ -235,8 +236,8 @@ fn resolve_command_impl(&mut self, cmd: &wstr, paths: &[WString]) -> AutoloadRes
self.autoloaded_files
.insert(cmd.to_owned(), file_id.clone());
AutoloadResult::Path(match file {
AutoloadableFileInfo::FileInfo(path) => AutoloadPath::Path(path.path),
AutoloadableFileInfo::EmbeddedPath(path) => AutoloadPath::Embedded(path),
AutoloadableFileInfo::OnDisk { path, .. } => AutoloadPath::OnDisk(path),
AutoloadableFileInfo::Embedded { path } => AutoloadPath::Embedded(path),
})
}
}
@@ -245,20 +246,12 @@ fn resolve_command_impl(&mut self, cmd: &wstr, paths: &[WString]) -> AutoloadRes
const AUTOLOAD_STALENESS_INTERVALL: u64 = 15;
/// Represents a file that we might want to autoload.
#[derive(Clone)]
struct FileInfo {
/// The path to the file.
path: WString,
/// The metadata for the file.
file_id: FileId,
}
#[derive(Clone)]
enum AutoloadableFileInfo {
/// An on-disk file.
FileInfo(FileInfo),
OnDisk { path: WString, file_id: FileId },
/// An embedded file.
EmbeddedPath(String),
Embedded { path: String },
}
// A timestamp is a monotonic point in time.
@@ -328,7 +321,7 @@ fn check(
// Check hits.
if let Some(value) = self.known_files.get(cmd) {
let embedded = matches!(value.file, AutoloadableFileInfo::EmbeddedPath(_));
let embedded = matches!(value.file, AutoloadableFileInfo::Embedded { .. });
if allow_stale
|| embedded
|| Self::is_fresh(value.last_checked, Self::current_timestamp())
@@ -430,7 +423,7 @@ fn locate_file(
let file_id = file_id_for_path(&path);
if file_id != INVALID_FILE_ID {
// Found it.
return Some(AutoloadableFileInfo::FileInfo(FileInfo { path, file_id }));
return Some(AutoloadableFileInfo::OnDisk { path, file_id });
}
}
None
@@ -444,11 +437,11 @@ fn locate_asset(&self, cmd: &wstr, asset_dir: AssetDir) -> Option<AutoloadableFi
}
let narrow = wcs2bytes(cmd);
let cmdstr = std::str::from_utf8(&narrow).ok()?;
let p = match asset_dir {
let path = match asset_dir {
AssetDir::Functions => "functions/".to_owned() + cmdstr + ".fish",
AssetDir::Completions => "completions/".to_owned() + cmdstr + ".fish",
};
has_asset(&p).then_some(AutoloadableFileInfo::EmbeddedPath(p))
has_asset(&path).then_some(AutoloadableFileInfo::Embedded { path })
}
}
@@ -462,9 +455,9 @@ mod tests {
#[test]
#[serial]
fn test_autoload() {
let _cleanup = test_init();
test_init();
use crate::fds::wopen_cloexec;
use fish_wcstringutil::wcs2zstring;
use fish_widestring::wcs2zstring;
use nix::fcntl::OFlag;
macro_rules! run {

View File

@@ -20,27 +20,23 @@
use fish::{
ast,
builtins::{
error::Error,
fish_indent, fish_key_reader,
shared::{
BUILTIN_ERR_MISSING, BUILTIN_ERR_UNEXP_ARG, BUILTIN_ERR_UNKNOWN, STATUS_CMD_ERROR,
STATUS_CMD_OK, STATUS_CMD_UNKNOWN, VERSION_STRING_TEMPLATE,
},
},
common::{
PACKAGE_NAME, PROFILING_ACTIVE, PROGRAM_NAME, bytes2wcstring, escape, osstr2wcstring,
save_term_foreground_process_group,
shared::{STATUS_CMD_ERROR, STATUS_CMD_OK, STATUS_CMD_UNKNOWN, VERSION_STRING_TEMPLATE},
},
common::{PACKAGE_NAME, PROFILING_ACTIVE, PROGRAM_NAME},
env::{
EnvMode, Statuses,
config_paths::ConfigPaths,
environment::{EnvStack, Environment as _, env_init},
},
eprintf,
eprintf, err_fmt,
event::{self, Event},
fds::heightenize_fd,
flog::{self, activate_flog_categories_by_pattern, flog, flogf, set_flog_file_fd},
fprintf, function, future_feature_flags as features,
fprintf, function,
history::{self, start_private_mode},
io::IoChain,
io::{FdOutputStream, IoChain, OutputStream},
locale::set_libc_locales,
nix::isatty,
panic::panic_handler,
@@ -55,25 +51,28 @@
Pid, get_login, is_interactive_session, mark_login, mark_no_exec, proc_init,
set_interactive_session,
},
reader::{reader_init, reader_read, term_copy_modes},
reader::{reader_exit_signal, reader_init, reader_read, term_copy_modes},
signal::{signal_clear_cancel, signal_unblock_all},
threads::{self},
topic_monitor,
wutil::waccess,
};
use fish_wcstringutil::wcs2bytes;
use libc::STDIN_FILENO;
use fish_common::{escape, save_term_foreground_process_group};
use fish_widestring::{bytes2wcstring, osstr2wcstring, wcs2bytes};
use libc::{STDERR_FILENO, STDIN_FILENO};
use nix::{
sys::resource::{UsageWho, getrusage},
unistd::{AccessFlags, getpid},
};
use std::ffi::{OsStr, OsString};
use std::fs::File;
use std::os::unix::prelude::*;
use std::path::Path;
use std::sync::Arc;
use std::sync::atomic::Ordering;
use std::{env, ops::ControlFlow};
use std::{
env,
ffi::{OsStr, OsString},
fs::File,
ops::ControlFlow,
os::unix::prelude::*,
path::Path,
sync::{Arc, atomic::Ordering},
};
/// container to hold the options specified within the command line
#[derive(Default, Debug)]
@@ -205,7 +204,7 @@ fn run_command_list(parser: &Parser, cmds: &[OsString]) -> Result<(), libc::c_in
if !errored {
// Construct a parsed source ref.
let ps = Arc::new(ParsedSource::new(cmd_wcs, ast));
let _ = parser.eval_parsed_source(&ps, &IoChain::new(), None, BlockType::top, false);
let _ = parser.eval_parsed_source(&ps, &IoChain::new(), None, BlockType::Top, false);
retval = Ok(());
} else {
let backtrace = parser.get_backtrace(&cmd_wcs, &errors);
@@ -318,24 +317,24 @@ fn fish_parse_opt(args: &mut [WString], opts: &mut FishCmdOpts) -> ControlFlow<i
// Either remove it or make it work with flog.
}
'?' => {
eprintf!(
"%s\n\n",
wgettext_fmt!(BUILTIN_ERR_UNKNOWN, "fish", args[w.wopt_index - 1])
);
err_fmt!(Error::UNKNOWN_OPT, args[w.wopt_index - 1])
.cmd(L!("fish"))
.append_to_msg('\n')
.write_to(&mut OutputStream::Fd(FdOutputStream::new(STDERR_FILENO)));
return ControlFlow::Break(1);
}
':' => {
eprintf!(
"%s\n\n",
wgettext_fmt!(BUILTIN_ERR_MISSING, "fish", args[w.wopt_index - 1])
);
err_fmt!(Error::MISSING_OPT_ARG, args[w.wopt_index - 1])
.cmd(L!("fish"))
.append_to_msg('\n')
.write_to(&mut OutputStream::Fd(FdOutputStream::new(STDERR_FILENO)));
return ControlFlow::Break(1);
}
';' => {
eprintf!(
"%s\n\n",
wgettext_fmt!(BUILTIN_ERR_UNEXP_ARG, "fish", args[w.wopt_index - 1])
);
err_fmt!(Error::UNEXP_OPT_ARG, args[w.wopt_index - 1])
.cmd(L!("fish"))
.append_to_msg('\n')
.write_to(&mut OutputStream::Fd(FdOutputStream::new(STDERR_FILENO)));
return ControlFlow::Break(1);
}
_ => panic!("unexpected retval from WGetopter"),
@@ -482,10 +481,10 @@ fn throwing_main() -> i32 {
// command line takes precedence).
if let Some(features_var) = EnvStack::globals().get(L!("fish_features")) {
for s in features_var.as_list() {
features::set_from_string(s.as_utfstr());
fish_feature_flags::set_from_string(s.as_utfstr());
}
}
features::set_from_string(opts.features.as_utfstr());
fish_feature_flags::set_from_string(opts.features.as_utfstr());
proc_init();
reader_init(true);
@@ -519,11 +518,7 @@ fn throwing_main() -> i32 {
// TODO(MSRV>=1.88): feature(let_chains)
if let Some(path) = &opts.profile_startup_output {
if opts.profile_startup_output != opts.profile_output {
parser.emit_profiling(path);
// If we are profiling both, ensure the startup data only
// ends up in the startup file.
parser.clear_profiling();
parser.flush_profiling(path);
}
}
@@ -566,11 +561,11 @@ fn throwing_main() -> i32 {
}
res = reader_read(parser, libc::STDIN_FILENO, &IoChain::new());
} else {
let n = wcs2bytes(&args[my_optind]);
let filename = &args[my_optind];
let n = wcs2bytes(filename);
let path = OsStr::from_bytes(&n);
my_optind += 1;
// Rust sets cloexec by default, see above
// We don't need autoclose_fd_t when we use File, it will be closed on drop.
match File::open(path) {
Err(e) => {
flogf!(
@@ -581,24 +576,23 @@ fn throwing_main() -> i32 {
eprintf!("%s\n", e);
}
Ok(f) => {
let list = &args[my_optind..];
parser.set_var(
L!("argv"),
ParserEnvSetMode::default(),
list.iter().map(|s| s.to_owned()).collect(),
);
let rel_filename = &args[my_optind - 1];
let _filename_push = parser
.library_data
.scoped_set(Some(Arc::new(rel_filename.to_owned())), |s| {
&mut s.current_filename
});
res = reader_read(parser, f.as_raw_fd(), &IoChain::new());
if res.is_err() {
flog!(
warning,
wgettext_fmt!("Error while reading file %s", path.to_string_lossy())
if let Ok(f) = heightenize_fd(f.into(), true).map(File::from) {
let list = &args[my_optind..];
parser.set_var(
L!("argv"),
ParserEnvSetMode::default(),
list.iter().map(|s| s.to_owned()).collect(),
);
let _filename_push = parser
.current_filename
.scoped_replace(Some(Arc::new(filename.to_owned())));
res = reader_read(parser, f.as_raw_fd(), &IoChain::new());
if res.is_err() {
flog!(
warning,
wgettext_fmt!("Error while reading file %s", path.to_string_lossy())
);
}
}
}
}
@@ -607,7 +601,7 @@ fn throwing_main() -> i32 {
let exit_status = if res.is_err() {
STATUS_CMD_UNKNOWN
} else {
parser.get_last_status()
parser.last_status()
};
event::fire(
@@ -623,10 +617,20 @@ fn throwing_main() -> i32 {
);
if let Some(profile_output) = opts.profile_output {
parser.emit_profiling(&profile_output);
parser.flush_profiling(&profile_output);
}
history::save_all();
// If we deferred a fatal signal, re-raise it now so the parent sees WIFSIGNALED.
let exit_sig = reader_exit_signal();
if exit_sig != 0 {
unsafe {
libc::signal(exit_sig, libc::SIG_DFL);
libc::raise(exit_sig);
}
}
if opts.print_rusage_self {
print_rusage_self();
}

View File

@@ -1,19 +1,24 @@
use super::prelude::*;
use crate::abbrs::{self, Abbreviation, Position};
use crate::common::{EscapeStringStyle, bytes2wcstring, escape, escape_string, valid_func_name};
use crate::env::{EnvMode, EnvStackSetResult};
use crate::highlight::highlight_and_colorize;
use crate::parser::ParserEnvSetMode;
use crate::re::{regex_make_anchored, to_boxed_chars};
use fish_common::help_section;
use crate::{
abbrs::{self, Abbreviation, Position},
builtins::error::Error,
common::valid_func_name,
env::{EnvMode, EnvStackSetResult},
err_fmt, err_str,
highlight::highlight_and_colorize,
parser::ParserEnvSetMode,
re::{regex_make_anchored, to_boxed_chars},
};
use fish_common::{EscapeStringStyle, escape, escape_string, help_section};
use fish_widestring::bytes2wcstring;
use pcre2::utf32::{Regex, RegexBuilder};
localizable_consts! {
NAME_CANNOT_BE_EMPTY
"%s %s: Name cannot be empty"
"Name cannot be empty"
ABBR_CANNOT_HAVE_SPACES
"%s %s: Abbreviation '%s' cannot have spaces in the word"
"Abbreviation '%s' cannot have spaces in the word"
}
const CMD: &wstr = L!("abbr");
@@ -36,7 +41,7 @@ struct Options {
}
impl Options {
fn validate(&mut self, streams: &mut IoStreams) -> bool {
fn validate(&mut self) -> Option<Error<'_>> {
// Duplicate options?
let mut cmds = vec![];
if self.add {
@@ -59,12 +64,7 @@ fn validate(&mut self, streams: &mut IoStreams) -> bool {
}
if cmds.len() > 1 {
streams.err.appendln(&wgettext_fmt!(
"%s: Cannot combine options %s",
CMD,
join(&cmds, L!(", "))
));
return false;
return Some(err_fmt!("Cannot combine options %s", join(&cmds, L!(", "))));
}
// If run with no options, treat it like --add if we have arguments,
@@ -76,54 +76,29 @@ fn validate(&mut self, streams: &mut IoStreams) -> bool {
localizable_consts! {
OPTION_REQUIRES_ARG
"%s: %s option requires %s"
"%s option requires %s"
}
if !self.add && self.position.is_some() {
streams.err.appendln(&wgettext_fmt!(
OPTION_REQUIRES_ARG,
CMD,
"--position",
"--add",
));
return false;
return Some(err_fmt!(OPTION_REQUIRES_ARG, "--position", "--add"));
}
if !self.add && self.regex_pattern.is_some() {
streams
.err
.appendln(&wgettext_fmt!(OPTION_REQUIRES_ARG, CMD, "--regex", "--add"));
return false;
return Some(err_fmt!(OPTION_REQUIRES_ARG, "--regex", "--add"));
}
if !self.add && self.function.is_some() {
streams.err.appendln(&wgettext_fmt!(
OPTION_REQUIRES_ARG,
CMD,
"--function",
"--add",
));
return false;
return Some(err_fmt!(OPTION_REQUIRES_ARG, "--function", "--add"));
}
if !self.add && self.set_cursor_marker.is_some() {
streams.err.appendln(&wgettext_fmt!(
OPTION_REQUIRES_ARG,
CMD,
"--set-cursor",
"--add",
));
return false;
return Some(err_fmt!(OPTION_REQUIRES_ARG, "--set-cursor", "--add"));
}
if self
.set_cursor_marker
.as_ref()
.is_some_and(|m| m.is_empty())
{
streams.err.appendln(&wgettext_fmt!(
"%s: --set-cursor argument cannot be empty",
CMD
));
return false;
return Some(err_str!("--set-cursor argument cannot be empty"));
}
true
None
}
}
@@ -198,7 +173,6 @@ fn abbr_show(opts: &Options, streams: &mut IoStreams, parser: &Parser) -> Builti
streams.out.append(&bytes2wcstring(&highlight_and_colorize(
&result,
&parser.context(),
parser.vars(),
)));
} else {
streams.out.append(&result);
@@ -213,12 +187,9 @@ fn abbr_show(opts: &Options, streams: &mut IoStreams, parser: &Parser) -> Builti
fn abbr_list(opts: &Options, streams: &mut IoStreams) -> BuiltinResult {
let subcmd = L!("--list");
if !opts.args.is_empty() {
streams.err.appendln(&wgettext_fmt!(
"%s %s: Unexpected argument -- '%s'",
CMD,
subcmd,
&opts.args[0]
));
err_fmt!("Unexpected argument -- '%s'", &opts.args[0])
.subcmd(CMD, subcmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
abbrs::with_abbrs(|abbrs| {
@@ -237,29 +208,24 @@ fn abbr_rename(opts: &Options, streams: &mut IoStreams) -> BuiltinResult {
let subcmd = L!("--rename");
if opts.args.len() != 2 {
streams.err.appendln(&wgettext_fmt!(
"%s %s: Requires exactly two arguments",
CMD,
subcmd
));
err_str!("Requires exactly two arguments")
.subcmd(CMD, subcmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
let old_name = &opts.args[0];
let new_name = &opts.args[1];
if old_name.is_empty() || new_name.is_empty() {
streams
.err
.appendln(&wgettext_fmt!(NAME_CANNOT_BE_EMPTY, CMD, subcmd));
err_str!(NAME_CANNOT_BE_EMPTY)
.subcmd(CMD, subcmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
if contains_whitespace(new_name) {
streams.err.appendln(&wgettext_fmt!(
ABBR_CANNOT_HAVE_SPACES,
CMD,
subcmd,
new_name.as_utfstr()
));
err_fmt!(ABBR_CANNOT_HAVE_SPACES, new_name.as_utfstr())
.subcmd(CMD, subcmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
abbrs::with_abbrs_mut(|abbrs| -> BuiltinResult {
@@ -268,12 +234,12 @@ fn abbr_rename(opts: &Options, streams: &mut IoStreams) -> BuiltinResult {
.iter()
.any(|a| a.name == *old_name && a.commands == opts.commands)
{
streams.err.appendln(&wgettext_fmt!(
"%s %s: No abbreviation named %s with the specified command restrictions",
CMD,
subcmd,
err_fmt!(
"No abbreviation named %s with the specified command restrictions",
old_name.as_utfstr()
));
)
.subcmd(CMD, subcmd)
.finish(streams);
return Err(STATUS_CMD_ERROR);
}
if abbrs
@@ -282,13 +248,13 @@ fn abbr_rename(opts: &Options, streams: &mut IoStreams) -> BuiltinResult {
.any(|a| a.name == *new_name && a.commands == opts.commands)
{
if opts.commands.is_empty() {
streams.err.appendln(&wgettext_fmt!(
"%s %s: Abbreviation %s already exists, cannot rename %s",
CMD,
subcmd,
err_fmt!(
"Abbreviation %s already exists, cannot rename %s",
new_name.as_utfstr(),
old_name.as_utfstr()
));
)
.subcmd(CMD, subcmd)
.finish(streams);
} else {
let style = EscapeStringStyle::Script(Default::default());
let mut cmd_list = WString::new();
@@ -299,14 +265,14 @@ fn abbr_rename(opts: &Options, streams: &mut IoStreams) -> BuiltinResult {
cmd_list.push_utfstr(&escape_string(cmd, style));
}
streams.err.appendln(&wgettext_fmt!(
"%s %s: Abbreviation %s already exists for commands %s, cannot rename %s",
CMD,
subcmd,
err_fmt!(
"Abbreviation %s already exists for commands %s, cannot rename %s",
new_name.as_utfstr(),
cmd_list.as_utfstr(),
old_name.as_utfstr()
));
)
.subcmd(CMD, subcmd)
.finish(streams);
}
return Err(STATUS_INVALID_ARGS);
@@ -339,28 +305,23 @@ fn abbr_add(opts: &Options, streams: &mut IoStreams) -> BuiltinResult {
let subcmd = L!("--add");
if opts.args.len() < 2 && opts.function.is_none() {
streams.err.appendln(&wgettext_fmt!(
"%s %s: Requires at least two arguments",
CMD,
subcmd
));
err_str!("Requires at least two arguments")
.subcmd(CMD, subcmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
if opts.args.is_empty() || opts.args[0].is_empty() {
streams
.err
.appendln(&wgettext_fmt!(NAME_CANNOT_BE_EMPTY, CMD, subcmd));
err_str!(NAME_CANNOT_BE_EMPTY)
.subcmd(CMD, subcmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
let name = &opts.args[0];
if name.chars().any(|c| c.is_whitespace()) {
streams.err.appendln(&wgettext_fmt!(
ABBR_CANNOT_HAVE_SPACES,
CMD,
subcmd,
name.as_utfstr()
));
err_fmt!(ABBR_CANNOT_HAVE_SPACES, name.as_utfstr())
.subcmd(CMD, subcmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
@@ -376,21 +337,16 @@ fn abbr_add(opts: &Options, streams: &mut IoStreams) -> BuiltinResult {
let result = builder.build(to_boxed_chars(regex_pattern));
if let Err(error) = result {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_REGEX_COMPILE,
CMD,
error.error_message(),
));
let mut err = err_fmt!(Error::REGEX_COMPILE, error.error_message());
if let Some(offset) = error.offset() {
streams
.err
.append(&sprintf!("%s: %s\n", CMD, regex_pattern.as_utfstr()));
err.append_assign_to_msg(&sprintf!("\n%s: %s", CMD, regex_pattern.as_utfstr()));
// TODO: This is misaligned if `regex_pattern` contains characters which are not
// exactly 1 terminal cell wide.
// exactly 1 terminal cell wide or not on a single line.
let mut marker = " ".repeat(offset.saturating_sub(1));
marker.push('^');
streams.err.append(&sprintf!("%s: %s\n", CMD, marker));
err.append_assign_to_msg(&sprintf!("\n%s: %s", CMD, marker));
}
err.cmd(CMD).finish(streams);
return Err(STATUS_INVALID_ARGS);
}
let anchored = regex_make_anchored(regex_pattern);
@@ -409,20 +365,16 @@ fn abbr_add(opts: &Options, streams: &mut IoStreams) -> BuiltinResult {
}
if opts.function.is_some() && opts.args.len() > 1 {
streams
.err
.appendln(&wgettext_fmt!(BUILTIN_ERR_TOO_MANY_ARGUMENTS, L!("abbr")));
err_str!(Error::TOO_MANY_ARGUMENTS).cmd(CMD).finish(streams);
return Err(STATUS_INVALID_ARGS);
}
let replacement = if let Some(ref function) = opts.function {
// Abbreviation function names disallow spaces.
// This is to prevent accidental usage of e.g. `--function 'string replace'`
if !valid_func_name(function) || contains_whitespace(function) {
streams.err.appendln(&wgettext_fmt!(
"%s: Invalid function name: %s",
CMD,
function.as_utfstr()
));
err_fmt!("Invalid function name: %s", function.as_utfstr())
.cmd(CMD)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
function.clone()
@@ -446,10 +398,9 @@ fn abbr_add(opts: &Options, streams: &mut IoStreams) -> BuiltinResult {
}
});
if !opts.commands.is_empty() && position == Position::Command {
streams.err.appendln(&wgettext_fmt!(
"%s: --command cannot be combined with --position=command",
CMD,
));
err_str!("--command cannot be combined with --position=command")
.cmd(CMD)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
@@ -564,9 +515,9 @@ pub fn abbr(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> Bui
'c' => opts.commands.push(w.woptarg.map(|x| x.to_owned()).unwrap()),
'p' => {
if opts.position.is_some() {
streams
.err
.appendln(&wgettext_fmt!("%s: Cannot specify multiple positions", CMD));
err_str!("Cannot specify multiple positions")
.cmd(CMD)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
if w.woptarg == Some(L!("command")) {
@@ -574,37 +525,34 @@ pub fn abbr(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> Bui
} else if w.woptarg == Some(L!("anywhere")) {
opts.position = Some(Position::Anywhere);
} else {
streams.err.appendln(&wgettext_fmt!(
"%s: Invalid position '%s'",
CMD,
w.woptarg.unwrap_or_default()
));
streams.err.appendln(&wgettext_fmt!(
"Position must be one of: %s",
// Use a single argument here to avoid having to update translations when
// the number of options changes.
"command, anywhere",
));
err_fmt!("Invalid position '%s'", w.woptarg.unwrap_or_default())
.append_to_msg('\n')
.append_to_msg(&wgettext_fmt!(
"Position must be one of: %s",
// Use a single argument here to avoid having to update translations when
// the number of options changes.
"command, anywhere",
))
.cmd(CMD)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
}
'r' => {
if opts.regex_pattern.is_some() {
streams.err.appendln(&wgettext_fmt!(
"%s: Cannot specify multiple regex patterns",
CMD
));
err_str!("Cannot specify multiple regex patterns")
.cmd(CMD)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
opts.regex_pattern = w.woptarg.map(ToOwned::to_owned);
}
SET_CURSOR_SHORT => {
if opts.set_cursor_marker.is_some() {
streams.err.appendln(&wgettext_fmt!(
"%s: Cannot specify multiple set-cursor options",
CMD
));
err_str!("Cannot specify multiple set-cursor options")
.cmd(CMD)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
// The default set-cursor indicator is '%'.
@@ -624,19 +572,20 @@ pub fn abbr(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> Bui
'U' => {
// Kept and made ineffective, so we warn.
streams.err.append(&wgettext_fmt!(
"%s: Warning: Option '%s' was removed and is now ignored",
cmd,
err_fmt!(
"Warning: Option '%s' was removed and is now ignored",
argv_read[w.wopt_index - 1]
));
builtin_print_error_trailer(parser, streams.err, cmd);
)
.cmd(CMD)
.full_trailer(parser)
.finish(streams);
}
'h' => {
builtin_print_help(parser, streams, cmd);
return Ok(SUCCESS);
}
':' => {
builtin_missing_argument(parser, streams, cmd, argv[w.wopt_index - 1], true);
builtin_missing_argument(parser, streams, cmd, None, argv[w.wopt_index - 1], true);
return Err(STATUS_INVALID_ARGS);
}
';' => {
@@ -660,7 +609,8 @@ pub fn abbr(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> Bui
opts.args.push((*arg).into());
}
if !opts.validate(streams) {
if let Some(err) = opts.validate() {
err.cmd(cmd).finish(streams);
return Err(STATUS_INVALID_ARGS);
}

View File

@@ -2,19 +2,21 @@
use super::prelude::*;
use crate::builtins::error::Error;
use crate::env::{EnvMode, EnvSetMode, EnvStack};
use crate::exec::exec_subshell;
use crate::parser::ParserEnvSetMode;
use crate::wutil::fish_iswalnum;
use crate::{err_fmt, err_str};
const VAR_NAME_PREFIX: &wstr = L!("_flag_");
localizable_consts!(
BUILTIN_ERR_INVALID_OPT_SPEC
"%s: Invalid option spec '%s' at char '%c'"
"Invalid option spec '%s' at char '%c'"
MISSING_DOUBLE_HYPHEN_SEPARATOR
"%s: Missing -- separator"
"Missing -- separator"
);
#[derive(Default)]
@@ -149,13 +151,10 @@ fn check_for_mutually_exclusive_flags(
if flag1 > flag2 {
std::mem::swap(&mut flag1, &mut flag2);
}
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_COMBO2_EXCLUSIVE,
opts.name,
flag1,
flag2
));
return Err(STATUS_CMD_ERROR);
err_fmt!(Error::COMBO_EXCLUSIVE, flag1, flag2)
.cmd(&opts.name)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
}
}
@@ -171,11 +170,9 @@ fn parse_exclusive_args(opts: &mut ArgParseCmdOpts, streams: &mut IoStreams) ->
for raw_xflags in &opts.raw_exclusive_flags {
let xflags: Vec<_> = raw_xflags.split(',').collect();
if xflags.len() < 2 {
streams.err.appendln(&wgettext_fmt!(
"%s: exclusive flag string '%s' is not valid",
opts.name,
raw_xflags
));
err_fmt!("exclusive flag string '%s' is not valid", raw_xflags)
.cmd(&opts.name)
.finish(streams);
return Err(STATUS_CMD_ERROR);
}
@@ -189,11 +186,9 @@ fn parse_exclusive_args(opts: &mut ArgParseCmdOpts, streams: &mut IoStreams) ->
// It's a long flag we store as its short flag equivalent.
exclusive_set.push(*short_equiv);
} else {
streams.err.appendln(&wgettext_fmt!(
"%s: exclusive flag '%s' is not valid",
opts.name,
flag
));
err_fmt!("exclusive flag '%s' is not valid", flag)
.cmd(&opts.name)
.finish(streams);
return Err(STATUS_CMD_ERROR);
}
}
@@ -218,12 +213,13 @@ fn parse_flag_modifiers<'args>(
&& s.char_at(0) != '!'
&& s.char_at(0) != '&'
{
streams.err.appendln(&wgettext_fmt!(
"%s: Implicit int short flag '%c' does not allow modifiers like '%c'",
opts.name,
err_fmt!(
"Implicit int short flag '%c' does not allow modifiers like '%c'",
opt_spec.short_flag,
s.char_at(0)
));
)
.cmd(&opts.name)
.finish(streams);
return false;
}
@@ -253,24 +249,18 @@ fn parse_flag_modifiers<'args>(
if s.char_at(0) == '!' {
if opt_spec.arg_type == ArgType::NoArgument {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_INVALID_OPT_SPEC,
opts.name,
option_spec,
s.char_at(0)
));
err_fmt!(BUILTIN_ERR_INVALID_OPT_SPEC, option_spec, s.char_at(0))
.cmd(&opts.name)
.finish(streams);
}
s = s.slice_from(1);
opt_spec.validation_command = s;
// Move cursor to the end so we don't expect a long flag.
s = s.slice_from(s.char_count());
} else if !s.is_empty() {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_INVALID_OPT_SPEC,
opts.name,
option_spec,
s.char_at(0)
));
err_fmt!(BUILTIN_ERR_INVALID_OPT_SPEC, option_spec, s.char_at(0))
.cmd(&opts.name)
.finish(streams);
return false;
}
@@ -280,11 +270,9 @@ fn parse_flag_modifiers<'args>(
}
if opts.options.contains_key(&opt_spec.short_flag) {
streams.err.appendln(&wgettext_fmt!(
"%s: Short flag '%c' already defined",
opts.name,
opt_spec.short_flag
));
err_fmt!("Short flag '%c' already defined", opt_spec.short_flag)
.cmd(&opts.name)
.finish(streams);
return false;
}
@@ -303,7 +291,7 @@ fn parse_option_spec_sep<'args>(
) -> bool {
localizable_consts! {
IMPLICIT_INT_FLAG_ALREADY_DEFINED
"%s: Implicit int flag '%c' already defined"
"Implicit int flag '%c' already defined"
}
let mut s = *opt_spec_str;
let mut i = 1usize;
@@ -316,11 +304,9 @@ fn parse_option_spec_sep<'args>(
*counter += 1;
}
if opts.implicit_int_flag != '\0' {
streams.err.appendln(&wgettext_fmt!(
IMPLICIT_INT_FLAG_ALREADY_DEFINED,
opts.name,
opts.implicit_int_flag
));
err_fmt!(IMPLICIT_INT_FLAG_ALREADY_DEFINED, opts.implicit_int_flag)
.cmd(&opts.name)
.finish(streams);
return false;
}
opts.implicit_int_flag = opt_spec.short_flag;
@@ -335,34 +321,26 @@ fn parse_option_spec_sep<'args>(
opt_spec.short_flag_valid = false;
i += 1;
if i == s.char_count() {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_INVALID_OPT_SPEC,
opts.name,
option_spec,
s.char_at(i - 1)
));
err_fmt!(BUILTIN_ERR_INVALID_OPT_SPEC, option_spec, s.char_at(i - 1))
.cmd(&opts.name)
.finish(streams);
return false;
}
}
'/' => {
i += 1; // the struct is initialized assuming short_flag_valid should be true
if i == s.char_count() {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_INVALID_OPT_SPEC,
opts.name,
option_spec,
s.char_at(i - 1)
));
err_fmt!(BUILTIN_ERR_INVALID_OPT_SPEC, option_spec, s.char_at(i - 1))
.cmd(&opts.name)
.finish(streams);
return false;
}
}
'#' => {
if opts.implicit_int_flag != '\0' {
streams.err.appendln(&wgettext_fmt!(
IMPLICIT_INT_FLAG_ALREADY_DEFINED,
opts.name,
opts.implicit_int_flag
));
err_fmt!(IMPLICIT_INT_FLAG_ALREADY_DEFINED, opts.implicit_int_flag)
.cmd(&opts.name)
.finish(streams);
return false;
}
opts.implicit_int_flag = opt_spec.short_flag;
@@ -402,21 +380,21 @@ fn parse_option_spec<'args>(
streams: &mut IoStreams,
) -> bool {
if option_spec.is_empty() {
streams.err.appendln(&wgettext_fmt!(
"%s: An option spec must have at least a short or a long flag",
opts.name
));
err_str!("An option spec must have at least a short or a long flag")
.cmd(&opts.name)
.finish(streams);
return false;
}
let mut s = option_spec;
if !fish_iswalnum(s.char_at(0)) && s.char_at(0) != '#' && !(s.char_at(0) == '/' && s.len() > 1)
{
streams.err.appendln(&wgettext_fmt!(
"%s: Short flag '%c' invalid, must be alphanum or '#'",
opts.name,
err_fmt!(
"Short flag '%c' invalid, must be alphanum or '#'",
s.char_at(0)
));
)
.cmd(&opts.name)
.finish(streams);
return false;
}
@@ -439,11 +417,9 @@ fn parse_option_spec<'args>(
if long_flag_char_count > 0 {
opt_spec.long_flag = s.slice_to(long_flag_char_count);
if opts.long_to_short_flag.contains_key(opt_spec.long_flag) {
streams.err.appendln(&wgettext_fmt!(
"%s: Long flag '%s' already defined",
opts.name,
opt_spec.long_flag
));
err_fmt!("Long flag '%s' already defined", opt_spec.long_flag)
.cmd(&opts.name)
.finish(streams);
return false;
}
}
@@ -486,9 +462,9 @@ fn collect_option_specs<'args>(
loop {
if *optind == argc {
streams
.err
.appendln(&wgettext_fmt!(MISSING_DOUBLE_HYPHEN_SEPARATOR, cmd));
err_str!(MISSING_DOUBLE_HYPHEN_SEPARATOR)
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
@@ -508,9 +484,9 @@ fn collect_option_specs<'args>(
let counter_max = 0xF8FFu32;
if counter > counter_max {
streams
.err
.appendln(&wgettext_fmt!("%s: Too many long-only options", cmd));
err_str!("Too many long-only options")
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
@@ -540,12 +516,9 @@ fn parse_cmd_opts<'args>(
's' => opts.stop_nonopt = true,
'i' | 'u' => {
if opts.unknown_handling != UnknownHandling::Error {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_COMBO2_EXCLUSIVE,
cmd,
"--ignore-unknown",
"--move-unknown"
));
err_fmt!(Error::COMBO_EXCLUSIVE, "--ignore-unknown", "--move-unknown")
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
opts.unknown_handling = if c == 'i' {
@@ -564,11 +537,9 @@ fn parse_cmd_opts<'args>(
} else if kind == L!("none") {
ArgType::NoArgument
} else {
streams.err.appendln(&wgettext_fmt!(
"%s: Invalid --unknown-arguments value '%s'",
cmd,
kind
));
err_fmt!("Invalid --unknown-arguments value '%s'", kind)
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
}
@@ -581,11 +552,9 @@ fn parse_cmd_opts<'args>(
opts.min_args = {
let x = fish_wcstol(w.woptarg.unwrap()).unwrap_or(-1);
if x < 0 {
streams.err.appendln(&wgettext_fmt!(
"%s: Invalid --min-args value '%s'",
cmd,
w.woptarg.unwrap()
));
err_fmt!("Invalid --min-args value '%s'", w.woptarg.unwrap())
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
x.try_into().unwrap()
@@ -595,11 +564,9 @@ fn parse_cmd_opts<'args>(
opts.max_args = {
let x = fish_wcstol(w.woptarg.unwrap()).unwrap_or(-1);
if x < 0 {
streams.err.appendln(&wgettext_fmt!(
"%s: Invalid --max-args value '%s'",
cmd,
w.woptarg.unwrap()
));
err_fmt!("Invalid --max-args value '%s'", w.woptarg.unwrap())
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
x.try_into().unwrap()
@@ -610,6 +577,7 @@ fn parse_cmd_opts<'args>(
parser,
streams,
cmd,
None,
args[w.wopt_index - 1],
/* print_hints */ false,
);
@@ -648,9 +616,9 @@ fn parse_cmd_opts<'args>(
if argc == w.wopt_index {
// The user didn't specify any option specs.
streams
.err
.appendln(&wgettext_fmt!(MISSING_DOUBLE_HYPHEN_SEPARATOR, cmd));
err_str!(MISSING_DOUBLE_HYPHEN_SEPARATOR)
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
@@ -932,6 +900,7 @@ fn argparse_parse_flags<'args>(
parser,
streams,
&opts.name,
None,
args_read[w.wopt_index - 1],
false,
);
@@ -961,11 +930,9 @@ fn argparse_parse_flags<'args>(
streams,
)?;
} else if opts.unknown_handling == UnknownHandling::Error {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_UNKNOWN,
opts.name,
args_read[w.wopt_index - 1]
));
err_fmt!(Error::UNKNOWN_OPT, args_read[w.wopt_index - 1])
.cmd(&opts.name)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
} else {
// The option is unknown, so there's no long opt index it could have used
@@ -1016,11 +983,9 @@ fn argparse_parse_flags<'args>(
Some(w.argv[w.wopt_index - 1])
} else {
// the option is at the end of argv, so it has no argument
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_MISSING,
opts.name,
args_read[w.wopt_index - 1]
));
err_fmt!(Error::MISSING_OPT_ARG, args_read[w.wopt_index - 1])
.cmd(&opts.name)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
} else {
@@ -1032,11 +997,9 @@ fn argparse_parse_flags<'args>(
&& is_long_flag
&& arg_contents.contains('=')
{
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_UNEXP_ARG,
opts.name,
args_read[w.wopt_index - 1]
));
err_fmt!(Error::UNEXP_OPT_ARG, args_read[w.wopt_index - 1])
.cmd(&opts.name)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
@@ -1129,22 +1092,16 @@ fn check_min_max_args_constraints(
let cmd = &opts.name;
if opts.args.len() < opts.min_args {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_MIN_ARG_COUNT1,
cmd,
opts.min_args,
opts.args.len()
));
err_fmt!(Error::MIN_ARG_COUNT, opts.min_args, opts.args.len())
.cmd(cmd)
.finish(streams);
return Err(STATUS_CMD_ERROR);
}
if opts.max_args != usize::MAX && opts.args.len() > opts.max_args {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_MAX_ARG_COUNT1,
cmd,
opts.max_args,
opts.args.len()
));
err_fmt!(Error::MAX_ARG_COUNT, opts.max_args, opts.args.len())
.cmd(cmd)
.finish(streams);
return Err(STATUS_CMD_ERROR);
}

View File

@@ -2,7 +2,7 @@
use std::{collections::HashSet, rc::Rc};
use crate::proc::Pid;
use crate::{builtins::error::Error, err_fmt, err_str, proc::Pid};
use super::prelude::*;
@@ -17,12 +17,13 @@ fn send_to_bg(
let jobs = parser.jobs();
if !jobs[job_pos].wants_job_control() {
let job = &jobs[job_pos];
streams.err.appendln(&wgettext_fmt!(
"%s: Can't put job %s, '%s' to background because it is not under job control",
cmd,
err_fmt!(
"Can't put job %s, '%s' to background because it is not under job control",
job.job_id().to_wstring(),
job.command()
));
)
.cmd(cmd)
.finish(streams);
return Err(STATUS_CMD_ERROR);
}
@@ -66,9 +67,7 @@ pub fn bg(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) -> Built
};
let Some(job_pos) = job_pos else {
streams
.err
.appendln(&wgettext_fmt!(BUILTIN_ERR_NO_SUITABLE_JOBS, cmd));
err_str!(Error::NO_SUITABLE_JOBS).cmd(cmd).finish(streams);
return Err(STATUS_CMD_ERROR);
};
@@ -101,9 +100,9 @@ pub fn bg(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) -> Built
send_to_bg(parser, streams, cmd, job_pos)?;
}
} else {
streams
.err
.appendln(&wgettext_fmt!(BUILTIN_ERR_COULD_NOT_FIND_JOB, cmd, pid));
err_fmt!(Error::COULD_NOT_FIND_JOB, pid)
.cmd(cmd)
.finish(streams);
}
}

View File

@@ -1,25 +1,30 @@
//! Implementation of the bind builtin.
use super::prelude::*;
use crate::common::{
EscapeFlags, EscapeStringStyle, bytes2wcstring, escape, escape_string, valid_var_name,
use crate::{
builtins::error::Error,
common::valid_var_name,
err_fmt, err_raw, err_str,
highlight::highlight_and_colorize,
input::{
InputMapping, InputMappingSet, KeyNameStyle, input_function_get_names, input_mappings,
},
key::{
self, KEY_NAMES, Key, MAX_FUNCTION_KEY, Modifiers, char_to_symbol, function_key, parse_keys,
},
};
use crate::highlight::highlight_and_colorize;
use crate::input::{
InputMapping, InputMappingSet, KeyNameStyle, input_function_get_names, input_mappings,
};
use crate::key::{
self, KEY_NAMES, Key, MAX_FUNCTION_KEY, Modifiers, char_to_symbol, function_key, parse_keys,
};
use fish_common::help_section;
use fish_common::{EscapeFlags, EscapeStringStyle, escape, escape_string, help_section};
use fish_widestring::bytes2wcstring;
use std::sync::MutexGuard;
const DEFAULT_BIND_MODE: &wstr = L!("default");
const BIND_INSERT: c_int = 0;
const BIND_ERASE: c_int = 1;
const BIND_KEY_NAMES: c_int = 2;
const BIND_FUNCTION_NAMES: c_int = 3;
enum BindMode {
Insert,
Erase,
KeyNames,
FunctionNames,
}
struct Options {
all: bool,
@@ -30,7 +35,7 @@ struct Options {
user: bool,
have_preset: bool,
preset: bool,
mode: c_int,
mode: BindMode,
bind_mode: Option<WString>,
sets_bind_mode: Option<WString>,
color: ColorEnabled,
@@ -47,7 +52,7 @@ fn new() -> Options {
user: false,
have_preset: false,
preset: false,
mode: BIND_INSERT,
mode: BindMode::Insert,
bind_mode: None,
sets_bind_mode: None,
color: ColorEnabled::default(),
@@ -112,7 +117,7 @@ fn generate_output_string(seq: &[Key], user: bool, bind: &InputMapping) -> WStri
if key.modifiers == Modifiers::ALT {
out.push_utfstr(&char_to_symbol('\x1b', i == 0));
out.push_utfstr(&char_to_symbol(
if key.codepoint == key::Escape {
if key.codepoint == key::ESCAPE {
'\x1b'
} else {
key.codepoint
@@ -163,7 +168,6 @@ fn list_one(
streams.out.append(&bytes2wcstring(&highlight_and_colorize(
&out,
&parser.context(),
parser.vars(),
)));
} else {
streams.out.append(&out);
@@ -261,7 +265,7 @@ fn compute_seq(&self, streams: &mut IoStreams, seq: &wstr) -> Option<Vec<Key>> {
match parse_keys(seq) {
Ok(keys) => Some(keys),
Err(err) => {
streams.err.append(&sprintf!("bind: %s\n", err));
err_raw!(err).cmd(L!("bind")).finish(streams);
None
}
}
@@ -312,12 +316,9 @@ fn insert(
} else {
// Inserting both on the other hand makes no sense.
if self.opts.have_preset && self.opts.have_user {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_COMBO2_EXCLUSIVE,
cmd,
"--preset",
"--user"
));
err_fmt!(Error::COMBO_EXCLUSIVE, "--preset", "--user")
.cmd(cmd)
.finish(streams);
return true;
}
}
@@ -353,17 +354,13 @@ fn insert(
);
if !self.opts.silent {
if seq.len() == 1 {
streams.err.appendln(&wgettext_fmt!(
"%s: No binding found for key '%s'",
cmd,
seq[0]
));
err_fmt!("No binding found for key '%s'", seq[0])
.cmd(cmd)
.finish(streams);
} else {
streams.err.appendln(&wgettext_fmt!(
"%s: No binding found for key sequence '%s'",
cmd,
eseq
));
err_fmt!("No binding found for key sequence '%s'", eseq)
.cmd(cmd)
.finish(streams);
}
}
return true;
@@ -433,12 +430,13 @@ fn parse_cmd_opts(
let check_mode_name = |streams: &mut IoStreams, mode_name: &wstr| -> Result<(), ErrorCode> {
if !valid_var_name(mode_name) {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_BIND_MODE,
cmd,
err_fmt!(
Error::BIND_MODE,
mode_name,
help_section!("language#shell-variable-and-function-names")
));
)
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
Ok(())
@@ -448,17 +446,18 @@ fn parse_cmd_opts(
while let Some(c) = w.next_opt() {
match c {
'a' => opts.all = true,
'e' => opts.mode = BIND_ERASE,
'f' => opts.mode = BIND_FUNCTION_NAMES,
'e' => opts.mode = BindMode::Erase,
'f' => opts.mode = BindMode::FunctionNames,
'h' => opts.print_help = true,
'k' => {
streams.err.appendln(&wgettext_fmt!(
"%s: the -k/--key syntax is no longer supported. See `bind --help` and `bind --key-names`",
cmd,
));
err_str!(
"the -k/--key syntax is no longer supported. See `bind --help` and `bind --key-names`"
)
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
'K' => opts.mode = BIND_KEY_NAMES,
'K' => opts.mode = BindMode::KeyNames,
'L' => {
opts.list_modes = true;
return Ok(SUCCESS);
@@ -483,7 +482,7 @@ fn parse_cmd_opts(
opts.user = true;
}
':' => {
builtin_missing_argument(parser, streams, cmd, argv[w.wopt_index - 1], true);
builtin_missing_argument(parser, streams, cmd, None, argv[w.wopt_index - 1], true);
return Err(STATUS_INVALID_ARGS);
}
';' => {
@@ -534,7 +533,7 @@ pub fn bind(
}
match self.opts.mode {
BIND_ERASE => {
BindMode::Erase => {
// If we get both, we erase both.
if self.opts.user
&& self.erase(
@@ -557,19 +556,13 @@ pub fn bind(
return Err(STATUS_CMD_ERROR);
}
}
BIND_INSERT => {
BindMode::Insert => {
if self.insert(optind, argv, parser, streams) {
return Err(STATUS_CMD_ERROR);
}
}
BIND_KEY_NAMES => self.key_names(streams),
BIND_FUNCTION_NAMES => self.function_names(streams),
_ => {
streams
.err
.appendln(&wgettext_fmt!("%s: Invalid state", cmd));
return Err(STATUS_CMD_ERROR);
}
BindMode::KeyNames => self.key_names(streams),
BindMode::FunctionNames => self.function_names(streams),
}
Ok(SUCCESS)
}

View File

@@ -1,5 +1,7 @@
use std::sync::atomic::Ordering;
use crate::err_str;
// Implementation of the block builtin.
use super::prelude::*;
@@ -51,7 +53,7 @@ fn parse_options(
opts.erase = true;
}
':' => {
builtin_missing_argument(parser, streams, cmd, args[w.wopt_index - 1], false);
builtin_missing_argument(parser, streams, cmd, None, args[w.wopt_index - 1], false);
return Err(STATUS_INVALID_ARGS);
}
';' => {
@@ -84,17 +86,14 @@ pub fn block(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) -> Bu
if opts.erase {
if opts.scope != Scope::Unset {
streams.err.appendln(&wgettext_fmt!(
"%s: Can not specify scope when removing block",
cmd
));
err_str!("Can not specify scope when removing block")
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
if parser.global_event_blocks.load(Ordering::Relaxed) == 0 {
streams
.err
.appendln(&wgettext_fmt!("%s: No blocks defined", cmd));
err_str!("No blocks defined").cmd(cmd).finish(streams);
return Err(STATUS_CMD_ERROR);
}
parser.global_event_blocks.fetch_sub(1, Ordering::Relaxed);

View File

@@ -1,18 +1,17 @@
use super::prelude::*;
use crate::builtins::error::Error;
use crate::parser::{Block, BlockType};
use crate::reader::reader_read;
use crate::{err_fmt, err_str};
use libc::STDIN_FILENO;
/// Implementation of the builtin breakpoint command, used to launch the interactive debugger.
pub fn breakpoint(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> BuiltinResult {
let cmd = argv[0];
if argv.len() != 1 {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_ARG_COUNT1,
cmd,
0,
argv.len() - 1
));
err_fmt!(Error::UNEXP_ARG_COUNT, 0, argv.len() - 1)
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
@@ -26,12 +25,11 @@ pub fn breakpoint(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr])
{
if parser
.block_at_index(1)
.is_none_or(|b| b.typ() == BlockType::breakpoint)
.is_none_or(|b| b.typ() == BlockType::Breakpoint)
{
streams.err.appendln(&wgettext_fmt!(
"%s: Command not valid at an interactive prompt",
cmd,
));
err_str!("Command not valid at an interactive prompt")
.cmd(cmd)
.finish(streams);
return Err(STATUS_ILLEGAL_CMD);
}
}
@@ -40,5 +38,5 @@ pub fn breakpoint(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr])
let io_chain = &streams.io_chain;
reader_read(parser, STDIN_FILENO, io_chain)?;
parser.pop_block(bpb);
BuiltinResult::from_dynamic(parser.get_last_status())
BuiltinResult::from_dynamic(parser.last_status())
}

View File

@@ -1,3 +1,5 @@
use crate::{builtins::error::Error, err_fmt};
use super::prelude::*;
#[derive(Default)]
@@ -29,7 +31,14 @@ pub fn r#builtin(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -
return Ok(SUCCESS);
}
':' => {
builtin_missing_argument(parser, streams, cmd, argv[w.wopt_index - 1], print_hints);
builtin_missing_argument(
parser,
streams,
cmd,
None,
argv[w.wopt_index - 1],
print_hints,
);
return Err(STATUS_INVALID_ARGS);
}
';' => {
@@ -53,11 +62,12 @@ pub fn r#builtin(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -
}
if opts.query && opts.list_names {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_COMBO2,
cmd,
err_fmt!(
Error::INVALID_OPT_COMBO_WITH_CTX,
wgettext!("--query and --names are mutually exclusive")
));
)
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}

View File

@@ -3,23 +3,22 @@
use super::prelude::*;
use crate::{
env::{EnvMode, Environment as _},
err_fmt, err_raw, err_str,
fds::{BEST_O_SEARCH, wopen_dir},
parser::ParserEnvSetMode,
path::path_apply_cdpath,
wutil::{normalize_path, wreadlink},
};
use errno::Errno;
use fish_util::perror;
use libc::{EACCES, ELOOP, ENOENT, ENOTDIR, EPERM};
use nix::unistd::fchdir;
use std::sync::Arc;
// The cd builtin. Changes the current directory to the one specified or to $HOME if none is
// specified. The directory can be relative to any directory in the CDPATH variable.
pub fn cd(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) -> BuiltinResult {
localizable_consts! {
DIR_DOES_NOT_EXIST
"%s: The directory '%s' does not exist"
"The directory '%s' does not exist"
}
let Some(&cmd) = args.first() else {
@@ -45,9 +44,9 @@ pub fn cd(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) -> Built
&tmpstr
}
None => {
streams
.err
.appendln(&wgettext_fmt!("%s: Could not find home directory", cmd));
err_str!("Could not find home directory")
.cmd(cmd)
.finish(streams);
return Err(STATUS_CMD_ERROR);
}
}
@@ -55,31 +54,21 @@ pub fn cd(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) -> Built
// Stop `cd ""` from crashing
if dir_in.is_empty() {
streams.err.appendln(&wgettext_fmt!(
"%s: Empty directory '%s' does not exist",
cmd,
dir_in
));
let mut err = err_fmt!("Empty directory '%s' does not exist", dir_in).cmd(cmd);
if !parser.is_interactive() {
streams.err.append(&parser.current_line());
err = err.stacktrace(parser);
}
err.finish(streams);
return Err(STATUS_CMD_ERROR);
}
let pwd = vars.get_pwd_slash();
let dirs = path_apply_cdpath(dir_in, &pwd, vars);
if dirs.is_empty() {
streams
.err
.appendln(&wgettext_fmt!(DIR_DOES_NOT_EXIST, cmd, dir_in));
if !parser.is_interactive() {
streams.err.append(&parser.current_line());
}
return Err(STATUS_CMD_ERROR);
}
assert!(
!dirs.is_empty(),
"dirs should always contains a least an abs path, or a rel path, or '<PWD>/...'"
);
let mut best_errno = 0;
let mut broken_symlink = WString::new();
@@ -93,46 +82,35 @@ pub fn cd(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) -> Built
let res = wopen_dir(&norm_dir, BEST_O_SEARCH).map_err(|err| err as i32);
let res = res.and_then(|fd| {
match fchdir(&fd) {
Ok(()) => Ok(fd),
fchdir(&fd).map_err(|_|
// nix::Result::Err contains nix::errno::Errno, which does not offer an API for
// converting to a raw int.
Err(_) => Err(errno::errno().0),
}
errno::errno().0)
});
let fd = match res {
Ok(fd) => fd,
Err(err) => {
// Some errors we skip and only report if nothing worked.
// ENOENT in particular is very low priority
// - if in another directory there was a *file* by the correct name
// we prefer *that* error because it's more specific
if err == ENOENT {
let tmp = wreadlink(&norm_dir);
// clippy doesn't like this is_some/unwrap pair, but using if let is harder to read IMO
#[allow(clippy::unnecessary_unwrap)]
if broken_symlink.is_empty() && tmp.is_some() {
broken_symlink = norm_dir;
broken_symlink_target = tmp.unwrap();
} else if best_errno == 0 {
best_errno = errno::errno().0;
}
continue;
} else if err == ENOTDIR {
best_errno = err;
continue;
if let Err(err) = res {
// Some errors we skip and only report if nothing worked.
// ENOENT in particular is very low priority
// - if in another directory there was a *file* by the correct name
// we prefer *that* error because it's more specific
if err == ENOENT {
let tmp = wreadlink(&norm_dir);
// clippy doesn't like this is_some/unwrap pair, but using if let is harder to read IMO
// TODO: if-let-chains
if let Some(tmp) = tmp.filter(|_| broken_symlink.is_empty()) {
broken_symlink = norm_dir;
broken_symlink_target = tmp;
} else if best_errno == 0 {
best_errno = errno::errno().0;
}
continue;
} else if err == ENOTDIR {
best_errno = err;
break;
continue;
}
};
// We need to keep around the fd for this directory, in the parser.
let dir_fd = Arc::new(fd);
// Stash the fd for the cwd in the parser.
parser.libdata_mut().cwd_fd = Some(dir_fd);
best_errno = err;
break;
}
parser.set_var_and_fire(
L!("PWD"),
@@ -142,44 +120,30 @@ pub fn cd(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) -> Built
return Ok(SUCCESS);
}
if best_errno == ENOTDIR {
streams
.err
.appendln(&wgettext_fmt!("%s: '%s' is not a directory", cmd, dir_in));
let mut err = if best_errno == ENOTDIR {
err_fmt!("'%s' is not a directory", dir_in)
} else if !broken_symlink.is_empty() {
streams.err.appendln(&wgettext_fmt!(
"%s: '%s' is a broken symbolic link to '%s'",
cmd,
err_fmt!(
"'%s' is a broken symbolic link to '%s'",
broken_symlink,
broken_symlink_target
));
)
} else if best_errno == ELOOP {
streams.err.appendln(&wgettext_fmt!(
"%s: Too many levels of symbolic links: '%s'",
cmd,
dir_in
));
err_fmt!("Too many levels of symbolic links: '%s'", dir_in)
} else if best_errno == ENOENT {
streams
.err
.appendln(&wgettext_fmt!(DIR_DOES_NOT_EXIST, cmd, dir_in));
err_fmt!(DIR_DOES_NOT_EXIST, dir_in)
} else if best_errno == EACCES || best_errno == EPERM {
streams
.err
.appendln(&wgettext_fmt!("%s: Permission denied: '%s'", cmd, dir_in));
err_fmt!("Permission denied: '%s'", dir_in)
} else {
errno::set_errno(Errno(best_errno));
perror("cd");
streams.err.appendln(&wgettext_fmt!(
"%s: Unknown error trying to locate directory '%s'",
cmd,
dir_in
));
}
err_raw!(builtin_strerror()).cmd(L!("cd")).finish(streams);
err_fmt!("Unknown error trying to locate directory '%s'", dir_in)
};
if !parser.is_interactive() {
streams.err.append(&parser.current_line());
err = err.stacktrace(parser);
}
err.cmd(cmd).finish(streams);
Err(STATUS_CMD_ERROR)
}

View File

@@ -36,7 +36,14 @@ pub fn r#command(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -
return Ok(SUCCESS);
}
':' => {
builtin_missing_argument(parser, streams, cmd, argv[w.wopt_index - 1], print_hints);
builtin_missing_argument(
parser,
streams,
cmd,
None,
argv[w.wopt_index - 1],
print_hints,
);
return Err(STATUS_INVALID_ARGS);
}
';' => {

View File

@@ -1,25 +1,29 @@
use super::prelude::*;
use super::read::TokenOutputMode;
use crate::ast::{self, Kind, Leaf as _};
use crate::common::{UnescapeFlags, UnescapeStringStyle, unescape_string};
use crate::complete::Completion;
use crate::expand::{ExpandFlags, ExpandResultCode, expand_string};
use crate::input::input_function_get_code;
use crate::input_common::{CharEvent, ReadlineCmd};
use crate::operation_context::{OperationContext, no_cancel};
use crate::parse_constants::ParseTreeFlags;
use crate::parse_util::{
detect_parse_errors, get_job_extent, get_offset_from_line, get_process_extent,
get_token_extent, lineno,
use crate::{
ast::{self, Kind, Leaf as _},
builtins::error::Error,
complete::Completion,
err_fmt, err_str,
expand::{ExpandFlags, ExpandResultCode, expand_string},
input::input_function_get_code,
input_common::{CharEvent, ReadlineCmd},
operation_context::{OperationContext, no_cancel},
parse_constants::ParseTreeFlags,
parse_util::{
detect_parse_errors, get_job_extent, get_offset_from_line, get_process_extent,
get_token_extent, lineno,
},
prelude::*,
proc::is_interactive_session,
reader::{
JumpDirection, JumpPrecision, commandline_get_state, commandline_set_buffer,
commandline_set_search_field, reader_execute_readline_cmd, reader_jump,
reader_showing_suggestion,
},
tokenizer::{TOK_ACCEPT_UNFINISHED, TokenType, Tokenizer},
};
use crate::prelude::*;
use crate::proc::is_interactive_session;
use crate::reader::{
JumpDirection, JumpPrecision, commandline_get_state, commandline_set_buffer,
commandline_set_search_field, reader_execute_readline_cmd, reader_jump,
reader_showing_suggestion,
};
use crate::tokenizer::{TOK_ACCEPT_UNFINISHED, TokenType, Tokenizer};
use fish_common::{UnescapeFlags, UnescapeStringStyle, unescape_string};
use fish_wcstringutil::join_strings;
use std::ops::Range;
@@ -320,12 +324,13 @@ pub fn commandline(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr])
'f' => function_mode = true,
'x' | '\x02' | 'o' => {
if token_mode.is_some() {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_COMBO2,
cmd,
err_fmt!(
Error::INVALID_OPT_COMBO_WITH_CTX,
wgettext!("--tokens options are mutually exclusive")
));
builtin_print_error_trailer(parser, streams.err, cmd);
)
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
token_mode = Some(match c {
@@ -372,7 +377,14 @@ pub fn commandline(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr])
return Ok(SUCCESS);
}
':' => {
builtin_missing_argument(parser, streams, cmd, w.argv[w.wopt_index - 1], true);
builtin_missing_argument(
parser,
streams,
cmd,
None,
w.argv[w.wopt_index - 1],
true,
);
return Err(STATUS_INVALID_ARGS);
}
';' => {
@@ -424,23 +436,25 @@ pub fn commandline(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr])
|| selection_start_mode
|| selection_end_mode
{
streams.err.appendln(&wgettext_fmt!(BUILTIN_ERR_COMBO, cmd));
builtin_print_error_trailer(parser, streams.err, cmd);
err_str!(Error::INVALID_OPT_COMBO)
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
if positional_args == 0 {
builtin_missing_argument(parser, streams, cmd, L!("--function"), true);
builtin_missing_argument(parser, streams, cmd, None, L!("--function"), true);
return Err(STATUS_INVALID_ARGS);
}
type RL = ReadlineCmd;
for arg in &w.argv[w.wopt_index..] {
let Some(cmd) = input_function_get_code(arg) else {
streams
.err
.append(&wgettext_fmt!("%s: Unknown input function '%s'", cmd, arg));
builtin_print_error_trailer(parser, streams.err, cmd);
err_fmt!("Unknown input function '%s'", arg)
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
};
// Don't enqueue a repaint if we're currently in the middle of one,
@@ -467,20 +481,20 @@ pub fn commandline(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr])
// Check for invalid switch combinations.
if (selection_start_mode || selection_end_mode) && positional_args != 0 {
streams
.err
.appendln(&wgettext_fmt!(BUILTIN_ERR_TOO_MANY_ARGUMENTS, cmd));
builtin_print_error_trailer(parser, streams.err, cmd);
err_str!(Error::TOO_MANY_ARGUMENTS)
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
if (search_mode || line_mode || column_mode || cursor_mode || paging_mode)
&& positional_args > 1
{
streams
.err
.appendln(&wgettext_fmt!(BUILTIN_ERR_TOO_MANY_ARGUMENTS, cmd));
builtin_print_error_trailer(parser, streams.err, cmd);
err_str!(Error::TOO_MANY_ARGUMENTS)
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
@@ -489,24 +503,29 @@ pub fn commandline(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr])
// Special case - we allow to get/set cursor position relative to the process/job/token.
&& ((buffer_part.is_none() && !search_field_mode) || !cursor_mode)
{
streams.err.appendln(&wgettext_fmt!(BUILTIN_ERR_COMBO, cmd));
builtin_print_error_trailer(parser, streams.err, cmd);
err_str!(Error::INVALID_OPT_COMBO)
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
if (token_mode.is_some() || cut_at_cursor) && positional_args != 0 {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_COMBO2,
cmd,
err_fmt!(
Error::INVALID_OPT_COMBO_WITH_CTX,
"--cut-at-cursor and token options can not be used when setting the commandline"
));
builtin_print_error_trailer(parser, streams.err, cmd);
)
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
if search_field_mode && (buffer_part.is_some() || token_mode.is_some()) {
streams.err.appendln(&wgettext_fmt!(BUILTIN_ERR_COMBO, cmd));
builtin_print_error_trailer(parser, streams.err, cmd);
err_str!(Error::INVALID_OPT_COMBO)
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
@@ -522,26 +541,20 @@ pub fn commandline(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr])
if append_mode == AppendMode::InsertSmart {
if search_field_mode {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_COMBO2_EXCLUSIVE,
cmd,
"--insert-smart",
"--search-field"
));
builtin_print_error_trailer(parser, streams.err, cmd);
err_fmt!(Error::COMBO_EXCLUSIVE, "--insert-smart", "--search-field")
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
match buffer_part {
TextScope::String | TextScope::Job | TextScope::Process => (),
TextScope::Token => {
// To-do: we can support it in command position.
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_COMBO2_EXCLUSIVE,
cmd,
"--insert-smart",
"--current-token"
));
builtin_print_error_trailer(parser, streams.err, cmd);
err_fmt!(Error::COMBO_EXCLUSIVE, "--insert-smart", "--current-token")
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
}
@@ -552,19 +565,19 @@ pub fn commandline(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr])
let arg = w.argv[w.wopt_index];
let new_coord = match fish_wcstol(arg) {
Err(_) => {
streams
.err
.appendln(&wgettext_fmt!(BUILTIN_ERR_NOT_NUMBER, cmd, arg));
builtin_print_error_trailer(parser, streams.err, cmd);
err_fmt!(Error::NOT_NUMBER, arg)
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
0
}
Ok(num) => num - 1,
};
let Ok(new_coord) = usize::try_from(new_coord) else {
streams
.err
.append(&wgettext_fmt!("%s: line/column index starts at 1", cmd));
builtin_print_error_trailer(parser, streams.err, cmd);
err_str!("line/column index starts at 1")
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
};
@@ -572,10 +585,10 @@ pub fn commandline(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr])
let Some(offset) =
get_offset_from_line(&rstate.text, i32::try_from(new_coord).unwrap())
else {
streams
.err
.appendln(&wgettext_fmt!("%s: there is no line %s", cmd, arg));
builtin_print_error_trailer(parser, streams.err, cmd);
err_fmt!("there is no line %s", arg)
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
};
offset
@@ -587,12 +600,10 @@ pub fn commandline(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr])
let next_line_offset =
get_offset_from_line(&rstate.text, line_index + 1).unwrap_or(rstate.text.len());
if line_offset + new_coord > next_line_offset {
streams.err.appendln(&wgettext_fmt!(
"%s: column %s exceeds line length",
cmd,
arg
));
builtin_print_error_trailer(parser, streams.err, cmd);
err_fmt!("column %s exceeds line length", arg)
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
line_offset + new_coord
@@ -675,14 +686,18 @@ pub fn commandline(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr])
current_cursor_pos = current_buffer.len();
} else if parser.libdata().transient_commandline.is_some() {
if cursor_mode && positional_args != 0 {
streams.err.append(&wgettext_fmt!(
"%s: setting cursor while evaluating 'complete --arguments' is not yet supported",
cmd
));
builtin_print_error_trailer(parser, streams.err, cmd);
err_str!("setting cursor while evaluating 'complete --arguments' is not yet supported")
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_CMD_ERROR);
}
transient = parser.libdata().transient_commandline.clone().unwrap();
transient = parser
.libdata()
.transient_commandline
.as_ref()
.unwrap()
.clone();
current_buffer = &transient;
current_cursor_pos = transient.len();
} else if parser.interactive_initialized.load() || is_interactive_session() {
@@ -690,11 +705,10 @@ pub fn commandline(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr])
current_cursor_pos = rstate.cursor_pos;
} else {
// There is no command line because we are not interactive.
streams.err.append(cmd);
streams
.err
.append(L!(": Can not set commandline in non-interactive mode\n"));
builtin_print_error_trailer(parser, streams.err, cmd);
err_str!("Can not set commandline in non-interactive mode")
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_CMD_ERROR);
}
@@ -742,10 +756,10 @@ pub fn commandline(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr])
let arg = w.argv[w.wopt_index];
let new_pos = match fish_wcstol(arg) {
Err(_) => {
streams
.err
.appendln(&wgettext_fmt!(BUILTIN_ERR_NOT_NUMBER, cmd, arg));
builtin_print_error_trailer(parser, streams.err, cmd);
err_fmt!(Error::NOT_NUMBER, arg)
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
0
}
Ok(num) => num,

View File

@@ -1,21 +1,22 @@
use super::prelude::*;
use crate::common::{ScopeGuard, UnescapeFlags, UnescapeStringStyle, unescape_string};
use crate::complete::{CompletionRequestOptions, complete_add_wrapper, complete_remove_wrapper};
use crate::highlight::highlight_and_colorize;
use crate::operation_context::OperationContext;
use crate::parse_constants::ParseErrorList;
use crate::parse_util::detect_errors_in_argument_list;
use crate::parse_util::{detect_parse_errors, get_token_extent};
use crate::proc::is_interactive_session;
use crate::reader::{commandline_get_state, completion_apply_to_command_line};
use crate::{
common::bytes2wcstring,
builtins::error::Error,
complete::{
CompleteFlags, CompleteOptionType, CompletionMode, complete_add, complete_print,
complete_remove, complete_remove_all,
CompleteFlags, CompleteOptionType, CompletionMode, CompletionRequestOptions, complete_add,
complete_add_wrapper, complete_print, complete_remove, complete_remove_all,
complete_remove_wrapper,
},
err_fmt, err_raw, err_str,
highlight::highlight_and_colorize,
operation_context::OperationContext,
parse_constants::ParseErrorList,
parse_util::{detect_errors_in_argument_list, detect_parse_errors, get_token_extent},
proc::is_interactive_session,
reader::{commandline_get_state, completion_apply_to_command_line},
};
use fish_common::{ScopeGuard, UnescapeFlags, UnescapeStringStyle, unescape_string};
use fish_wcstringutil::string_suffixes_string;
use fish_widestring::bytes2wcstring;
// builtin_complete_* are a set of rather silly looping functions that make sure that all the proper
// combinations of complete_add or complete_remove get called. This is needed since complete allows
@@ -230,7 +231,6 @@ fn builtin_complete_print(
streams.out.append(&bytes2wcstring(&highlight_and_colorize(
&repr,
&parser.context(),
parser.vars(),
)));
} else {
streams.out.append(&repr);
@@ -245,7 +245,7 @@ fn builtin_complete_print(
pub fn complete(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> BuiltinResult {
localizable_consts! {
OPTION_REQUIRES_NON_EMPTY_STRING
"%s: %s requires a non-empty string"
"%s requires a non-empty string"
}
let cmd = argv[0];
@@ -326,11 +326,9 @@ pub fn complete(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) ->
cmd_to_complete.push(tmp);
}
} else {
streams.err.appendln(&wgettext_fmt!(
"%s: Invalid token '%s'",
cmd,
w.woptarg.unwrap()
));
err_fmt!("Invalid token '%s'", w.woptarg.unwrap())
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
}
@@ -347,11 +345,9 @@ pub fn complete(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) ->
let arg = w.woptarg.unwrap();
short_opt.extend(arg.chars());
if arg.is_empty() {
streams.err.appendln(&wgettext_fmt!(
OPTION_REQUIRES_NON_EMPTY_STRING,
cmd,
"-s",
));
err_fmt!(OPTION_REQUIRES_NON_EMPTY_STRING, "-s",)
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
}
@@ -359,11 +355,9 @@ pub fn complete(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) ->
let arg = w.woptarg.unwrap();
gnu_opt.push(arg);
if arg.is_empty() {
streams.err.appendln(&wgettext_fmt!(
OPTION_REQUIRES_NON_EMPTY_STRING,
cmd,
"-l",
));
err_fmt!(OPTION_REQUIRES_NON_EMPTY_STRING, "-l",)
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
}
@@ -371,11 +365,9 @@ pub fn complete(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) ->
let arg = w.woptarg.unwrap();
old_opt.push(arg);
if arg.is_empty() {
streams.err.appendln(&wgettext_fmt!(
OPTION_REQUIRES_NON_EMPTY_STRING,
cmd,
"-o",
));
err_fmt!(OPTION_REQUIRES_NON_EMPTY_STRING, "-o",)
.cmd(cmd)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
}
@@ -404,7 +396,7 @@ pub fn complete(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) ->
return Ok(SUCCESS);
}
':' => {
builtin_missing_argument(parser, streams, cmd, argv[w.wopt_index - 1], true);
builtin_missing_argument(parser, streams, cmd, None, argv[w.wopt_index - 1], true);
return Err(STATUS_INVALID_ARGS);
}
';' => {
@@ -424,19 +416,21 @@ pub fn complete(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) ->
if result_mode.no_files && result_mode.force_files {
if !have_x {
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_COMBO2,
"complete",
err_fmt!(
Error::INVALID_OPT_COMBO_WITH_CTX,
"'--no-files' and '--force-files'"
));
)
.cmd(cmd)
.finish(streams);
} else {
// The reason for us not wanting files is `-x`,
// which is short for `-rf`.
streams.err.appendln(&wgettext_fmt!(
BUILTIN_ERR_COMBO2,
"complete",
err_fmt!(
Error::INVALID_OPT_COMBO_WITH_CTX,
"'--exclusive' and '--force-files'"
));
)
.cmd(cmd)
.finish(streams);
}
return Err(STATUS_INVALID_ARGS);
}
@@ -450,10 +444,10 @@ pub fn complete(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) ->
// Or use one left-over arg as the command to complete
cmd_to_complete.push(argv[argc - 1].to_owned());
} else {
streams
.err
.appendln(&wgettext_fmt!(BUILTIN_ERR_TOO_MANY_ARGUMENTS, cmd));
builtin_print_error_trailer(parser, streams.err, cmd);
err_str!(Error::TOO_MANY_ARGUMENTS)
.cmd(cmd)
.full_trailer(parser)
.finish(streams);
return Err(STATUS_INVALID_ARGS);
}
}
@@ -461,14 +455,16 @@ pub fn complete(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) ->
for condition_string in &condition {
let mut errors = ParseErrorList::new();
if detect_parse_errors(condition_string, Some(&mut errors), false).is_err() {
let prefix = WString::from(L!("-n '")) + &condition_string[..] + L!("'");
for error in errors {
let prefix = cmd.to_owned() + L!(": -n '") + &condition_string[..] + L!("'");
streams.err.appendln(&error.describe_with_prefix(
err_raw!(&error.describe_with_prefix(
condition_string,
&prefix,
parser.is_interactive(),
false,
));
))
.cmd(cmd)
.finish(streams);
}
return Err(STATUS_CMD_ERROR);
}
@@ -476,10 +472,10 @@ pub fn complete(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) ->
if !comp.is_empty() {
if let Err(err_text) = detect_errors_in_argument_list(&comp, cmd) {
streams
.err
.appendln(&wgettext_fmt!("%s: %s: contains a syntax error", cmd, comp));
streams.err.appendln(&err_text);
let mut err = err_fmt!("%s: contains a syntax error", comp);
err.append_assign_to_msg('\n');
err.append_assign_to_msg(&err_text);
err.cmd(cmd).finish(streams);
return Err(STATUS_CMD_ERROR);
}
}
@@ -491,10 +487,9 @@ pub fn complete(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) ->
// No argument given, try to use the current commandline.
let commandline_state = commandline_get_state(true);
if !parser.interactive_initialized.load() && !is_interactive_session() {
streams.err.append(cmd);
streams
.err
.append(L!(": Can not get commandline in non-interactive mode\n"));
err_str!("Can not get commandline in non-interactive mode")
.cmd(cmd)
.finish(streams);
return Err(STATUS_CMD_ERROR);
}
commandline_state.text

Some files were not shown because too many files have changed in this diff Show More