Compare commits

..

78 Commits
4.3.2 ... 4.3.3

Author SHA1 Message Date
Johannes Altmanninger
c98fd886fd Release 4.3.3
Created by ./build_tools/release.sh 4.3.3
2026-01-07 08:34:20 +01:00
Johannes Altmanninger
94fdb36f6b Revert "cmake: rename WITH_GETTEXT to WITH_MESSAGE_LOCALIZATION"
This reverts commit 77e1aead40 for the
patch release.
2026-01-07 08:32:33 +01:00
Johannes Altmanninger
d1a40ace7d changelog: fix RST syntax 2026-01-07 08:32:20 +01:00
Johannes Altmanninger
dd4d69a288 cirrus: fix FreeBSD pkg install failure
Multiple PRs fail with

	pkg: Repository FreeBSD-ports cannot be opened. 'pkg update' required
	Updating database digests format: . done
	pkg: No packages available to install matching 'cmake-core' have been found in the repositories
2026-01-07 08:01:21 +01:00
Daniel Rainer
e16ea8df11 sync: once_cell::sync::OnceCell -> std::sync::OnceLock
Rust 1.70 stabilized `std::sync::OnceLock`, which replaces
`once_cell::sync::OnceCell`.

With this, we only have a single remaining direct dependency on
`once_cell`: `VAR_DISPATCH_TABLE` in `src/env_dispatch.rs`, where we use
`Lazy::get`. This can be replaced with `LazyLock::get` once our MSRV
reaches 1.94, where the function is stabilized.

At the moment, `serial_test` depends on `once_cell`, so even if we
eliminate it as a direct dependency, it will remain a transitive
dependency.

Closes #12289
2026-01-07 07:59:24 +01:00
Daniel Rainer
80e1942980 sync: once_cell::sync::Lazy -> std::sync::LazyLock
Rust 1.80 stabilized `std::sync::LazyLock`, which replaces
`once_cell::sync::Lazy`. There is one exception in
`src/env_dispatch.rs`, which still uses the `once_cell` variant, since
the code there relies on `Lazy::get`, which also exists for `LazyLock`,
but will only be stabilized in Rust 1.94, so we can't use it yet.

Part of #12289
2026-01-07 07:59:24 +01:00
Johannes Altmanninger
99109278a6 Enable color theme reporting again on tmux >= 3.7
Color theme reporting has race conditions, so we might want to disable
it until we have fixed those. Not sure.

At least the tmux-specific issue hsa been fixed,
so treat new tmux like other terminals.
See https://github.com/tmux/tmux/issues/4787#issuecomment-3716135010

See #12261
2026-01-07 07:59:24 +01:00
Johannes Altmanninger
f924a880c8 Update changelog 2026-01-07 07:59:23 +01:00
Johannes Altmanninger
5683d26d24 github: update pull request template
Repeat here that 'Fixes #' should go into commit message, and remove
redundant PR description.
2026-01-07 07:35:30 +01:00
Lumynous
740aef06df l10n(zh-TW): Complete untranslated strings
Closes #12288
2026-01-07 07:35:30 +01:00
Daniel Rainer
557f6d1743 check: allow overriding default Rust toolchain
This is useful for running the checks with a toolchain which is
different from the default toolchain, for example to check if everything
works with our MSRV, or on beta/nightly toolchains. Additionally,
providing a way to run using the nightly toolchain allows writing
wrappers around `check.sh` which make use of nightly-only features.

The toolchain could be changed using `rustup toolchain default`, but if
the toolchain should only be used for a specific run, this is
inconvenient, and it does not allow for concurrent builds using
different toolchains.

Closes #12281
2026-01-06 17:54:56 +00:00
Johannes Altmanninger
8d257f5c57 completions/fastboot: one item per line 2026-01-06 14:29:32 +01:00
Johannes Altmanninger
d880a14b1a changelog: add sections 2026-01-06 14:29:32 +01:00
Next Alone
d4fcc00821 completions(fastboot): sync partitions from Xiaomi images
Signed-off-by: Next Alone <12210746+NextAlone@users.noreply.github.com>

Closes #12283
2026-01-06 14:29:32 +01:00
Johannes Altmanninger
6d8bb292ec fish_config theme choose: overwrite color-aware theme's vars also if called from config
Webconfig persists themes to ~/.config/fish/conf.d/fish_frozen_theme.fish
(the name is due to historical reasons).

That file's color variables have no "--theme=foo" annotations, which
means that fish_config can't distinguish them from other "user-set"
values.  We can change this in future, but that doesn't affect the
following fix.

A "fish_config theme choose foo" command is supposed to
overwrite all variables that are defined in "foo.theme".
If the theme is color-theme-aware *and* this command runs before
$fish_terminal_color_theme is initialized, we delay loading of the
theme until that initialization happens.  But the --on-variable
invocation won't have the override bit set, thus it will not touch
variables that don't have "--theme=*" value.  Fix this by clearing
immediately the variables mentioned in the theme.

Fixes #12278

While at it, tweak the error message for this command because it's
not an internal error:

	fish -c 'echo yes | fish_config theme save tomorrow'
2026-01-06 14:29:32 +01:00
Lennard Hofmann
c638401469 Speedup syntax highlighting of redirection targets
Instead of checking twice whether the redirection target is a valid file,
use the return value from test_redirection_target().

Closes #12276
2026-01-06 10:39:58 +01:00
Fabian Boehm
5930574d8a README: Mention cargo
A bit pedantic, we're also not mentioning that you need a linker, but
oh well.

Fixes #12277
2026-01-05 17:16:33 +01:00
Daniel Rainer
fdef7c8689 l10n: add initialize_localization function
This replaces `initialize_gettext`. It is only defined when the
`localize-messages` feature is enabled, to avoid giving the impression
that it does anything useful when the feature is disabled.

With this change, Fluent will be initialized as well once it is added,
without requiring any additional code for initialization.

Closes #12190
2026-01-05 15:12:34 +00:00
Daniel Rainer
5c36a1be1b l10n: create gettext submodule
Put the gettext-specific code into `localization/gettext`.

Part of #12190
2026-01-05 15:12:34 +00:00
Daniel Rainer
14f747019b l10n: create localization/settings
Extract the language selection code from the gettext crate, and to a
lesser extent from `src/localization/mod.rs` and put it into
`src/localization/settings.rs`. No functional changes are intended.

Aside from better separation of concerns, this refactoring makes it
feasible to reuse the language selection logic for Fluent later on.

Part of #12190
2026-01-05 15:12:34 +00:00
Johannes Altmanninger
d7d5d2a9be completions/make: fix on OpenBSD/FreeBSD
Tested with the POSIX Makefile from https://github.com/mawww/kakoune

Closes #12272
2026-01-05 12:52:00 +01:00
Johannes Altmanninger
750955171a __fish_migrate: don't leak standard file descriptors
The __fish_migrate.fish function spawns a "sh -c 'sleep 7' &" child
process that inherits stdin/stdout/stderr file descriptors fish.

This means that if the app running "fish
tests/checks/__fish_migrate.fish" actually waits for fish to close its
standard file descriptors, it will appear to hang for 7 seconds. Fix
that by closing the file descriptors in the background job when
creating it.

Closes #12271
2026-01-05 12:52:00 +01:00
Denys Zhak
36fd93215b fish_indent: Keep braces on same line in if/while conditions
Closes #12270
2026-01-05 12:50:19 +01:00
Lennard Hofmann
6e7353170a Highlight valid paths in redirection targets
Closes #12260
2026-01-05 12:50:19 +01:00
Peter Ammon
62cc117c12 Minor refactoring of add_to_history
Preparation for other refactoring in the future.
2026-01-04 12:33:53 -08:00
Peter Ammon
af00695383 Clean up replace_home_directory_with_tilde
Fix a stale comment and add a test.
2026-01-04 11:08:21 -08:00
Johannes Altmanninger
85ac91eb2b fish_config theme choose: fix Tomorrow always using light version
The backward compat hack canonicalization caused us to always treat
"tomorrow" light theme.

Restrict this hack to the legacy name (Tomorrow); don't do it when
the new canonical name (tomorrow) is used.  The same issue does not
affect other themes because their legacy names always have a "light"
or 'dark' suffix, which means that the canonical name is different,
so the legacy hacks don't affect the canonical name.

Fixes #12266
2026-01-04 13:08:26 +01:00
Johannes Altmanninger
d1ed582919 fish_config theme choose: apply backwards compat hacks only to sample themes
This logic exists to not break user configurations as we renamed
themes.  But user-sourced themes haven't been renamed.

(It's also questionable whether we should really have these compat
hacks; they might cause confusion in the long run).
2026-01-04 13:08:26 +01:00
xtqqczze
06a14c4a76 clippy: fix assigning_clones lint
https://rust-lang.github.io/rust-clippy/master/index.html#assigning_clones

Closes #12267
2026-01-04 13:08:26 +01:00
takeokunn
400d5281f4 feat(git-completion): add missing options and completions for commands
- Add missing options and completions for fetch, show-branch, am,
  checkout, archive, grep, pull, push, revert, rm, config, clean, and
  other commands
- Replace TODO comments with actual option completions for improved
  usability
- Ensure all new options have appropriate descriptions and argument
  handling for fish shell completion

Closes #12263
2026-01-04 13:08:26 +01:00
Johannes Altmanninger
50778670fb Disable color theme reporting in tmux for now
Due to the way tmux implements it, color theme reporting
causes issues when typing commands really quickly (such as
when synthesizing keys).  We're working on fixing this, see
https://github.com/tmux/tmux/issues/4787#issuecomment-3707866550
Disable it for now. AFAIK other terminals are not affected.

Closes #12261
2026-01-04 13:08:26 +01:00
WitherZuo
9037cd779d Optimize functions page style of fish_config.
- Fix the background color of .function-body in dark mode to improve readability.
- Switch to Tomorrow Night Bright color theme for better contrast and readability in dark mode.
- Format all stylesheets of fish_config.

Closes #12257
2026-01-04 13:08:26 +01:00
phanium
c23a4cbd9f Add --color option for some builtins
Fixes #9716

Closes #12252
2026-01-04 13:08:26 +01:00
Johannes Altmanninger
5d8f7801f7 builtin fish_indent/fish_key_reader: call help from existing fish process
Something like

	PATH=mypath builtin fish_indent --help

runs "fish -c '__fish_print_help fish_indent'" internally.  Since we
don't call setenv(), the PATH update doesn't reach the child shell.
Fix this by using what other builtins use if we are one (i.e. if we
have a Parser in context).

Fixes #12229
Maybe also #12085
2026-01-04 13:08:26 +01:00
Johannes Altmanninger
756134cf2b test/tmux-complete3: fail more reliably 2026-01-04 13:08:26 +01:00
Johannes Altmanninger
c16677fd6f tty_handoff: use Drop consistently
We sometimes use explicit reclaim() and sometimes rely on the drop
implementation. This adds an unnecesary step to reading all uses of
this code.  Make this consistent. Use drop everywhere though we could
use explicit reclaim too.
2026-01-04 13:08:26 +01:00
Johannes Altmanninger
13bc514aa6 __fish_complete_directories: remove use of empty variable
Closes #12248
2026-01-04 09:42:25 +01:00
Johannes Altmanninger
1c3403825c completions/signify: consistent style
Also, replace use of "ls" with globbing.
2026-01-04 09:42:25 +01:00
LunarEclipse
6f1ac7c949 Prioritize files with matching extensions for flag arguments in signify completions
Closes #12243
2026-01-04 09:42:25 +01:00
LunarEclipse
f5d3fd8a82 Full completions for openbsd signify
Part of #12243
2026-01-04 09:42:25 +01:00
Johannes Altmanninger
0a23a78523 Soft-wrapped autosuggestion to hide right prompt for now
Prior to f417cbc981 (Show soft-wrapped portions in autosuggestions,
2025-12-11), we'd truncate autosuggestions before the right prompt.
We no longer do that for autosuggestions that soft-wrap, which means
we try to draw both right prompt and suggestion in the same space.

Make suggestion paint over right prompt for now, since this seems to
be a simple and robust solution.  We can revisit this later.

Fixes #12255
2026-01-03 15:54:04 +01:00
xtqqczze
725cf33f1a fix: remove never read collection from parse_cmd_opts
Closes #12251
2026-01-03 15:54:04 +01:00
xtqqczze
2d6db3f980 clippy: fix implicit_clone lint
https://rust-lang.github.io/rust-clippy/master/index.html#implicit_clone

Closes #12245
2026-01-03 15:54:04 +01:00
xtqqczze
41b9584bb3 clippy: fix cloned_instead_of_copied lint
https://rust-lang.github.io/rust-clippy/master/index.html#cloned_instead_of_copied

Closes #12244
2026-01-03 15:54:04 +01:00
Tin Lai
c915435417 respect canonical config for untracked files
Signed-off-by: Tin Lai <tin@tinyiu.com>

Closes #11709
2026-01-03 15:54:04 +01:00
Johannes Altmanninger
afcde1222b Update cargo dependencies
cargo update && cargo +nightly -Zunstable-options update --breaking
2026-01-03 15:54:04 +01:00
Benjamin A. Beasley
a3cb512628 Update phf from 0.12 to 0.13
Closes #12222
2026-01-03 15:54:04 +01:00
Johannes Altmanninger
fc71ba07da share: fix typo 2026-01-03 15:54:04 +01:00
Johannes Altmanninger
9c867225ee reader handle_completions(): remove code duplication
We fail to flash the command line if we filter out completions due
to !reader_can_replace. Fix that and de-duplicate the logic
2026-01-03 15:54:04 +01:00
Johannes Altmanninger
972355e2fc reader handle_completions(): remove stale comment
See 656b39a0b3 (Also show case-insensitive prefix matches in completion pager, 2025-11-23).
2026-01-03 15:54:04 +01:00
Johannes Altmanninger
8f4c80699f reader handle_completions(): don't allocate a second completion list 2026-01-03 15:54:04 +01:00
Johannes Altmanninger
e79b00d9d1 reader handle_completions(): also truncate common prefix when replacing
I don't know why we don't apply the common-prefix truncation logic
when all completions are replacing.
Let's do that.
2026-01-03 15:54:04 +01:00
Johannes Altmanninger
2f6b1eaaf9 reader handle_completions(): don't consider odd replacing completions for common prefix
If "will_replace_token" is set, we generally only consider
appending completions.  This changed in commit 656b39a0b3 (Also show
case-insensitive prefix matches in completion pager, 2025-11-23) which
also allowed icase completions as long as they are also prefix matches.

Such replacing completions might cause the common prefix to be empty,
which breaks the appending completions.

Fix this by not considering these replacing completions for the
common-prefix computation. The pager already doesn't show the prefix
for these completions specifically.

Fixes #12249
2026-01-03 15:54:04 +01:00
Johannes Altmanninger
3546ffa3ef reader handle_completions(): remove dead filtering code
We skip completions where "will_replace_token != c.replaces_token()".
This means that
- if will_replace_token, we filter out non-replacing completions.
  But those do not exist because, by definition, will_replace_token
  is true iff there are no non-replacing completions.
- if !will_replace_token, we filter out replacing completions.
  From the definition of will_replace_token follows that there is
  some non-replacing completion, which must be a prefix or exact match.
  Since we've filtered by rank, any replacing ones must have the same rank.
  So the replacement bit must be due to smartcase.  Smartcase
  completions are already passed through explicitly here since
  656b39a0b3 (Also show case-insensitive prefix matches in completion
  pager, 2025-11-23).

So the cases where we 'continue' here can never happen.
Remove this redundant check.
2026-01-03 15:54:04 +01:00
Johannes Altmanninger
30f96860a7 reader handle_completions(): closed form for pager prefix bool 2026-01-03 15:54:04 +01:00
Johannes Altmanninger
41d50f1a71 reader handle_completions(): don't duplicate pager prefix allocation
While at it, use Cow I guess?
2026-01-03 15:54:04 +01:00
Johannes Altmanninger
1e9c80f34c reader handle_completions(): don't allocate common prefix 2026-01-03 15:54:04 +01:00
Johannes Altmanninger
b88d2ed812 reader handle_completions(): don't clone surviving completions 2026-01-03 15:54:04 +01:00
Johannes Altmanninger
92dd37d3c7 reader handle_completions(): remove dead code for skipping to add prefix
The tuple (will_replace_token, all_matches_exact_or_prefix) can never
be (false, false).

Proof by contraction:
1. Goal: show unsatisfiability of: !will_replace_token && !all_matches_exact_or_prefix
2. Substitute defintions: !all(replaces) && !all(is_exact_or_prefix)
3. wlog, !replaces(c1) && !is_exact_or_prefix(c2)
4. since c1 and c2 have same rank we know that !is_exact_or_prefix(c1)
5. !is_exact_or_prefix() implies requires_full_replacement()
6. all callers that create a Completion from StringFuzzyMatch::try_create(),
   set CompleteFlags::REPLACE_TOKEN if requires_full_replacement(),
   so requires_full_replacement() implies replaces()
7. From 4-6 follows: !is_exact_or_prefix(c1) implies replaces(c1), which is a contradiction
2026-01-03 15:54:04 +01:00
Johannes Altmanninger
f24cc6a8fc reader handle_completions(): remove code clone
While at it,
1. add assertions to tighten some screws
2. migrate to closed form / inline computation.
2026-01-03 15:54:04 +01:00
Johannes Altmanninger
3117a488ec complete: reuse replaces_token() 2026-01-03 15:54:04 +01:00
Johannes Altmanninger
185b91de13 reader rls: remove redundant initial value
This initial value is weird and None works the same way so use that.
2026-01-03 15:54:04 +01:00
Johannes Altmanninger
e20024f0f0 reader rls: minimize state for tracking the completion pager 2026-01-03 15:54:04 +01:00
Johannes Altmanninger
501ec1905e Test completion pager invalidation behavior 2026-01-03 15:54:04 +01:00
Johannes Altmanninger
7ebd2011ff update-dependencies.sh: fix uv lock --check command 2026-01-03 15:54:04 +01:00
Daniel Rainer
8004f354aa changelog: put new changes into upcoming release
The update to `WITH_GETTEXT` was not released in 4.3.2, so remove it
from there and put it into the section for the upcoming release.

Closes #12254
2026-01-02 00:11:36 +00:00
Daniel Rainer
77e1aead40 cmake: rename WITH_GETTEXT to WITH_MESSAGE_LOCALIZATION
This change is made to make the option name appropriate for Fluent
localization.

While at it, add a feature summary for this feature.

Closes #12208
2026-01-01 23:46:07 +00:00
Peter Ammon
e48a88a4b3 Minor refactoring of history item deletion
Clean this up.
2026-01-01 12:00:18 -08:00
Peter Ammon
1154d9f663 Correct error reporting when rewriting history files
A recent change attempted this:

    let result: std::io::Result<()> = { code()? }

However this doesn't initialize the Result on error - instead it
returns from the function, meaning that the error would be silently
dropped.

Fix that by reporting the error at the call site instead.
2025-12-31 10:20:31 -08:00
Johannes Altmanninger
810a707069 Fix PROMPT_SP hack regression
Commit fbad0ab50a (reset_abandoning_line: remove redundant
allocations, 2025-11-13) uses byte count of ⏎ (3) instead of char
count (1), thus overestimating the number of spaces this symbol takes.

Fixes #12246
2025-12-31 07:46:50 +01:00
Peter Ammon
7fa9e9bfb9 History: use Rust's buffered writing instead of our own
Simplify some code.
2025-12-30 20:26:07 -08:00
Alan Somers
48b0e7e695 Fix installation of prompts and theme files after 4.3.0
Installation of these files was accidentally broken by d8f1a2a.

Fixes #12241
2025-12-31 11:10:34 +08:00
Peter Ammon
848fa57144 Introduce and adopt BorrowedFdFile
Rust has this annoying design where all of the syscall conveniences on
File assume that it owns its fd; in particular this means that we can't
easily construct File from stdin, a raw file descriptor, etc.

The usual workarounds are to construct a File and then mem::forget it
(this is apparently idiomatic Rust!). But this has problems of its own:
for example it can't easily be used in Drop.

Introduce BorrowedFdFile which wraps File with ManuallyDrop and then
never drops the file (i.e. it's always forgotten). Replace some raw FDs
with BorrowedFdFile.
2025-12-30 14:07:39 -08:00
Peter Ammon
4101e831af Make fish_indent stop panicing on closed stdin
Prior to this commit, this code:

    fish_indent <&-

would panic as we would construct a File with a negative fd.
Check for a closed fd as other builtins do.
2025-12-30 14:07:39 -08:00
Peter Ammon
eb803ba6a7 Minor cleanup of shared::Arguments 2025-12-30 14:07:39 -08:00
Fabian Boehm
248a8e7c54 Fix doc test 2025-12-30 20:40:55 +01:00
Peter Ammon
2fa8c8cd7f Allow ctrl-C to work in fish_indent builtin
Since fish_indent became a builtin, it cannot be canceled with control-C,
because Rust's `read_to_end` retries on EINTR. Add our own function which
propagates EINTR and use it.

Fixes #12238
2025-12-30 10:39:19 -08:00
Johannes Altmanninger
8d5f5586dc start new cycle
Created by ./build_tools/release.sh 4.3.2
2025-12-30 17:43:15 +01:00
114 changed files with 8752 additions and 1632 deletions

View File

@@ -29,6 +29,7 @@ freebsd_task:
freebsd_instance:
image: freebsd-15-0-release-amd64-ufs # updatecli.d/cirrus-freebsd.yml
tests_script:
- pkg update
- pkg install -y cmake-core devel/pcre2 devel/ninja gettext git-lite lang/rust misc/py-pexpect
# libclang.so is a required build dependency for rust-c++ ffi bridge
- pkg install -y llvm

View File

@@ -1,11 +1,8 @@
## Description
Talk about your changes here.
Fixes issue #
## TODOs:
<!-- Just check off what what we know been done so far. We can help you with this stuff. -->
<!-- Check off what what has been done so far. -->
- [ ] If addressing an issue, a commit message mentions `Fixes issue #<issue-number>`
- [ ] Changes to fish usage are reflected in user documentation/manpages.
- [ ] Tests have been added for regressions fixed
- [ ] User-visible changes noted in CHANGELOG.rst <!-- Don't document changes for completions inside CHANGELOG.rst, there are lot of such edits -->
- [ ] User-visible changes noted in CHANGELOG.rst <!-- Usually skipped for changes to completions -->

View File

@@ -1,3 +1,21 @@
fish 4.3.3 (released January 07, 2026)
======================================
This release fixes the following problems identified in fish 4.3.0:
- Selecting a completion could insert only part of the token (:issue:`12249`).
- Glitch with soft-wrapped autosuggestions and :doc:`fish_right_prompt <cmds/fish_right_prompt>` (:issue:`12255`).
- Spurious echo in tmux when typing a command really fast (:issue:`12261`).
- ``tomorrow`` theme always using the light variant (:issue:`12266`).
- ``fish_config theme choose`` sometimes not shadowing themes set by e.g. webconfig (:issue:`12278`).
- The sample prompts and themes are correctly installed (:issue:`12241`).
- Last line of command output could be hidden when missing newline (:issue:`12246`).
Other improvements include:
- The ``abbr``, ``bind``, ``complete``, ``functions``, ``history`` and ``type`` commands now support a ``--color`` option to control syntax highlighting in their output. Valid values are ``auto`` (default), ``always``, or ``never``.
- Existing file paths in redirection targets such as ``> file.txt`` are now highlighted using :envvar:`fish_color_valid_path`, indicating that ``file.txt`` will be clobbered (:issue:`12260`).
fish 4.3.2 (released December 30, 2025)
=======================================

121
Cargo.lock generated
View File

@@ -4,9 +4,9 @@ version = 4
[[package]]
name = "aho-corasick"
version = "1.1.3"
version = "1.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8e60d3430d3a69478ad0993f19238d2df97c507009a52b3c10addcd7f6bcb916"
checksum = "ddd31a130427c27518df266943a5308ed92d4b226cc639f5a8f1002816174301"
dependencies = [
"memchr",
]
@@ -40,9 +40,9 @@ dependencies = [
[[package]]
name = "bstr"
version = "1.12.0"
version = "1.12.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "234113d19d0d7d613b40e86fb654acf958910802bcceab913a4f9e7cda03b1a4"
checksum = "63044e1ae8e69f3b5a92c736ca6269b8d12fa7efe39bf34ddb06d102cf0e2cab"
dependencies = [
"memchr",
"serde",
@@ -50,9 +50,9 @@ dependencies = [
[[package]]
name = "cc"
version = "1.2.41"
version = "1.2.51"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ac9fe6cdbb24b6ade63616c0a0688e45bb56732262c158df3c0c4bea4ca47cb7"
checksum = "7a0aeaff4ff1a90589618835a598e545176939b97874f7abc7851caa0618f203"
dependencies = [
"find-msvc-tools",
"jobserver",
@@ -83,9 +83,9 @@ dependencies = [
[[package]]
name = "crypto-common"
version = "0.1.6"
version = "0.1.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1bfb12502f3fc46cca1bb51ac28df9d618d813cdc3d2f25b9fe775a34af26bb3"
checksum = "78c8292055d1c1df0cce5d180393dc8cce0abec0a7102adb6c7b1eef6016d60a"
dependencies = [
"generic-array",
"typenum",
@@ -146,13 +146,13 @@ checksum = "37909eebbb50d72f9059c3b6d82c0463f2ff062c9e95845c43a6c9c0355411be"
[[package]]
name = "find-msvc-tools"
version = "0.1.4"
version = "0.1.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "52051878f80a721bb68ebfbc930e07b65ba72f2da88968ea5c06fd6ca3d3a127"
checksum = "645cbb3a84e60b7531617d5ae4e57f7e27308f6445f5abf653209ea76dec8dff"
[[package]]
name = "fish"
version = "4.3.2"
version = "4.3.3"
dependencies = [
"bitflags",
"cc",
@@ -176,7 +176,7 @@ dependencies = [
"num-traits",
"once_cell",
"pcre2",
"phf_codegen 0.12.1",
"phf_codegen 0.13.1",
"portable-atomic",
"rand 0.9.2",
"rsconf",
@@ -209,7 +209,6 @@ version = "0.0.0"
dependencies = [
"libc",
"nix",
"once_cell",
]
[[package]]
@@ -221,7 +220,6 @@ dependencies = [
"fish-wchar",
"fish-widecharwidth",
"libc",
"once_cell",
"rsconf",
"widestring",
]
@@ -231,8 +229,7 @@ name = "fish-gettext"
version = "0.0.0"
dependencies = [
"fish-gettext-maps",
"once_cell",
"phf 0.12.1",
"phf 0.13.1",
]
[[package]]
@@ -250,8 +247,8 @@ version = "0.0.0"
dependencies = [
"fish-build-helper",
"fish-gettext-mo-file-parser",
"phf 0.12.1",
"phf_codegen 0.12.1",
"phf 0.13.1",
"phf_codegen 0.13.1",
"rsconf",
]
@@ -297,15 +294,15 @@ checksum = "3f9eec918d3f24069decb9af1554cad7c880e2da24a9afd88aca000531ab82c1"
[[package]]
name = "foldhash"
version = "0.1.5"
version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d9c4f5dac5e15c24eb999c26181a6ca40b39fe946cbe4c263c7209467bc83af2"
checksum = "77ce24cb58228fbb8aa041425bb1050850ac19177686ea6e0f41a70416f56fdb"
[[package]]
name = "generic-array"
version = "0.14.9"
version = "0.14.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4bb6743198531e02858aeaea5398fcc883e71851fcbcb5a2f773e2fb6cb1edf2"
checksum = "85649ca51fd72272d7821adaf274ad91c288277713d9c18820d8499a7ff69e9a"
dependencies = [
"typenum",
"version_check",
@@ -336,9 +333,9 @@ dependencies = [
[[package]]
name = "globset"
version = "0.4.16"
version = "0.4.18"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "54a1028dfc5f5df5da8a56a73e6c153c9a9708ec57232470703592a3f18e49f5"
checksum = "52dfc19153a48bde0cbd630453615c8151bce3a5adfac7a0aebfbf0a1e1f57e3"
dependencies = [
"aho-corasick",
"bstr",
@@ -349,9 +346,9 @@ dependencies = [
[[package]]
name = "hashbrown"
version = "0.15.5"
version = "0.16.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9229cfe53dfd69f0609a49f65461bd93001ea1ef889cd5529dd176593f5338a1"
checksum = "841d1cc9bed7f9236f321df977030373f4a4163ae1a7dbfe1a51a2c1a51d9100"
dependencies = [
"allocator-api2",
"equivalent",
@@ -370,15 +367,15 @@ dependencies = [
[[package]]
name = "libc"
version = "0.2.177"
version = "0.2.178"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2874a2af47a2325c2001a6e6fad9b16a53b802102b528163885171cf92b15976"
checksum = "37c93d8daa9d8a012fd8ab92f088405fb202ea0b6ab73ee2482ae66af4f42091"
[[package]]
name = "libredox"
version = "0.1.10"
version = "0.1.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "416f7e718bdb06000964960ffa43b4335ad4012ae8b99060261aa4a8088d5ccb"
checksum = "df15f6eac291ed1cf25865b1ee60399f57e7c227e7f51bdbd4c5270396a9ed50"
dependencies = [
"bitflags",
"libc",
@@ -395,15 +392,15 @@ dependencies = [
[[package]]
name = "log"
version = "0.4.28"
version = "0.4.29"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "34080505efa8e45a4b816c349525ebe327ceaa8559756f0356cba97ef3bf7432"
checksum = "5e5032e24019045c762d3c0f28f5b6b8bbf38563a65908389bf7978758920897"
[[package]]
name = "lru"
version = "0.13.0"
version = "0.16.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "227748d55f2f0ab4735d87fd623798cb6b664512fe979705f829c9f81c934465"
checksum = "96051b46fc183dc9cd4a223960ef37b9af631b55191852a8274bfef064cda20f"
dependencies = [
"hashbrown",
]
@@ -539,11 +536,11 @@ dependencies = [
[[package]]
name = "phf"
version = "0.12.1"
version = "0.13.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "913273894cec178f401a31ec4b656318d95473527be05c0752cc41cdc32be8b7"
checksum = "c1562dc717473dbaa4c1f85a36410e03c047b2e7df7f45ee938fbef64ae7fadf"
dependencies = [
"phf_shared 0.12.1",
"phf_shared 0.13.1",
]
[[package]]
@@ -558,12 +555,12 @@ dependencies = [
[[package]]
name = "phf_codegen"
version = "0.12.1"
version = "0.13.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "efbdcb6f01d193b17f0b9c3360fa7e0e620991b193ff08702f78b3ce365d7e61"
checksum = "49aa7f9d80421bca176ca8dbfebe668cc7a2684708594ec9f3c0db0805d5d6e1"
dependencies = [
"phf_generator 0.12.1",
"phf_shared 0.12.1",
"phf_generator 0.13.1",
"phf_shared 0.13.1",
]
[[package]]
@@ -578,12 +575,12 @@ dependencies = [
[[package]]
name = "phf_generator"
version = "0.12.1"
version = "0.13.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2cbb1126afed61dd6368748dae63b1ee7dc480191c6262a3b4ff1e29d86a6c5b"
checksum = "135ace3a761e564ec88c03a77317a7c6b80bb7f7135ef2544dbe054243b89737"
dependencies = [
"fastrand",
"phf_shared 0.12.1",
"phf_shared 0.13.1",
]
[[package]]
@@ -597,9 +594,9 @@ dependencies = [
[[package]]
name = "phf_shared"
version = "0.12.1"
version = "0.13.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "06005508882fb681fd97892ecff4b7fd0fee13ef1aa569f8695dae7ab9099981"
checksum = "e57fef6bc5981e38c2ce2d63bfa546861309f875b8a75f092d1d54ae2d64f266"
dependencies = [
"siphasher",
]
@@ -612,9 +609,9 @@ checksum = "7edddbd0b52d732b21ad9a5fab5c704c14cd949e5e9a1ec5929a24fded1b904c"
[[package]]
name = "portable-atomic"
version = "1.11.1"
version = "1.12.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f84267b20a16ea918e43c6a88433c2d54fa145c92a811b5b047ccbe153674483"
checksum = "f59e70c4aef1e55797c2e8fd94a4f2a973fc972cfde0e0b05f683667b0cd39dd"
[[package]]
name = "ppv-lite86"
@@ -627,18 +624,18 @@ dependencies = [
[[package]]
name = "proc-macro2"
version = "1.0.101"
version = "1.0.103"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "89ae43fd86e4158d6db51ad8e2b80f313af9cc74f5c0e03ccb87de09998732de"
checksum = "5ee95bc4ef87b8d5ba32e8b7714ccc834865276eab0aed5c9958d00ec45f49e8"
dependencies = [
"unicode-ident",
]
[[package]]
name = "quote"
version = "1.0.41"
version = "1.0.42"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ce25767e7b499d1b604768e7cde645d14cc8584231ea6b295e9c9eb22c02e1d1"
checksum = "a338cc41d27e6cc6dce6cefc13a0729dfbb81c262b1f519331575dd80ef3067f"
dependencies = [
"proc-macro2",
]
@@ -732,9 +729,9 @@ checksum = "7a2d987857b319362043e95f5353c0535c1f58eec5336fdfcf626430af7def58"
[[package]]
name = "rsconf"
version = "0.2.2"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bd2af859f1af0401e7fc7577739c87b0d239d8a5da400d717183bca92336bcdc"
checksum = "06cbd984e96cc891aa018958ac3d09986c0ea7635eedfff670b99a90970f159f"
dependencies = [
"cc",
]
@@ -897,9 +894,9 @@ checksum = "67b1b7a3b5fe4f1376887184045fcf45c69e92af734b7aaddc05fb777b6fbd03"
[[package]]
name = "syn"
version = "2.0.107"
version = "2.0.111"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2a26dbd934e5451d21ef060c018dae56fc073894c5a7896f882928a76e6d081b"
checksum = "390cc9a294ab71bdb1aa2e99d13be9c753cd2d7bd6560c77118597410c4d2e87"
dependencies = [
"proc-macro2",
"quote",
@@ -946,9 +943,9 @@ checksum = "562d481066bde0658276a35467c4af00bdc6ee726305698a55b86e61d7ad82bb"
[[package]]
name = "unicode-ident"
version = "1.0.20"
version = "1.0.22"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "462eeb75aeb73aea900253ce739c8e18a67423fadf006037cd3ff27e82748a06"
checksum = "9312f7c4f6ff9069b165498234ce8be658059c6728633667c526e27dc2cf1df5"
[[package]]
name = "unicode-segmentation"
@@ -1052,18 +1049,18 @@ checksum = "4de5f056fb9dc8b7908754867544e26145767187aaac5a98495e88ad7cb8a80f"
[[package]]
name = "zerocopy"
version = "0.8.27"
version = "0.8.31"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0894878a5fa3edfd6da3f88c4805f4c8558e2b996227a3d864f47fe11e38282c"
checksum = "fd74ec98b9250adb3ca554bdde269adf631549f51d8a8f8f0a10b50f1cb298c3"
dependencies = [
"zerocopy-derive",
]
[[package]]
name = "zerocopy-derive"
version = "0.8.27"
version = "0.8.31"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "88d2b8d9c68ad2b9e4340d7832716a4d21a22a1154777ad56ea55c51a9cf3831"
checksum = "d8a8d209fdf45cf5138cbb5a506f6b52522a25afccc534d1475dad8e31105c6a"
dependencies = [
"proc-macro2",
"quote",

View File

@@ -31,7 +31,7 @@ libc = "0.2.177"
# lru pulls in hashbrown by default, which uses a faster (though less DoS resistant) hashing algo.
# disabling default features uses the stdlib instead, but it doubles the time to rewrite the history
# files as of 22 April 2024.
lru = "0.13.0"
lru = "0.16.2"
nix = { version = "0.30.1", default-features = false, features = [
"event",
"inotify",
@@ -44,8 +44,8 @@ once_cell = "1.19.0"
pcre2 = { git = "https://github.com/fish-shell/rust-pcre2", tag = "0.2.9-utf32", default-features = false, features = [
"utf32",
] }
phf = { version = "0.12", default-features = false }
phf_codegen = { version = "0.12" }
phf = { version = "0.13", default-features = false }
phf_codegen = "0.13"
portable-atomic = { version = "1", default-features = false, features = [
"fallback",
] }
@@ -54,7 +54,7 @@ rand = { version = "0.9.2", default-features = false, features = [
"small_rng",
"thread_rng",
] }
rsconf = "0.2.2"
rsconf = "0.3.0"
rust-embed = { version = "8.9.0", features = [
"deterministic-timestamps",
"include-exclude",
@@ -79,7 +79,7 @@ debug = true
[package]
name = "fish"
version = "4.3.2"
version = "4.3.3"
edition.workspace = true
rust-version.workspace = true
default-run = "fish"
@@ -182,19 +182,24 @@ rust.non_upper_case_globals = "allow"
rust.unknown_lints = "allow"
rust.unstable_name_collisions = "allow"
rustdoc.private_intra_doc_links = "allow"
clippy.len_without_is_empty = "allow" # we're not a library crate
clippy.let_and_return = "allow"
clippy.manual_range_contains = "allow"
clippy.map_unwrap_or = "warn"
clippy.needless_lifetimes = "allow"
clippy.new_without_default = "allow"
clippy.option_map_unit_fn = "allow"
[workspace.lints.clippy]
assigning_clones = "warn"
implicit_clone = "warn"
cloned_instead_of_copied = "warn"
len_without_is_empty = "allow" # we're not a library crate
let_and_return = "allow"
manual_range_contains = "allow"
map_unwrap_or = "warn"
needless_lifetimes = "allow"
new_without_default = "allow"
option_map_unit_fn = "allow"
# We do not want to use the e?print(ln)?! macros.
# These lints flag their use.
# In the future, they might change to flag other methods of printing.
clippy.print_stdout = "deny"
clippy.print_stderr = "deny"
print_stdout = "deny"
print_stderr = "deny"
[lints]
workspace = true

View File

@@ -117,7 +117,7 @@ Dependencies
Compiling fish requires:
- Rust (version 1.85 or later)
- Rust (version 1.85 or later), including cargo
- CMake (version 3.15 or later)
- a C compiler (for system feature detection and the test helper binary)
- PCRE2 (headers and libraries) - optional, this will be downloaded if missing

View File

@@ -33,8 +33,13 @@ fi
cargo() {
subcmd=$1
shift
# shellcheck disable=2086
command cargo "$subcmd" $cargo_args "$@"
if [ -n "$FISH_CHECK_RUST_TOOLCHAIN" ]; then
# shellcheck disable=2086
command cargo "+$FISH_CHECK_RUST_TOOLCHAIN" "$subcmd" $cargo_args "$@"
else
# shellcheck disable=2086
command cargo "$subcmd" $cargo_args "$@"
fi
}
cleanup () {

View File

@@ -10,7 +10,8 @@ command -v updatecli
command -v uv
sort --version-sort </dev/null
uv lock --check
# TODO This is copied from .github/actions/install-sphinx/action.yml
uv lock --check --exclude-newer="$(awk -F'"' <uv.lock '/^exclude-newer[[:space:]]*=/ {print $2}')"
updatecli "${@:-apply}"

View File

@@ -131,6 +131,14 @@ install(DIRECTORY share/functions/
DESTINATION ${rel_datadir}/fish/functions
FILES_MATCHING PATTERN "*.fish")
install(DIRECTORY share/prompts/
DESTINATION ${rel_datadir}/fish/prompts
FILES_MATCHING PATTERN "*.fish")
install(DIRECTORY share/themes/
DESTINATION ${rel_datadir}/fish/themes
FILES_MATCHING PATTERN "*.theme")
# CONDEMNED_PAGE is managed by the conditional above
# Building the man pages is optional: if sphinx isn't installed, they're not built
install(DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}/user_doc/man/man1/
@@ -149,9 +157,7 @@ install(DIRECTORY share/tools/web_config
PATTERN "*.css"
PATTERN "*.html"
PATTERN "*.py"
PATTERN "*.js"
PATTERN "*.theme"
PATTERN "*.fish")
PATTERN "*.js")
# Building the man pages is optional: if Sphinx isn't installed, they're not built
install(FILES ${MANUALS} DESTINATION ${mandir}/man1/ OPTIONAL)

View File

@@ -1,3 +1,11 @@
fish (4.3.3-1) stable; urgency=medium
* Release of new version 4.3.3.
See https://github.com/fish-shell/fish-shell/releases/tag/4.3.3 for details.
-- Johannes Altmanninger <aclopte@gmail.com> Wed, 07 Jan 2026 08:34:20 +0100
fish (4.3.2-1) stable; urgency=medium
* Release of new version 4.3.2.

View File

@@ -9,7 +9,6 @@ license.workspace = true
[dependencies]
libc.workspace = true
nix.workspace = true
once_cell.workspace = true
[lints]
workspace = true

View File

@@ -1,7 +1,7 @@
use libc::STDIN_FILENO;
use once_cell::sync::OnceCell;
use std::env;
use std::os::unix::ffi::OsStrExt;
use std::sync::OnceLock;
// These are in the Unicode private-use range. We really shouldn't use this
// range but have little choice in the matter given how our lexer/parser works.
@@ -39,7 +39,7 @@ pub fn subslice_position<T: Eq>(a: &[T], b: &[T]) -> Option<usize> {
/// session. We err on the side of assuming it's not a console session. This approach isn't
/// bullet-proof and that's OK.
pub fn is_console_session() -> bool {
static IS_CONSOLE_SESSION: OnceCell<bool> = OnceCell::new();
static IS_CONSOLE_SESSION: OnceLock<bool> = OnceLock::new();
// TODO(terminal-workaround)
*IS_CONSOLE_SESSION.get_or_init(|| {
nix::unistd::ttyname(unsafe { std::os::fd::BorrowedFd::borrow_raw(STDIN_FILENO) })

View File

@@ -11,7 +11,6 @@ fish-common.workspace = true
fish-wchar.workspace = true
fish-widecharwidth.workspace = true
libc.workspace = true
once_cell.workspace = true
widestring.workspace = true
[build-dependencies]

View File

@@ -5,9 +5,11 @@
use fish_wchar::prelude::*;
use fish_widecharwidth::{WcLookupTable, WcWidth};
use once_cell::sync::Lazy;
use std::cmp;
use std::sync::atomic::{AtomicIsize, Ordering};
use std::sync::{
LazyLock,
atomic::{AtomicIsize, Ordering},
};
/// Width of ambiguous East Asian characters and, as of TR11, all private-use characters.
/// 1 is the typical default, but we accept any non-negative override via `$fish_ambiguous_width`.
@@ -25,7 +27,7 @@
// For some reason, this is declared here and exposed here, but is set in `env_dispatch`.
pub static FISH_EMOJI_WIDTH: AtomicIsize = AtomicIsize::new(1);
static WC_LOOKUP_TABLE: Lazy<WcLookupTable> = Lazy::new(WcLookupTable::new);
static WC_LOOKUP_TABLE: LazyLock<WcLookupTable> = LazyLock::new(WcLookupTable::new);
/// A safe wrapper around the system `wcwidth()` function
#[cfg(not(cygwin))]

View File

@@ -8,7 +8,6 @@ license.workspace = true
[dependencies]
fish-gettext-maps.workspace = true
once_cell.workspace = true
phf.workspace = true
[lints]

View File

@@ -1,259 +1,19 @@
use fish_gettext_maps::CATALOGS;
use once_cell::sync::Lazy;
use std::{collections::HashSet, sync::Mutex};
use std::{
collections::HashMap,
sync::{LazyLock, Mutex},
};
type Catalog = &'static phf::Map<&'static str, &'static str>;
pub struct SetLanguageLints<'a> {
pub duplicates: Vec<&'a str>,
pub non_existing: Vec<&'a str>,
}
#[derive(PartialEq, Eq, Clone, Copy)]
pub enum LanguagePrecedenceOrigin {
Default,
LocaleVariable(LocaleVariable),
LanguageEnvVar,
StatusLanguage,
}
#[derive(PartialEq, Eq, Clone, Copy)]
pub enum LocaleVariable {
#[allow(clippy::upper_case_acronyms)]
LANG,
#[allow(non_camel_case_types)]
LC_MESSAGES,
#[allow(non_camel_case_types)]
LC_ALL,
}
impl LocaleVariable {
fn as_language_precedence_origin(&self) -> LanguagePrecedenceOrigin {
LanguagePrecedenceOrigin::LocaleVariable(*self)
}
pub fn as_str(&self) -> &'static str {
match self {
Self::LANG => "LANG",
Self::LC_MESSAGES => "LC_MESSAGES",
Self::LC_ALL => "LC_ALL",
}
}
}
impl std::fmt::Display for LocaleVariable {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.as_str())
}
}
struct InternalLocalizationState {
precedence_origin: LanguagePrecedenceOrigin,
language_precedence: Vec<(String, Catalog)>,
}
pub struct PublicLocalizationState {
pub precedence_origin: LanguagePrecedenceOrigin,
pub language_precedence: Vec<String>,
}
/// Stores the current localization status.
/// `is_active` indicates whether localization is currently active, and the reason if it is
/// not.
/// The `origin` indicates where the values in `language_precedence` were taken from.
/// `language_precedence` stores the catalogs in the order they should be used.
///
/// This struct should be updated when the relevant variables change or `status language` is used
/// to modify the localization state.
static LOCALIZATION_STATE: Lazy<Mutex<InternalLocalizationState>> =
Lazy::new(|| Mutex::new(InternalLocalizationState::new()));
impl InternalLocalizationState {
fn new() -> Self {
Self {
precedence_origin: LanguagePrecedenceOrigin::Default,
language_precedence: vec![],
}
}
fn to_public(&self) -> PublicLocalizationState {
PublicLocalizationState {
precedence_origin: self.precedence_origin,
language_precedence: self
.language_precedence
.iter()
.map(|(lang, _)| lang.to_owned())
.collect(),
}
}
fn update_from_env(
&mut self,
message_locale: Option<(LocaleVariable, String)>,
language_var: Option<Vec<String>>,
) {
// Do not override values set via `status language`.
if self.precedence_origin == LanguagePrecedenceOrigin::StatusLanguage {
return;
}
if let Some((precedence_origin, locale)) = &message_locale {
// Regular locale names start with lowercase letters (`ll_CC`, followed by some suffix).
// The C or POSIX locale is special, and often used to disable localization.
// Their names are upper-case, but variants with suffixes (`C.UTF-8`) exist.
// To ensure that such variants are accounted for, we match on prefixes of the
// locale name.
// https://pubs.opengroup.org/onlinepubs/009695399/basedefs/xbd_chap07.html#tag_07_02
fn is_c_locale(locale: &str) -> bool {
locale.starts_with('C') || locale.starts_with("POSIX")
}
if is_c_locale(locale) {
self.precedence_origin =
LanguagePrecedenceOrigin::LocaleVariable(*precedence_origin);
self.language_precedence.clear();
return;
}
}
let (precedence_origin, language_list) = if let Some(list) = language_var {
(LanguagePrecedenceOrigin::LanguageEnvVar, list)
} else if let Some((precedence_origin, locale)) = message_locale {
let mut normalized_name = String::new();
// Strip off encoding and modifier. (We always expect UTF-8 and don't support modifiers.)
for c in locale.chars() {
if c.is_alphabetic() || c == '_' {
normalized_name.push(c);
} else {
break;
}
}
// At this point, the normalized_name should have the shape `ll` or `ll_CC`.
(
precedence_origin.as_language_precedence_origin(),
vec![normalized_name],
)
} else {
(LanguagePrecedenceOrigin::Default, vec![])
};
let mut seen_languages = HashSet::new();
self.language_precedence = language_list
.into_iter()
.flat_map(|lang| find_existing_catalogs(&lang))
.filter(|(lang, _)| seen_languages.insert(lang.to_owned()))
.collect();
self.precedence_origin = precedence_origin;
}
fn update_from_status_language_builtin<'a, 'b: 'a, S: AsRef<str> + 'a>(
&mut self,
langs: &'b [S],
) -> SetLanguageLints<'a> {
let mut seen = HashSet::new();
let mut duplicates = vec![];
for lang in langs {
let lang = lang.as_ref();
if !seen.insert(lang) {
duplicates.push(lang)
}
}
let mut existing_langs = vec![];
let mut non_existing = vec![];
for lang in langs {
let lang = lang.as_ref();
if let Some(catalog) = CATALOGS.get(lang) {
existing_langs.push((lang.to_owned(), *catalog));
} else {
non_existing.push(lang);
}
}
let mut seen = HashSet::new();
let unique_langs = existing_langs
.into_iter()
.filter(|(lang, _)| seen.insert(lang.to_owned()))
.collect();
self.language_precedence = unique_langs;
self.precedence_origin = LanguagePrecedenceOrigin::StatusLanguage;
SetLanguageLints {
duplicates,
non_existing,
}
}
}
/// Tries to find catalogs for `language`.
/// `language` must be an ISO 639 language code, optionally followed by an underscore and an ISO
/// 3166 country/territory code.
/// Uses the catalog with the exact same name as `language` if it exists.
/// If a country code is present (`ll_CC`), only the catalog named `ll` will be considered as a fallback.
/// If no country code is present (`ll`), all catalogs whose names start with `ll_` will be used in
/// arbitrary order.
fn find_existing_catalogs(language: &str) -> Vec<(String, Catalog)> {
// Try the exact name first.
// If there already is a corresponding catalog return the language.
if let Some(catalog) = CATALOGS.get(language) {
return vec![(language.to_owned(), catalog)];
}
let language_without_country_code = language.split_once('_').map_or(language, |(ll, _cc)| ll);
if language == language_without_country_code {
// We have `ll` format. In this case, try to find any catalog whose name starts with `ll_`.
// Note that it is important to include the underscore in the pattern, otherwise `ll` might
// fall back to `llx_CC`, where `llx` is a 3-letter language identifier.
let ll_prefix = format!("{language}_");
let mut lang_catalogs = vec![];
for (&lang_name, &catalog) in CATALOGS.entries() {
if lang_name.starts_with(&ll_prefix) {
lang_catalogs.push((lang_name.to_owned(), catalog));
}
}
lang_catalogs
} else {
// If `language` contained a country code, we only try to fall back to a catalog
// without a country code.
if let Some(catalog) = CATALOGS.get(language_without_country_code) {
vec![(language_without_country_code.to_owned(), catalog)]
} else {
vec![]
}
}
}
pub fn update_from_env(
locale: Option<(LocaleVariable, String)>,
language_var: Option<Vec<String>>,
) {
let mut localization_state = LOCALIZATION_STATE.lock().unwrap();
localization_state.update_from_env(locale, language_var);
}
pub fn update_from_status_language_builtin<'a, 'b: 'a, S: AsRef<str> + 'a>(
langs: &'b [S],
) -> SetLanguageLints<'a> {
let mut localization_state = LOCALIZATION_STATE.lock().unwrap();
localization_state.update_from_status_language_builtin(langs)
}
pub fn unset_from_status_language_builtin(
locale: Option<(LocaleVariable, String)>,
language_var: Option<Vec<String>>,
) {
let mut localization_state = LOCALIZATION_STATE.lock().unwrap();
localization_state.precedence_origin = LanguagePrecedenceOrigin::Default;
localization_state.update_from_env(locale, language_var);
}
pub fn status_language() -> PublicLocalizationState {
let localization_state = LOCALIZATION_STATE.lock().unwrap();
localization_state.to_public()
}
static LANGUAGE_PRECEDENCE: LazyLock<Mutex<Vec<(&'static str, Catalog)>>> =
LazyLock::new(|| Mutex::new(vec![]));
pub fn gettext(message_str: &'static str) -> Option<&'static str> {
let localization_state = LOCALIZATION_STATE.lock().unwrap();
let language_precedence = LANGUAGE_PRECEDENCE.lock().unwrap();
// Use the localization from the highest-precedence language that has one available.
for (_, catalog) in localization_state.language_precedence.iter() {
for (_, catalog) in language_precedence.iter() {
if let Some(localized_str) = catalog.get(message_str) {
return Some(localized_str);
}
@@ -261,8 +21,40 @@ pub fn gettext(message_str: &'static str) -> Option<&'static str> {
None
}
pub fn list_available_languages() -> Vec<&'static str> {
let mut langs: Vec<_> = CATALOGS.entries().map(|(&lang, _)| lang).collect();
langs.sort();
langs
#[derive(Clone, Copy)]
pub struct GettextLocalizationLanguage {
language: &'static str,
}
static AVAILABLE_LANGUAGES: LazyLock<HashMap<&'static str, GettextLocalizationLanguage>> =
LazyLock::new(|| {
HashMap::from_iter(
CATALOGS
.entries()
.map(|(&language, _)| (language, GettextLocalizationLanguage { language })),
)
});
pub fn get_available_languages() -> &'static HashMap<&'static str, GettextLocalizationLanguage> {
&AVAILABLE_LANGUAGES
}
pub fn set_language_precedence(new_precedence: &[GettextLocalizationLanguage]) {
let catalogs = new_precedence
.iter()
.map(|lang| {
(
lang.language,
*CATALOGS
.get(lang.language)
.expect("Only languages for which catalogs exist may be passed to gettext."),
)
})
.collect();
*LANGUAGE_PRECEDENCE.lock().unwrap() = catalogs;
}
pub fn get_language_precedence() -> Vec<&'static str> {
let language_precedence = LANGUAGE_PRECEDENCE.lock().unwrap();
language_precedence.iter().map(|&(lang, _)| lang).collect()
}

View File

@@ -10,7 +10,7 @@ Synopsis
[--set-cursor[=MARKER]] ([-f | --function FUNCTION] | EXPANSION)
abbr --erase [ [-c | --command COMMAND]... ] NAME ...
abbr --rename [ [-c | --command COMMAND]... ] OLD_WORD NEW_WORD
abbr --show
abbr [--show] [--color WHEN]
abbr --list
abbr --query NAME ...
@@ -75,7 +75,6 @@ With **--set-cursor=MARKER**, the cursor is moved to the first occurrence of **M
With **-f FUNCTION** or **--function FUNCTION**, **FUNCTION** is treated as the name of a fish function instead of a literal replacement. When the abbreviation matches, the function will be called with the matching token as an argument. If the function's exit status is 0 (success), the token will be replaced by the function's output; otherwise the token will be left unchanged. No **EXPANSION** may be given separately.
Examples
########

View File

@@ -6,8 +6,8 @@ Synopsis
.. synopsis::
bind [(-M | --mode) MODE] [(-m | --sets-mode) NEW_MODE] [--preset | --user] [-s | --silent] KEYS COMMAND ...
bind [(-M | --mode) MODE] [--preset] [--user] [KEYS]
bind [-a | --all] [--preset] [--user]
bind [(-M | --mode) MODE] [--preset] [--user] [--color WHEN] [KEYS]
bind [-a | --all] [--preset] [--user] [--color WHEN]
bind (-f | --function-names)
bind (-K | --key-names)
bind (-L | --list-modes)
@@ -104,6 +104,10 @@ The following options are available:
**-s** or **--silent**
Silences some of the error messages, including for unknown key names and unbound sequences.
**--color** *WHEN*
Controls when to use syntax highlighting colors when listing bindings.
*WHEN* can be ``auto`` (the default, colorize if the output :doc:`is a terminal <isatty>`), ``always``, or ``never``.
**-h** or **--help**
Displays help about using this command.

View File

@@ -6,7 +6,7 @@ Synopsis
.. synopsis::
complete ((-c | --command) | (-p | --path)) COMMAND [OPTIONS]
complete ((-c | --command) | (-p | --path)) COMMAND [OPTIONS] [--color WHEN]
complete (-C | --do-complete) [--escape] STRING
Description
@@ -74,6 +74,10 @@ The following options are available:
**--escape**
When used with ``-C``, escape special characters in completions.
**--color** *WHEN*
Controls when to use syntax highlighting colors when printing completions.
*WHEN* can be ``auto`` (the default, colorize if the output :doc:`is a terminal <isatty>`), ``always``, or ``never``.
**-h** or **--help**
Displays help about using this command.

View File

@@ -6,8 +6,8 @@ Synopsis
.. synopsis::
functions [-a | --all] [-n | --names]
functions [-D | --details] [-v] FUNCTION
functions [-a | --all] [-n | --names] [--color WHEN]
functions [-D | --details] [-v] [--color WHEN] FUNCTION
functions -c OLDNAME NEWNAME
functions -d DESCRIPTION FUNCTION
functions [-e | -q] FUNCTION ...
@@ -60,6 +60,10 @@ The following options are available:
**-t** or **--handlers-type** *TYPE*
Show all event handlers matching the given *TYPE*.
**--color** *WHEN*
Controls when to use syntax highlighting colors when printing function definitions.
*WHEN* can be ``auto`` (the default, colorize if the output :doc:`is a terminal <isatty>`), ``always``, or ``never``.
**-h** or **--help**
Displays help about using this command.

View File

@@ -75,6 +75,10 @@ These flags can appear before or immediately after one of the sub-commands liste
**-R** or **--reverse**
Causes the history search results to be ordered oldest to newest. Which is the order used by most shells. The default is newest to oldest.
**--color** *WHEN*
Controls when to use syntax highlighting colors for the history entries.
*WHEN* can be ``auto`` (the default, colorize if the output :doc:`is a terminal <isatty>`), ``always``, or ``never``.
**-h** or **--help**
Displays help for this command.

View File

@@ -41,6 +41,10 @@ The following options are available:
**-q** or **--query**
Suppresses all output; this is useful when testing the exit status. For compatibility with old fish versions this is also **--quiet**.
**--color** *WHEN*
Controls when to use syntax highlighting colors when printing function definitions.
*WHEN* can be ``auto`` (the default, colorize if the output :doc:`is a terminal <isatty>`), ``always``, or ``never``.
**-h** or **--help**
Displays help about using this command.

View File

@@ -133,7 +133,7 @@ Variable Meaning
.. envvar:: fish_color_end process separators like ``;`` and ``&``
.. envvar:: fish_color_error syntax errors
.. envvar:: fish_color_param ordinary command parameters
.. envvar:: fish_color_valid_path parameters that are filenames (if the file exists)
.. envvar:: fish_color_valid_path parameters and redirection targets that are filenames (if the file exists)
.. envvar:: fish_color_option options starting with "-", up to the first "--" parameter
.. envvar:: fish_color_comment comments like '# important'
.. envvar:: fish_color_selection selected text in vi visual mode

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -9,9 +9,12 @@ complete -c abbr -f -n $__fish_abbr_not_add_cond -l rename -d 'Rename an abbrevi
complete -c abbr -f -n $__fish_abbr_not_add_cond -s e -l erase -d 'Erase abbreviation' -xa '(abbr --list)'
complete -c abbr -f -n $__fish_abbr_not_add_cond -s s -l show -d 'Print all abbreviations'
complete -c abbr -f -n $__fish_abbr_not_add_cond -s l -l list -d 'Print all abbreviation names'
complete -c abbr -f -n $__fish_abbr_not_add_cond -l color -d 'When to colorize output' -xa 'always never auto'
complete -c abbr -f -n $__fish_abbr_not_add_cond -s h -l help -d Help
complete -c abbr -f -n $__fish_abbr_add_cond -s p -l position -a 'command anywhere' -d 'Expand only as a command, or anywhere' -x
complete -c abbr -f -n $__fish_abbr_add_cond -s f -l function -d 'Treat expansion argument as a fish function' -xa '(functions)'
complete -c abbr -f -n $__fish_abbr_add_cond -s r -l regex -d 'Match a regular expression' -x
complete -c abbr -f -n $__fish_abbr_add_cond -l set-cursor -d 'Position the cursor at % post-expansion'
complete -c abbr -f -n '__fish_seen_subcommand_from -s --show' -l color -d 'When to colorize output' -xa 'always never auto'

View File

@@ -25,6 +25,7 @@ complete -c bind -s L -l list-modes -d 'Display a list of defined bind modes'
complete -c bind -s s -l silent -d 'Operate silently'
complete -c bind -l preset -d 'Operate on preset bindings'
complete -c bind -l user -d 'Operate on user bindings'
complete -c bind -l color -d 'When to colorize output' -xa 'always never auto'
complete -c bind -n '__fish_bind_has_keys (commandline -pcx)' -a '(bind --function-names)' -d 'Function name' -x

View File

@@ -19,6 +19,7 @@ complete -c complete -l escape -d "Make -C escape special characters"
complete -c complete -s n -l condition -d "Completion only used if command has zero exit status" -x
complete -c complete -s w -l wraps -d "Inherit completions from specified command" -xa '(__fish_complete_command)'
complete -c complete -s k -l keep-order -d "Keep order of arguments instead of sorting alphabetically"
complete -c complete -l color -d "When to colorize output" -xa "always never auto"
# Deprecated options

View File

@@ -15,10 +15,72 @@ function __fish_fastboot_list_partition_or_file
end
function __fish_fastboot_list_partition
set -l partitions boot bootloader cache cust dtbo metadata misc modem odm odm_dlkm oem product pvmfw radio recovery system system_ext userdata vbmeta vendor vendor_dlkm vmbeta_system
for i in $partitions
echo $i
end
printf %s "\
abl
aop
aop_config
apdp
bluetooth
boot
countrycode
cpucp
cpucp_dtb
crclist
dcp
ddr
devcfg
dsp
dtbo
featenabler
hyp
hyp_ac_config
idmanager
imagefv
init_boot
keymaster
logfs
metadata
misc
modem
modemfirmware
multiimgoem
multiimgqti
pdp
pdp_cdb
pvmfw
qtvm_dtbo
qupfw
recovery
rescue
secretkeeper
shrm
soccp
soccp_dcd
soccp_debug
sparsecrclist
spuservice
storsec
super
tme_config
tme_fw
tme_seq_patch
toolsfv
tz
tz_ac_config
tz_qti_config
uefi
uefisecapp
userdata
vbmeta
vbmeta_system
vendor_boot
vm-bootsys
xbl
xbl_ac_config
xbl_config
xbl_ramdump
xbl_sc_test_mode
"
end
complete -c fastboot -s h -l help -d 'Show this message'

View File

@@ -19,3 +19,4 @@ complete -c functions -s D -l details -d "Display information about the function
complete -c functions -s v -l verbose -d "Print more output"
complete -c functions -s H -l handlers -d "Show event handlers"
complete -c functions -s t -l handlers-type -d "Show event handlers matching the given type" -x -a "signal variable exit job-id generic"
complete -c functions -l color -d 'When to colorize output' -x -a 'always never auto'

View File

@@ -1070,8 +1070,17 @@ complete -f -c git -n '__fish_git_using_command fetch' -l refetch -d 'Re-fetch w
__fish_git_add_revision_completion -n '__fish_git_using_command fetch' -l negotiation-tip -d 'Only report commits reachable from these tips' -x
complete -f -c git -n '__fish_git_using_command fetch' -l negotiate-only -d "Don't fetch, only show commits in common with the server"
complete -f -c git -n '__fish_git_using_command fetch' -l filter -ra '(__fish_git_filters)' -d 'Request a subset of objects from server'
# TODO other options
complete -f -c git -n '__fish_git_using_command fetch' -s 4 -l ipv4 -d 'Use IPv4 addresses only'
complete -f -c git -n '__fish_git_using_command fetch' -s 6 -l ipv6 -d 'Use IPv6 addresses only'
complete -x -c git -n '__fish_git_using_command fetch' -l depth -d 'Limit fetching to specified depth'
complete -x -c git -n '__fish_git_using_command fetch' -l deepen -d 'Deepen history of shallow repository'
complete -x -c git -n '__fish_git_using_command fetch' -l shallow-since -d 'Deepen history after specified date'
complete -x -c git -n '__fish_git_using_command fetch' -l shallow-exclude -d 'Deepen history excluding branch'
complete -f -c git -n '__fish_git_using_command fetch' -l unshallow -d 'Convert shallow repository to complete one'
complete -f -c git -n '__fish_git_using_command fetch' -l update-shallow -d 'Accept refs that update .git/shallow'
complete -x -c git -n '__fish_git_using_command fetch' -l refmap -d 'Specify refspec to map the refs to remote-tracking branches'
complete -f -c git -n '__fish_git_using_command fetch' -l write-fetch-head -d 'Write FETCH_HEAD file (default)'
complete -f -c git -n '__fish_git_using_command fetch' -l no-write-fetch-head -d 'Do not write FETCH_HEAD file'
#### filter-branch
complete -f -c git -n __fish_git_needs_command -a filter-branch -d 'Rewrite branches'
@@ -1151,7 +1160,18 @@ complete -f -c git -n '__fish_git_using_command show-branch' -l no-color -d "Tur
complete -f -c git -n '__fish_git_using_command show-branch' -l merge-base -d "Determine merge bases for the given commits"
complete -f -c git -n '__fish_git_using_command show-branch' -l independent -d "Show which refs can't be reached from any other"
complete -f -c git -n '__fish_git_using_command show-branch' -l topics -d "Show only commits that are not on the first given branch"
# TODO options
complete -f -c git -n '__fish_git_using_command show-branch' -l all -d 'Show all refs under refs/heads and refs/remotes'
complete -f -c git -n '__fish_git_using_command show-branch' -s r -l remotes -d 'Show all refs under refs/remotes'
complete -f -c git -n '__fish_git_using_command show-branch' -l current -d 'Include the current branch to the list of revs'
complete -f -c git -n '__fish_git_using_command show-branch' -l topo-order -d 'Do not show commits in reverse chronological order'
complete -f -c git -n '__fish_git_using_command show-branch' -l date-order -d 'Show commits in chronological order'
complete -f -c git -n '__fish_git_using_command show-branch' -l sparse -d 'Show merges only reachable from one tip'
complete -x -c git -n '__fish_git_using_command show-branch' -l more -d 'Go N more commits beyond common ancestor'
complete -f -c git -n '__fish_git_using_command show-branch' -l list -d 'Show branches and their commits'
complete -f -c git -n '__fish_git_using_command show-branch' -l sha1-name -d 'Name commits with unique prefix of SHA-1'
complete -f -c git -n '__fish_git_using_command show-branch' -l no-name -d 'Do not show naming strings for each commit'
complete -x -c git -n '__fish_git_using_command show-branch' -l color -a 'always never auto' -d 'Color the status sign'
complete -f -c git -n '__fish_git_using_command show-branch' -l no-color -d 'Turn off colored output'
### add
complete -c git -n __fish_git_needs_command -a add -d 'Add file contents to the staging area'
@@ -1170,10 +1190,10 @@ complete -c git -n '__fish_git_using_command add' -l ignore-errors -d 'Ignore er
complete -c git -n '__fish_git_using_command add' -l ignore-missing -d 'Check if any of the given files would be ignored'
# Renames also show up as untracked + deleted, and to get git to show it as a rename _both_ need to be added.
# However, we can't do that as it is two tokens, so we don't need renamed here.
complete -f -c git -n '__fish_git_using_command add; and test "$(__fish_git config --get bash.showUntrackedFiles)" != 0' -a '(__fish_git_files modified untracked deleted unmerged modified-staged-deleted)'
complete -f -c git -n '__fish_git_using_command add; and test "$(__fish_git config --type bool --get status.showUntrackedFiles)" != false' -a '(__fish_git_files modified untracked deleted unmerged modified-staged-deleted)'
# If we have so many files that you disable untrackedfiles, let's add file completions,
# to avoid slurping megabytes of git output.
complete -F -c git -n '__fish_git_using_command add; and test "$(__fish_git config --get bash.showUntrackedFiles)" = 0' -a '(__fish_git_files modified deleted unmerged modified-staged-deleted)'
complete -F -c git -n '__fish_git_using_command add; and test "$(__fish_git config --type bool --get status.showUntrackedFiles)" = false' -a '(__fish_git_files modified deleted unmerged modified-staged-deleted)'
### am
complete -c git -n __fish_git_needs_command -a am -d 'Apply patches from a mailbox'
@@ -1201,6 +1221,7 @@ complete -f -c git -n '__fish_git_using_command am' -l ignore-date -d 'Treat aut
complete -f -c git -n '__fish_git_using_command am' -l skip -d 'Skip current patch'
complete -x -c git -n '__fish_git_using_command am' -s S -l gpg-sign -a '(type -q gpg && __fish_complete_gpg_key_id gpg)' -d 'Sign commits with gpg'
complete -f -c git -n '__fish_git_using_command am' -l no-gpg-sign -d 'Do not sign commits'
complete -f -c git -n '__fish_git_using_command am' -s n -l no-verify -d 'Do not run pre-applypatch and applypatch-msg hooks'
complete -f -c git -n '__fish_git_using_command am' -s r -l resolved -l continue -d 'Mark patch failures as resolved'
complete -x -c git -n '__fish_git_using_command am' -l resolvemsg -d 'Message to print after patch failure'
complete -f -c git -n '__fish_git_using_command am' -l abort -d 'Abort patch operation and restore branch'
@@ -1229,7 +1250,15 @@ complete -f -c git -n '__fish_git_using_command checkout' -l no-recurse-submodul
complete -f -c git -n '__fish_git_using_command checkout' -l progress -d 'Report progress even if not connected to a terminal'
complete -f -c git -n '__fish_git_using_command checkout' -l no-progress -d "Don't report progress"
complete -f -c git -n '__fish_git_using_command checkout' -s f -l force -d 'Switch even if working tree differs or unmerged files exist'
# TODO options
complete -f -c git -n '__fish_git_using_command checkout' -s q -l quiet -d 'Suppress feedback messages'
complete -f -c git -n '__fish_git_using_command checkout' -l detach -d 'Detach HEAD at named commit'
complete -f -c git -n '__fish_git_using_command checkout' -l guess -d 'Guess remote tracking branch (default)'
complete -f -c git -n '__fish_git_using_command checkout' -l no-guess -d 'Do not guess remote tracking branch'
complete -f -c git -n '__fish_git_using_command checkout' -l overlay -d 'Never remove files from working tree in checkout'
complete -f -c git -n '__fish_git_using_command checkout' -l no-overlay -d 'Remove files from working tree not in tree-ish'
complete -F -c git -n '__fish_git_using_command checkout' -l pathspec-from-file -d 'Read pathspec from file'
complete -f -c git -n '__fish_git_using_command checkout' -l pathspec-file-nul -d 'NUL separator for pathspec-from-file'
complete -f -c git -n '__fish_git_using_command checkout' -l ignore-skip-worktree-bits -d 'Check out all files including sparse entries'
### sparse-checkout
# `init` subcommand is deprecated
@@ -1281,7 +1310,12 @@ complete -f -c git -n __fish_git_needs_command -a archive -d 'Create an archive
complete -f -c git -n '__fish_git_using_command archive' -s l -l list -d "Show all available formats"
complete -f -c git -n '__fish_git_using_command archive' -s v -l verbose -d "Be verbose"
complete -f -c git -n '__fish_git_using_command archive' -l worktree-attributes -d "Look for attributes in .gitattributes files in the working tree as well"
# TODO options
complete -x -c git -n '__fish_git_using_command archive' -l format -a 'tar zip tar.gz tgz' -d 'Format of the resulting archive'
complete -F -c git -n '__fish_git_using_command archive' -s o -l output -d 'Write the archive to file instead of stdout'
complete -x -c git -n '__fish_git_using_command archive' -l prefix -d 'Prepend prefix to each pathname in the archive'
complete -F -c git -n '__fish_git_using_command archive' -l add-file -d 'Add non-tracked file to the archive'
complete -x -c git -n '__fish_git_using_command archive' -l remote -d 'Retrieve a tar archive from a remote repository'
complete -x -c git -n '__fish_git_using_command archive' -l exec -d 'Path to git-upload-archive on the remote side'
### bisect
complete -f -c git -n __fish_git_needs_command -a bisect -d 'Use binary search to find what introduced a bug'
@@ -1442,7 +1476,21 @@ complete -f -c git -n '__fish_git_using_command commit' -l no-signoff -d 'Do not
complete -f -c git -n '__fish_git_using_command commit' -s C -l reuse-message -d 'Reuse log message and authorship of an existing commit'
complete -f -c git -n '__fish_git_using_command commit' -s c -l reedit-message -d 'Like --reuse-message, but allow editing commit message'
complete -f -c git -n '__fish_git_using_command commit' -s e -l edit -d 'Further edit the message taken from -m, -C, or -F'
# TODO options
complete -f -c git -n '__fish_git_using_command commit' -l no-edit -d 'Do not edit the commit message'
complete -f -c git -n '__fish_git_using_command commit' -s q -l quiet -d 'Suppress commit summary message'
complete -f -c git -n '__fish_git_using_command commit' -l dry-run -d 'Show what would be committed without committing'
complete -f -c git -n '__fish_git_using_command commit' -l short -d 'Show short format output for dry-run'
complete -f -c git -n '__fish_git_using_command commit' -l porcelain -d 'Show porcelain format output for dry-run'
complete -f -c git -n '__fish_git_using_command commit' -l long -d 'Show long format output for dry-run (default)'
complete -f -c git -n '__fish_git_using_command commit' -s z -l null -d 'Terminate entries with NUL in dry-run'
complete -f -c git -n '__fish_git_using_command commit' -l status -d 'Include git status output in commit message template'
complete -f -c git -n '__fish_git_using_command commit' -l no-status -d 'Do not include git status output in commit message template'
complete -f -c git -n '__fish_git_using_command commit' -s i -l include -d 'Stage contents of given paths before committing'
complete -f -c git -n '__fish_git_using_command commit' -s o -l only -d 'Commit only from paths specified on command line'
complete -f -c git -n '__fish_git_using_command commit' -l trailer -d 'Add trailer to commit message'
complete -F -c git -n '__fish_git_using_command commit' -s t -l template -d 'Use file as commit message template'
complete -F -c git -n '__fish_git_using_command commit' -l pathspec-from-file -d 'Read pathspec from file'
complete -f -c git -n '__fish_git_using_command commit' -l pathspec-file-nul -d 'NUL separator for pathspec-from-file'
### count-objects
complete -f -c git -n __fish_git_needs_command -a count-objects -d 'Count number of objects and disk consumption'
@@ -1550,7 +1598,13 @@ complete -f -c git -n '__fish_git_using_command difftool' -l tool-help -d 'Print
complete -f -c git -n '__fish_git_using_command difftool' -l trust-exit-code -d 'Exit when an invoked diff tool returns a non-zero exit code'
complete -f -c git -n '__fish_git_using_command difftool' -s x -l extcmd -d 'Specify a custom command for viewing diffs'
complete -f -c git -n '__fish_git_using_command difftool' -l no-gui -d 'Overrides --gui setting'
# TODO options
complete -f -c git -n '__fish_git_using_command difftool' -l tool-help -d 'Print a list of diff tools that may be used'
complete -f -c git -n '__fish_git_using_command difftool' -s d -l dir-diff -d 'Copy modified files to a temporary location and perform dir diff'
complete -f -c git -n '__fish_git_using_command difftool' -l symlinks -d 'Use symlinks in dir-diff mode'
complete -f -c git -n '__fish_git_using_command difftool' -l no-symlinks -d 'Do not use symlinks in dir-diff mode'
complete -f -c git -n '__fish_git_using_command difftool' -l no-trust-exit-code -d 'Do not exit when diff tool returns non-zero'
complete -f -c git -n '__fish_git_using_command difftool' -s y -l no-prompt -d 'Do not prompt before launching diff tool'
complete -f -c git -n '__fish_git_using_command difftool' -l prompt -d 'Prompt before each invocation of diff tool'
### gc
complete -f -c git -n __fish_git_needs_command -a gc -d 'Collect garbage (unreachable commits etc)'
@@ -1603,13 +1657,29 @@ complete -f -c git -n '__fish_git_using_command grep' -l not -d 'Combine pattern
complete -f -c git -n '__fish_git_using_command grep' -l all-match -d 'Only match files that can match all the pattern expressions when giving multiple'
complete -f -c git -n '__fish_git_using_command grep' -s q -l quiet -d 'Just exit with status 0 when there is a match and with non-zero status when there isn\'t'
complete -c git -n '__fish_git_using_command grep' -n 'not contains -- -- (commandline -xpc)' -ka '(__fish_git_refs)'
# TODO options, including max-depth, h, open-files-in-pager, contexts, threads, file
complete -x -c git -n '__fish_git_using_command grep' -l max-depth -d 'Maximum depth of directory traversal'
complete -f -c git -n '__fish_git_using_command grep' -s h -d 'Do not output the filename for each match'
complete -f -c git -n '__fish_git_using_command grep' -s H -d 'Show filename for each match (default)'
complete -x -c git -n '__fish_git_using_command grep' -s O -l open-files-in-pager -d 'Open matching files in pager'
complete -x -c git -n '__fish_git_using_command grep' -s C -l context -d 'Show context lines before and after matches'
complete -x -c git -n '__fish_git_using_command grep' -s A -l after-context -d 'Show context lines after matches'
complete -x -c git -n '__fish_git_using_command grep' -s B -l before-context -d 'Show context lines before matches'
complete -x -c git -n '__fish_git_using_command grep' -l threads -d 'Number of grep worker threads to use'
complete -f -c git -n '__fish_git_using_command grep' -l break -d 'Print empty line between matches from different files'
complete -f -c git -n '__fish_git_using_command grep' -l heading -d 'Show filename above matches in that file'
complete -f -c git -n '__fish_git_using_command grep' -l untracked -d 'Search in untracked files'
complete -f -c git -n '__fish_git_using_command grep' -l no-index -d 'Search files in current directory that is not managed by Git'
complete -f -c git -n '__fish_git_using_command grep' -l recurse-submodules -d 'Recursively search in each submodule'
### init
complete -f -c git -n __fish_git_needs_command -a init -d 'Create an empty git repository'
complete -f -c git -n '__fish_git_using_command init' -s q -l quiet -d 'Only print error and warning messages'
complete -f -c git -n '__fish_git_using_command init' -l bare -d 'Create a bare repository'
# TODO options
complete -F -c git -n '__fish_git_using_command init' -l template -d 'Directory from which templates will be used'
complete -F -c git -n '__fish_git_using_command init' -l separate-git-dir -d 'Create git dir at specified path'
complete -x -c git -n '__fish_git_using_command init' -l object-format -a 'sha1 sha256' -d 'Specify hash algorithm for objects'
complete -x -c git -n '__fish_git_using_command init' -s b -l initial-branch -d 'Use specified name for initial branch'
complete -x -c git -n '__fish_git_using_command init' -l shared -a 'false true umask group all world everybody' -d 'Specify that the repository is shared'
### shortlog
complete -c git -n __fish_git_needs_command -a shortlog -d 'Show commit shortlog'
@@ -1951,7 +2021,7 @@ complete -f -c git -n __fish_git_needs_command -a prune -d 'Prune unreachable ob
complete -f -c git -n '__fish_git_using_command prune' -s n -l dry-run -d 'Just report what it would remove'
complete -f -c git -n '__fish_git_using_command prune' -s v -l verbose -d 'Report all removed objects'
complete -f -c git -n '__fish_git_using_command prune' -l progress -d 'Show progress'
# TODO options
complete -x -c git -n '__fish_git_using_command prune' -l expire -d 'Only expire loose objects older than date'
### pull
complete -f -c git -n __fish_git_needs_command -a pull -d 'Fetch from and merge with another repo or branch'
@@ -1994,7 +2064,18 @@ complete -f -c git -n '__fish_git_using_command pull' -s r -l rebase -d 'Rebase
complete -f -c git -n '__fish_git_using_command pull' -l no-rebase -d 'Do not rebase the current branch on top of the upstream branch'
complete -f -c git -n '__fish_git_using_command pull' -l autostash -d 'Before starting rebase, stash local changes, and apply stash when done'
complete -f -c git -n '__fish_git_using_command pull' -l no-autostash -d 'Do not stash local changes before starting rebase'
# TODO other options
complete -f -c git -n '__fish_git_using_command pull' -l verify -d 'Allow the pre-merge and commit-msg hooks to run (default)'
complete -f -c git -n '__fish_git_using_command pull' -l no-verify -d 'Do not run pre-merge and commit-msg hooks'
complete -x -c git -n '__fish_git_using_command pull' -l upload-pack -d 'Path to git-upload-pack on remote'
complete -x -c git -n '__fish_git_using_command pull' -l depth -d 'Limit fetching to specified number of commits'
complete -x -c git -n '__fish_git_using_command pull' -l deepen -d 'Deepen history of shallow repository by specified commits'
complete -x -c git -n '__fish_git_using_command pull' -l shallow-since -d 'Deepen history after specified date'
complete -x -c git -n '__fish_git_using_command pull' -l shallow-exclude -d 'Deepen history excluding commits reachable from branch'
complete -f -c git -n '__fish_git_using_command pull' -l unshallow -d 'Convert shallow repository to complete one'
complete -f -c git -n '__fish_git_using_command pull' -l update-shallow -d 'Accept refs that update .git/shallow'
complete -x -c git -n '__fish_git_using_command pull' -s j -l jobs -d 'Number of parallel children for fetching'
complete -f -c git -n '__fish_git_using_command pull' -s 4 -l ipv4 -d 'Use IPv4 addresses only'
complete -f -c git -n '__fish_git_using_command pull' -s 6 -l ipv6 -d 'Use IPv6 addresses only'
### range-diff
complete -f -c git -n __fish_git_needs_command -a range-diff -d 'Compare two commit ranges'
@@ -2031,7 +2112,21 @@ complete -f -c git -n '__fish_git_using_command push' -s u -l set-upstream -d 'A
complete -f -c git -n '__fish_git_using_command push' -s q -l quiet -d 'Be quiet'
complete -f -c git -n '__fish_git_using_command push' -s v -l verbose -d 'Be verbose'
complete -f -c git -n '__fish_git_using_command push' -l progress -d 'Force progress status'
# TODO --recurse-submodules=check|on-demand
complete -f -c git -n '__fish_git_using_command push' -l verify -d 'Allow the pre-push hook to run (default)'
complete -f -c git -n '__fish_git_using_command push' -l no-verify -d 'Do not run the pre-push hook'
complete -x -c git -n '__fish_git_using_command push' -l recurse-submodules -a 'check on-demand only no' -d 'Control recursive pushing of submodules'
complete -f -c git -n '__fish_git_using_command push' -l signed -d 'GPG-sign the push'
complete -f -c git -n '__fish_git_using_command push' -l no-signed -d 'Do not GPG-sign the push'
complete -f -c git -n '__fish_git_using_command push' -l atomic -d 'Request atomic transaction on remote side'
complete -f -c git -n '__fish_git_using_command push' -l no-atomic -d 'Do not request atomic transaction'
complete -f -c git -n '__fish_git_using_command push' -l thin -d 'Spend extra cycles to minimize number of objects'
complete -f -c git -n '__fish_git_using_command push' -l no-thin -d 'Do not use thin pack'
complete -f -c git -n '__fish_git_using_command push' -s 4 -l ipv4 -d 'Use IPv4 addresses only'
complete -f -c git -n '__fish_git_using_command push' -s 6 -l ipv6 -d 'Use IPv6 addresses only'
complete -x -c git -n '__fish_git_using_command push' -s o -l push-option -d 'Transmit string to server'
complete -x -c git -n '__fish_git_using_command push' -l repo -d 'Override configured repository'
complete -x -c git -n '__fish_git_using_command push' -l receive-pack -d 'Path to git-receive-pack on remote'
complete -x -c git -n '__fish_git_using_command push' -l exec -d 'Same as --receive-pack'
### rebase
complete -f -c git -n __fish_git_needs_command -a rebase -d 'Reapply commit sequence on a new base'
@@ -2087,7 +2182,12 @@ __fish_git_add_revision_completion -n '__fish_git_using_command reset'
complete -f -c git -n '__fish_git_using_command reset' -n 'not contains -- -- (commandline -xpc)' -a '(__fish_git_files all-staged modified)'
complete -F -c git -n '__fish_git_using_command reset' -n 'contains -- -- (commandline -xpc)'
complete -f -c git -n '__fish_git_using_command reset' -n 'not contains -- -- (commandline -xpc)' -a '(__fish_git_reflog)' -d Reflog
# TODO options
complete -f -c git -n '__fish_git_using_command reset' -s q -l quiet -d 'Be quiet, only report errors'
complete -f -c git -n '__fish_git_using_command reset' -s p -l patch -d 'Interactively select hunks to reset'
complete -f -c git -n '__fish_git_using_command reset' -l merge -d 'Reset index and update files in working tree that differ'
complete -f -c git -n '__fish_git_using_command reset' -l keep -d 'Reset index but keep changes in working tree'
complete -F -c git -n '__fish_git_using_command reset' -l pathspec-from-file -d 'Read pathspec from file'
complete -f -c git -n '__fish_git_using_command reset' -l pathspec-file-nul -d 'NUL separator for pathspec-from-file'
### restore and switch
# restore options
@@ -2155,7 +2255,17 @@ complete -f -c git -n '__fish_git_using_command revert' -s n -l no-commit -d 'Ap
complete -f -c git -n '__fish_git_using_command revert' -s s -l signoff -d 'Add a Signed-off-by trailer at the end of the commit message'
complete -f -c git -n '__fish_git_using_command revert' -l rerere-autoupdate -d 'Allow the rerere mechanism to update the index automatically'
complete -f -c git -n '__fish_git_using_command revert' -l no-rerere-autoupdate -d 'Prevent the rerere mechanism from updating the index with auto-conflict resolution'
# TODO options
complete -f -c git -n '__fish_git_using_command revert' -l abort -d 'Cancel the operation and return to pre-sequence state'
complete -f -c git -n '__fish_git_using_command revert' -l continue -d 'Continue the operation after resolving conflicts'
complete -f -c git -n '__fish_git_using_command revert' -l quit -d 'Clear the sequencer state after a failed revert'
complete -f -c git -n '__fish_git_using_command revert' -l skip -d 'Skip the current commit and continue with the rest'
complete -f -c git -n '__fish_git_using_command revert' -s e -l edit -d 'Edit the commit message before committing'
complete -f -c git -n '__fish_git_using_command revert' -l no-edit -d 'Do not edit the commit message'
complete -x -c git -n '__fish_git_using_command revert' -s m -l mainline -d 'Select parent number for reverting merge commits'
complete -x -c git -n '__fish_git_using_command revert' -s S -l gpg-sign -a '(type -q gpg && __fish_complete_gpg_key_id gpg)' -d 'GPG-sign commits'
complete -f -c git -n '__fish_git_using_command revert' -l no-gpg-sign -d 'Do not GPG-sign commits'
complete -x -c git -n '__fish_git_using_command revert' -s s -l strategy -d 'Use the given merge strategy'
complete -x -c git -n '__fish_git_using_command revert' -s X -l strategy-option -d 'Pass option to the merge strategy'
### rm
complete -c git -n __fish_git_needs_command -a rm -d 'Remove files from the working tree and/or staging area'
@@ -2167,7 +2277,8 @@ complete -c git -n '__fish_git_using_command rm' -s q -l quiet -d 'Be quiet'
complete -c git -n '__fish_git_using_command rm' -s f -l force -d 'Override the up-to-date check'
complete -c git -n '__fish_git_using_command rm' -s n -l dry-run -d 'Dry run'
complete -c git -n '__fish_git_using_command rm' -l sparse -d 'Allow updating index entries outside of the sparse-checkout cone'
# TODO options
complete -F -c git -n '__fish_git_using_command rm' -l pathspec-from-file -d 'Read pathspec from file'
complete -f -c git -n '__fish_git_using_command rm' -l pathspec-file-nul -d 'NUL separator for pathspec-from-file'
### status
complete -f -c git -n __fish_git_needs_command -a status -d 'Show the working tree status'
@@ -2181,7 +2292,13 @@ complete -f -c git -n '__fish_git_using_command status' -s v -l verbose -d 'Also
complete -f -c git -n '__fish_git_using_command status' -l no-ahead-behind -d 'Do not display detailed ahead/behind upstream-branch counts'
complete -f -c git -n '__fish_git_using_command status' -l renames -d 'Turn on rename detection regardless of user configuration'
complete -f -c git -n '__fish_git_using_command status' -l no-renames -d 'Turn off rename detection regardless of user configuration'
# TODO options
complete -f -c git -n '__fish_git_using_command status' -l ahead-behind -d 'Display detailed ahead/behind upstream-branch counts'
complete -f -c git -n '__fish_git_using_command status' -l long -d 'Give the output in the long-format (default)'
complete -f -c git -n '__fish_git_using_command status' -l show-stash -d 'Show the number of entries currently stashed'
complete -x -c git -n '__fish_git_using_command status' -l column -d 'Display untracked files in columns'
complete -f -c git -n '__fish_git_using_command status' -l no-column -d 'Do not display untracked files in columns'
complete -F -c git -n '__fish_git_using_command status' -l pathspec-from-file -d 'Read pathspec from file'
complete -f -c git -n '__fish_git_using_command status' -l pathspec-file-nul -d 'NUL separator for pathspec-from-file'
### stripspace
complete -f -c git -n __fish_git_needs_command -a stripspace -d 'Remove unnecessary whitespace'
@@ -2199,7 +2316,19 @@ complete -f -c git -n '__fish_git_using_command tag' -s f -l force -d 'Force ove
complete -f -c git -n '__fish_git_using_command tag' -s l -l list -d 'List tags'
complete -f -c git -n '__fish_git_using_command tag' -l contains -xka '(__fish_git_commits)' -d 'List tags that contain a commit'
complete -f -c git -n '__fish_git_using_command tag' -n '__fish_git_contains_opt -s d delete -s v verify -s f force' -ka '(__fish_git_tags)' -d Tag
# TODO options
complete -x -c git -n '__fish_git_using_command tag' -s m -l message -d 'Use the given tag message'
complete -F -c git -n '__fish_git_using_command tag' -s F -l file -d 'Read tag message from file'
complete -x -c git -n '__fish_git_using_command tag' -s u -l local-user -d 'Use this key to sign tag'
complete -x -c git -n '__fish_git_using_command tag' -l cleanup -a 'strip whitespace verbatim' -d 'How to clean up the tag message'
complete -f -c git -n '__fish_git_using_command tag' -l create-reflog -d 'Create reflog for the tag'
complete -f -c git -n '__fish_git_using_command tag' -l no-create-reflog -d 'Do not create reflog for the tag'
complete -x -c git -n '__fish_git_using_command tag' -l color -a 'always never auto' -d 'Respect any colors in format'
complete -f -c git -n '__fish_git_using_command tag' -l column -d 'Display tag listing in columns'
complete -f -c git -n '__fish_git_using_command tag' -l no-column -d 'Do not display tag listing in columns'
complete -x -c git -n '__fish_git_using_command tag' -l sort -d 'Sort tags based on the given key'
complete -f -c git -n '__fish_git_using_command tag' -l merged -d 'List tags whose commits are reachable from specified commit'
complete -f -c git -n '__fish_git_using_command tag' -l no-merged -d 'List tags whose commits are not reachable from specified commit'
complete -x -c git -n '__fish_git_using_command tag' -l points-at -d 'List tags of the given object'
### update-index
complete -c git -n __fish_git_needs_command -a update-index -d 'Register file contents in the working tree to the index'
@@ -2303,7 +2432,37 @@ complete -f -c git -n '__fish_git_using_command stash' -n '__fish_git_stash_usin
### config
complete -f -c git -n __fish_git_needs_command -a config -d 'Set and read git configuration variables'
# TODO options
complete -f -c git -n '__fish_git_using_command config' -l local -d 'Write to repository .git/config'
complete -f -c git -n '__fish_git_using_command config' -l global -d 'Write to global ~/.gitconfig'
complete -f -c git -n '__fish_git_using_command config' -l system -d 'Write to system-wide /etc/gitconfig'
complete -f -c git -n '__fish_git_using_command config' -l worktree -d 'Write to .git/config.worktree'
complete -F -c git -n '__fish_git_using_command config' -s f -l file -d 'Use given config file'
complete -F -c git -n '__fish_git_using_command config' -l blob -d 'Read config from given blob object'
complete -f -c git -n '__fish_git_using_command config' -s l -l list -d 'List all variables set in config'
complete -f -c git -n '__fish_git_using_command config' -s e -l edit -d 'Open config file in editor'
complete -f -c git -n '__fish_git_using_command config' -l get -d 'Get value for given key'
complete -f -c git -n '__fish_git_using_command config' -l get-all -d 'Get all values for a multi-valued key'
complete -f -c git -n '__fish_git_using_command config' -l get-regexp -d 'Get values for keys matching regex'
complete -f -c git -n '__fish_git_using_command config' -l get-urlmatch -d 'Get value for URL-specific key'
complete -f -c git -n '__fish_git_using_command config' -l add -d 'Add new line without altering existing values'
complete -f -c git -n '__fish_git_using_command config' -l unset -d 'Remove line matching key'
complete -f -c git -n '__fish_git_using_command config' -l unset-all -d 'Remove all lines matching key'
complete -f -c git -n '__fish_git_using_command config' -l replace-all -d 'Replace all matching lines'
complete -f -c git -n '__fish_git_using_command config' -l rename-section -d 'Rename given section'
complete -f -c git -n '__fish_git_using_command config' -l remove-section -d 'Remove given section'
complete -x -c git -n '__fish_git_using_command config' -l type -a 'bool int bool-or-int path expiry-date color' -d 'Ensure value is of given type'
complete -f -c git -n '__fish_git_using_command config' -l bool -d 'Value is true or false'
complete -f -c git -n '__fish_git_using_command config' -l int -d 'Value is a decimal number'
complete -f -c git -n '__fish_git_using_command config' -l bool-or-int -d 'Value is bool or int'
complete -f -c git -n '__fish_git_using_command config' -l path -d 'Value is a path'
complete -f -c git -n '__fish_git_using_command config' -s z -l null -d 'Terminate values with NUL byte'
complete -f -c git -n '__fish_git_using_command config' -l name-only -d 'Output only names of config variables'
complete -f -c git -n '__fish_git_using_command config' -l show-origin -d 'Show origin of config'
complete -f -c git -n '__fish_git_using_command config' -l show-scope -d 'Show scope of config'
complete -f -c git -n '__fish_git_using_command config' -l includes -d 'Respect include directives'
complete -f -c git -n '__fish_git_using_command config' -l no-includes -d 'Do not respect include directives'
complete -x -c git -n '__fish_git_using_command config' -l default -d 'Use default value when variable is missing'
complete -f -c git -n '__fish_git_using_command config' -a '(__fish_git_config_keys)' -d 'Config key'
### format-patch
complete -f -c git -n __fish_git_needs_command -a format-patch -d 'Generate patch series to send upstream'
@@ -2386,7 +2545,7 @@ complete -f -c git -n '__fish_git_using_command clean' -s q -l quiet -d 'Be quie
complete -f -c git -n '__fish_git_using_command clean' -s d -d 'Remove untracked directories in addition to untracked files'
complete -f -c git -n '__fish_git_using_command clean' -s x -d 'Remove ignored files, as well'
complete -f -c git -n '__fish_git_using_command clean' -s X -d 'Remove only ignored files'
# TODO -e option
complete -x -c git -n '__fish_git_using_command clean' -s e -l exclude -d 'Add pattern to exclude from removal'
### git blame
complete -f -c git -n __fish_git_needs_command -a blame -d 'Show what last modified each line of a file'
@@ -2644,6 +2803,150 @@ function __fish_git_sort_keys
end
complete -f -c git -n "__fish_seen_subcommand_from $sortcommands" -l sort -d 'Sort results by' -a "(__fish_git_sort_keys)"
### Plumbing commands
### cat-file
complete -f -c git -n __fish_git_needs_command -a cat-file -d 'Provide content or type info for repository objects'
complete -f -c git -n '__fish_git_using_command cat-file' -s t -d 'Show object type'
complete -f -c git -n '__fish_git_using_command cat-file' -s s -d 'Show object size'
complete -f -c git -n '__fish_git_using_command cat-file' -s e -d 'Exit with zero if object exists and is valid'
complete -f -c git -n '__fish_git_using_command cat-file' -s p -d 'Pretty-print object content'
complete -f -c git -n '__fish_git_using_command cat-file' -l textconv -d 'Show content as transformed by a textconv filter'
complete -f -c git -n '__fish_git_using_command cat-file' -l filters -d 'Show content as transformed by filters'
complete -f -c git -n '__fish_git_using_command cat-file' -l batch -d 'Read objects from stdin and print info'
complete -f -c git -n '__fish_git_using_command cat-file' -l batch-check -d 'Read objects from stdin and print type/size info'
complete -f -c git -n '__fish_git_using_command cat-file' -l batch-all-objects -d 'Print info for all objects'
complete -f -c git -n '__fish_git_using_command cat-file' -l follow-symlinks -d 'Follow symlinks when using --batch'
### ls-remote
complete -f -c git -n __fish_git_needs_command -a ls-remote -d 'List references in a remote repository'
complete -f -c git -n '__fish_git_using_command ls-remote' -s h -l heads -d 'Limit to refs/heads'
complete -f -c git -n '__fish_git_using_command ls-remote' -s t -l tags -d 'Limit to refs/tags'
complete -f -c git -n '__fish_git_using_command ls-remote' -l refs -d 'Do not show peeled tags or pseudorefs'
complete -f -c git -n '__fish_git_using_command ls-remote' -s q -l quiet -d 'Do not print remote URL'
complete -x -c git -n '__fish_git_using_command ls-remote' -l upload-pack -d 'Path to git-upload-pack on remote'
complete -f -c git -n '__fish_git_using_command ls-remote' -l exit-code -d 'Exit with status 2 if no matching refs are found'
complete -f -c git -n '__fish_git_using_command ls-remote' -l get-url -d 'Print URL of remote'
complete -f -c git -n '__fish_git_using_command ls-remote' -l symref -d 'Show underlying ref for symbolic refs'
complete -x -c git -n '__fish_git_using_command ls-remote' -l sort -d 'Sort based on the given key'
complete -f -c git -n '__fish_git_using_command ls-remote' -a '(__fish_git_remotes)' -d Remote
### ls-tree
complete -f -c git -n __fish_git_needs_command -a ls-tree -d 'List contents of a tree object'
complete -f -c git -n '__fish_git_using_command ls-tree' -s d -d 'Only show trees'
complete -f -c git -n '__fish_git_using_command ls-tree' -s r -d 'Recurse into subtrees'
complete -f -c git -n '__fish_git_using_command ls-tree' -s t -d 'Show tree entries even when recursing'
complete -f -c git -n '__fish_git_using_command ls-tree' -l name-only -d 'Show only names'
complete -f -c git -n '__fish_git_using_command ls-tree' -l name-status -d 'Show only names (same as name-only)'
complete -f -c git -n '__fish_git_using_command ls-tree' -l full-name -d 'Show full path names'
complete -f -c git -n '__fish_git_using_command ls-tree' -l full-tree -d 'Do not limit listing to current directory'
complete -f -c git -n '__fish_git_using_command ls-tree' -s z -d 'NUL line termination on output'
complete -f -c git -n '__fish_git_using_command ls-tree' -l long -d 'Show object size'
complete -f -c git -n '__fish_git_using_command ls-tree' -l abbrev -d 'Show abbreviated object names'
complete -f -c git -n '__fish_git_using_command ls-tree' -a '(__fish_git_refs)' -d Ref
### show-ref
complete -f -c git -n __fish_git_needs_command -a show-ref -d 'List references in a local repository'
complete -f -c git -n '__fish_git_using_command show-ref' -l head -d 'Show HEAD reference'
complete -f -c git -n '__fish_git_using_command show-ref' -l heads -d 'Limit to refs/heads'
complete -f -c git -n '__fish_git_using_command show-ref' -l tags -d 'Limit to refs/tags'
complete -f -c git -n '__fish_git_using_command show-ref' -s d -l dereference -d 'Dereference tags'
complete -f -c git -n '__fish_git_using_command show-ref' -s s -l hash -d 'Only show SHA-1 hash'
complete -f -c git -n '__fish_git_using_command show-ref' -l verify -d 'Enable stricter reference checking'
complete -f -c git -n '__fish_git_using_command show-ref' -l abbrev -d 'Show abbreviated object names'
complete -f -c git -n '__fish_git_using_command show-ref' -s q -l quiet -d 'Do not print any results'
complete -f -c git -n '__fish_git_using_command show-ref' -l exclude-existing -d 'Check refs from stdin that do not exist'
### symbolic-ref
complete -f -c git -n __fish_git_needs_command -a symbolic-ref -d 'Read, modify, delete symbolic refs'
complete -f -c git -n '__fish_git_using_command symbolic-ref' -s d -l delete -d 'Delete the symbolic ref'
complete -f -c git -n '__fish_git_using_command symbolic-ref' -s q -l quiet -d 'Do not issue error if ref is not a symbolic ref'
complete -f -c git -n '__fish_git_using_command symbolic-ref' -l short -d 'Shorten the ref name'
complete -x -c git -n '__fish_git_using_command symbolic-ref' -s m -d 'Update reflog with given reason'
### check-ignore
complete -f -c git -n __fish_git_needs_command -a check-ignore -d 'Debug gitignore / exclude files'
complete -f -c git -n '__fish_git_using_command check-ignore' -s q -l quiet -d 'Do not output anything, just set exit status'
complete -f -c git -n '__fish_git_using_command check-ignore' -s v -l verbose -d 'Show matching pattern for each file'
complete -f -c git -n '__fish_git_using_command check-ignore' -l stdin -d 'Read pathnames from stdin'
complete -f -c git -n '__fish_git_using_command check-ignore' -s z -d 'NUL line termination'
complete -f -c git -n '__fish_git_using_command check-ignore' -s n -l non-matching -d 'Show given paths which do not match any pattern'
complete -f -c git -n '__fish_git_using_command check-ignore' -l no-index -d 'Do not look in the index when undertaking checks'
### checkout-index
complete -c git -n __fish_git_needs_command -a checkout-index -d 'Copy files from index to working tree'
complete -f -c git -n '__fish_git_using_command checkout-index' -s a -l all -d 'Check out all files in the index'
complete -f -c git -n '__fish_git_using_command checkout-index' -s f -l force -d 'Force overwrite existing files'
complete -f -c git -n '__fish_git_using_command checkout-index' -s n -l no-create -d 'Do not create files that do not exist'
complete -f -c git -n '__fish_git_using_command checkout-index' -s q -l quiet -d 'Be quiet if files exist or are not in the index'
complete -f -c git -n '__fish_git_using_command checkout-index' -s u -l index -d 'Update stat information in the index'
complete -f -c git -n '__fish_git_using_command checkout-index' -s z -d 'NUL line termination'
complete -x -c git -n '__fish_git_using_command checkout-index' -l prefix -d 'Prefix to use when creating files'
complete -x -c git -n '__fish_git_using_command checkout-index' -l stage -a 'all 1 2 3' -d 'Which stage to check out'
complete -f -c git -n '__fish_git_using_command checkout-index' -l temp -d 'Write files to temporary files'
### commit-tree
complete -f -c git -n __fish_git_needs_command -a commit-tree -d 'Create a new commit object'
complete -x -c git -n '__fish_git_using_command commit-tree' -s p -d 'Parent commit object'
complete -x -c git -n '__fish_git_using_command commit-tree' -s m -d 'Commit message'
complete -F -c git -n '__fish_git_using_command commit-tree' -s F -d 'Read commit message from file'
complete -x -c git -n '__fish_git_using_command commit-tree' -s S -l gpg-sign -d 'GPG-sign commit'
complete -f -c git -n '__fish_git_using_command commit-tree' -l no-gpg-sign -d 'Do not GPG-sign commit'
### diff-index
complete -f -c git -n __fish_git_needs_command -a diff-index -d 'Compare tree to working tree or index'
complete -f -c git -n '__fish_git_using_command diff-index' -l cached -d 'Compare tree to index'
complete -f -c git -n '__fish_git_using_command diff-index' -s m -d 'Ignore changes in submodules'
complete -f -c git -n '__fish_git_using_command diff-index' -l raw -d 'Generate raw diff output'
complete -f -c git -n '__fish_git_using_command diff-index' -s p -l patch -d 'Generate patch'
complete -f -c git -n '__fish_git_using_command diff-index' -s q -l quiet -d 'Disable diff output, only set exit status'
complete -f -c git -n '__fish_git_using_command diff-index' -a '(__fish_git_refs)' -d Ref
### hash-object
complete -f -c git -n __fish_git_needs_command -a hash-object -d 'Compute object ID and optionally create an object'
complete -x -c git -n '__fish_git_using_command hash-object' -s t -a 'blob tree commit tag' -d 'Specify object type'
complete -f -c git -n '__fish_git_using_command hash-object' -s w -d 'Actually write object into object database'
complete -f -c git -n '__fish_git_using_command hash-object' -l stdin -d 'Read object from stdin'
complete -f -c git -n '__fish_git_using_command hash-object' -l stdin-paths -d 'Read file paths from stdin'
complete -F -c git -n '__fish_git_using_command hash-object' -l path -d 'Hash object as if it were at the given path'
complete -f -c git -n '__fish_git_using_command hash-object' -l no-filters -d 'Hash contents as-is, without any input filters'
complete -f -c git -n '__fish_git_using_command hash-object' -l literally -d 'Allow hashing any garbage'
### read-tree
complete -f -c git -n __fish_git_needs_command -a read-tree -d 'Read tree info into the index'
complete -f -c git -n '__fish_git_using_command read-tree' -s m -d 'Perform a merge'
complete -f -c git -n '__fish_git_using_command read-tree' -l reset -d 'Same as -m, but discard unmerged entries'
complete -f -c git -n '__fish_git_using_command read-tree' -s u -d 'Update working tree with merge result'
complete -f -c git -n '__fish_git_using_command read-tree' -s i -d 'Update only the index, leave working tree alone'
complete -f -c git -n '__fish_git_using_command read-tree' -s n -l dry-run -d 'Do not update index or working tree'
complete -f -c git -n '__fish_git_using_command read-tree' -s v -d 'Show progress'
complete -f -c git -n '__fish_git_using_command read-tree' -l trivial -d 'Restrict three-way merge to trivial cases'
complete -f -c git -n '__fish_git_using_command read-tree' -l aggressive -d 'Try harder to resolve trivial cases'
complete -x -c git -n '__fish_git_using_command read-tree' -l prefix -d 'Read tree into subdirectory'
complete -f -c git -n '__fish_git_using_command read-tree' -l index-output -d 'Write index to specified file'
complete -f -c git -n '__fish_git_using_command read-tree' -l empty -d 'Instead of reading tree object, empty the index'
complete -f -c git -n '__fish_git_using_command read-tree' -l no-sparse-checkout -d 'Disable sparse checkout support'
complete -f -c git -n '__fish_git_using_command read-tree' -a '(__fish_git_refs)' -d Ref
### update-ref
complete -f -c git -n __fish_git_needs_command -a update-ref -d 'Update object name stored in a ref safely'
complete -f -c git -n '__fish_git_using_command update-ref' -s d -d 'Delete the reference'
complete -f -c git -n '__fish_git_using_command update-ref' -l no-deref -d 'Do not dereference symbolic refs'
complete -x -c git -n '__fish_git_using_command update-ref' -s m -d 'Update reflog with given reason'
complete -f -c git -n '__fish_git_using_command update-ref' -l create-reflog -d 'Create a reflog'
complete -f -c git -n '__fish_git_using_command update-ref' -l stdin -d 'Read instructions from stdin'
complete -f -c git -n '__fish_git_using_command update-ref' -s z -d 'NUL-terminated format for stdin'
### verify-pack
complete -f -c git -n __fish_git_needs_command -a verify-pack -d 'Validate packed Git archive files'
complete -f -c git -n '__fish_git_using_command verify-pack' -s v -l verbose -d 'Show objects contained in pack'
complete -f -c git -n '__fish_git_using_command verify-pack' -s s -l stat-only -d 'Only show histogram of delta chain length'
### write-tree
complete -f -c git -n __fish_git_needs_command -a write-tree -d 'Create a tree object from the current index'
complete -f -c git -n '__fish_git_using_command write-tree' -l missing-ok -d 'Allow missing objects'
complete -x -c git -n '__fish_git_using_command write-tree' -l prefix -d 'Write a tree object for a subdirectory'
## Custom commands (git-* commands installed in the PATH)
complete -c git -n __fish_git_needs_command -a '(__fish_git_custom_commands)' -d 'Custom command'

View File

@@ -22,6 +22,8 @@ complete -c history -n '__fish_seen_subcommand_from search; or not __fish_seen_s
-s z -l null -d "Terminate entries with NUL character"
complete -c history -n '__fish_seen_subcommand_from search; or not __fish_seen_subcommand_from $__fish_history_all_commands' \
-s R -l reverse -d "Output the oldest results first" -x
complete -c history -n '__fish_seen_subcommand_from search; or not __fish_seen_subcommand_from $__fish_history_all_commands' \
-l color -d "When to colorize output" -xa "always never auto"
# We don't include a completion for the "save" subcommand because it should not be used
# interactively.

View File

@@ -21,8 +21,10 @@ function __fish_print_make_targets --argument-names directory file
is_continuation = $0 ~ "^([^#]*[^#" bs_regex "])?(" bs_regex bs_regex ")*" bs_regex "$";
}' 2>/dev/null
else
# BSD make
make $makeflags -d g1 -rn >/dev/null 2>| awk -F, '/^#\*\*\* Input graph:/,/^$/ {if ($1 !~ "^#... ") {gsub(/# /,"",$1); print $1}}' 2>/dev/null
# FreeBSD/NetBSD
make $makeflags -V .ALLTARGETS 2>/dev/null | string split ' '
# OpenBSD
or make $makeflags -d g1 -rn 2>/dev/null | awk '/^[^#. \t].+:/ { gsub(/:.*/, "", $1); print $1 }'
end
end

View File

@@ -1,4 +1,78 @@
complete -c signify -n __fish_seen_subcommand_from -s C -d 'Verify a signed checksum list'
complete -c signify -n __fish_seen_subcommand_from -s G -d 'Generate a new key pair'
complete -c signify -n __fish_seen_subcommand_from -s S -d 'Sign specified message'
complete -c signify -n __fish_seen_subcommand_from -s V -d 'Verify a signed message and sig'
# Tab completion for openbsd-signify
set -l subcommands -C -G -S -V
set -l subcommands_desc (echo -s \
-C\t"Verify a signed checksum list."\n \
-G\t"Generate a new key pair."\n \
-S\t"Sign the specified message file."\n \
-V\t"Verify the message and signature match."\n \
| string escape)
complete -c signify -f
complete -c signify \
-n "not __fish_seen_subcommand_from $subcommands" \
-a "$subcommands_desc"
complete -c signify -F \
-n '__fish_seen_subcommand_from -C'
complete -c signify -s c -f -r \
-d 'Specify the comment to be added during key generation' \
-n '__fish_seen_subcommand_from -G'
complete -c signify -s e -f \
-d 'Use embedded signatures' \
-n '__fish_seen_subcommand_from -S -V'
complete -c signify -s m -F -r \
-d 'Message file to sign or verify' \
-n '__fish_seen_subcommand_from -S -V'
# The -n option has two context-dependent meanings
complete -c signify -s n -f \
-d 'When generating a key pair, do not ask for a passphrase' \
-n '__fish_seen_argument -s G'
complete -c signify -s n -f \
-d 'When signing with -z, store a zero timestamp in the gzip header' \
-n '__fish_seen_subcommand_from -S && __fish_seen_argument -s z'
# This is deliberately split up to only add a description to the flag and not all its argument completions
complete -c signify -s p -f -k -r \
-a '(__fish_complete_suffix .pub)' \
-n '__fish_seen_subcommand_from -C -G -V'
complete -c signify -s p -f -k -r \
-d 'Public key produced by -G, and used by -V to check a signature' \
-n '__fish_seen_subcommand_from -C -G -V'
complete -c signify -s q -f \
-d 'Quiet mode. Suppress informational output' \
-n '__fish_seen_subcommand_from -C -V'
complete -c signify -s s -f -k -r \
-a '(__fish_complete_suffix .sec)' \
-n '__fish_seen_subcommand_from -G -S'
complete -c signify -s s -f -k -r \
-d 'Secret (private) key produced by -G, and used by -S to sign a message' \
-n '__fish_seen_subcommand_from -G -S'
complete -c signify -s t -f -r \
-a '(
set -l files /etc/signify/*
string replace -rf -- \'\\.pub$\' "" $files | string replace -r \'.*-\' ""
)' \
-n '__fish_seen_subcommand_from -C -V'
complete -c signify -s t -f -r \
-d 'When inferring a key to verify with, only use keys with this keytype suffix' \
-n '__fish_seen_subcommand_from -C -V'
complete -c signify -s x -f -k -r \
-a '(__fish_complete_suffix .sig)' \
-n '__fish_seen_subcommand_from -C -S -V'
complete -c signify -s x -f -k -r \
-d 'The signature file to create or verify. The default is message.sig' \
-n '__fish_seen_subcommand_from -C -S -V'
complete -c signify -s z -f \
-d 'Sign and verify gzip (1) archives, embed signature in the header' \
-n '__fish_seen_subcommand_from -S -V'

View File

@@ -6,6 +6,7 @@ complete -c type -s p -l path -d "Print path to command, or nothing if name is n
complete -c type -s P -l force-path -d "Print path to command"
complete -c type -s q -l query -l quiet -d "Check if something exists without output"
complete -c type -s s -l short -d "Don't print function definition"
complete -c type -l color -d "When to colorize output" -xa "always never auto"
complete -c type -a "(builtin -n)" -d Builtin
complete -c type -a "(functions -n)" -d Function

View File

@@ -14,7 +14,7 @@ function __fish_complete_directories -d "Complete directory prefixes" --argument
# If we have a --name=val option, we need to remove it,
# or the complete -C below would keep it, and then whatever complete
# called us would add it again (assuming it's in the current token)
set comp (commandline -ct | string replace -r -- '^-[^=]*=' '' $comp)
set comp (commandline -ct | string replace -r -- '^-[^=]*=' '')
end
# HACK: We call into the file completions by using an empty command

View File

@@ -9,7 +9,7 @@ function __fish_migrate
return
end
# Create empty configuration directores if they do not already exist
# Create empty configuration directories if they do not already exist
test -e $__fish_config_dir/completions/ -a -e $__fish_config_dir/conf.d/ -a -e $__fish_config_dir/functions/ ||
mkdir -p $__fish_config_dir/{completions, conf.d, functions}
@@ -88,7 +88,7 @@ set --erase --universal fish_key_bindings"
echo 'See also the release notes (type `help relnotes`).'
set -Ue fish_key_bindings $theme_uvars
set -l sh (__fish_posix_shell)
eval "$sh -c 'sleep 7 # Please read above notice about universal variables' &"
eval "$sh -c 'sleep 7 # Please read above notice about universal variables' </dev/null &>/dev/null &"
end
end

View File

@@ -321,10 +321,11 @@ function __fish_config_theme_choose
else
set desired_color_theme $fish_terminal_color_theme
if not set -q desired_color_theme[1]
echo >&2 "fish_config theme choose: internal error: \$fish_terminal_color_theme not yet initialized"
return 1
end
if not contains -- "[$desired_color_theme]" $theme_data
if test $scope = -U
echo >&2 "fish_config theme save: error: \$fish_terminal_color_theme not yet initialized"
return 1
end
else if not contains -- "[$desired_color_theme]" $theme_data
__fish_config_theme_choose_bad_color_theme $theme_name "$desired_color_theme" \$fish_terminal_color_theme = $desired_color_theme
echo >&2 "fish_config theme choose: hint: if your terminal does not report colors, pass --color-theme=light or --color-theme=dark when using color-theme-aware themes"
return 1
@@ -342,7 +343,7 @@ function __fish_config_theme_choose
set -l color_theme
string match -re -- (__fish_theme_variable_filter)'|^\[.*\]$' $theme_data |
while read -lat toks
if $theme_is_color_theme_aware
if $theme_is_color_theme_aware && set -q desired_color_theme[1]
for ct in $color_themes
if test "$toks" = [$ct]
set color_theme $ct
@@ -362,7 +363,11 @@ function __fish_config_theme_choose
set -eg $varname
end
if $override || not set -q $varname || string match -rq -- '--theme=.*' $$varname
set $scope $toks (test $scope != -U && echo --theme=$theme_name)
set $scope $toks[1] (
if not $theme_is_color_theme_aware || set -q desired_color_theme[1]
string join \n -- $toks[2..]
end
) (test $scope != -U && echo --theme=$theme_name)
end
end
if $override
@@ -376,31 +381,38 @@ function __fish_config_theme_choose
end
end
end
if test -n "$fish_terminal_color_theme" || not $need_hook
if set -q _flag_no_override[1]
__fish_apply_theme
else
__fish_override=true __fish_apply_theme
if not $need_hook || test -n "$fish_terminal_color_theme" ||
# comment to work around fish_indent bug
{
$theme_is_color_theme_aware && test -z "$fish_terminal_color_theme"
}
if not set -q _flag_no_override[1]
set -fx __fish_override true
end
__fish_apply_theme
end
end
function __fish_config_theme_canonicalize --no-scope-shadowing
# theme_name
# color_theme
if not path is (__fish_theme_dir)/$theme_name.theme
switch $theme_name
case 'fish default'
set theme_name default
case 'ayu Dark' 'ayu Light' 'ayu Mirage' \
'Base16 Default Dark' 'Base16 Default Light' 'Base16 Eighties' \
'Bay Cruise' Dracula Fairground 'Just a Touch' Lava \
'Mono Lace' 'Mono Smoke' \
None Nord 'Old School' Seaweed 'Snow Day' \
'Solarized Dark' 'Solarized Light' \
'Tomorrow Night Bright' 'Tomorrow Night' Tomorrow
set theme_name (string lower (string replace -a " " "-" $theme_name))
end
if path is (__fish_theme_dir)/$theme_name.theme
return
end
switch $theme_name
case 'fish default'
set theme_name default
case 'ayu Dark' 'ayu Light' 'ayu Mirage' \
'Base16 Default Dark' 'Base16 Default Light' 'Base16 Eighties' \
'Bay Cruise' Dracula Fairground 'Just a Touch' Lava \
'Mono Lace' 'Mono Smoke' \
None Nord 'Old School' Seaweed 'Snow Day' \
'Solarized Dark' 'Solarized Light' \
'Tomorrow Night Bright' 'Tomorrow Night' Tomorrow
if test $theme_name = Tomorrow
set color_theme light
end
set theme_name (string lower (string replace -a " " "-" $theme_name))
end
switch $theme_name
case \
@@ -408,8 +420,6 @@ function __fish_config_theme_canonicalize --no-scope-shadowing
base16-default-dark base16-default-light \
solarized-dark solarized-light
string match -rq -- '^(?<theme_name>.*)-(?<color_theme>dark|light)$' $theme_name
case tomorrow
set color_theme light
case tomorrow-night
set theme_name tomorrow
set color_theme dark

View File

@@ -6,7 +6,7 @@ function history --description "display or manipulate interactive command histor
set -l cmd history
set -l options --exclusive 'c,e,p' --exclusive 'S,D,M,V,X'
set -a options h/help c/contains e/exact p/prefix
set -a options C/case-sensitive R/reverse z/null 't/show-time=?' 'n#max'
set -a options C/case-sensitive R/reverse z/null 't/show-time=?' 'n#max' 'color='
# The following options are deprecated and will be removed in the next major release.
# Note that they do not have usable short flags.
set -a options S-search D-delete M-merge V-save X-clear
@@ -22,9 +22,12 @@ function history --description "display or manipulate interactive command histor
set -l show_time
set -l max_count
set -l search_mode
set -l color_opt
set -q _flag_max
set max_count -n$_flag_max
set color_opt --color=$_flag_color
set -q _flag_with_time
and set -l _flag_show_time $_flag_with_time
if set -q _flag_show_time[1]
@@ -78,7 +81,7 @@ function history --description "display or manipulate interactive command histor
# If the user hasn't preconfigured less with the $LESS environment variable,
# we do so to have it behave like cat if output fits on one screen.
if not set -qx LESS
set -fx LESS --quit-if-one-screen
set -fx LESS --quit-if-one-screen --RAW-CONTROL-CHARS
# Also set --no-init for less < v530, see #8157.
if type -q less; and test (less --version | string match -r 'less (\d+)')[2] -lt 530 2>/dev/null
set LESS $LESS --no-init
@@ -87,9 +90,15 @@ function history --description "display or manipulate interactive command histor
not set -qx LV # ask the pager lv not to strip colors
and set -fx LV -c
builtin history search $search_mode $show_time $max_count $_flag_case_sensitive $_flag_reverse $_flag_null -- $argv | $pager
if contains -- "$color_opt" '' '--color=auto'
and test "$pager" = less
and string match -rq -- '^(-\w*R|--RAW-CONTROL-CHARS$)' $LESS
set color_opt --color=always
end
builtin history search $color_opt $search_mode $show_time $max_count $_flag_case_sensitive $_flag_reverse $_flag_null -- $argv | $pager
else
builtin history search $search_mode $show_time $max_count $_flag_case_sensitive $_flag_reverse $_flag_null -- $argv
builtin history search $color_opt $search_mode $show_time $max_count $_flag_case_sensitive $_flag_reverse $_flag_null -- $argv
end
case delete # interactively delete history
@@ -100,15 +109,15 @@ function history --description "display or manipulate interactive command histor
end
if test "$search_mode" = --exact
builtin history delete $search_mode $_flag_case_sensitive -- $searchterm
builtin history delete $color_opt $search_mode $_flag_case_sensitive -- $searchterm
builtin history save
return
end
# TODO: Fix this so that requesting history entries with a timestamp works:
# set -l found_items (builtin history search $search_mode $show_time -- $argv)
# set -l found_items (builtin history search $color_opt $search_mode $show_time -- $argv)
set -l found_items
set found_items (builtin history search $search_mode $_flag_case_sensitive --null -- $searchterm | string split0)
set found_items (builtin history search $color_opt $search_mode $_flag_case_sensitive --null -- $searchterm | string split0)
if set -q found_items[1]
set -l found_items_count (count $found_items)
for i in (seq $found_items_count)
@@ -132,7 +141,7 @@ function history --description "display or manipulate interactive command histor
if test "$choice" = all
printf "Deleting all matching entries!\n"
for item in $found_items
builtin history delete --exact --case-sensitive -- $item
builtin history delete $color_opt --exact --case-sensitive -- $item
end
builtin history save
return
@@ -173,15 +182,15 @@ function history --description "display or manipulate interactive command histor
echo Deleting choices: $choices
for x in $choices
printf "Deleting history entry %s: \"%s\"\n" $x $found_items[$x]
builtin history delete --exact --case-sensitive -- "$found_items[$x]"
builtin history delete $color_opt --exact --case-sensitive -- "$found_items[$x]"
end
builtin history save
end
case save # save our interactive command history to the persistent history
builtin history save $search_mode $show_time $max_count $_flag_case_sensitive $_flag_reverse $_flag_null -- $argv
builtin history save $color_opt $search_mode $show_time $max_count $_flag_case_sensitive $_flag_reverse $_flag_null -- $argv
case merge # merge the persistent interactive command history with our history
builtin history merge $search_mode $show_time $max_count $_flag_case_sensitive $_flag_reverse $_flag_null -- $argv
builtin history merge $color_opt $search_mode $show_time $max_count $_flag_case_sensitive $_flag_reverse $_flag_null -- $argv
case clear # clear the interactive command history
if test -n "$search_mode"
or set -q show_time[1]
@@ -197,13 +206,13 @@ function history --description "display or manipulate interactive command histor
read --local --prompt "echo 'Are you sure you want to clear history? (yes/no) '" choice
or return $status
if test "$choice" = yes
builtin history clear $search_mode $show_time $max_count $_flag_case_sensitive $_flag_reverse $_flag_null -- $argv
builtin history clear $color_opt $search_mode $show_time $max_count $_flag_case_sensitive $_flag_reverse $_flag_null -- $argv
and printf (_ "Command history cleared!\n")
else
printf (_ "You did not say 'yes' so I will not clear your command history\n")
end
case clear-session # clears only session
builtin history clear-session $search_mode $show_time $max_count $_flag_case_sensitive $_flag_reverse $_flag_null -- $argv
builtin history clear-session $color_opt $search_mode $show_time $max_count $_flag_case_sensitive $_flag_reverse $_flag_null -- $argv
and printf (_ "Command history for session cleared!\n")
case append
set -l newitem $argv
@@ -212,7 +221,7 @@ function history --description "display or manipulate interactive command histor
or return $status
end
builtin history append $search_mode $show_time $max_count $_flag_case_sensitive $_flag_reverse $_flag_null -- $newitem
builtin history append $color_opt $search_mode $show_time $max_count $_flag_case_sensitive $_flag_reverse $_flag_null -- $newitem
case '*'
printf "%s: unexpected subcommand '%s'\n" $cmd $hist_cmd
return 2

View File

@@ -5,7 +5,15 @@ body {
color: #222;
}
.prompt_demo, .prompt_demo_text, .data_table_row, .colorpicker_text_sample_tight, .colorpicker_text_sample, .history_text, pre, code, tt {
.prompt_demo,
.prompt_demo_text,
.data_table_row,
.colorpicker_text_sample_tight,
.colorpicker_text_sample,
.history_text,
pre,
code,
tt {
font-family: ui-monospace, SFMono-Regular, SF Mono, Menlo, Consolas, "Ubuntu Mono", "Hack", "Noto Sans Mono", Liberation Mono, monospace;
}
@@ -48,14 +56,14 @@ body {
.tab:hover,
#tab_contents .master_element:hover,
.color_scheme_choice_container:hover,
.prompt_choices_list > .ng-scope:hover
{
.prompt_choices_list > .ng-scope:hover {
background-color: #DDE;
}
#tab_parent{
#tab_parent {
border: red 1px;
}
#tab_parent :first-child {
border-top-left-radius: 8px;
border-left: none;
@@ -65,7 +73,8 @@ body {
border-top-right-radius: 8px;
}
.selected_tab, .selected_tab:hover {
.selected_tab,
.selected_tab:hover {
background-color: #eeeefa;
border-bottom: none;
}
@@ -139,30 +148,39 @@ body {
.detail_function .fish_color_autosuggestion {
color: #555;
}
.detail_function .fish_color_command {
color: #005fd7;
}
.detail_function .fish_color_param {
color: #00afff;
}
.detail_function .fish_color_redirection {
color: #00afff;
}
.detail_function .fish_color_comment {
color: #990000;
}
.detail_function .fish_color_error {
color: #ff0000;
}
.detail_function .fish_color_escape {
color: #00a6b2;
}
.detail_function .fish_color_operator {
color: #00a6b2;
}
.detail_function .fish_color_quote {
color: #999900;
}
.detail_function .fish_color_statement_terminator {
color: #009900;
}
@@ -221,6 +239,7 @@ body {
.master_element > br {
display: none;
}
.selected_master_elem > br {
display: inherit;
}
@@ -267,8 +286,7 @@ body {
padding-right: 10px;
}
.data_table_row {
}
.data_table_row {}
.data_table_cell {
padding-top: 5px;
@@ -625,6 +643,7 @@ button.delete_button:hover {
#parent {
margin-top: 0;
}
#tab_contents {
margin-bottom: 0;
}
@@ -632,27 +651,29 @@ button.delete_button:hover {
@media (prefers-color-scheme: dark) {
body {
background: linear-gradient(to top, #1f1f3f 0%,#051f3a 100%);
background: linear-gradient(to top, #1f1f3f 0%, #051f3a 100%);
color: #DDD;
}
#ancestor {
box-shadow: 0 0 5px 1px #000;
}
.tab {
background-color: black;
border: 1px solid #222;
border-top: none;
}
.tab:hover,
#tab_contents .master_element:hover,
.color_scheme_choice_container:hover,
.prompt_choices_list > .ng-scope:hover
{
.prompt_choices_list > .ng-scope:hover {
background-color: #223;
}
.selected_tab, .selected_tab:hover {
.selected_tab,
.selected_tab:hover {
background-color: #202028;
border-bottom: none;
}
@@ -660,15 +681,63 @@ button.delete_button:hover {
#tab_contents {
background-color: #202028;
}
.detail, .selected_master_elem {
.detail,
.selected_master_elem {
background-color: black;
}
input {
background-color: #222;
color: #AAA;
}
/* Fix .function-body background in dark mode */
.function-body {
background-color: black;
}
.detail_function .fish_color_autosuggestion {
color: #808080; /* brblack */
}
.detail_function .fish_color_command {
color: #c397d8;
}
.detail_function .fish_color_param {
color: #7aa6da;
}
.detail_function .fish_color_redirection {
color: #70c0b1;
}
.detail_function .fish_color_comment {
color: #e7c547;
}
.detail_function .fish_color_error {
color: #d54e53;
}
.detail_function .fish_color_escape {
color: #00a6b2;
}
.detail_function .fish_color_operator {
color: #00a6b2;
}
.detail_function .fish_color_quote {
color: #b9ca4a;
}
.detail_function .fish_color_statement_terminator {
color: #c397d8;
}
}
.print_only {
display: none;
}
}

View File

@@ -3,7 +3,8 @@ body {
font-size: 8pt;
}
.tab, .print_hidden {
.tab,
.print_hidden {
display: none;
}
@@ -27,4 +28,4 @@ body {
#ancestor {
width: 100%;
box-shadow: none;
}
}

View File

@@ -1,15 +1,14 @@
use std::{
collections::HashSet,
sync::{Mutex, MutexGuard},
sync::{LazyLock, Mutex, MutexGuard},
};
use crate::prelude::*;
use once_cell::sync::Lazy;
use crate::parse_constants::SourceRange;
use pcre2::utf32::Regex;
static ABBRS: Lazy<Mutex<AbbreviationSet>> = Lazy::new(|| Mutex::new(Default::default()));
static ABBRS: LazyLock<Mutex<AbbreviationSet>> = LazyLock::new(|| Mutex::new(Default::default()));
pub fn with_abbrs<R>(cb: impl FnOnce(&AbbreviationSet) -> R) -> R {
let abbrs_g = ABBRS.lock().unwrap();

View File

@@ -385,7 +385,8 @@ fn throwing_main() -> i32 {
set_libc_locales(/*log_ok=*/ false)
};
fish::localization::initialize_gettext();
#[cfg(feature = "localize-messages")]
fish::localization::initialize_localization();
// Enable debug categories set in FISH_DEBUG.
// This is in *addition* to the ones given via --debug.

View File

@@ -1,7 +1,8 @@
use super::prelude::*;
use crate::abbrs::{self, Abbreviation, Position};
use crate::common::{EscapeStringStyle, escape, escape_string, valid_func_name};
use crate::common::{EscapeStringStyle, bytes2wcstring, escape, escape_string, valid_func_name};
use crate::env::{EnvMode, EnvStackSetResult};
use crate::highlight::highlight_and_colorize;
use crate::parser::ParserEnvSetMode;
use crate::re::{regex_make_anchored, to_boxed_chars};
use pcre2::utf32::{Regex, RegexBuilder};
@@ -22,6 +23,7 @@ struct Options {
position: Option<Position>,
set_cursor_marker: Option<WString>,
args: Vec<WString>,
color: ColorEnabled,
}
impl Options {
@@ -124,7 +126,7 @@ fn join(list: &[&wstr], sep: &wstr) -> WString {
}
// Print abbreviations in a fish-script friendly way.
fn abbr_show(streams: &mut IoStreams) -> BuiltinResult {
fn abbr_show(opts: &Options, streams: &mut IoStreams, parser: &Parser) -> BuiltinResult {
let style = EscapeStringStyle::Script(Default::default());
abbrs::with_abbrs(|abbrs| {
@@ -173,7 +175,15 @@ fn abbr_show(streams: &mut IoStreams) -> BuiltinResult {
));
}
result.push('\n');
streams.out.append(&result);
if opts.color.enabled(streams) {
streams.out.append(&bytes2wcstring(&highlight_and_colorize(
&result,
&parser.context(),
parser.vars(),
)));
} else {
streams.out.append(&result);
}
}
});
@@ -508,6 +518,7 @@ pub fn abbr(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> Bui
wopt(L!("global"), ArgType::NoArgument, 'g'),
wopt(L!("universal"), ArgType::NoArgument, 'U'),
wopt(L!("help"), ArgType::NoArgument, 'h'),
wopt(L!("color"), ArgType::RequiredArgument, COLOR_OPTION_CHAR),
];
let mut opts = Options::default();
@@ -614,6 +625,9 @@ pub fn abbr(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> Bui
builtin_unknown_option(parser, streams, cmd, argv[w.wopt_index - 1], false);
return Err(STATUS_INVALID_ARGS);
}
COLOR_OPTION_CHAR => {
opts.color = ColorEnabled::parse_from_opt(streams, cmd, w.woptarg.unwrap())?;
}
_ => {
panic!("unexpected retval from wgeopter.next()");
}
@@ -632,7 +646,7 @@ pub fn abbr(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> Bui
return abbr_add(&opts, streams);
};
if opts.show {
return abbr_show(streams);
return abbr_show(&opts, streams, parser);
};
if opts.list {
return abbr_list(&opts, streams);

View File

@@ -196,7 +196,7 @@ fn parse_exclusive_args(opts: &mut ArgParseCmdOpts, streams: &mut IoStreams) ->
}
// Store the set of exclusive flags for use when parsing the supplied set of arguments.
opts.exclusive_flag_sets.push(exclusive_set.to_vec());
opts.exclusive_flag_sets.push(exclusive_set.clone());
}
Ok(SUCCESS)
}

View File

@@ -4,12 +4,11 @@
use crate::common::{
EscapeFlags, EscapeStringStyle, bytes2wcstring, escape, escape_string, valid_var_name,
};
use crate::highlight::{colorize, highlight_shell};
use crate::highlight::highlight_and_colorize;
use crate::input::{InputMappingSet, KeyNameStyle, input_function_get_names, input_mappings};
use crate::key::{
self, KEY_NAMES, Key, MAX_FUNCTION_KEY, Modifiers, char_to_symbol, function_key, parse_keys,
};
use crate::nix::isatty;
use std::sync::MutexGuard;
const DEFAULT_BIND_MODE: &wstr = L!("default");
@@ -32,6 +31,7 @@ struct Options {
mode: c_int,
bind_mode: WString,
sets_bind_mode: Option<WString>,
color: ColorEnabled,
}
impl Options {
@@ -49,6 +49,7 @@ fn new() -> Options {
mode: BIND_INSERT,
bind_mode: DEFAULT_BIND_MODE.to_owned(),
sets_bind_mode: None,
color: ColorEnabled::default(),
}
}
}
@@ -153,11 +154,12 @@ fn list_one(
}
out.push('\n');
if !streams.out_is_redirected && isatty(libc::STDOUT_FILENO) {
let mut colors = Vec::new();
highlight_shell(&out, &mut colors, &parser.context(), false, None);
let colored = colorize(&out, &colors, parser.vars());
streams.out.append(&bytes2wcstring(&colored));
if self.opts.color.enabled(streams) {
streams.out.append(&bytes2wcstring(&highlight_and_colorize(
&out,
&parser.context(),
parser.vars(),
)));
} else {
streams.out.append(&out);
}
@@ -370,8 +372,8 @@ fn insert(
if self.add(
seq,
&argv[optind + 1..],
self.opts.bind_mode.to_owned(),
self.opts.sets_bind_mode.to_owned(),
self.opts.bind_mode.clone(),
self.opts.sets_bind_mode.clone(),
self.opts.user,
streams,
) {
@@ -408,7 +410,7 @@ fn parse_cmd_opts(
) -> BuiltinResult {
let cmd = argv[0];
let short_options = L!("aehkKfM:Lm:s");
const long_options: &[WOption] = &[
let long_options: &[WOption] = &[
wopt(L!("all"), NoArgument, 'a'),
wopt(L!("erase"), NoArgument, 'e'),
wopt(L!("function-names"), NoArgument, 'f'),
@@ -421,9 +423,10 @@ fn parse_cmd_opts(
wopt(L!("sets-mode"), RequiredArgument, 'm'),
wopt(L!("silent"), NoArgument, 's'),
wopt(L!("user"), NoArgument, 'u'),
wopt(L!("color"), RequiredArgument, COLOR_OPTION_CHAR),
];
let mut check_mode_name = |mode_name: &wstr| -> Result<(), ErrorCode> {
let check_mode_name = |streams: &mut IoStreams, mode_name: &wstr| -> Result<(), ErrorCode> {
if !valid_var_name(mode_name) {
streams.err.append(&wgettext_fmt!(
BUILTIN_ERR_BIND_MODE,
@@ -457,13 +460,13 @@ fn parse_cmd_opts(
}
'M' => {
let applicable_mode = w.woptarg.unwrap();
check_mode_name(applicable_mode)?;
check_mode_name(streams, applicable_mode)?;
opts.bind_mode = applicable_mode.to_owned();
opts.bind_mode_given = true;
}
'm' => {
let new_mode = w.woptarg.unwrap();
check_mode_name(new_mode)?;
check_mode_name(streams, new_mode)?;
opts.sets_bind_mode = Some(new_mode.to_owned());
}
'p' => {
@@ -487,6 +490,9 @@ fn parse_cmd_opts(
builtin_unknown_option(parser, streams, cmd, argv[w.wopt_index - 1], true);
return Err(STATUS_INVALID_ARGS);
}
COLOR_OPTION_CHAR => {
opts.color = ColorEnabled::parse_from_opt(streams, cmd, w.woptarg.unwrap())?;
}
_ => {
panic!("unexpected retval from WGetopter")
}

View File

@@ -1,9 +1,7 @@
use super::prelude::*;
use crate::common::{ScopeGuard, UnescapeFlags, UnescapeStringStyle, unescape_string};
use crate::complete::{CompletionRequestOptions, complete_add_wrapper, complete_remove_wrapper};
use crate::highlight::colorize;
use crate::highlight::highlight_shell;
use crate::nix::isatty;
use crate::highlight::highlight_and_colorize;
use crate::operation_context::OperationContext;
use crate::parse_constants::ParseErrorList;
use crate::parse_util::parse_util_detect_errors_in_argument_list;
@@ -18,7 +16,6 @@
complete_remove, complete_remove_all,
},
};
use libc::STDOUT_FILENO;
// builtin_complete_* are a set of rather silly looping functions that make sure that all the proper
// combinations of complete_add or complete_remove get called. This is needed since complete allows
@@ -221,16 +218,20 @@ fn builtin_complete_remove(
}
}
fn builtin_complete_print(cmd: &wstr, streams: &mut IoStreams, parser: &Parser) {
fn builtin_complete_print(
cmd: &wstr,
streams: &mut IoStreams,
parser: &Parser,
color: ColorEnabled,
) {
let repr = complete_print(cmd);
// colorize if interactive
if !streams.out_is_redirected && isatty(STDOUT_FILENO) {
let mut colors = vec![];
highlight_shell(&repr, &mut colors, &parser.context(), false, None);
streams
.out
.append(&bytes2wcstring(&colorize(&repr, &colors, parser.vars())));
if color.enabled(streams) {
streams.out.append(&bytes2wcstring(&highlight_and_colorize(
&repr,
&parser.context(),
parser.vars(),
)));
} else {
streams.out.append(&repr);
}
@@ -259,9 +260,10 @@ pub fn complete(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) ->
let mut wrap_targets = vec![];
let mut preserve_order = false;
let mut unescape_output = true;
let mut color = ColorEnabled::default();
const short_options: &wstr = L!("a:c:p:s:l:o:d:fFrxeuAn:C::w:hk");
const long_options: &[WOption] = &[
let short_options: &wstr = L!("a:c:p:s:l:o:d:fFrxeuAn:C::w:hk");
let long_options: &[WOption] = &[
wopt(L!("exclusive"), ArgType::NoArgument, 'x'),
wopt(L!("no-files"), ArgType::NoArgument, 'f'),
wopt(L!("force-files"), ArgType::NoArgument, 'F'),
@@ -282,6 +284,7 @@ pub fn complete(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) ->
wopt(L!("help"), ArgType::NoArgument, 'h'),
wopt(L!("keep-order"), ArgType::NoArgument, 'k'),
wopt(L!("escape"), ArgType::NoArgument, OPT_ESCAPE),
wopt(L!("color"), ArgType::RequiredArgument, COLOR_OPTION_CHAR),
];
let mut have_x = false;
@@ -401,6 +404,9 @@ pub fn complete(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) ->
builtin_unknown_option(parser, streams, cmd, argv[w.wopt_index - 1], true);
return Err(STATUS_INVALID_ARGS);
}
COLOR_OPTION_CHAR => {
color = ColorEnabled::parse_from_opt(streams, cmd, w.woptarg.unwrap())?;
}
_ => panic!("unexpected retval from WGetopter"),
}
}
@@ -584,10 +590,10 @@ pub fn complete(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) ->
// No arguments that would add or remove anything specified, so we print the definitions of
// all matching completions.
if cmd_to_complete.is_empty() {
builtin_complete_print(L!(""), streams, parser);
builtin_complete_print(L!(""), streams, parser, color);
} else {
for cmd in cmd_to_complete {
builtin_complete_print(&cmd, streams, parser);
builtin_complete_print(&cmd, streams, parser, color);
}
}
} else {

View File

@@ -157,7 +157,6 @@ pub fn fg(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> Built
if job.is_stopped() {
handoff.save_tty_modes();
}
handoff.reclaim();
if resumed {
Ok(SUCCESS)
} else {

View File

@@ -11,7 +11,7 @@
use super::prelude::*;
use crate::ast::{self, AsNode, Ast, Kind, Leaf, Node, NodeVisitor, SourceRangeList, Traversal};
use crate::common::{
PROGRAM_NAME, UnescapeFlags, UnescapeStringStyle, bytes2wcstring, get_program_name,
PROGRAM_NAME, ReadExt, UnescapeFlags, UnescapeStringStyle, bytes2wcstring, get_program_name,
unescape_string, wcs2bytes,
};
use crate::env::EnvStack;
@@ -756,6 +756,10 @@ fn brace_is_continuation(&self, node: &dyn ast::Token) -> bool {
};
conj.decorator.is_some()
|| matches!(
self.traversal.parent(conj.as_node()).kind(),
Kind::IfClause(_) | Kind::WhileHeader(_)
)
}
fn visit_left_brace(&mut self, node: &dyn ast::Token) {
@@ -910,10 +914,9 @@ pub fn main() {
fn throwing_main() -> i32 {
// TODO: Duplicated with fish_key_reader
use crate::io::FdOutputStream;
use crate::io::IoChain;
use crate::io::OutputStream::Fd;
use libc::{STDERR_FILENO, STDIN_FILENO, STDOUT_FILENO};
use crate::fds::BorrowedFdFile;
use crate::io::{FdOutputStream, IoChain, OutputStream::Fd};
use libc::{STDERR_FILENO, STDOUT_FILENO};
topic_monitor_init();
threads::init();
@@ -922,12 +925,13 @@ fn throwing_main() -> i32 {
let mut err = Fd(FdOutputStream::new(STDERR_FILENO));
let io_chain = IoChain::new();
let mut streams = IoStreams::new(&mut out, &mut err, &io_chain);
streams.stdin_fd = STDIN_FILENO;
streams.stdin_file = Some(BorrowedFdFile::stdin());
// Safety: single-threaded.
unsafe {
set_libc_locales(/*log_ok=*/ false)
};
crate::localization::initialize_gettext();
#[cfg(feature = "localize-messages")]
crate::localization::initialize_localization();
env_init(None, true, false);
// Only set these here so you can't set them via the builtin.
@@ -940,15 +944,19 @@ fn throwing_main() -> i32 {
let args: Vec<WString> = std::env::args_os()
.map(|osstr| bytes2wcstring(osstr.as_bytes()))
.collect();
do_indent(&mut streams, args).builtin_status_code()
do_indent(None, &mut streams, args).builtin_status_code()
}
pub fn fish_indent(_parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) -> BuiltinResult {
pub fn fish_indent(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) -> BuiltinResult {
let args = args.iter_mut().map(|x| x.to_owned()).collect();
do_indent(streams, args)
do_indent(Some(parser), streams, args)
}
fn do_indent(streams: &mut IoStreams, args: Vec<WString>) -> BuiltinResult {
fn do_indent(
parser: Option<&Parser>,
streams: &mut IoStreams,
args: Vec<WString>,
) -> BuiltinResult {
// Types of output we support
#[derive(Eq, PartialEq)]
enum OutputType {
@@ -988,7 +996,11 @@ enum OutputType {
match c {
'P' => DUMP_PARSE_TREE.store(true),
'h' => {
print_help("fish_indent");
if let Some(parser) = parser {
builtin_print_help(parser, streams, L!("fish_indent"));
} else {
print_help("fish_indent");
}
return Ok(SUCCESS);
}
'v' => {
@@ -1042,18 +1054,24 @@ enum OutputType {
));
return Err(STATUS_CMD_ERROR);
}
use std::os::fd::FromRawFd;
let mut fd = unsafe { std::fs::File::from_raw_fd(streams.stdin_fd) };
let Some(stdin_file) = streams.stdin_file.as_mut() else {
let cmd = "fish_indent";
streams
.err
.append(&wgettext_fmt!("%s: stdin is closed\n", cmd));
return Err(STATUS_CMD_ERROR);
};
let mut buf = vec![];
match fd.read_to_end(&mut buf) {
match stdin_file.read_to_end_interruptible(&mut buf) {
Ok(_) => {}
Err(_) => {
// Don't close the fd
std::mem::forget(fd);
return Err(STATUS_CMD_ERROR);
Err(err) => {
return if err.kind() == std::io::ErrorKind::Interrupted {
Err(128 + libc::SIGINT)
} else {
Err(STATUS_CMD_ERROR)
};
}
}
std::mem::forget(fd);
src = bytes2wcstring(&buf);
} else {
let arg = args[i];
@@ -1250,7 +1268,7 @@ struct TokenRange {
}
let mut token_ranges: Vec<TokenRange> = vec![];
for (i, color) in colors.iter().cloned().enumerate() {
for (i, color) in colors.iter().copied().enumerate() {
let role = color.foreground;
// See if we can extend the last range.
if let Some(last) = token_ranges.last_mut() {

View File

@@ -172,6 +172,7 @@ fn setup_and_process_keys(
}
fn parse_flags(
parser: Option<&Parser>,
streams: &mut IoStreams,
args: Vec<WString>,
continuous_mode: &mut bool,
@@ -184,7 +185,6 @@ fn parse_flags(
wopt(L!("version"), ArgType::NoArgument, 'v'),
wopt(L!("verbose"), ArgType::NoArgument, 'V'),
];
let mut shim_args: Vec<&wstr> = args.iter().map(|s| s.as_ref()).collect();
let mut w = WGetopter::new(short_opts, long_opts, &mut shim_args);
while let Some(opt) = w.next_opt() {
@@ -193,7 +193,11 @@ fn parse_flags(
*continuous_mode = true;
}
'h' => {
print_help("fish_key_reader");
if let Some(parser) = parser {
builtin_print_help(parser, streams, L!("fish_key_reader"));
} else {
print_help("fish_key_reader");
}
return ControlFlow::Break(Ok(SUCCESS));
}
'v' => {
@@ -239,7 +243,7 @@ fn parse_flags(
}
pub fn fish_key_reader(
_parser: &Parser,
parser: &Parser,
streams: &mut IoStreams,
args: &mut [&wstr],
) -> BuiltinResult {
@@ -247,11 +251,17 @@ pub fn fish_key_reader(
let mut verbose = false;
let args = args.iter_mut().map(|x| x.to_owned()).collect();
if let ControlFlow::Break(s) = parse_flags(streams, args, &mut continuous_mode, &mut verbose) {
if let ControlFlow::Break(s) = parse_flags(
Some(parser),
streams,
args,
&mut continuous_mode,
&mut verbose,
) {
return s;
}
if streams.stdin_fd < 0 || !isatty(streams.stdin_fd) {
if streams.stdin_fd() < 0 || !isatty(streams.stdin_fd()) {
streams.err.appendln("Stdin must be attached to a tty.");
return Err(STATUS_CMD_ERROR);
}
@@ -261,7 +271,7 @@ pub fn fish_key_reader(
continuous_mode,
verbose,
// Won't be querying, so no timeout value needed.
InputEventQueue::new(streams.stdin_fd, None),
InputEventQueue::new(streams.stdin_fd(), None),
)
}
@@ -279,7 +289,8 @@ fn throwing_main() -> i32 {
set_interactive_session(true);
topic_monitor_init();
threads::init();
crate::localization::initialize_gettext();
#[cfg(feature = "localize-messages")]
crate::localization::initialize_localization();
env_init(None, true, false);
reader_init(false);
if let Some(features_var) = EnvStack::globals().get(L!("fish_features")) {
@@ -300,7 +311,7 @@ fn throwing_main() -> i32 {
.map(|osstr| bytes2wcstring(osstr.as_bytes()))
.collect();
if let ControlFlow::Break(s) =
parse_flags(&mut streams, args, &mut continuous_mode, &mut verbose)
parse_flags(None, &mut streams, args, &mut continuous_mode, &mut verbose)
{
return s.builtin_status_code();
}

View File

@@ -6,8 +6,7 @@
use crate::common::{EscapeFlags, EscapeStringStyle};
use crate::event::{self};
use crate::function;
use crate::highlight::colorize;
use crate::highlight::highlight_shell;
use crate::highlight::highlight_and_colorize;
use crate::parse_util::apply_indents;
use crate::parse_util::parse_util_compute_indents;
use crate::parser_keywords::parser_keywords_is_reserved;
@@ -25,6 +24,7 @@ struct FunctionsCmdOpts<'args> {
no_metadata: bool,
verbose: bool,
handlers: bool,
color: ColorEnabled,
handlers_type: Option<&'args wstr>,
description: Option<&'args wstr>,
}
@@ -46,6 +46,7 @@ struct FunctionsCmdOpts<'args> {
wopt(L!("verbose"), ArgType::NoArgument, 'v'),
wopt(L!("handlers"), ArgType::NoArgument, 'H'),
wopt(L!("handlers-type"), ArgType::RequiredArgument, 't'),
wopt(L!("color"), ArgType::RequiredArgument, COLOR_OPTION_CHAR),
];
/// Parses options to builtin function, populating opts.
@@ -79,6 +80,9 @@ fn parse_cmd_opts<'args>(
opts.handlers = true;
opts.handlers_type = Some(w.woptarg.unwrap());
}
COLOR_OPTION_CHAR => {
opts.color = ColorEnabled::parse_from_opt(streams, cmd, w.woptarg.unwrap())?;
}
':' => {
builtin_missing_argument(parser, streams, cmd, argv[w.wopt_index - 1], print_hints);
return Err(STATUS_INVALID_ARGS);
@@ -278,7 +282,7 @@ pub fn functions(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) -
if opts.list || args.is_empty() {
let mut names = function::get_names(opts.show_hidden, parser.vars());
names.sort();
if streams.out_is_terminal() {
if opts.color.enabled(streams) {
let mut buff = WString::new();
let mut first: bool = true;
for name in names {
@@ -414,12 +418,12 @@ pub fn functions(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) -
def = apply_indents(&def, &parse_util_compute_indents(&def));
}
if streams.out_is_terminal() {
let mut colors = vec![];
highlight_shell(&def, &mut colors, &parser.context(), false, None);
streams
.out
.append(&bytes2wcstring(&colorize(&def, &colors, parser.vars())));
if opts.color.enabled(streams) {
streams.out.append(&bytes2wcstring(&highlight_and_colorize(
&def,
&parser.context(),
parser.vars(),
)));
} else {
streams.out.append(&def);
}

View File

@@ -60,6 +60,7 @@ struct HistoryCmdOpts {
case_sensitive: bool,
null_terminate: bool,
reverse: bool,
color: ColorEnabled,
}
/// Note: Do not add new flags that represent subcommands. We're encouraging people to switch to
@@ -82,6 +83,7 @@ struct HistoryCmdOpts {
wopt(L!("clear"), ArgType::NoArgument, '\x04'),
wopt(L!("merge"), ArgType::NoArgument, '\x05'),
wopt(L!("reverse"), ArgType::NoArgument, 'R'),
wopt(L!("color"), ArgType::RequiredArgument, COLOR_OPTION_CHAR),
];
/// Remember the history subcommand and disallow selecting more than one history subcommand.
@@ -226,6 +228,9 @@ fn parse_cmd_opts(
}
w.remaining_text = L!("");
}
COLOR_OPTION_CHAR => {
opts.color = ColorEnabled::parse_from_opt(streams, cmd, w.woptarg.unwrap())?;
}
_ => {
panic!("unexpected retval from WGetopter");
}
@@ -278,6 +283,8 @@ pub fn history(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) ->
match opts.hist_cmd {
HistCmd::None | HistCmd::Search => {
if !history.search(
parser,
streams,
opts.search_type
.unwrap_or(history::SearchType::ContainsGlob),
args,
@@ -287,7 +294,7 @@ pub fn history(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) ->
opts.null_terminate,
opts.reverse,
&parser.context().cancel_checker,
streams,
opts.color.enabled(streams),
) {
status = Err(STATUS_CMD_ERROR);
}

View File

@@ -2,13 +2,12 @@
use crate::util::get_seeded_rng;
use crate::wutil;
use once_cell::sync::Lazy;
use rand::rngs::SmallRng;
use rand::{Rng, RngCore};
use std::sync::Mutex;
use std::sync::{LazyLock, Mutex};
static RNG: Lazy<Mutex<SmallRng>> =
Lazy::new(|| Mutex::new(get_seeded_rng(rand::rng().next_u64())));
static RNG: LazyLock<Mutex<SmallRng>> =
LazyLock::new(|| Mutex::new(get_seeded_rng(rand::rng().next_u64())));
pub fn random(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> BuiltinResult {
let cmd = argv[0];

View File

@@ -604,7 +604,7 @@ pub fn read(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> Bui
validate_read_args(cmd, &mut opts, argv, parser, streams)?;
// stdin may have been explicitly closed
if streams.stdin_fd < 0 {
if streams.is_stdin_closed() {
streams
.err
.append(&wgettext_fmt!("%s: stdin is closed\n", cmd));
@@ -627,7 +627,7 @@ pub fn read(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> Bui
}
};
let stream_stdin_is_a_tty = isatty(streams.stdin_fd);
let stream_stdin_is_a_tty = streams.stdin_fd() >= 0 && isatty(streams.stdin_fd());
// Normally, we either consume a line of input or all available input. But if we are reading a
// line at a time, we need a middle ground where we only consume as many lines as we need to
@@ -646,7 +646,7 @@ pub fn read(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> Bui
opts.prompt.as_ref().unwrap(),
&opts.right_prompt,
&opts.commandline,
streams.stdin_fd,
streams.stdin_fd(),
);
} else if opts.nchars.is_none() && !stream_stdin_is_a_tty &&
// "one_line" is implemented as reading n-times to a new line,
@@ -655,7 +655,7 @@ pub fn read(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> Bui
!opts.one_line &&
(
streams.stdin_is_directly_redirected ||
unsafe {libc::lseek(streams.stdin_fd, 0, SEEK_CUR)} != -1)
unsafe {libc::lseek(streams.stdin_fd(), 0, SEEK_CUR)} != -1)
{
// We read in chunks when we either can seek (so we put the bytes back),
// or we have the bytes to ourselves (because it's directly redirected).
@@ -665,14 +665,18 @@ pub fn read(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> Bui
// You don't rewind VHS tapes before throwing them in the trash.
// TODO: Do this when nchars is set by seeking back.
exit_res = read_in_chunks(
streams.stdin_fd,
streams.stdin_fd(),
&mut buff,
opts.split_null,
!streams.stdin_is_directly_redirected,
);
} else {
exit_res =
read_one_char_at_a_time(streams.stdin_fd, &mut buff, opts.nchars, opts.split_null);
exit_res = read_one_char_at_a_time(
streams.stdin_fd(),
&mut buff,
opts.nchars,
opts.split_null,
);
}
if exit_res.is_err() {

View File

@@ -890,7 +890,7 @@ fn new_var_values(
// So do not use the given variable: we must re-fetch it.
// TODO: this races under concurrent execution.
if let Some(existing) = vars.get(varname) {
result = existing.as_list().to_owned();
existing.as_list().clone_into(&mut result);
}
if opts.prepend {
@@ -914,10 +914,12 @@ fn new_var_values_by_index(split: &SplitVar, argv: &[&wstr]) -> Vec<WString> {
// Inherit any existing values.
// Note unlike the append/prepend case, we start with a variable in the same scope as we are
// setting.
let mut result = vec![];
if let Some(var) = split.var.as_ref() {
result = var.as_list().to_owned();
}
let mut result = split
.var
.as_ref()
.map(EnvVar::as_list)
.unwrap_or_default()
.to_owned();
// For each (index, argument) pair, set the element in our `result` to the replacement string.
// Extend the list with empty strings as needed. The indexes are 1-based.

View File

@@ -1,5 +1,6 @@
use super::prelude::*;
use crate::common::{Named, bytes2wcstring, escape, get_by_sorted_name};
use crate::fds::BorrowedFdFile;
use crate::io::OutputStream;
use crate::parse_constants::UNKNOWN_BUILTIN_ERR_MSG;
use crate::parse_util::parse_util_argument_is_help;
@@ -8,10 +9,7 @@
use crate::{builtins::*, wutil};
use errno::errno;
use fish_wchar::L;
use std::fs::File;
use std::io::{BufRead, BufReader, Read};
use std::os::fd::FromRawFd;
pub type BuiltinCmd = fn(&Parser, &mut IoStreams, &mut [&wstr]) -> BuiltinResult;
@@ -814,25 +812,26 @@ pub fn new(arg: Cow<'args, wstr>, want_newline: bool) -> Self {
/// A helper type for extracting arguments from either argv or stdin.
pub struct Arguments<'args, 'iter> {
/// The list of arguments passed to the string builtin.
args: &'iter [&'args wstr],
/// If using argv, index of the next argument to return.
argidx: &'iter mut usize,
split_behavior: SplitBehavior,
/// Buffer to store what we read with the BufReader
/// Is only here to avoid allocating every time
buffer: Vec<u8>,
/// If not using argv, we read with a buffer
reader: Option<BufReader<File>>,
source: ArgvSource<'args, 'iter>,
}
impl Drop for Arguments<'_, '_> {
fn drop(&mut self) {
if let Some(r) = self.reader.take() {
// we should not close stdin
std::mem::forget(r.into_inner());
}
}
/// Either the arguments from argv, or from stdin.
enum ArgvSource<'args, 'iter> {
/// Read arguments from argv.
Args {
// The list of arguments passed to the builtin.
args: &'iter [&'args wstr],
// Index of the next argument to return.
argidx: &'iter mut usize,
},
/// Read arguments from stdin (possibly redirected).
Stdin {
/// Reused storage for reading.
buffer: Vec<u8>,
/// The reader to read from.
reader: BufReader<BorrowedFdFile>,
},
}
impl<'args, 'iter> Arguments<'args, 'iter> {
@@ -842,20 +841,21 @@ pub fn new(
streams: &mut IoStreams,
chunk_size: usize,
) -> Self {
let reader = streams.stdin_is_directly_redirected.then(|| {
let stdin_fd = streams.stdin_fd;
assert!(stdin_fd >= 0, "should have a valid fd");
// safety: this should be a valid fd, and already open
let fd = unsafe { File::from_raw_fd(stdin_fd) };
BufReader::with_capacity(chunk_size, fd)
});
let source: ArgvSource = if !streams.stdin_is_directly_redirected {
ArgvSource::Args { args, argidx }
} else {
let stdin_file = streams
.stdin_file
.clone()
.expect("should have stdin if redirected");
ArgvSource::Stdin {
buffer: Vec::new(),
reader: BufReader::with_capacity(chunk_size, stdin_file),
}
};
Arguments {
args,
argidx,
split_behavior: SplitBehavior::Newline,
buffer: Vec::new(),
reader,
source,
}
}
@@ -864,9 +864,23 @@ pub fn with_split_behavior(mut self, split_behavior: SplitBehavior) -> Self {
self
}
/// Return the next argument by reading from argv ArgvSource.
fn get_arg_argv(&mut self) -> Option<InputValue<'args>> {
let ArgvSource::Args { args, argidx } = &mut self.source else {
panic!("Not reading from argv")
};
let arg = args.get(**argidx)?;
**argidx += 1;
let retval = InputValue::new(Cow::Borrowed(arg), /*want_newline=*/ true);
Some(retval)
}
/// Return the next argument by reading from stdin ArgvSource.
fn get_arg_stdin(&mut self) -> Option<InputValue<'args>> {
use SplitBehavior::*;
let reader = self.reader.as_mut().unwrap();
let ArgvSource::Stdin { reader, buffer } = &mut self.source else {
panic!("Not reading from stdin")
};
if self.split_behavior == InferNull {
// we must determine if the first `PATH_MAX` bytes contains a null.
@@ -882,9 +896,9 @@ fn get_arg_stdin(&mut self) -> Option<InputValue<'args>> {
// NOTE: C++ wrongly commented that read_blocked retries for EAGAIN
let num_bytes: usize = match self.split_behavior {
Newline => reader.read_until(b'\n', &mut self.buffer),
Null => reader.read_until(b'\0', &mut self.buffer),
Never => reader.read_to_end(&mut self.buffer),
Newline => reader.read_until(b'\n', buffer),
Null => reader.read_until(b'\0', buffer),
Never => reader.read_to_end(buffer),
_ => unreachable!(),
}
.ok()?;
@@ -895,7 +909,7 @@ fn get_arg_stdin(&mut self) -> Option<InputValue<'args>> {
}
// assert!(num_bytes == self.buffer.len());
let (end, want_newline) = match (&self.split_behavior, self.buffer.last()) {
let (end, want_newline) = match (&self.split_behavior, buffer.last()) {
// remove the newline — consumers do not expect it
(Newline, Some(b'\n')) => (num_bytes - 1, true),
// we are missing a trailing newline!
@@ -909,8 +923,8 @@ fn get_arg_stdin(&mut self) -> Option<InputValue<'args>> {
_ => unreachable!(),
};
let parsed = bytes2wcstring(&self.buffer[..end]);
self.buffer.clear();
let parsed = bytes2wcstring(&buffer[..end]);
buffer.clear();
Some(InputValue::new(Cow::Owned(parsed), want_newline))
}
@@ -925,19 +939,10 @@ impl<'args> Iterator for Arguments<'args, '_> {
type Item = InputValue<'args>;
fn next(&mut self) -> Option<Self::Item> {
if self.reader.is_some() {
return self.get_arg_stdin();
match &mut self.source {
ArgvSource::Args { .. } => self.get_arg_argv(),
ArgvSource::Stdin { .. } => self.get_arg_stdin(),
}
if *self.argidx >= self.args.len() {
return None;
}
let retval = InputValue::new(
Cow::Borrowed(self.args[*self.argidx]),
/*want_newline=*/ true,
);
*self.argidx += 1;
Some(retval)
}
}
@@ -1048,3 +1053,51 @@ pub fn builtin_break_continue(
};
Ok(SUCCESS)
}
/// Option character for --color flag
pub const COLOR_OPTION_CHAR: char = '\x10';
#[derive(Debug, Clone, Copy, PartialEq, Eq, Default)]
pub enum ColorEnabled {
#[default]
Auto,
Always,
Never,
}
impl TryFrom<&wstr> for ColorEnabled {
type Error = ();
fn try_from(s: &wstr) -> Result<Self, Self::Error> {
match s {
s if s == "auto" => Ok(ColorEnabled::Auto),
s if s == "always" => Ok(ColorEnabled::Always),
s if s == "never" => Ok(ColorEnabled::Never),
_ => Err(()),
}
}
}
impl ColorEnabled {
pub fn enabled(&self, streams: &crate::io::IoStreams) -> bool {
match self {
ColorEnabled::Always => true,
ColorEnabled::Never => false,
ColorEnabled::Auto => streams.out_is_terminal(),
}
}
pub fn parse_from_opt(
streams: &mut IoStreams,
cmd: &wstr,
arg: &wstr,
) -> Result<Self, ErrorCode> {
Self::try_from(arg).map_err(|()| {
streams.err.append(&wgettext_fmt!(
"%s: Invalid value for '--color' option: '%s'. Expected 'always', 'never', or 'auto'\n",
cmd,
arg
));
STATUS_INVALID_ARGS
})
}
}

View File

@@ -36,14 +36,14 @@ pub fn source(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) -> B
let optind = opts.optind;
if argc == optind || args[optind] == "-" {
if streams.stdin_fd < 0 {
if streams.is_stdin_closed() {
streams
.err
.append(&wgettext_fmt!("%s: stdin is closed\n", cmd));
return Err(STATUS_CMD_ERROR);
}
// Either a bare `source` which means to implicitly read from stdin or an explicit `-`.
if argc == optind && isatty(streams.stdin_fd) {
if argc == optind && isatty(streams.stdin_fd()) {
// Don't implicitly read from the terminal.
streams.err.append(&wgettext_fmt!(
"%s: missing filename argument or input redirection\n",
@@ -52,7 +52,7 @@ pub fn source(parser: &Parser, streams: &mut IoStreams, args: &mut [&wstr]) -> B
return Err(STATUS_CMD_ERROR);
}
func_filename = FilenameRef::new(L!("-").to_owned());
fd = streams.stdin_fd;
fd = streams.stdin_fd();
} else {
match wopen_cloexec(args[optind], OFlag::O_RDONLY, Mode::empty()) {
Ok(file) => {

View File

@@ -202,10 +202,6 @@ fn parse_cmd_opts(
streams: &mut IoStreams,
) -> BuiltinResult {
let cmd = args[0];
let mut args_read = Vec::with_capacity(args.len());
args_read.extend_from_slice(args);
let mut w = WGetopter::new(SHORT_OPTIONS, LONG_OPTIONS, args);
while let Some(c) = w.next_opt() {
match c {

View File

@@ -295,7 +295,7 @@ fn populate_captures_from_match<'a>(
// empty/null members so we're going to have to use an empty string as the
// sentinel value.
if let Some(m) = cg.as_ref().and_then(|cg| cg.name(&name.to_string())) {
if let Some(m) = cg.as_ref().and_then(|cg| cg.name(&name.clone())) {
captures.push(WString::from(m.as_bytes()));
} else if opts.all {
captures.push(WString::new());

View File

@@ -10,9 +10,9 @@ mod test_expressions {
Error, Options, file_id_for_path, fish_wcswidth, lwstat, waccess, wcstod::wcstod,
wcstoi_opts, wstat,
};
use once_cell::sync::Lazy;
use std::collections::HashMap;
use std::os::unix::prelude::*;
use std::sync::LazyLock;
#[derive(Copy, Clone, PartialEq, Eq)]
pub(super) enum Token {
@@ -164,7 +164,7 @@ fn isatty(&self, streams: &mut IoStreams) -> bool {
}
let bint = self.base as i32;
if bint == 0 {
match streams.stdin_fd {
match streams.stdin_fd() {
-1 => false,
fd => isatty(fd),
}
@@ -182,7 +182,7 @@ fn token_for_string(str: &wstr) -> Token {
TOKEN_INFOS.get(str).copied().unwrap_or(Token::Unknown)
}
static TOKEN_INFOS: Lazy<HashMap<&'static wstr, Token>> = Lazy::new(|| {
static TOKEN_INFOS: LazyLock<HashMap<&'static wstr, Token>> = LazyLock::new(|| {
let pairs = [
(L!(""), Token::Unknown),
(L!("!"), Token::UnaryBoolean(UnaryBooleanToken::Bang)),

View File

@@ -1,8 +1,7 @@
use super::prelude::*;
use crate::common::bytes2wcstring;
use crate::function;
use crate::highlight::{colorize, highlight_shell};
use crate::highlight::highlight_and_colorize;
use crate::parse_util::{apply_indents, parse_util_compute_indents};
use crate::path::{path_get_path, path_get_paths};
@@ -15,6 +14,7 @@ struct type_cmd_opts_t {
path: bool,
force_path: bool,
query: bool,
color: ColorEnabled,
}
pub fn r#type(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> BuiltinResult {
@@ -34,6 +34,7 @@ pub fn r#type(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> B
wopt(L!("force-path"), ArgType::NoArgument, 'P'),
wopt(L!("query"), ArgType::NoArgument, 'q'),
wopt(L!("quiet"), ArgType::NoArgument, 'q'),
wopt(L!("color"), ArgType::RequiredArgument, COLOR_OPTION_CHAR),
];
let mut w = WGetopter::new(shortopts, longopts, argv);
@@ -68,6 +69,9 @@ pub fn r#type(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> B
builtin_unknown_option(parser, streams, cmd, argv[w.wopt_index - 1], print_hints);
return Err(STATUS_INVALID_ARGS);
}
COLOR_OPTION_CHAR => {
opts.color = ColorEnabled::parse_from_opt(streams, cmd, w.woptarg.unwrap())?;
}
_ => {
panic!("unexpected retval from wgeopter.next()");
}
@@ -143,17 +147,12 @@ pub fn r#type(parser: &Parser, streams: &mut IoStreams, argv: &mut [&wstr]) -> B
def = apply_indents(&def, &parse_util_compute_indents(&def));
}
if streams.out_is_terminal() {
let mut color = vec![];
highlight_shell(
if opts.color.enabled(streams) {
streams.out.append(&bytes2wcstring(&highlight_and_colorize(
&def,
&mut color,
&parser.context(),
/*io_ok=*/ false,
/*cursor=*/ None,
);
let col = bytes2wcstring(&colorize(&def, &color, parser.vars()));
streams.out.append(&col);
parser.vars(),
)));
} else {
streams.out.append(&def);
}

View File

@@ -1,9 +1,8 @@
use std::cmp::Ordering;
use std::{cmp::Ordering, sync::LazyLock};
use libc::{RLIM_INFINITY, c_uint, rlim_t};
use nix::errno::Errno;
use nix::sys::resource::Resource as ResourceEnum;
use once_cell::sync::Lazy;
use crate::wutil::perror;
use fish_fallback::{fish_wcswidth, wcscasecmp};
@@ -434,7 +433,7 @@ fn new(
}
/// Array of resource_t structs, describing all known resource types.
static RESOURCE_ARR: Lazy<Box<[Resource]>> = Lazy::new(|| {
static RESOURCE_ARR: LazyLock<Box<[Resource]>> = LazyLock::new(|| {
let resources_info = [
(
limits::SBSIZE,

View File

@@ -20,15 +20,15 @@
use fish_fallback::fish_wcwidth;
use fish_wchar::{decode_byte_from_char, encode_byte_to_char};
use libc::{SIG_IGN, SIGTTOU, STDIN_FILENO};
use once_cell::sync::OnceCell;
use std::cell::{Cell, RefCell};
use std::env;
use std::ffi::{CStr, CString, OsString};
use std::io::Read;
use std::mem;
use std::ops::{Deref, DerefMut};
use std::os::unix::prelude::*;
use std::sync::atomic::{AtomicI32, AtomicU32, Ordering};
use std::sync::{Arc, MutexGuard};
use std::sync::{Arc, MutexGuard, OnceLock};
use std::time;
pub const BUILD_DIR: &str = env!("FISH_RESOLVED_BUILD_DIR");
@@ -1030,10 +1030,6 @@ pub fn get_omitted_newline_str() -> &'static str {
static OMITTED_NEWLINE_STR: AtomicRef<str> = AtomicRef::new(&"");
pub fn get_omitted_newline_width() -> usize {
OMITTED_NEWLINE_STR.load().len()
}
static OBFUSCATION_READ_CHAR: AtomicU32 = AtomicU32::new(0);
pub fn get_obfuscation_read_char() -> char {
@@ -1044,7 +1040,7 @@ pub fn get_obfuscation_read_char() -> char {
pub static PROFILING_ACTIVE: RelaxedAtomicBool = RelaxedAtomicBool::new(false);
/// Name of the current program. Should be set at startup. Used by the debug function.
pub static PROGRAM_NAME: OnceCell<&'static wstr> = OnceCell::new();
pub static PROGRAM_NAME: OnceLock<&'static wstr> = OnceLock::new();
pub fn get_program_name() -> &'static wstr {
PROGRAM_NAME.get().unwrap()
@@ -1237,6 +1233,23 @@ pub fn read_blocked(fd: RawFd, buf: &mut [u8]) -> nix::Result<usize> {
}
}
pub trait ReadExt {
/// Like [`std::io::Read::read_to_end`], but does not retry on EINTR.
fn read_to_end_interruptible(&mut self, buf: &mut Vec<u8>) -> std::io::Result<()>;
}
impl<T: Read + ?Sized> ReadExt for T {
fn read_to_end_interruptible(&mut self, buf: &mut Vec<u8>) -> std::io::Result<()> {
let mut chunk = [0_u8; 4096];
loop {
match self.read(&mut chunk)? {
0 => return Ok(()),
n => buf.extend_from_slice(&chunk[..n]),
}
}
}
}
/// Test if the string is a valid function name.
pub fn valid_func_name(name: &wstr) -> bool {
!(name.is_empty()

View File

@@ -4,7 +4,7 @@
mem,
ops::{Deref, DerefMut},
sync::{
Mutex, MutexGuard,
LazyLock, Mutex, MutexGuard,
atomic::{self, AtomicUsize},
},
time::{Duration, Instant},
@@ -55,7 +55,6 @@
};
use bitflags::bitflags;
use fish_wchar::WExt;
use once_cell::sync::Lazy;
// Completion description strings, mostly for different types of files, such as sockets, block
// devices, etc.
@@ -207,7 +206,7 @@ pub fn rank(&self) -> u32 {
/// If this completion replaces the entire token, prepend a prefix. Otherwise do nothing.
pub fn prepend_token_prefix(&mut self, prefix: &wstr) {
if self.flags.contains(CompleteFlags::REPLACES_TOKEN) {
if self.replaces_token() {
self.completion.insert_utfstr(0, prefix)
}
}
@@ -451,7 +450,7 @@ struct CompletionEntryIndex {
/// Completion "wrapper" support. The map goes from wrapping-command to wrapped-command-list.
type WrapperMap = HashMap<WString, Vec<WString>>;
static WRAPPER_MAP: Lazy<Mutex<WrapperMap>> = Lazy::new(|| Mutex::new(HashMap::new()));
static WRAPPER_MAP: LazyLock<Mutex<WrapperMap>> = LazyLock::new(|| Mutex::new(HashMap::new()));
/// Clear the [`CompleteFlags::AUTO_SPACE`] flag, and set [`CompleteFlags::NO_SPACE`] appropriately
/// depending on the suffix of the string.
@@ -606,8 +605,8 @@ struct Completer<'ctx> {
condition_cache: HashMap<WString, bool>,
}
static COMPLETION_AUTOLOADER: Lazy<Mutex<Autoload>> =
Lazy::new(|| Mutex::new(Autoload::new(L!("fish_complete_path"))));
static COMPLETION_AUTOLOADER: LazyLock<Mutex<Autoload>> =
LazyLock::new(|| Mutex::new(Autoload::new(L!("fish_complete_path"))));
impl<'ctx> Completer<'ctx> {
pub fn new(ctx: &'ctx OperationContext<'ctx>, flags: CompletionRequestOptions) -> Self {
@@ -2093,7 +2092,7 @@ fn escape_opening_brackets(completions: &mut [Completion], argument: &wstr) {
return;
};
for comp in completions {
if comp.flags.contains(CompleteFlags::REPLACES_TOKEN) {
if comp.replaces_token() {
continue;
}
comp.flags |= CompleteFlags::REPLACES_TOKEN;
@@ -2128,7 +2127,7 @@ fn mark_completions_duplicating_arguments(
let mut comp_str;
for comp in self.completions.get_list_mut() {
comp_str = comp.completion.clone();
if !comp.flags.contains(CompleteFlags::REPLACES_TOKEN) {
if !comp.replaces_token() {
comp_str.insert_utfstr(0, prefix);
}
if arg_strs.binary_search(&comp_str).is_ok() {
@@ -3000,7 +2999,7 @@ macro_rules! unique_completion_applies_as {
let completions = do_complete(L!("cat te"), CompletionRequestOptions::default());
assert_eq!(completions.len(), 1);
assert_eq!(completions[0].completion, L!("stfile"));
assert!(!(completions[0].flags.contains(CompleteFlags::REPLACES_TOKEN)));
assert!(!completions[0].replaces_token());
assert!(
!(completions[0]
.flags
@@ -3017,7 +3016,7 @@ macro_rules! unique_completion_applies_as {
let completions = do_complete(L!("cat testfile TE"), CompletionRequestOptions::default());
assert_eq!(completions.len(), 1);
assert_eq!(completions[0].completion, L!("testfile"));
assert!(completions[0].flags.contains(CompleteFlags::REPLACES_TOKEN));
assert!(completions[0].replaces_token());
assert!(
completions[0]
.flags

View File

@@ -1,10 +1,10 @@
use crate::common::{BUILD_DIR, get_program_name};
use crate::{flog, flogf};
use fish_build_helper::workspace_root;
use once_cell::sync::OnceCell;
use std::ffi::OsStr;
use std::os::unix::ffi::OsStrExt;
use std::path::{Path, PathBuf};
use std::sync::OnceLock;
/// A struct of configuration directories, determined in main() that fish will optionally pass to
/// env_init.
@@ -163,7 +163,7 @@ pub enum FishPath {
LookUpInPath,
}
static FISH_PATH: OnceCell<FishPath> = OnceCell::new();
static FISH_PATH: OnceLock<FishPath> = OnceLock::new();
/// Get the absolute path to the fish executable itself
pub fn get_fish_path() -> &'static FishPath {

View File

@@ -28,13 +28,12 @@
use crate::wutil::{fish_wcstol, wgetcwd};
use libc::{c_int, uid_t};
use once_cell::sync::{Lazy, OnceCell};
use std::collections::HashMap;
use std::ffi::CStr;
use std::mem::MaybeUninit;
use std::os::unix::prelude::*;
use std::path::PathBuf;
use std::sync::Arc;
use std::sync::{Arc, LazyLock, OnceLock};
/// Set when a universal variable has been modified but not yet been written to disk via sync().
static UVARS_LOCALLY_MODIFIED: RelaxedAtomicBool = RelaxedAtomicBool::new(false);
@@ -591,7 +590,7 @@ fn setup_user(global_exported_mode: EnvSetMode, vars: &EnvStack) {
}
}
pub(crate) static FALLBACK_PATH: Lazy<&[WString]> = Lazy::new(|| {
pub(crate) static FALLBACK_PATH: LazyLock<&[WString]> = LazyLock::new(|| {
// _CS_PATH: colon-separated paths to find POSIX utilities. Same as USER_CS_PATH.
let cs_path = libc::_CS_PATH;
@@ -624,7 +623,7 @@ fn setup_path(global_exported_mode: EnvSetMode) {
/// The originally inherited variables and their values.
/// This is a simple key->value map and not e.g. cut into paths.
pub static INHERITED_VARS: OnceCell<HashMap<WString, WString>> = OnceCell::new();
pub static INHERITED_VARS: OnceLock<HashMap<WString, WString>> = OnceLock::new();
pub fn env_init(paths: Option<&ConfigPaths>, do_uvars: bool, default_paths: bool) {
let vars = EnvStack::globals();

View File

@@ -15,13 +15,13 @@
use crate::threads::{is_forked_child, is_main_thread};
use crate::wutil::fish_wcstol_radix;
use once_cell::sync::Lazy;
use std::cell::{RefCell, UnsafeCell};
use std::collections::HashSet;
use std::ffi::CString;
use std::marker::PhantomData;
use std::mem;
use std::ops::{Deref, DerefMut};
use std::sync::LazyLock;
#[cfg(not(target_has_atomic = "64"))]
use portable_atomic::AtomicU64;
@@ -294,7 +294,7 @@ fn next(&mut self) -> Option<EnvNodeRef> {
}
}
static GLOBAL_NODE: Lazy<EnvNodeRef> = Lazy::new(|| EnvNodeRef::new(false, None));
static GLOBAL_NODE: LazyLock<EnvNodeRef> = LazyLock::new(|| EnvNodeRef::new(false, None));
/// Recursive helper to snapshot a series of nodes.
fn copy_node_chain(node: &EnvNodeRef) -> EnvNodeRef {
@@ -595,7 +595,7 @@ fn export_array_needs_regeneration(&self) -> bool {
let mut cursor = self.export_array_generations.iter().fuse();
let mut mismatch = false;
self.enumerate_generations(|r#gen| {
if cursor.next().cloned() != Some(r#gen) {
if cursor.next().copied() != Some(r#gen) {
mismatch = true;
}
});

View File

@@ -231,6 +231,7 @@ pub fn env_dispatch_var_change(milieu: VarChangeMilieu, key: &wstr, vars: &EnvSt
let suppress_repaint = milieu.is_repainting || !milieu.global_or_universal;
// We want to ignore variable changes until the dispatch table is explicitly initialized.
// TODO(MSRV>=1.94): Use std::sync::LazyLock. (LazyLock::get is stabilized in Rust 1.94)
if let Some(dispatch_table) = Lazy::get(&VAR_DISPATCH_TABLE) {
dispatch_table.dispatch(key, vars, suppress_repaint);
}

View File

@@ -862,7 +862,7 @@ fn test_universal() {
let mut handles = Vec::new();
for i in 0..threads {
let path = test_path.to_owned();
let path = test_path.clone();
handles.push(std::thread::spawn(move || {
test_universal_helper(i, &path);
}));
@@ -873,7 +873,7 @@ fn test_universal() {
}
let mut uvars = EnvUniversal::new();
uvars.initialize_at_path(test_path.to_owned());
uvars.initialize_at_path(test_path.clone());
for i in 0..threads {
for j in 0..UVARS_PER_THREAD {
@@ -1037,11 +1037,11 @@ fn test_universal_callbacks() {
let mut uvars1 = EnvUniversal::new();
let mut uvars2 = EnvUniversal::new();
let mut callbacks = uvars1
.initialize_at_path(test_path.to_owned())
.initialize_at_path(test_path.clone())
.unwrap_or_default();
callbacks.append(
&mut uvars2
.initialize_at_path(test_path.to_owned())
.initialize_at_path(test_path.clone())
.unwrap_or_default(),
);
@@ -1134,7 +1134,7 @@ fn test_universal_ok_to_save() {
let mut uvars = EnvUniversal::new();
uvars
.initialize_at_path(test_path.to_owned())
.initialize_at_path(test_path.clone())
.unwrap_or_default();
assert!(!uvars.is_ok_to_save(), "Should not be OK to save");
uvars.sync();

View File

@@ -14,8 +14,9 @@
use crate::env::{EnvMode, EnvSetMode, EnvStack, Environment, READ_BYTE_LIMIT, Statuses};
#[cfg(have_posix_spawn)]
use crate::env_dispatch::use_posix_spawn;
use crate::fds::make_fd_blocking;
use crate::fds::{PIPE_ERROR, make_autoclose_pipes, open_cloexec};
use crate::fds::{
BorrowedFdFile, PIPE_ERROR, make_autoclose_pipes, make_fd_blocking, open_cloexec,
};
use crate::flog::{flog, flogf};
use crate::fork_exec::PATH_BSHELL;
use crate::fork_exec::blocked_signals_for_job;
@@ -57,7 +58,7 @@
use std::io::{Read, Write};
use std::mem::MaybeUninit;
use std::num::NonZeroU32;
use std::os::fd::{AsRawFd, OwnedFd, RawFd};
use std::os::fd::{AsRawFd, FromRawFd, OwnedFd, RawFd};
use std::slice;
use std::sync::{
Arc, OnceLock,
@@ -251,7 +252,6 @@ pub fn exec_job(parser: &Parser, job: &Job, block_io: IoChain) -> bool {
if job.is_stopped() {
handoff.save_tty_modes();
}
handoff.reclaim();
true
}
@@ -1160,23 +1160,28 @@ fn get_performer_for_builtin(p: &Process, j: &Job, io_chain: &IoChain) -> Box<Pr
let err_io = io_chain.io_for_fd(STDERR_FILENO);
// Figure out what fd to use for the builtin's stdin.
let mut local_builtin_stdin = STDIN_FILENO;
let mut local_builtin_stdin = Some(BorrowedFdFile::stdin());
if let Some(inp) = io_chain.io_for_fd(STDIN_FILENO) {
// An fd of -1 is treated as closing stdin.
// Ignore fd redirections from an fd other than the
// standard ones. e.g. in source <&3 don't actually read from fd 3,
// which is internal to fish. We still respect this redirection in
// that we pass it on as a block IO to the code that source runs,
// and therefore this is not an error.
let ignore_redirect = inp.io_mode() == IoMode::Fd && inp.source_fd() >= 3;
if !ignore_redirect {
local_builtin_stdin = inp.source_fd();
let fd = inp.source_fd();
let ignore_redirect = fd >= 3 && inp.io_mode() == IoMode::Fd;
if fd == -1 {
local_builtin_stdin = None;
} else if !ignore_redirect {
// Safety: the fd may in principal be closed, but this only panics on negative values.
local_builtin_stdin = Some(unsafe { BorrowedFdFile::from_raw_fd(fd) });
}
}
// Populate our IoStreams. This is a bag of information for the builtin.
let mut streams = IoStreams::new(output_stream, errput_stream, &io_chain);
streams.job_group = job_group;
streams.stdin_fd = local_builtin_stdin;
streams.stdin_file = local_builtin_stdin;
streams.stdin_is_directly_redirected = stdin_is_directly_redirected;
streams.out_is_redirected = out_io.is_some();
streams.err_is_redirected = err_io.is_some();

View File

@@ -297,9 +297,9 @@ pub fn expand_tilde(input: &mut WString, vars: &dyn Environment) {
}
}
/// Perform the opposite of tilde expansion on the string, which is modified in place.
pub fn replace_home_directory_with_tilde(s: &wstr, vars: &dyn Environment) -> WString {
let mut result = s.to_owned();
/// Perform the opposite of tilde expansion on the string.
pub fn replace_home_directory_with_tilde(s: impl Into<WString>, vars: &dyn Environment) -> WString {
let mut result = s.into();
// Only absolute paths get this treatment.
if result.starts_with(L!("/")) {
let mut home_directory = L!("~").to_owned();
@@ -737,7 +737,7 @@ fn expand_variables(
// here, So tmp < 1 means it's definitely not in.
// Note we are 1-based.
if item_index >= 1 && item_index <= all_var_items.len() {
var_item_list.push(all_var_items[item_index - 1].to_owned());
var_item_list.push(all_var_items[item_index - 1].clone());
}
}
}
@@ -1038,7 +1038,7 @@ pub fn expand_cmdsubst(
continue;
}
// -1 to convert from 1-based slice index to 0-based vector index.
sub_res2.push(sub_res[idx as usize - 1].to_owned());
sub_res2.push(sub_res[idx as usize - 1].clone());
}
sub_res = sub_res2;
}
@@ -2049,4 +2049,22 @@ fn test_abbreviations() {
assert_eq!(abbr_expand_1(L!("foo"), cmd), Some(L!("bar").into()));
}
#[test]
fn test_replace_home_directory_with_tilde() {
use super::replace_home_directory_with_tilde as rhdwt;
use crate::env::{EnvMode, EnvSetMode, EnvStack};
let vars = EnvStack::new();
vars.set_one(
L!("HOME"),
EnvSetMode::new(EnvMode::GLOBAL, false),
L!("/home/testuser").to_owned(),
);
assert_eq!(rhdwt("/home/testuser/", &vars), "~/");
assert_eq!(rhdwt("/home/testuser/Documents/", &vars), "~/Documents/");
assert_eq!(rhdwt("/home/testuser", &vars), "/home/testuser");
assert_eq!(rhdwt("/other/path/", &vars), "/other/path/");
assert_eq!(rhdwt("relative/path", &vars), "relative/path");
}
}

View File

@@ -10,6 +10,9 @@
use std::ffi::CStr;
use std::fs::File;
use std::io;
use std::mem::ManuallyDrop;
use std::ops::{Deref, DerefMut};
use std::os::fd::{AsRawFd, FromRawFd, IntoRawFd, OwnedFd};
use std::os::unix::prelude::*;
localizable_consts!(
@@ -240,12 +243,84 @@ pub fn make_fd_blocking(fd: RawFd) -> Result<(), io::Error> {
Ok(())
}
/// A helper type for a File that does not close on drop.
/// Note the underlying file is never dropped; this is equivalent to mem::forget.
pub struct BorrowedFdFile(ManuallyDrop<File>);
impl Deref for BorrowedFdFile {
type Target = File;
#[inline]
fn deref(&self) -> &Self::Target {
&self.0
}
}
impl DerefMut for BorrowedFdFile {
#[inline]
fn deref_mut(&mut self) -> &mut Self::Target {
&mut self.0
}
}
impl FromRawFd for BorrowedFdFile {
// Note this does NOT take ownership.
unsafe fn from_raw_fd(fd: RawFd) -> Self {
Self(ManuallyDrop::new(unsafe { File::from_raw_fd(fd) }))
}
}
impl AsRawFd for BorrowedFdFile {
#[inline]
fn as_raw_fd(&self) -> RawFd {
self.0.as_raw_fd()
}
}
impl IntoRawFd for BorrowedFdFile {
#[inline]
fn into_raw_fd(self) -> RawFd {
ManuallyDrop::into_inner(self.0).into_raw_fd()
}
}
impl BorrowedFdFile {
/// Return a BorrowedFdFile from stdin.
pub fn stdin() -> Self {
unsafe { Self::from_raw_fd(libc::STDIN_FILENO) }
}
}
impl Clone for BorrowedFdFile {
// BorrowedFdFile may be cloned: this just shares the borrowed fd.
// It does NOT duplicate the underlying fd.
fn clone(&self) -> Self {
// Safety: just re-borrow the same fd.
unsafe { Self::from_raw_fd(self.as_raw_fd()) }
}
}
impl std::io::Read for BorrowedFdFile {
fn read(&mut self, buf: &mut [u8]) -> io::Result<usize> {
self.deref_mut().read(buf)
}
fn read_vectored(&mut self, bufs: &mut [io::IoSliceMut<'_>]) -> io::Result<usize> {
self.deref_mut().read_vectored(bufs)
}
fn read_to_end(&mut self, buf: &mut Vec<u8>) -> io::Result<usize> {
self.deref_mut().read_to_end(buf)
}
fn read_to_string(&mut self, buf: &mut String) -> io::Result<usize> {
self.deref_mut().read_to_string(buf)
}
}
#[cfg(test)]
mod tests {
use super::{FIRST_HIGH_FD, make_autoclose_pipes};
use super::{BorrowedFdFile, FIRST_HIGH_FD, make_autoclose_pipes};
use crate::tests::prelude::*;
use libc::{F_GETFD, FD_CLOEXEC};
use std::os::fd::AsRawFd;
use std::os::fd::{AsRawFd, FromRawFd};
#[test]
#[serial]
@@ -269,4 +344,18 @@ fn test_pipes() {
}
}
}
#[test]
fn test_borrowed_fd_file_does_not_close() {
let file = std::fs::File::open("/dev/null").unwrap();
let fd = file.as_raw_fd();
let borrowed = unsafe { BorrowedFdFile::from_raw_fd(fd) };
#[allow(clippy::drop_non_drop)]
drop(borrowed);
let flags = unsafe { libc::fcntl(fd, libc::F_GETFD, 0) };
assert!(flags >= 0);
drop(file);
let flags = unsafe { libc::fcntl(fd, libc::F_GETFD, 0) };
assert!(flags < 0);
}
}

View File

@@ -14,10 +14,9 @@
use crate::parser_keywords::parser_keywords_is_reserved;
use crate::prelude::*;
use crate::wutil::dir_iter::DirIter;
use once_cell::sync::Lazy;
use std::collections::{HashMap, HashSet};
use std::num::NonZeroU32;
use std::sync::{Arc, Mutex};
use std::sync::{Arc, LazyLock, Mutex};
#[derive(Clone)]
pub struct FunctionProperties {
@@ -100,7 +99,7 @@ fn allow_autoload(&self, name: &wstr) -> bool {
}
/// The big set of all functions.
static FUNCTION_SET: Lazy<Mutex<FunctionSet>> = Lazy::new(|| {
static FUNCTION_SET: LazyLock<Mutex<FunctionSet>> = LazyLock::new(|| {
Mutex::new(FunctionSet {
funcs: HashMap::new(),
autoload_tombstones: HashSet::new(),

View File

@@ -88,7 +88,7 @@ pub fn test_path(&self, token: &wstr, prefix: bool) -> bool {
is_potential_path(
&token,
prefix,
&[self.working_directory.to_owned()],
std::slice::from_ref(&self.working_directory),
self.ctx,
PathFlags {
expand_tilde: true,
@@ -129,17 +129,17 @@ pub fn test_cd_path(&self, token: &wstr, is_prefix: bool) -> FileTestResult {
}
}
// Test if a the given string is a valid redirection target, given the mode.
// Note we return bool, because we never underline redirection targets.
pub fn test_redirection_target(&self, target: &wstr, mode: RedirectionMode) -> bool {
// Test if a the given string is a valid redirection target, and if so, whether
// it is a path to an existing file.
pub fn test_redirection_target(&self, target: &wstr, mode: RedirectionMode) -> FileTestResult {
// Skip targets exceeding PATH_MAX. See #7837.
if target.len() > (PATH_MAX as usize) {
return false;
return Err(IsErr);
}
let mut target = target.to_owned();
if !expand_one(&mut target, ExpandFlags::FAIL_ON_CMDSUBST, self.ctx, None) {
// Could not be expanded.
return false;
return Err(IsErr);
}
// Ok, we successfully expanded our target. Now verify that it works with this
// redirection. We will probably need it as a path (but not in the case of fd
@@ -148,29 +148,33 @@ pub fn test_redirection_target(&self, target: &wstr, mode: RedirectionMode) -> b
match mode {
RedirectionMode::Fd => {
if target == "-" {
return true;
return Ok(IsFile(false));
}
match fish_wcstoi(&target) {
Ok(fd) => fd >= 0,
Err(_) => false,
Ok(fd) if fd >= 0 => Ok(IsFile(false)),
_ => Err(IsErr),
}
}
RedirectionMode::Input | RedirectionMode::TryInput => {
// Input redirections must have a readable non-directory.
// Note we color "try_input" files as errors if they are invalid,
// even though it's possible to execute these (replaced via /dev/null).
waccess(&target_path, libc::R_OK) == 0
if waccess(&target_path, libc::R_OK) == 0
&& wstat(&target_path).is_ok_and(|md| !md.file_type().is_dir())
{
Ok(IsFile(true))
} else {
Err(IsErr)
}
}
RedirectionMode::Overwrite | RedirectionMode::Append | RedirectionMode::NoClob => {
if string_suffixes_string(L!("/"), &target) {
// Redirections to things that are directories is definitely not
// allowed.
return false;
return Err(IsErr);
}
// Test whether the file exists, and whether it's writable (possibly after
// creating it). access() returns failure if the file does not exist.
// TODO: we do not need to compute file_exists for an 'overwrite' redirection.
let file_exists;
let file_is_writable;
match wstat(&target_path) {
@@ -206,7 +210,10 @@ pub fn test_redirection_target(&self, target: &wstr, mode: RedirectionMode) -> b
}
}
// NoClob means that we must not overwrite files that exist.
file_is_writable && !(file_exists && mode == RedirectionMode::NoClob)
if !file_is_writable || (mode == RedirectionMode::NoClob && file_exists) {
return Err(IsErr);
}
Ok(IsFile(file_exists))
}
}
}
@@ -546,57 +553,57 @@ fn test_redirections() {
// Normal redirection.
let result = tester.test_redirection_target(L!("file.txt"), RedirectionMode::Input);
assert!(result);
assert_eq!(result, Ok(IsFile(true)));
// Can't redirect from a missing file
let result = tester.test_redirection_target(L!("notfile.txt"), RedirectionMode::Input);
assert!(!result);
assert_eq!(result, Err(IsErr));
let result =
tester.test_redirection_target(L!("bogus_path/file.txt"), RedirectionMode::Input);
assert!(!result);
assert_eq!(result, Err(IsErr));
// Can't redirect from a directory.
let result = tester.test_redirection_target(L!("somedir"), RedirectionMode::Input);
assert!(!result);
assert_eq!(result, Err(IsErr));
// Can't redirect from an unreadable file.
#[cfg(not(cygwin))] // Can't mark a file write-only on MSYS, this may work on true Cygwin
{
fs::set_permissions(&file_path, Permissions::from_mode(0o200)).unwrap();
let result = tester.test_redirection_target(L!("file.txt"), RedirectionMode::Input);
assert!(!result);
assert_eq!(result, Err(IsErr));
fs::set_permissions(&file_path, Permissions::from_mode(0o600)).unwrap();
}
// try_input syntax highlighting reports an error even though the command will succeed.
let result = tester.test_redirection_target(L!("file.txt"), RedirectionMode::TryInput);
assert!(result);
assert_eq!(result, Ok(IsFile(true)));
let result = tester.test_redirection_target(L!("notfile.txt"), RedirectionMode::TryInput);
assert!(!result);
assert_eq!(result, Err(IsErr));
let result =
tester.test_redirection_target(L!("bogus_path/file.txt"), RedirectionMode::TryInput);
assert!(!result);
assert_eq!(result, Err(IsErr));
// Test write redirections.
// Overwrite an existing file.
let result = tester.test_redirection_target(L!("file.txt"), RedirectionMode::Overwrite);
assert!(result);
assert_eq!(result, Ok(IsFile(true)));
// Append to an existing file.
let result = tester.test_redirection_target(L!("file.txt"), RedirectionMode::Append);
assert!(result);
assert_eq!(result, Ok(IsFile(true)));
// Write to a missing file.
let result = tester.test_redirection_target(L!("newfile.txt"), RedirectionMode::Overwrite);
assert!(result);
assert_eq!(result, Ok(IsFile(false)));
// No-clobber write to existing file should fail.
let result = tester.test_redirection_target(L!("file.txt"), RedirectionMode::NoClob);
assert!(!result);
assert_eq!(result, Err(IsErr));
// No-clobber write to missing file should succeed.
let result = tester.test_redirection_target(L!("unique.txt"), RedirectionMode::NoClob);
assert!(result);
assert_eq!(result, Ok(IsFile(false)));
let write_modes = &[
RedirectionMode::Overwrite,
@@ -606,8 +613,9 @@ fn test_redirections() {
// Can't write to a directory.
for mode in write_modes {
assert!(
!tester.test_redirection_target(L!("somedir"), *mode),
assert_eq!(
tester.test_redirection_target(L!("somedir"), *mode),
Err(IsErr),
"Should not be able to write to a directory with mode {:?}",
mode
);
@@ -616,8 +624,9 @@ fn test_redirections() {
// Can't write without write permissions.
fs::set_permissions(&file_path, Permissions::from_mode(0o400)).unwrap(); // Read-only.
for mode in write_modes {
assert!(
!tester.test_redirection_target(L!("file.txt"), *mode),
assert_eq!(
tester.test_redirection_target(L!("file.txt"), *mode),
Err(IsErr),
"Should not be able to write to a read-only file with mode {:?}",
mode
);
@@ -629,8 +638,9 @@ fn test_redirections() {
{
fs::set_permissions(&dir_path, Permissions::from_mode(0o500)).unwrap(); // Read and execute, no write.
for mode in write_modes {
assert!(
!tester.test_redirection_target(L!("somedir/newfile.txt"), *mode),
assert_eq!(
tester.test_redirection_target(L!("somedir/newfile.txt"), *mode),
Err(IsErr),
"Should not be able to create/write in a read-only directory with mode {:?}",
mode
);
@@ -639,28 +649,82 @@ fn test_redirections() {
}
// Test fd redirections.
assert!(tester.test_redirection_target(L!("-"), RedirectionMode::Fd));
assert!(tester.test_redirection_target(L!("0"), RedirectionMode::Fd));
assert!(tester.test_redirection_target(L!("1"), RedirectionMode::Fd));
assert!(tester.test_redirection_target(L!("2"), RedirectionMode::Fd));
assert!(tester.test_redirection_target(L!("3"), RedirectionMode::Fd));
assert!(tester.test_redirection_target(L!("500"), RedirectionMode::Fd));
assert_eq!(
tester.test_redirection_target(L!("-"), RedirectionMode::Fd),
Ok(IsFile(false)),
);
assert_eq!(
tester.test_redirection_target(L!("0"), RedirectionMode::Fd),
Ok(IsFile(false)),
);
assert_eq!(
tester.test_redirection_target(L!("1"), RedirectionMode::Fd),
Ok(IsFile(false)),
);
assert_eq!(
tester.test_redirection_target(L!("2"), RedirectionMode::Fd),
Ok(IsFile(false)),
);
assert_eq!(
tester.test_redirection_target(L!("3"), RedirectionMode::Fd),
Ok(IsFile(false)),
);
assert_eq!(
tester.test_redirection_target(L!("500"), RedirectionMode::Fd),
Ok(IsFile(false)),
);
// We are base 10, despite the leading 0.
assert!(tester.test_redirection_target(L!("000"), RedirectionMode::Fd));
assert!(tester.test_redirection_target(L!("01"), RedirectionMode::Fd));
assert!(tester.test_redirection_target(L!("07"), RedirectionMode::Fd));
assert_eq!(
tester.test_redirection_target(L!("000"), RedirectionMode::Fd),
Ok(IsFile(false)),
);
assert_eq!(
tester.test_redirection_target(L!("01"), RedirectionMode::Fd),
Ok(IsFile(false)),
);
assert_eq!(
tester.test_redirection_target(L!("07"), RedirectionMode::Fd),
Ok(IsFile(false)),
);
// Invalid fd redirections.
assert!(!tester.test_redirection_target(L!("0x2"), RedirectionMode::Fd));
assert!(!tester.test_redirection_target(L!("0x3F"), RedirectionMode::Fd));
assert!(!tester.test_redirection_target(L!("0F"), RedirectionMode::Fd));
assert!(!tester.test_redirection_target(L!("-1"), RedirectionMode::Fd));
assert!(!tester.test_redirection_target(L!("-0009"), RedirectionMode::Fd));
assert!(!tester.test_redirection_target(L!("--"), RedirectionMode::Fd));
assert!(!tester.test_redirection_target(L!("derp"), RedirectionMode::Fd));
assert!(!tester.test_redirection_target(L!("123boo"), RedirectionMode::Fd));
assert!(!tester.test_redirection_target(L!("18446744073709551616"), RedirectionMode::Fd));
assert_eq!(
tester.test_redirection_target(L!("0x2"), RedirectionMode::Fd),
Err(IsErr),
);
assert_eq!(
tester.test_redirection_target(L!("0x3F"), RedirectionMode::Fd),
Err(IsErr),
);
assert_eq!(
tester.test_redirection_target(L!("0F"), RedirectionMode::Fd),
Err(IsErr),
);
assert_eq!(
tester.test_redirection_target(L!("-1"), RedirectionMode::Fd),
Err(IsErr),
);
assert_eq!(
tester.test_redirection_target(L!("-0009"), RedirectionMode::Fd),
Err(IsErr),
);
assert_eq!(
tester.test_redirection_target(L!("--"), RedirectionMode::Fd),
Err(IsErr),
);
assert_eq!(
tester.test_redirection_target(L!("derp"), RedirectionMode::Fd),
Err(IsErr),
);
assert_eq!(
tester.test_redirection_target(L!("123boo"), RedirectionMode::Fd),
Err(IsErr),
);
assert_eq!(
tester.test_redirection_target(L!("18446744073709551616"), RedirectionMode::Fd),
Err(IsErr),
);
}
#[test]

View File

@@ -106,6 +106,22 @@ pub fn highlight_shell(
*color = highlighter.highlight();
}
pub fn highlight_and_colorize(
text: &wstr,
ctx: &OperationContext<'_>,
vars: &dyn Environment,
) -> Vec<u8> {
let mut colors = Vec::new();
highlight_shell(
text,
&mut colors,
ctx,
/*io_ok=*/ false,
/*cursor=*/ None,
);
colorize(text, &colors, vars)
}
/// highlight_color_resolver_t resolves highlight specs (like "a command") to actual RGB colors.
/// It maintains a cache with no invalidation mechanism. The lifetime of these should typically be
/// one screen redraw.
@@ -951,24 +967,29 @@ fn visit_redirection(&mut self, redir: &Redirection) {
}
// No command substitution, so we can highlight the target file or fd. For example,
// disallow redirections into a non-existent directory.
let target_is_valid = if !self.io_still_ok() {
let (role, file_exists) = if !self.io_still_ok() {
// I/O is disallowed, so we don't have much hope of catching anything but gross
// errors. Assume it's valid.
true
(HighlightRole::redirection, false)
} else if contains_pending_variable(&self.pending_variables, &target) {
true
// Target uses a variable defined by the current commandline. Assume it's valid.
(HighlightRole::redirection, false)
} else {
// Validate the redirection target..
self.file_tester.test_redirection_target(&target, oper.mode)
};
self.color_node(
&redir.target,
HighlightSpec::with_fg(if target_is_valid {
HighlightRole::redirection
if let Ok(IsFile(file_exists)) =
self.file_tester.test_redirection_target(&target, oper.mode)
{
(HighlightRole::redirection, file_exists)
} else {
HighlightRole::error
}),
);
(HighlightRole::error, false)
}
};
self.color_node(&redir.target, HighlightSpec::with_fg(role));
if file_exists {
for i in redir.target.source_range().as_usize() {
self.color_array[i].valid_path = true;
}
}
}
fn visit_variable_assignment(&mut self, varas: &VariableAssignment) {
@@ -1376,6 +1397,9 @@ macro_rules! validate {
let mut param_valid_path = HighlightSpec::with_fg(HighlightRole::param);
param_valid_path.valid_path = true;
let mut redirection_valid_path = HighlightSpec::with_fg(HighlightRole::redirection);
redirection_valid_path.valid_path = true;
let saved_flag = future_feature_flags::test(FeatureFlag::AmpersandNoBgInToken);
future_feature_flags::set(FeatureFlag::AmpersandNoBgInToken, true);
let _restore_saved_flag = ScopeGuard::new((), |_| {
@@ -1513,7 +1537,7 @@ macro_rules! validate {
("param1", fg(HighlightRole::param)),
// Input redirection.
("<", fg(HighlightRole::redirection)),
("/bin/echo", fg(HighlightRole::redirection)),
("/dev/null", redirection_valid_path),
// Output redirection to a valid fd.
("1>&2", fg(HighlightRole::redirection)),
// Output redirection to an invalid fd.

View File

@@ -2,7 +2,7 @@
use std::{
fs::File,
io::{Read, Write},
io::Read,
ops::{Deref, DerefMut},
os::fd::AsRawFd,
time::{SystemTime, UNIX_EPOCH},
@@ -283,27 +283,30 @@ fn try_from(region: MmapRegion) -> std::io::Result<Self> {
}
}
/// Append a history item to a buffer, in preparation for outputting it to the history file.
pub fn append_history_item_to_buffer(item: &HistoryItem, buffer: &mut Vec<u8>) {
assert!(item.should_write_to_disk(), "Item should not be persisted");
impl HistoryItem {
/// Write this history item to some writer.
pub fn write_to(&self, writer: &mut impl std::io::Write) -> std::io::Result<()> {
assert!(self.should_write_to_disk(), "Item should not be persisted");
let mut cmd = wcs2bytes(item.str());
escape_yaml_fish_2_0(&mut cmd);
buffer.extend(b"- cmd: ");
buffer.extend(&cmd);
buffer.push(b'\n');
writeln!(buffer, " when: {}", time_to_seconds(item.timestamp())).unwrap();
let mut cmd = wcs2bytes(self.str());
escape_yaml_fish_2_0(&mut cmd);
writer.write_all(b"- cmd: ")?;
writer.write_all(&cmd)?;
writer.write_all(b"\n")?;
writeln!(writer, " when: {}", time_to_seconds(self.timestamp()))?;
let paths = item.get_required_paths();
if !paths.is_empty() {
writeln!(buffer, " paths:").unwrap();
for path in paths {
let mut path = wcs2bytes(path);
escape_yaml_fish_2_0(&mut path);
buffer.extend(b" - ");
buffer.extend(&path);
buffer.push(b'\n');
let paths = self.get_required_paths();
if !paths.is_empty() {
writeln!(writer, " paths:")?;
for path in paths {
let mut path = wcs2bytes(path);
escape_yaml_fish_2_0(&mut path);
writer.write_all(b" - ")?;
writer.write_all(&path)?;
writer.write_all(b"\n")?;
}
}
Ok(())
}
}

View File

@@ -29,7 +29,7 @@
collections::{BTreeMap, HashMap, HashSet},
ffi::CString,
fs::File,
io::{BufRead, Read, Write},
io::{BufRead, BufWriter, Read, Write},
mem::MaybeUninit,
num::NonZeroUsize,
ops::ControlFlow,
@@ -50,12 +50,14 @@
fds::wopen_cloexec,
flog::{flog, flogf},
fs::fsync,
history::file::{HistoryFile, RawHistoryFile, append_history_item_to_buffer},
highlight::highlight_and_colorize,
history::file::{HistoryFile, RawHistoryFile},
io::IoStreams,
localization::wgettext_fmt,
operation_context::{EXPANSION_LIMIT_BACKGROUND, OperationContext},
parse_constants::{ParseTreeFlags, StatementDecoration},
parse_util::{parse_util_detect_errors, parse_util_unescape_wildcards},
parser::Parser,
path::{path_get_config, path_get_data, path_is_valid},
prelude::*,
threads::assert_is_background_thread,
@@ -109,16 +111,6 @@ pub enum SearchDirection {
pub const VACUUM_FREQUENCY: usize = 25;
/// Output the contents `buffer` to `file` and clear the `buffer`.
fn flush_to_file(buffer: &mut Vec<u8>, file: &mut File, min_size: usize) -> std::io::Result<()> {
if buffer.is_empty() || buffer.len() < min_size {
return Ok(());
}
file.write_all(buffer)?;
buffer.clear();
Ok(())
}
struct TimeProfiler {
what: &'static str,
start: SystemTime,
@@ -289,7 +281,7 @@ fn merge(&mut self, item: &HistoryItem) -> bool {
// Ok, merge this item.
self.creation_timestamp = self.creation_timestamp.max(item.creation_timestamp);
if self.required_paths.len() < item.required_paths.len() {
self.required_paths = item.required_paths.clone();
self.required_paths.clone_from(&item.required_paths);
}
true
}
@@ -297,6 +289,13 @@ fn merge(&mut self, item: &HistoryItem) -> bool {
static HISTORIES: Mutex<BTreeMap<WString, Arc<History>>> = Mutex::new(BTreeMap::new());
/// When deleting, whether the deletion should be only for this session or for all sessions.
#[derive(Clone, Copy, PartialEq, Eq)]
enum DeletionScope {
SessionOnly,
AllSessions,
}
struct HistoryImpl {
/// The name of this list. Used for picking a suitable filename and for switching modes.
name: WString,
@@ -311,10 +310,8 @@ struct HistoryImpl {
has_pending_item: bool, // false
/// Whether we should disable saving to the file for a time.
disable_automatic_save_counter: u32, // 0
/// Deleted item contents.
/// Boolean describes if it should be deleted only in this session or in all
/// (used in deduplication).
deleted_items: HashMap<WString, bool>,
/// Deleted item contents, and the scope of the deletion.
deleted_items: HashMap<WString, DeletionScope>,
/// The history file contents.
file_contents: Option<HistoryFile>,
/// The file ID of the history file.
@@ -456,7 +453,7 @@ fn compact_new_items(&mut self) {
continue;
}
if !seen.insert(item.contents.to_owned()) {
if !seen.insert(item.contents.clone()) {
// This item was not inserted because it was already in the set, so delete the item at
// this index.
self.new_items.remove(idx);
@@ -511,23 +508,21 @@ fn rewrite_to_temporary_file(
let Some(old_item) = local_file.decode_item(offset) else {
continue;
};
// If old item is newer than session always erase if in deleted.
if old_item.timestamp() > self.boundary_timestamp {
if old_item.is_empty() || self.deleted_items.contains_key(old_item.str()) {
continue;
}
lru.add_item(old_item);
} else {
// If old item is older and in deleted items don't erase if added by
// clear_session.
if old_item.is_empty() || self.deleted_items.get(old_item.str()) == Some(&false)
{
continue;
}
// Add this old item.
lru.add_item(old_item);
if old_item.is_empty() {
continue;
}
// Check if this item should be deleted.
if let Some(&scope) = self.deleted_items.get(old_item.str()) {
// If old item is newer than session always erase if in deleted.
// If old item is older and in deleted items don't erase if added by clear_session.
let delete = old_item.timestamp() > self.boundary_timestamp
|| scope == DeletionScope::AllSessions;
if delete {
continue;
}
}
lru.add_item(old_item);
}
}
@@ -551,30 +546,12 @@ fn rewrite_to_temporary_file(
/// Default buffer size for flushing to the history file.
const HISTORY_OUTPUT_BUFFER_SIZE: usize = 64 * 1024;
// Write them out.
let mut err = None;
let mut buffer = Vec::with_capacity(HISTORY_OUTPUT_BUFFER_SIZE + 128);
let mut buffer = BufWriter::with_capacity(HISTORY_OUTPUT_BUFFER_SIZE + 128, dst);
for item in items {
append_history_item_to_buffer(&item, &mut buffer);
if let Err(e) = flush_to_file(&mut buffer, dst, HISTORY_OUTPUT_BUFFER_SIZE) {
err = Some(e);
break;
}
}
if err.is_none() {
if let Err(e) = flush_to_file(&mut buffer, dst, 0) {
err = Some(e);
}
}
if let Some(err) = err {
flog!(
history_file,
"Error writing to temporary history file:",
err
);
Err(err)
} else {
Ok(())
item.write_to(&mut buffer)?;
}
buffer.flush()?;
Ok(())
}
/// Saves history by rewriting the file.
@@ -587,7 +564,15 @@ fn save_internal_via_rewrite(&mut self, history_path: &wstr) -> std::io::Result<
let rewrite =
|old_file: &File, tmp_file: &mut File| -> std::io::Result<PotentialUpdate<()>> {
self.rewrite_to_temporary_file(old_file, tmp_file)?;
let result = self.rewrite_to_temporary_file(old_file, tmp_file);
if let Err(err) = result {
flog!(
history_file,
"Error writing to temporary history file:",
err
);
return Err(err);
}
Ok(PotentialUpdate {
do_save: true,
data: (),
@@ -646,19 +631,20 @@ fn save_internal_via_appending(&mut self, history_path: &wstr) -> std::io::Resul
// So far so good. Write all items at or after first_unwritten_new_item_index. Note that we
// write even a pending item - pending items are ignored by history within the command
// itself, but should still be written to the file.
// Use a small buffer size for appending, we usually only have 1 item
// Use a small buffer size for appending, as we usually only have 1 item.
// Buffer everything and then write it all at once to avoid tearing writes (O_APPEND).
let mut buffer = Vec::new();
let mut new_first_index = self.first_unwritten_new_item_index;
while new_first_index < self.new_items.len() {
let item = &self.new_items[new_first_index];
if item.should_write_to_disk() {
append_history_item_to_buffer(item, &mut buffer);
// Can't error writing to a buffer.
item.write_to(&mut buffer).unwrap();
}
// We wrote or skipped this item, hooray.
new_first_index += 1;
}
flush_to_file(&mut buffer, locked_history_file.get_mut(), 0)?;
locked_history_file.get_mut().write_all(&buffer)?;
fsync(locked_history_file.get())?;
self.first_unwritten_new_item_index = new_first_index;
@@ -808,7 +794,8 @@ fn is_empty(&mut self) -> bool {
/// Remove a history item.
fn remove(&mut self, str_to_remove: &wstr) {
// Add to our list of deleted items.
self.deleted_items.insert(str_to_remove.to_owned(), false);
self.deleted_items
.insert(str_to_remove.to_owned(), DeletionScope::AllSessions);
for idx in (0..self.new_items.len()).rev() {
let matched = self.new_items[idx].str() == str_to_remove;
@@ -856,7 +843,8 @@ fn clear(&mut self) {
/// Clears only session.
fn clear_session(&mut self) {
for item in &self.new_items {
self.deleted_items.insert(item.str().to_owned(), true);
self.deleted_items
.insert(item.str().to_owned(), DeletionScope::SessionOnly);
}
self.new_items.clear();
@@ -1356,6 +1344,8 @@ pub fn save(&self) {
#[allow(clippy::too_many_arguments)]
pub fn search(
self: &Arc<Self>,
parser: &Parser,
streams: &mut IoStreams,
search_type: SearchType,
search_args: &[&wstr],
show_time_format: Option<&str>,
@@ -1364,7 +1354,7 @@ pub fn search(
null_terminate: bool,
reverse: bool,
cancel_check: &CancelChecker,
streams: &mut IoStreams,
color_enabled: bool,
) -> bool {
let mut remaining = max_items;
let mut collected = Vec::new();
@@ -1376,7 +1366,17 @@ pub fn search(
return ControlFlow::Break(());
}
remaining -= 1;
let formatted_record = format_history_record(item, show_time_format, null_terminate);
let mut formatted_record =
format_history_record(item, show_time_format, null_terminate);
if color_enabled {
formatted_record = bytes2wcstring(&highlight_and_colorize(
&formatted_record,
&parser.context(),
parser.vars(),
));
}
if reverse {
// We need to collect this for later.
collected.push(formatted_record);

View File

@@ -10,10 +10,9 @@
use crate::prelude::*;
use crate::reader::{Reader, reader_reset_interrupted};
use crate::threads::assert_is_main_thread;
use once_cell::sync::Lazy;
use std::mem;
use std::sync::{
Mutex, MutexGuard,
LazyLock, Mutex, MutexGuard,
atomic::{AtomicU32, Ordering},
};
@@ -228,8 +227,8 @@ pub struct InputMappingSet {
/// Access the singleton input mapping set.
pub fn input_mappings() -> MutexGuard<'static, InputMappingSet> {
static INPUT_MAPPINGS: Lazy<Mutex<InputMappingSet>> =
Lazy::new(|| Mutex::new(InputMappingSet::default()));
static INPUT_MAPPINGS: LazyLock<Mutex<InputMappingSet>> =
LazyLock::new(|| Mutex::new(InputMappingSet::default()));
INPUT_MAPPINGS.lock().unwrap()
}
@@ -509,7 +508,7 @@ fn next_is_char(
return self.next_is_char(style, Key::from_raw(key.codepoint), true);
} else if actual_seq
.get(self.subidx + 1)
.cloned()
.copied()
.map(|c| Key::from_single_char(c).codepoint)
== Some(key.codepoint)
{

View File

@@ -1,7 +1,9 @@
use crate::builtins::shared::{STATUS_CMD_ERROR, STATUS_CMD_OK, STATUS_READ_TOO_MUCH};
use crate::common::{EMPTY_STRING, bytes2wcstring, wcs2bytes};
use crate::fd_monitor::{Callback, FdMonitor, FdMonitorItemId};
use crate::fds::{PIPE_ERROR, make_autoclose_pipes, make_fd_nonblocking, wopen_cloexec};
use crate::fds::{
BorrowedFdFile, PIPE_ERROR, make_autoclose_pipes, make_fd_nonblocking, wopen_cloexec,
};
use crate::flog::{flog, flogf, should_flog};
use crate::nix::isatty;
use crate::path::path_apply_working_directory;
@@ -16,11 +18,10 @@
use libc::{EAGAIN, EINTR, ENOENT, ENOTDIR, EPIPE, EWOULDBLOCK, STDOUT_FILENO};
use nix::fcntl::OFlag;
use nix::sys::stat::Mode;
use once_cell::sync::Lazy;
use std::fs::File;
use std::io;
use std::os::fd::{AsFd, AsRawFd, BorrowedFd, OwnedFd, RawFd};
use std::sync::{Arc, Mutex, MutexGuard};
use std::sync::{Arc, LazyLock, Mutex, MutexGuard};
/// separated_buffer_t represents a buffer of output from commands, prepared to be turned into a
/// variable. For example, command substitutions output into one of these. Most commands just
@@ -859,9 +860,9 @@ pub struct IoStreams<'a> {
pub out: &'a mut OutputStream,
pub err: &'a mut OutputStream,
// fd representing stdin. This is not closed by the destructor.
// Note: if stdin is explicitly closed by `<&-` then this is -1!
pub stdin_fd: RawFd,
// File representing stdin.
// Note: if stdin is explicitly closed by `<&-` then this is None!
pub stdin_file: Option<BorrowedFdFile>,
// Whether stdin is "directly redirected," meaning it is the recipient of a pipe (foo | cmd) or
// direct redirection (cmd < foo.txt). An "indirect redirection" would be e.g.
@@ -896,7 +897,7 @@ pub fn new(
IoStreams {
out,
err,
stdin_fd: -1,
stdin_file: None,
stdin_is_directly_redirected: false,
out_is_piped: false,
err_is_piped: false,
@@ -906,9 +907,23 @@ pub fn new(
job_group: None,
}
}
pub fn out_is_terminal(&self) -> bool {
!self.out_is_redirected && isatty(STDOUT_FILENO)
}
/// Return the fd for stdin, or -1 if stdin is closed.
pub fn stdin_fd(&self) -> RawFd {
self.stdin_file.as_ref().map_or(-1, |f| f.as_raw_fd())
}
/// Return whether stdin is closed.
/// This is "closed in the fish sense" - i.e. `<&-` has been used.
/// This does not handle the case where a closed stdin was inherited - in that case
/// we'll have an stdin_fd of 0 and we'll just get syscall errors when we try to use it.
pub fn is_stdin_closed(&self) -> bool {
self.stdin_file.is_none()
}
}
/// File redirection error message.
@@ -919,7 +934,7 @@ pub fn out_is_terminal(&self) -> bool {
const OPEN_MASK: Mode = Mode::from_bits_truncate(0o666);
/// Provide the fd monitor used for background fillthread operations.
static FD_MONITOR: Lazy<FdMonitor> = Lazy::new(FdMonitor::new);
static FD_MONITOR: LazyLock<FdMonitor> = LazyLock::new(FdMonitor::new);
pub fn fd_monitor() -> &'static FdMonitor {
&FD_MONITOR

View File

@@ -3,15 +3,14 @@
//! Works like the killring in emacs and readline. The killring is cut and paste with a memory of
//! previous cuts.
use once_cell::sync::Lazy;
use std::collections::VecDeque;
use std::sync::Mutex;
use std::sync::{LazyLock, Mutex};
use crate::prelude::*;
struct KillRing(VecDeque<WString>);
static KILL_RING: Lazy<Mutex<KillRing>> = Lazy::new(|| Mutex::new(KillRing::new()));
static KILL_RING: LazyLock<Mutex<KillRing>> = LazyLock::new(|| Mutex::new(KillRing::new()));
impl KillRing {
/// Create a new killring.

214
src/localization/gettext.rs Normal file
View File

@@ -0,0 +1,214 @@
use fish_wchar::{L, WString, wstr};
use std::sync::{LazyLock, Mutex};
/// Use this function to localize a message.
/// The [`MaybeStatic`] wrapper type allows avoiding allocating and leaking a new [`wstr`] when no
/// localization is found and the input is returned, but as a static reference.
fn gettext(message: MaybeStatic) -> &'static wstr {
use std::collections::HashMap;
#[cfg(not(feature = "localize-messages"))]
type NarrowMessage = ();
#[cfg(feature = "localize-messages")]
type NarrowMessage = &'static str;
let message_wstr = match message {
MaybeStatic::Static(s) => s,
MaybeStatic::Local(s) => s,
};
static MESSAGE_TO_NARROW: LazyLock<Mutex<HashMap<&'static wstr, NarrowMessage>>> =
LazyLock::new(|| Mutex::new(HashMap::default()));
let mut message_to_narrow = MESSAGE_TO_NARROW.lock().unwrap();
if !message_to_narrow.contains_key(message_wstr) {
let message_wstr: &'static wstr = match message {
MaybeStatic::Static(s) => s,
MaybeStatic::Local(l) => wstr::from_char_slice(Box::leak(l.as_char_slice().into())),
};
#[cfg(not(feature = "localize-messages"))]
let message_str = ();
#[cfg(feature = "localize-messages")]
let message_str = Box::leak(message_wstr.to_string().into_boxed_str());
message_to_narrow.insert(message_wstr, message_str);
}
let (message_static_wstr, message_str) = message_to_narrow.get_key_value(message_wstr).unwrap();
#[cfg(not(feature = "localize-messages"))]
let () = message_str;
#[cfg(feature = "localize-messages")]
{
if let Some(localized_str) = fish_gettext::gettext(message_str) {
static LOCALIZATION_TO_WIDE: LazyLock<Mutex<HashMap<&'static str, &'static wstr>>> =
LazyLock::new(|| Mutex::new(HashMap::default()));
let mut localizations_to_wide = LOCALIZATION_TO_WIDE.lock().unwrap();
if !localizations_to_wide.contains_key(localized_str) {
let localization_wstr =
Box::leak(WString::from_str(localized_str).into_boxed_utfstr());
localizations_to_wide.insert(localized_str, localization_wstr);
}
return localizations_to_wide.get(localized_str).unwrap();
}
}
// No localization found.
message_static_wstr
}
/// A type that can be either a static or local string.
enum MaybeStatic<'a> {
Static(&'static wstr),
Local(&'a wstr),
}
/// A string which can be localized.
/// The wrapped string itself is the original, unlocalized version.
/// Use [`LocalizableString::localize`] to obtain the localized version.
///
/// Do not construct this type directly.
/// For string literals defined in fish's Rust sources,
/// use the macros defined in this file.
/// For strings defined elsewhere, use [`LocalizableString::from_external_source`].
/// Use this function with caution. If the string is not extracted into the gettext PO files from
/// which fish obtains localizations, localization will not work.
#[derive(Debug, Clone)]
pub enum LocalizableString {
Static(&'static wstr),
Owned(WString),
}
impl LocalizableString {
/// Create a [`LocalizableString`] from a string which is not from fish's own Rust sources.
/// Localizations will only work if this string is extracted into the localization files some
/// other way.
pub fn from_external_source(s: WString) -> Self {
Self::Owned(s)
}
/// Get the localization of a [`LocalizableString`].
/// If original string is empty, an empty `wstr` is returned,
/// instead of the gettext metadata.
pub fn localize(&self) -> &'static wstr {
match self {
Self::Static(s) => {
if s.is_empty() {
L!("")
} else {
gettext(MaybeStatic::Static(s))
}
}
Self::Owned(s) => {
if s.is_empty() {
L!("")
} else {
gettext(MaybeStatic::Local(s))
}
}
}
}
}
impl std::fmt::Display for LocalizableString {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.localize())
}
}
/// This macro takes a string literal and produces a [`LocalizableString`].
/// The essential part is the invocation of the proc macro,
/// which ensures that the string gets extracted for localization.
#[macro_export]
#[cfg(feature = "gettext-extract")]
macro_rules! localizable_string {
($string:literal) => {
$crate::localization::LocalizableString::Static(widestring::utf32str!(
fish_gettext_extraction::gettext_extract!($string)
))
};
}
#[macro_export]
#[cfg(not(feature = "gettext-extract"))]
macro_rules! localizable_string {
($string:literal) => {
$crate::localization::LocalizableString::Static(widestring::utf32str!($string))
};
}
pub use localizable_string;
/// Macro for declaring string consts which should be localized.
#[macro_export]
macro_rules! localizable_consts {
(
$(
$(#[$attr:meta])*
$vis:vis
$name:ident
$string:literal
)*
) => {
$(
$(#[$attr])*
$vis const $name: $crate::localization::LocalizableString =
localizable_string!($string);
)*
};
}
pub use localizable_consts;
/// Takes a string literal of a [`LocalizableString`].
/// Given a string literal, it is extracted for localization.
/// Returns a possibly localized `&'static wstr`.
#[macro_export]
macro_rules! wgettext {
(
$string:literal
) => {
localizable_string!($string).localize()
};
(
$string:expr // format string (LocalizableString)
) => {
$string.localize()
};
}
pub use wgettext;
/// Like wgettext, but applies a sprintf format string.
/// The result is a WString.
#[macro_export]
macro_rules! wgettext_fmt {
(
$string:literal // format string
$(, $args:expr)* // list of expressions
$(,)? // optional trailing comma
) => {
$crate::wutil::sprintf!(
localizable_string!($string).localize(),
$($args),*
)
};
(
$string:expr // format string (LocalizableString)
$(, $args:expr)* // list of expressions
$(,)? // optional trailing comma
) => {
$crate::wutil::sprintf!($string.localize(), $($args),*)
};
}
pub use wgettext_fmt;
#[cfg(test)]
mod tests {
use super::*;
use crate::tests::prelude::*;
#[test]
#[serial]
fn test_unlocalized() {
let _cleanup = test_init();
let abc_str = LocalizableString::from_external_source(WString::from("abc"));
let s: &'static wstr = wgettext!(abc_str);
assert_eq!(s, "abc");
let static_str = LocalizableString::from_external_source(WString::from("static"));
let s2: &'static wstr = wgettext!(static_str);
assert_eq!(s2, "static");
}
}

View File

@@ -1,405 +1,26 @@
use std::sync::Mutex;
mod gettext;
pub use gettext::{
LocalizableString, localizable_consts, localizable_string, wgettext, wgettext_fmt,
};
#[cfg(feature = "localize-messages")]
mod settings;
#[cfg(feature = "localize-messages")]
pub use settings::{
list_available_languages, status_language, unset_from_status_language_builtin, update_from_env,
update_from_status_language_builtin,
};
#[cfg(feature = "localize-messages")]
use crate::env::{EnvStack, Environment};
use crate::prelude::*;
use once_cell::sync::Lazy;
#[cfg(feature = "localize-messages")]
fn get_message_locale(vars: &EnvStack) -> Option<(fish_gettext::LocaleVariable, String)> {
use fish_gettext::LocaleVariable;
let get = |var_str: &wstr, var: LocaleVariable| {
vars.get_unless_empty(var_str)
.map(|val| (var, val.as_string().to_string()))
};
get(L!("LC_ALL"), LocaleVariable::LC_ALL)
.or_else(|| get(L!("LC_MESSAGES"), LocaleVariable::LC_MESSAGES))
.or_else(|| get(L!("LANG"), LocaleVariable::LANG))
}
#[cfg(feature = "localize-messages")]
fn get_language_var(vars: &EnvStack) -> Option<Vec<String>> {
let langs = vars.get_unless_empty(L!("LANGUAGE"))?;
let langs = langs.as_list();
let filtered_langs: Vec<String> = langs
.iter()
.filter(|lang| !lang.is_empty())
.map(|lang| lang.to_string())
.collect();
if filtered_langs.is_empty() {
return None;
}
Some(filtered_langs)
}
/// Call this when one of `LANGUAGE`, `LC_ALL`, `LC_MESSAGES`, `LANG` changes.
/// Updates internal state such that the correct localizations will be used in subsequent
/// localization requests.
///
/// For deciding how to localize, the following is done:
///
/// 1. If the language precedence was set via `status language`, env vars are ignored.
/// 2. Check the first non-empty value of the env vars `LC_ALL`, `LC_MESSAGES`, `LANG`. If it
/// starts with `C` we consider this a C locale and disable localization.
/// 3. Otherwise, the value of the `LANGUAGE` env var is used, if non-empty. This allows specifying
/// multiple languages, with languages specified first taking precedence, e.g.
/// `LANGUAGE=zh_TW:zh_CN:pt_BR`
/// 4. Otherwise, the first non-empty value of the env vars `LC_ALL`, `LC_MESSAGES`, `LANG` is
/// used. This can only specify a single language, e.g. `LANG=de_AT.UTF-8`.
/// There, we normalize locale names by stripping off the suffix, leaving only the `ll_CC` part.
/// 5. Otherwise, localization will not happen.
///
/// If users specify `ll_CC` as a language and we don't have a catalog for this language, but we
/// have one for `ll`, that will be used instead. If users specify `ll` (without specifying a
/// language variant), which we discourage, and we don't have a catalog for `ll`, but we do have
/// one for `ll_CC`, that will be used as a fallback. If we have multiple `ll_*` catalogs, all of
/// them will be used, in arbitrary order.
#[cfg(feature = "localize-messages")]
pub fn update_from_env(vars: &EnvStack) {
fish_gettext::update_from_env(get_message_locale(vars), get_language_var(vars));
}
#[cfg(feature = "localize-messages")]
fn append_space_separated_list<S: AsRef<str>>(
string: &mut WString,
list: impl IntoIterator<Item = S>,
) {
for lang in list.into_iter() {
string.push(' ');
string.push_utfstr(&crate::common::escape(
WString::from_str(lang.as_ref()).as_utfstr(),
));
}
}
#[cfg(feature = "localize-messages")]
pub struct SetLanguageLints<'a> {
duplicates: Vec<&'a str>,
non_existing: Vec<&'a str>,
}
#[cfg(feature = "localize-messages")]
impl<'a> From<fish_gettext::SetLanguageLints<'a>> for SetLanguageLints<'a> {
fn from(lints: fish_gettext::SetLanguageLints<'a>) -> Self {
Self {
duplicates: lints.duplicates,
non_existing: lints.non_existing,
}
}
}
#[cfg(feature = "localize-messages")]
impl<'a> SetLanguageLints<'a> {
pub fn display_duplicates(&self) -> WString {
let mut result = WString::new();
if self.duplicates.is_empty() {
return result;
}
result.push_utfstr(wgettext!("Language specifiers appear repeatedly:"));
append_space_separated_list(&mut result, &self.duplicates);
result.push('\n');
result
}
pub fn display_non_existing(&self) -> WString {
let mut result = WString::new();
if self.non_existing.is_empty() {
return result;
}
result.push_utfstr(wgettext!("No catalogs available for language specifiers:"));
append_space_separated_list(&mut result, &self.non_existing);
result.push('\n');
result
}
pub fn display_all(&self) -> WString {
let mut result = WString::new();
result.push_utfstr(&self.display_duplicates());
result.push_utfstr(&self.display_non_existing());
result
}
}
/// Call this when the `status language` builtin should update the language precedence.
/// `langs` should be the list of languages the precedence should be set to.
#[cfg(feature = "localize-messages")]
pub fn update_from_status_language_builtin<'a, 'b: 'a, S: AsRef<str> + 'a>(
langs: &'b [S],
) -> SetLanguageLints<'a> {
fish_gettext::update_from_status_language_builtin(langs).into()
}
#[cfg(feature = "localize-messages")]
pub fn unset_from_status_language_builtin(vars: &EnvStack) {
fish_gettext::unset_from_status_language_builtin(
get_message_locale(vars),
get_language_var(vars),
);
}
#[cfg(feature = "localize-messages")]
pub fn status_language() -> WString {
use fish_gettext::LanguagePrecedenceOrigin;
let localization_state = fish_gettext::status_language();
let mut result = WString::new();
localizable_consts!(
LANGUAGE_LIST_VARIABLE_ORIGIN "%s variable"
);
let origin_string = match localization_state.precedence_origin {
LanguagePrecedenceOrigin::Default => wgettext!("default").to_owned(),
LanguagePrecedenceOrigin::LocaleVariable(var) => {
wgettext_fmt!(LANGUAGE_LIST_VARIABLE_ORIGIN, var.as_str())
}
LanguagePrecedenceOrigin::LanguageEnvVar => {
wgettext_fmt!(LANGUAGE_LIST_VARIABLE_ORIGIN, "LANGUAGE")
}
LanguagePrecedenceOrigin::StatusLanguage => {
wgettext_fmt!("%s command", "`status language set`")
}
};
result.push_utfstr(&wgettext_fmt!(
"Active languages (source: %s):",
origin_string
));
append_space_separated_list(&mut result, &localization_state.language_precedence);
result.push('\n');
result
}
#[cfg(feature = "localize-messages")]
pub fn list_available_languages() -> WString {
let mut languages = WString::new();
for lang in fish_gettext::list_available_languages() {
languages.push_str(lang);
languages.push('\n');
}
languages
}
#[cfg(not(feature = "localize-messages"))]
pub fn initialize_gettext() {}
/// This function only exists to provide a way for initializing gettext before an [`EnvStack`] is
/// This function only exists to provide a way for initializing gettext before an `EnvStack` is
/// available. Without this, early error messages cannot be localized.
#[cfg(feature = "localize-messages")]
pub fn initialize_gettext() {
let vars = EnvStack::new();
env_stack_set_from_env!(vars, "LANGUAGE");
env_stack_set_from_env!(vars, "LC_ALL");
env_stack_set_from_env!(vars, "LC_MESSAGES");
env_stack_set_from_env!(vars, "LANG");
pub fn initialize_localization() {
use crate::env::EnvStack;
use fish_wchar::L;
fish_gettext::update_from_env(get_message_locale(&vars), get_language_var(&vars));
}
/// Use this function to localize a message.
/// The [`MaybeStatic`] wrapper type allows avoiding allocating and leaking a new [`wstr`] when no
/// localization is found and the input is returned, but as a static reference.
fn gettext(message: MaybeStatic) -> &'static wstr {
use std::collections::HashMap;
#[cfg(not(feature = "localize-messages"))]
type NarrowMessage = ();
#[cfg(feature = "localize-messages")]
type NarrowMessage = &'static str;
let message_wstr = match message {
MaybeStatic::Static(s) => s,
MaybeStatic::Local(s) => s,
};
static MESSAGE_TO_NARROW: Lazy<Mutex<HashMap<&'static wstr, NarrowMessage>>> =
Lazy::new(|| Mutex::new(HashMap::default()));
let mut message_to_narrow = MESSAGE_TO_NARROW.lock().unwrap();
if !message_to_narrow.contains_key(message_wstr) {
let message_wstr: &'static wstr = match message {
MaybeStatic::Static(s) => s,
MaybeStatic::Local(l) => wstr::from_char_slice(Box::leak(l.as_char_slice().into())),
};
#[cfg(not(feature = "localize-messages"))]
let message_str = ();
#[cfg(feature = "localize-messages")]
let message_str = Box::leak(message_wstr.to_string().into_boxed_str());
message_to_narrow.insert(message_wstr, message_str);
}
let (message_static_wstr, message_str) = message_to_narrow.get_key_value(message_wstr).unwrap();
#[cfg(not(feature = "localize-messages"))]
let () = message_str;
#[cfg(feature = "localize-messages")]
{
if let Some(localized_str) = fish_gettext::gettext(message_str) {
static LOCALIZATION_TO_WIDE: Lazy<Mutex<HashMap<&'static str, &'static wstr>>> =
Lazy::new(|| Mutex::new(HashMap::default()));
let mut localizations_to_wide = LOCALIZATION_TO_WIDE.lock().unwrap();
if !localizations_to_wide.contains_key(localized_str) {
let localization_wstr =
Box::leak(WString::from_str(localized_str).into_boxed_utfstr());
localizations_to_wide.insert(localized_str, localization_wstr);
}
return localizations_to_wide.get(localized_str).unwrap();
}
}
// No localization found.
message_static_wstr
}
/// A type that can be either a static or local string.
enum MaybeStatic<'a> {
Static(&'static wstr),
Local(&'a wstr),
}
/// A string which can be localized.
/// The wrapped string itself is the original, unlocalized version.
/// Use [`LocalizableString::localize`] to obtain the localized version.
///
/// Do not construct this type directly.
/// For string literals defined in fish's Rust sources,
/// use the macros defined in this file.
/// For strings defined elsewhere, use [`LocalizableString::from_external_source`].
/// Use this function with caution. If the string is not extracted into the gettext PO files from
/// which fish obtains localizations, localization will not work.
#[derive(Debug, Clone)]
pub enum LocalizableString {
Static(&'static wstr),
Owned(WString),
}
impl LocalizableString {
/// Create a [`LocalizableString`] from a string which is not from fish's own Rust sources.
/// Localizations will only work if this string is extracted into the localization files some
/// other way.
pub fn from_external_source(s: WString) -> Self {
Self::Owned(s)
}
/// Get the localization of a [`LocalizableString`].
/// If original string is empty, an empty `wstr` is returned,
/// instead of the gettext metadata.
pub fn localize(&self) -> &'static wstr {
match self {
Self::Static(s) => {
if s.is_empty() {
L!("")
} else {
gettext(MaybeStatic::Static(s))
}
}
Self::Owned(s) => {
if s.is_empty() {
L!("")
} else {
gettext(MaybeStatic::Local(s))
}
}
}
}
}
impl std::fmt::Display for LocalizableString {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.localize())
}
}
/// This macro takes a string literal and produces a [`LocalizableString`].
/// The essential part is the invocation of the proc macro,
/// which ensures that the string gets extracted for localization.
#[macro_export]
#[cfg(feature = "gettext-extract")]
macro_rules! localizable_string {
($string:literal) => {
$crate::localization::LocalizableString::Static(widestring::utf32str!(
fish_gettext_extraction::gettext_extract!($string)
))
};
}
#[macro_export]
#[cfg(not(feature = "gettext-extract"))]
macro_rules! localizable_string {
($string:literal) => {
$crate::localization::LocalizableString::Static(widestring::utf32str!($string))
};
}
pub use localizable_string;
/// Macro for declaring string consts which should be localized.
#[macro_export]
macro_rules! localizable_consts {
(
$(
$(#[$attr:meta])*
$vis:vis
$name:ident
$string:literal
)*
) => {
$(
$(#[$attr])*
$vis const $name: $crate::localization::LocalizableString =
localizable_string!($string);
)*
};
}
pub use localizable_consts;
/// Takes a string literal of a [`LocalizableString`].
/// Given a string literal, it is extracted for localization.
/// Returns a possibly localized `&'static wstr`.
#[macro_export]
macro_rules! wgettext {
(
$string:literal
) => {
localizable_string!($string).localize()
};
(
$string:expr // format string (LocalizableString)
) => {
$string.localize()
};
}
pub use wgettext;
/// Like wgettext, but applies a sprintf format string.
/// The result is a WString.
#[macro_export]
macro_rules! wgettext_fmt {
(
$string:literal // format string
$(, $args:expr)* // list of expressions
$(,)? // optional trailing comma
) => {
$crate::wutil::sprintf!(
localizable_string!($string).localize(),
$($args),*
)
};
(
$string:expr // format string (LocalizableString)
$(, $args:expr)* // list of expressions
$(,)? // optional trailing comma
) => {
$crate::wutil::sprintf!($string.localize(), $($args),*)
};
}
pub use wgettext_fmt;
#[cfg(test)]
mod tests {
use super::*;
use crate::tests::prelude::*;
#[test]
#[serial]
fn test_unlocalized() {
let _cleanup = test_init();
let abc_str = LocalizableString::from_external_source(WString::from("abc"));
let s: &'static wstr = wgettext!(abc_str);
assert_eq!(s, "abc");
let static_str = LocalizableString::from_external_source(WString::from("static"));
let s2: &'static wstr = wgettext!(static_str);
assert_eq!(s2, "static");
}
let env = EnvStack::new();
env_stack_set_from_env!(env, "LANGUAGE");
env_stack_set_from_env!(env, "LC_ALL");
env_stack_set_from_env!(env, "LC_MESSAGES");
env_stack_set_from_env!(env, "LANG");
update_from_env(&env);
}

View File

@@ -0,0 +1,413 @@
use super::{localizable_consts, localizable_string, wgettext, wgettext_fmt};
use crate::env::{EnvStack, Environment};
use fish_wchar::{L, WString, wstr};
use std::collections::{HashMap, HashSet};
use std::sync::{LazyLock, Mutex};
#[derive(PartialEq, Eq, Clone, Copy)]
enum LanguagePrecedenceOrigin {
Default,
LocaleVariable(LocaleVariable),
LanguageEnvVar,
StatusLanguage,
}
#[derive(PartialEq, Eq, Clone, Copy)]
enum LocaleVariable {
#[allow(clippy::upper_case_acronyms)]
LANG,
#[allow(non_camel_case_types)]
LC_MESSAGES,
#[allow(non_camel_case_types)]
LC_ALL,
}
impl LocaleVariable {
fn as_language_precedence_origin(&self) -> LanguagePrecedenceOrigin {
LanguagePrecedenceOrigin::LocaleVariable(*self)
}
fn as_str(&self) -> &'static str {
match self {
Self::LANG => "LANG",
Self::LC_MESSAGES => "LC_MESSAGES",
Self::LC_ALL => "LC_ALL",
}
}
}
impl std::fmt::Display for LocaleVariable {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.as_str())
}
}
struct LocalizationVariables {
message_locale: Option<(LocaleVariable, String)>,
language: Option<Vec<String>>,
}
impl LocalizationVariables {
fn get_message_locale(env: &EnvStack) -> Option<(LocaleVariable, String)> {
let get = |var_str: &wstr, var: LocaleVariable| {
env.get_unless_empty(var_str)
.map(|val| (var, val.as_string().to_string()))
};
get(L!("LC_ALL"), LocaleVariable::LC_ALL)
.or_else(|| get(L!("LC_MESSAGES"), LocaleVariable::LC_MESSAGES))
.or_else(|| get(L!("LANG"), LocaleVariable::LANG))
}
fn get_language_var(env: &EnvStack) -> Option<Vec<String>> {
let langs = env.get_unless_empty(L!("LANGUAGE"))?;
let langs = langs.as_list();
let filtered_langs: Vec<String> = langs
.iter()
.filter(|lang| !lang.is_empty())
.map(|lang| lang.to_string())
.collect();
if filtered_langs.is_empty() {
return None;
}
Some(filtered_langs)
}
fn from_env(env: &EnvStack) -> Self {
Self {
message_locale: Self::get_message_locale(env),
language: Self::get_language_var(env),
}
}
}
fn append_space_separated_list<S: AsRef<str>>(
string: &mut WString,
list: impl IntoIterator<Item = S>,
) {
for lang in list.into_iter() {
string.push(' ');
string.push_utfstr(&crate::common::escape(
WString::from_str(lang.as_ref()).as_utfstr(),
));
}
}
pub struct SetLanguageLints<'a> {
duplicates: Vec<&'a str>,
non_existing: Vec<&'a str>,
}
impl<'a> SetLanguageLints<'a> {
fn display_duplicates(&self) -> WString {
let mut result = WString::new();
if self.duplicates.is_empty() {
return result;
}
result.push_utfstr(wgettext!("Language specifiers appear repeatedly:"));
append_space_separated_list(&mut result, &self.duplicates);
result.push('\n');
result
}
fn display_non_existing(&self) -> WString {
let mut result = WString::new();
if self.non_existing.is_empty() {
return result;
}
result.push_utfstr(wgettext!("No catalogs available for language specifiers:"));
append_space_separated_list(&mut result, &self.non_existing);
result.push('\n');
result
}
pub fn display_all(&self) -> WString {
let mut result = WString::new();
result.push_utfstr(&self.display_duplicates());
result.push_utfstr(&self.display_non_existing());
result
}
}
struct LocalizationState {
precedence_origin: LanguagePrecedenceOrigin,
}
impl LocalizationState {
fn new() -> Self {
Self {
precedence_origin: LanguagePrecedenceOrigin::Default,
}
}
/// Tries to find catalogs for `language`.
/// `language` must be an ISO 639 language code, optionally followed by an underscore and an ISO
/// 3166 country/territory code.
/// Uses the catalog with the exact same name as `language` if it exists.
/// If a country code is present (`ll_CC`), only the catalog named `ll` will be considered as a fallback.
/// If no country code is present (`ll`), all catalogs whose names start with `ll_` will be used in
/// arbitrary order.
fn find_best_matches<'a, 'b: 'a, L: Copy>(
language: &str,
available_languages: &'a HashMap<&'b str, L>,
) -> Vec<(&'b str, L)> {
// Try the exact name first.
// If there already is a corresponding catalog return the language.
if let Some((&lang_str, &lang_value)) = available_languages.get_key_value(language) {
return vec![(lang_str, lang_value)];
}
let language_without_country_code =
language.split_once('_').map_or(language, |(ll, _cc)| ll);
if language == language_without_country_code {
// We have `ll` format. In this case, try to find any catalog whose name starts with `ll_`.
// Note that it is important to include the underscore in the pattern, otherwise `ll` might
// fall back to `llx_CC`, where `llx` is a 3-letter language identifier.
let ll_prefix = format!("{language}_");
let mut lang_catalogs = vec![];
for (&lang_str, &localization_lang) in available_languages.iter() {
if lang_str.starts_with(&ll_prefix) {
lang_catalogs.push((lang_str, localization_lang));
}
}
lang_catalogs
} else {
// If `language` contained a country code, we only try to fall back to a catalog
// without a country code.
if let Some((&lang_str, &lang_value)) =
available_languages.get_key_value(language_without_country_code)
{
vec![(lang_str, lang_value)]
} else {
vec![]
}
}
}
fn update_from_env(&mut self, localization_vars: LocalizationVariables) {
// Do not override values set via `status language`.
if self.precedence_origin == LanguagePrecedenceOrigin::StatusLanguage {
return;
}
if let Some((precedence_origin, locale)) = &localization_vars.message_locale {
// Regular locale names start with lowercase letters (`ll_CC`, followed by some suffix).
// The C or POSIX locale is special, and often used to disable localization.
// Their names are upper-case, but variants with suffixes (`C.UTF-8`) exist.
// To ensure that such variants are accounted for, we match on prefixes of the
// locale name.
// https://pubs.opengroup.org/onlinepubs/009695399/basedefs/xbd_chap07.html#tag_07_02
fn is_c_locale(locale: &str) -> bool {
locale.starts_with('C') || locale.starts_with("POSIX")
}
if is_c_locale(locale) {
self.precedence_origin =
LanguagePrecedenceOrigin::LocaleVariable(*precedence_origin);
fish_gettext::set_language_precedence(&[]);
return;
}
}
let (precedence_origin, language_list) = if let Some(list) = localization_vars.language {
(LanguagePrecedenceOrigin::LanguageEnvVar, list)
} else if let Some((precedence_origin, locale)) = &localization_vars.message_locale {
// Strip off encoding and modifier. (We always expect UTF-8 and don't support modifiers.)
let normalized_name = locale
.split_once(|c: char| !(c.is_ascii_alphabetic() || c == '_'))
.map_or(locale.as_str(), |(lang_name, _)| lang_name)
.to_owned();
// At this point, the normalized_name should have the shape `ll` or `ll_CC`.
(
precedence_origin.as_language_precedence_origin(),
vec![normalized_name],
)
} else {
(LanguagePrecedenceOrigin::Default, vec![])
};
fn update_precedence<'a, 'b: 'a, LocalizationLanguage: Copy + 'a>(
language_list: &[String],
get_available_languages: fn() -> &'a HashMap<&'b str, LocalizationLanguage>,
set_language_precedence: fn(&[LocalizationLanguage]),
) {
let available_langs = get_available_languages();
let mut seen_languages = HashSet::new();
let language_precedence: Vec<_> = language_list
.iter()
.flat_map(|lang| LocalizationState::find_best_matches(lang, available_langs))
.filter(|&(lang_str, _)| seen_languages.insert(lang_str))
.map(|(_, localization_lang)| localization_lang)
.collect();
set_language_precedence(&language_precedence);
}
update_precedence(
&language_list,
fish_gettext::get_available_languages,
fish_gettext::set_language_precedence,
);
self.precedence_origin = precedence_origin;
}
fn update_from_status_language_builtin<'a, 'b: 'a, S: AsRef<str> + 'a>(
&mut self,
langs: &'b [S],
) -> SetLanguageLints<'a> {
let mut seen_in_input = HashSet::new();
let mut unique_lang_strs = vec![];
let mut duplicates = vec![];
for lang in langs {
let lang = lang.as_ref();
if seen_in_input.insert(lang) {
unique_lang_strs.push(lang);
} else {
duplicates.push(lang)
}
}
let mut all_available_langs = HashSet::new();
fn update_precedence<'a, 'b, 'c: 'a + 'b, LocalizationLanguage: Copy + 'a>(
unique_lang_strs: &[&str],
get_available_languages: fn() -> &'a HashMap<&'c str, LocalizationLanguage>,
set_language_precedence: fn(&[LocalizationLanguage]),
all_available_langs: &'b mut HashSet<&'c str>,
) {
let available_langs = get_available_languages();
for &lang in available_langs.keys() {
all_available_langs.insert(lang);
}
let mut existing_langs = vec![];
for lang in unique_lang_strs {
if let Some((&lang_str, &lang_value)) = available_langs.get_key_value(lang) {
existing_langs.push((lang_str, lang_value));
}
}
let mut seen = HashSet::new();
let unique_langs: Vec<_> = existing_langs
.into_iter()
.filter(|&(lang, _)| seen.insert(lang))
.map(|(_, localization_lang)| localization_lang)
.collect();
set_language_precedence(&unique_langs);
}
update_precedence(
&unique_lang_strs,
fish_gettext::get_available_languages,
fish_gettext::set_language_precedence,
&mut all_available_langs,
);
self.precedence_origin = LanguagePrecedenceOrigin::StatusLanguage;
let mut seen_non_existing = HashSet::new();
let non_existing: Vec<&str> = langs
.iter()
.map(|lang| lang.as_ref())
.filter(|&lang| !all_available_langs.contains(lang) && seen_non_existing.insert(lang))
.collect();
SetLanguageLints {
duplicates,
non_existing,
}
}
}
/// Stores the current localization status.
/// `is_active` indicates whether localization is currently active, and the reason if it is
/// not.
/// The `origin` indicates where the values in `language_precedence` were taken from.
/// `language_precedence` stores the catalogs in the order they should be used.
///
/// This struct should be updated when the relevant variables change or `status language` is used
/// to modify the localization state.
static LOCALIZATION_STATE: LazyLock<Mutex<LocalizationState>> =
LazyLock::new(|| Mutex::new(LocalizationState::new()));
/// Call this when one of `LANGUAGE`, `LC_ALL`, `LC_MESSAGES`, `LANG` changes.
/// Updates internal state such that the correct localizations will be used in subsequent
/// localization requests.
///
/// For deciding how to localize, the following is done:
///
/// 1. If the language precedence was set via `status language`, env vars are ignored.
/// 2. Check the first non-empty value of the env vars `LC_ALL`, `LC_MESSAGES`, `LANG`. If it
/// starts with `C` or `POSIX` we consider this a C locale and disable localization.
/// 3. Otherwise, the value of the `LANGUAGE` env var is used, if non-empty. This allows specifying
/// multiple languages, with languages specified first taking precedence, e.g.
/// `LANGUAGE=zh_TW:zh_CN:pt_BR`
/// 4. Otherwise, the first non-empty value of the env vars `LC_ALL`, `LC_MESSAGES`, `LANG` is
/// used. This can only specify a single language, e.g. `LANG=de_AT.UTF-8`.
/// There, we normalize locale names by stripping off the suffix, leaving only the `ll_CC` part.
/// 5. Otherwise, localization will not happen.
///
/// If users specify `ll_CC` as a language and we don't have a catalog for this language, but we
/// have one for `ll`, that will be used instead. If users specify `ll` (without specifying a
/// language variant), which we discourage, and we don't have a catalog for `ll`, but we do have
/// one for `ll_CC`, that will be used as a fallback. If we have multiple `ll_*` catalogs, all of
/// them will be used, in arbitrary order.
pub fn update_from_env(env: &EnvStack) {
let mut localization_state = LOCALIZATION_STATE.lock().unwrap();
localization_state.update_from_env(LocalizationVariables::from_env(env));
}
/// Call this when the `status language` builtin should update the language precedence.
/// `langs` should be the list of languages the precedence should be set to.
pub fn update_from_status_language_builtin<'a, 'b: 'a, S: AsRef<str> + 'a>(
langs: &'b [S],
) -> SetLanguageLints<'a> {
let mut localization_state = LOCALIZATION_STATE.lock().unwrap();
localization_state.update_from_status_language_builtin(langs)
}
pub fn unset_from_status_language_builtin(env: &EnvStack) {
let mut localization_state = LOCALIZATION_STATE.lock().unwrap();
localization_state.precedence_origin = LanguagePrecedenceOrigin::Default;
localization_state.update_from_env(LocalizationVariables::from_env(env));
}
pub fn status_language() -> WString {
let localization_state = LOCALIZATION_STATE.lock().unwrap();
let mut result = WString::new();
localizable_consts!(
LANGUAGE_LIST_VARIABLE_ORIGIN "%s variable"
);
let origin_string = match localization_state.precedence_origin {
LanguagePrecedenceOrigin::Default => wgettext!("default").to_owned(),
LanguagePrecedenceOrigin::LocaleVariable(var) => {
wgettext_fmt!(LANGUAGE_LIST_VARIABLE_ORIGIN, var.as_str())
}
LanguagePrecedenceOrigin::LanguageEnvVar => {
wgettext_fmt!(LANGUAGE_LIST_VARIABLE_ORIGIN, "LANGUAGE")
}
LanguagePrecedenceOrigin::StatusLanguage => {
wgettext_fmt!("%s command", "`status language set`")
}
};
result.push_utfstr(&wgettext_fmt!(
"Active languages (source: %s):",
origin_string
));
let gettext_language_precedence = fish_gettext::get_language_precedence();
append_space_separated_list(&mut result, &gettext_language_precedence);
result.push('\n');
result
}
pub fn list_available_languages() -> WString {
let mut language_set = HashSet::new();
fn add_languages<'a, 'b: 'a, LocalizationLanguage: 'a>(
language_set: &mut HashSet<&'b str>,
get_available_languages: fn() -> &'a HashMap<&'b str, LocalizationLanguage>,
) {
for &lang in get_available_languages().keys() {
language_set.insert(lang);
}
}
add_languages(&mut language_set, fish_gettext::get_available_languages);
let mut language_list = Vec::from_iter(language_set);
language_list.sort();
let mut languages = WString::new();
for lang in language_list {
languages.push_str(lang);
languages.push('\n');
}
languages
}

View File

@@ -1,5 +1,6 @@
//! Pager support.
use std::borrow::Cow;
use std::collections::HashMap;
use std::collections::hash_map::Entry;
@@ -108,7 +109,7 @@ pub struct Pager {
// then we definitely need to re-render.
have_unrendered_completions: bool,
prefix: WString,
prefix: Cow<'static, wstr>,
highlight_prefix: bool,
// The text of the search field.
@@ -390,8 +391,12 @@ fn completion_info_passes_filter(&self, info: &PagerComp) -> bool {
// Match against the completion strings.
for candidate in &info.comp {
if string_fuzzy_match_string(needle, &(self.prefix.clone() + &candidate[..]), false)
.is_some()
if string_fuzzy_match_string(
needle,
&(self.prefix.clone().into_owned() + &candidate[..]),
false,
)
.is_some()
{
return true;
}
@@ -429,7 +434,7 @@ fn completion_print(
let effective_selected_idx = self.visual_selected_completion_index(rows, cols);
for row in row_start..row_stop {
for (col, col_width) in width_by_column.iter().cloned().enumerate() {
for (col, col_width) in width_by_column.iter().copied().enumerate() {
let idx = col * rows + row;
if lst.len() <= idx {
continue;
@@ -639,7 +644,7 @@ pub fn set_completions(&mut self, raw_completions: &[Completion], enable_refilte
self.unfiltered_completion_infos = process_completions_into_infos(raw_completions);
// Maybe join them.
if self.prefix == "-" {
if *self.prefix == "-" {
join_completions(&mut self.unfiltered_completion_infos);
}
@@ -656,8 +661,8 @@ pub fn set_completions(&mut self, raw_completions: &[Completion], enable_refilte
}
// Sets the prefix.
pub fn set_prefix(&mut self, pref: &wstr, highlight: bool /* = true */) {
self.prefix = pref.to_owned();
pub fn set_prefix(&mut self, prefix: Cow<'static, wstr>, highlight: bool /* = true */) {
self.prefix = prefix;
self.highlight_prefix = highlight;
}
@@ -995,7 +1000,7 @@ pub fn is_empty(&self) -> bool {
pub fn clear(&mut self) {
self.unfiltered_completion_infos.clear();
self.completion_infos.clear();
self.prefix.clear();
self.prefix = Cow::Borrowed(L!(""));
self.highlight_prefix = false;
self.selected_completion_idx = None;
self.fully_disclosed = false;
@@ -1283,6 +1288,7 @@ mod tests {
use crate::termsize::Termsize;
use crate::tests::prelude::*;
use crate::wcstringutil::StringFuzzyMatch;
use std::borrow::Cow;
use std::num::NonZeroU16;
#[test]
@@ -1484,7 +1490,7 @@ macro_rules! validate {
StringFuzzyMatch::exact_match(),
CompleteFlags::default(),
)];
pager.set_prefix(L!("{\\\n"), false); // }
pager.set_prefix(Cow::Borrowed(L!("{\\\n")), false); // }
pager.set_completions(&c4s, true);
validate!(&mut pager, 30, L!("{\\␊Hello")); // }
}

View File

@@ -1,11 +1,13 @@
use std::{
panic::{UnwindSafe, set_hook, take_hook},
sync::atomic::{AtomicBool, Ordering},
sync::{
OnceLock,
atomic::{AtomicBool, Ordering},
},
time::Duration,
};
use libc::STDIN_FILENO;
use once_cell::sync::OnceCell;
use crate::{
common::{get_program_name, read_blocked},
@@ -13,7 +15,7 @@
threads::{asan_maybe_exit, is_main_thread},
};
pub static AT_EXIT: OnceCell<Box<dyn Fn() + Send + Sync>> = OnceCell::new();
pub static AT_EXIT: OnceLock<Box<dyn Fn() + Send + Sync>> = OnceLock::new();
pub fn panic_handler(main: impl FnOnce() -> i32 + UnwindSafe) -> ! {
// The isatty() check will stop us from hanging in most fish tests, but not those

View File

@@ -453,7 +453,7 @@ fn infinite_recursive_statement_in_job_list<'b>(
});
if infinite_recursive_statement.is_some() {
*out_func_name = forbidden_function_name.to_owned();
forbidden_function_name.clone_into(out_func_name);
}
// may be none
@@ -1865,7 +1865,7 @@ fn setup_group(&self, ctx: &OperationContext<'_>, j: &mut Job) {
.is_some_and(|job_group| job_group.has_job_id() || !j.wants_job_id())
&& !j.is_initially_background()
{
j.group = ctx.job_group.clone();
j.group.clone_from(&ctx.job_group);
return;
}

View File

@@ -10,11 +10,11 @@
use crate::wutil::{normalize_path, path_normalize_for_cd, waccess, wdirname, wstat};
use errno::{Errno, errno, set_errno};
use libc::{EACCES, ENOENT, ENOTDIR, F_OK, X_OK};
use once_cell::sync::Lazy;
use std::ffi::OsStr;
use std::io::ErrorKind;
use std::mem::MaybeUninit;
use std::os::unix::prelude::*;
use std::sync::LazyLock;
/// Returns the user configuration directory for fish. If the directory or one of its parents
/// doesn't exist, they are first created.
@@ -24,7 +24,7 @@
pub fn path_get_config() -> Option<WString> {
let dir = get_config_directory();
if dir.success() {
Some(dir.path.to_owned())
Some(dir.path.clone())
} else {
None
}
@@ -40,7 +40,7 @@ pub fn path_get_config() -> Option<WString> {
pub fn path_get_data() -> Option<WString> {
let dir = get_data_directory();
if dir.success() {
Some(dir.path.to_owned())
Some(dir.path.clone())
} else {
None
}
@@ -57,7 +57,7 @@ pub fn path_get_data() -> Option<WString> {
pub fn path_get_cache() -> Option<WString> {
let dir = get_cache_directory();
if dir.success() {
Some(dir.path.to_owned())
Some(dir.path.clone())
} else {
None
}
@@ -729,20 +729,20 @@ pub fn path_remoteness(path: &wstr) -> DirRemoteness {
}
fn get_data_directory() -> &'static BaseDirectory {
static DIR: Lazy<BaseDirectory> =
Lazy::new(|| make_base_directory(L!("XDG_DATA_HOME"), L!("/.local/share/fish")));
static DIR: LazyLock<BaseDirectory> =
LazyLock::new(|| make_base_directory(L!("XDG_DATA_HOME"), L!("/.local/share/fish")));
&DIR
}
fn get_cache_directory() -> &'static BaseDirectory {
static DIR: Lazy<BaseDirectory> =
Lazy::new(|| make_base_directory(L!("XDG_CACHE_HOME"), L!("/.cache/fish")));
static DIR: LazyLock<BaseDirectory> =
LazyLock::new(|| make_base_directory(L!("XDG_CACHE_HOME"), L!("/.cache/fish")));
&DIR
}
fn get_config_directory() -> &'static BaseDirectory {
static DIR: Lazy<BaseDirectory> =
Lazy::new(|| make_base_directory(L!("XDG_CONFIG_HOME"), L!("/.config/fish")));
static DIR: LazyLock<BaseDirectory> =
LazyLock::new(|| make_base_directory(L!("XDG_CONFIG_HOME"), L!("/.config/fish")));
&DIR
}

View File

@@ -28,7 +28,6 @@
SIGINT, SIGKILL, SIGPIPE, SIGQUIT, SIGSEGV, SIGSYS, SIGTTOU, STDOUT_FILENO, WCONTINUED,
WEXITSTATUS, WIFCONTINUED, WIFEXITED, WIFSIGNALED, WIFSTOPPED, WNOHANG, WTERMSIG, WUNTRACED,
};
use once_cell::sync::Lazy;
#[cfg(not(target_has_atomic = "64"))]
use portable_atomic::AtomicU64;
use std::cell::{Cell, Ref, RefCell, RefMut};
@@ -40,7 +39,7 @@
#[cfg(target_has_atomic = "64")]
use std::sync::atomic::AtomicU64;
use std::sync::atomic::{AtomicU8, Ordering};
use std::sync::{Arc, Mutex, OnceLock};
use std::sync::{Arc, LazyLock, Mutex, OnceLock};
/// Types of processes.
#[derive(Default)]
@@ -1615,8 +1614,8 @@ fn process_clean_after_marking(parser: &Parser, interactive: bool) -> bool {
pub fn have_proc_stat() -> bool {
// Check for /proc/self/stat to see if we are running with Linux-style procfs.
static HAVE_PROC_STAT_RESULT: Lazy<bool> =
Lazy::new(|| fs::metadata("/proc/self/stat").is_ok());
static HAVE_PROC_STAT_RESULT: LazyLock<bool> =
LazyLock::new(|| fs::metadata("/proc/self/stat").is_ok());
*HAVE_PROC_STAT_RESULT
}

View File

@@ -24,7 +24,6 @@
};
use nix::fcntl::OFlag;
use nix::sys::stat::Mode;
use once_cell::sync::Lazy;
#[cfg(not(target_has_atomic = "64"))]
use portable_atomic::AtomicU64;
use std::borrow::Cow;
@@ -43,7 +42,7 @@
#[cfg(target_has_atomic = "64")]
use std::sync::atomic::AtomicU64;
use std::sync::atomic::{AtomicI32, AtomicU8, AtomicU32, Ordering};
use std::sync::{Arc, Mutex, MutexGuard};
use std::sync::{Arc, LazyLock, Mutex, MutexGuard, OnceLock};
use std::time::{Duration, Instant};
use errno::{Errno, errno};
@@ -175,17 +174,16 @@ enum ExitState {
static EXIT_STATE: AtomicU8 = AtomicU8::new(ExitState::None as u8);
pub static SHELL_MODES: Lazy<Mutex<libc::termios>> =
Lazy::new(|| Mutex::new(unsafe { std::mem::zeroed() }));
pub static SHELL_MODES: LazyLock<Mutex<libc::termios>> =
LazyLock::new(|| Mutex::new(unsafe { std::mem::zeroed() }));
/// The valid terminal modes on startup.
/// Warning: this is read from the SIGTERM handler! Hence the raw global.
static TERMINAL_MODE_ON_STARTUP: once_cell::sync::OnceCell<libc::termios> =
once_cell::sync::OnceCell::new();
static TERMINAL_MODE_ON_STARTUP: OnceLock<libc::termios> = OnceLock::new();
/// Mode we use to execute programs.
static TTY_MODES_FOR_EXTERNAL_CMDS: Lazy<Mutex<libc::termios>> =
Lazy::new(|| Mutex::new(unsafe { std::mem::zeroed() }));
static TTY_MODES_FOR_EXTERNAL_CMDS: LazyLock<Mutex<libc::termios>> =
LazyLock::new(|| Mutex::new(unsafe { std::mem::zeroed() }));
static RUN_COUNT: AtomicU64 = AtomicU64::new(0);
@@ -560,6 +558,12 @@ enum JumpPrecision {
To,
}
#[derive(Copy, Clone, Eq, PartialEq)]
enum CompletionAction {
ShownAmbiguous,
InsertedUnique,
}
/// readline_loop_state_t encapsulates the state used in a readline loop.
struct ReadlineLoopState {
/// The last command that was executed.
@@ -569,10 +573,7 @@ struct ReadlineLoopState {
yank_len: usize,
/// If the last "complete" readline command has inserted text into the command line.
complete_did_insert: bool,
/// List of completions.
comp: Vec<Completion>,
completion_action: Option<CompletionAction>,
/// Whether the loop has finished, due to reaching the character limit or through executing a
/// command.
@@ -587,8 +588,7 @@ fn new() -> Self {
Self {
last_cmd: None,
yank_len: 0,
complete_did_insert: true,
comp: vec![],
completion_action: None,
finished: false,
nchars: None,
}
@@ -2472,7 +2472,6 @@ fn eval_bind_cmd(&mut self, cmd: &wstr) {
self.parser.eval(cmd, &IoChain::new());
self.parser.set_last_statuses(last_statuses);
scoped_tty.reclaim();
}
/// Run a sequence of commands from an input binding.
@@ -2558,7 +2557,7 @@ fn handle_char_event(&mut self, injected_event: Option<CharEvent>) -> ControlFlo
if self.reset_loop_state {
self.reset_loop_state = false;
self.rls_mut().last_cmd = None;
self.rls_mut().complete_did_insert = false;
self.rls_mut().completion_action = None;
}
// Perhaps update the termsize. This is cheap if it has not changed.
reader_update_termsize(self.parser);
@@ -2920,7 +2919,7 @@ fn handle_readline_command(&mut self, c: ReadlineCmd) {
// but never complete{,_and_search})
//
// Also paging is already cancelled above.
if self.rls().complete_did_insert
if self.rls().completion_action == Some(CompletionAction::InsertedUnique)
&& matches!(
self.rls().last_cmd,
Some(rl::Complete | rl::CompleteAndSearch)
@@ -2969,8 +2968,7 @@ fn handle_readline_command(&mut self, c: ReadlineCmd) {
return;
}
if self.is_navigating_pager_contents()
|| (!self.rls().comp.is_empty()
&& !self.rls().complete_did_insert
|| (self.rls().completion_action == Some(CompletionAction::ShownAmbiguous)
&& self.rls().last_cmd == Some(rl::Complete))
{
// The user typed complete more than once in a row. If we are not yet fully
@@ -2994,7 +2992,6 @@ fn handle_readline_command(&mut self, c: ReadlineCmd) {
let mut tty = TtyHandoff::new(reader_save_screen_state);
tty.disable_tty_protocols();
self.compute_and_apply_completions(c);
tty.reclaim();
}
}
rl::PagerToggleSearch => {
@@ -3295,7 +3292,7 @@ fn handle_readline_command(&mut self, c: ReadlineCmd) {
self.history_pager = Some(0..1);
// Update the pager data.
self.pager.set_search_field_shown(true);
self.pager.set_prefix(L!(""), false);
self.pager.set_prefix(Cow::Borrowed(L!("")), false);
// Update the search field, which triggers the actual history search.
let search_string = if !self.history_search.active()
|| self.history_search.search_string().is_empty()
@@ -5988,14 +5985,10 @@ fn reader_run_command(parser: &Parser, cmd: &wstr) -> EvalRes {
// Provide values for `status current-command` and `status current-commandline`
if !ft.is_empty() {
parser.libdata_mut().status_vars.command = ft.to_owned();
parser.libdata_mut().status_vars.command = ft.clone();
parser.libdata_mut().status_vars.commandline = cmd.to_owned();
// Also provide a value for the deprecated fish 2.0 $_ variable
parser.set_one(
L!("_"),
ParserEnvSetMode::new(EnvMode::GLOBAL),
ft.to_owned(),
);
parser.set_one(L!("_"), ParserEnvSetMode::new(EnvMode::GLOBAL), ft.clone());
}
reader_write_title(cmd, parser, true);
@@ -6116,19 +6109,20 @@ fn add_to_history(&mut self) {
// Remove ephemeral items - even if the text is empty.
self.history.remove_ephemeral_items();
if !text.is_empty() {
// Mark this item as ephemeral if should_add_to_history says no (#615).
let mode = if !self.should_add_to_history(&text) {
PersistenceMode::Ephemeral
} else if in_private_mode(self.vars()) {
PersistenceMode::Memory
} else {
PersistenceMode::Disk
};
self.history
.add_pending_with_file_detection(&text, &self.parser.variables, mode);
if text.is_empty() {
return;
}
// Mark this item as ephemeral if should_add_to_history says no (#615).
let mode = if !self.should_add_to_history(&text) {
PersistenceMode::Ephemeral
} else if in_private_mode(self.vars()) {
PersistenceMode::Memory
} else {
PersistenceMode::Disk
};
self.history
.add_pending_with_file_detection(&text, &self.parser.variables, mode);
}
/// Check if we have background jobs that we have not warned about.
@@ -6519,15 +6513,6 @@ fn reader_can_replace(s: &wstr, flags: CompleteFlags) -> bool {
.any(|c| matches!(c, '$' | '*' | '?' | '(' | '{' | '}' | ')'))
}
/// Determine the best (lowest) match rank for a set of completions.
fn get_best_rank(comp: &[Completion]) -> u32 {
let mut best_rank = u32::MAX;
for c in comp {
best_rank = best_rank.min(c.rank());
}
best_rank
}
impl<'a> Reader<'a> {
/// Compute completions and update the pager and/or commandline as needed.
fn compute_and_apply_completions(&mut self, c: ReadlineCmd) {
@@ -6590,8 +6575,7 @@ fn compute_and_apply_completions(&mut self, c: ReadlineCmd) {
return;
}
ExpandResultCode::ok => {
self.rls_mut().comp.clear();
self.rls_mut().complete_did_insert = false;
self.rls_mut().completion_action = None;
self.push_edit(
EditableLineTag::Commandline,
Edit::new(token_range, wc_expanded),
@@ -6604,12 +6588,11 @@ fn compute_and_apply_completions(&mut self, c: ReadlineCmd) {
// up to the end of the token we're completing.
let cmdsub = &el.text()[cmdsub_range.start..token_range.end];
let (comp, _needs_load) = complete(
let (mut comp, _needs_load) = complete(
cmdsub,
CompletionRequestOptions::normal(),
&self.parser.context(),
);
self.rls_mut().comp = comp;
let el = &self.command_line;
// User-supplied completions may have changed the commandline - prevent buffer
@@ -6618,23 +6601,22 @@ fn compute_and_apply_completions(&mut self, c: ReadlineCmd) {
token_range.end = std::cmp::min(token_range.end, el.text().len());
// Munge our completions.
sort_and_prioritize(
&mut self.rls_mut().comp,
CompletionRequestOptions::default(),
);
sort_and_prioritize(&mut comp, CompletionRequestOptions::default());
let el = &self.command_line;
// Record our cycle_command_line.
self.cycle_command_line = el.text().to_owned();
self.cycle_cursor_pos = token_range.end;
self.rls_mut().complete_did_insert = self.handle_completions(token_range);
let inserted_unique = self.handle_completions(token_range, comp);
self.rls_mut().completion_action = if inserted_unique {
Some(CompletionAction::InsertedUnique)
} else {
(!self.pager.is_empty()).then_some(CompletionAction::ShownAmbiguous)
};
// Show the search field if requested and if we printed a list of completions.
if c == ReadlineCmd::CompleteAndSearch
&& !self.rls().complete_did_insert
&& !self.pager.is_empty()
{
if c == ReadlineCmd::CompleteAndSearch && !inserted_unique && !self.pager.is_empty() {
self.pager.set_search_field_shown(true);
self.select_completion_in_direction(SelectionMotion::Next, false);
}
@@ -6643,7 +6625,7 @@ fn compute_and_apply_completions(&mut self, c: ReadlineCmd) {
fn try_insert(&mut self, c: Completion, tok: &wstr, token_range: Range<usize>) {
// If this is a replacement completion, check that we know how to replace it, e.g. that
// the token doesn't contain evil operators like {}.
if !c.flags.contains(CompleteFlags::REPLACES_TOKEN) || reader_can_replace(tok, c.flags) {
if !c.replaces_token() || reader_can_replace(tok, c.flags) {
self.completion_insert(
&c.completion,
token_range.end,
@@ -6668,11 +6650,30 @@ fn try_insert(&mut self, c: Completion, tok: &wstr, token_range: Range<usize>) {
/// \param token_end the position after the token to complete
///
/// Return true if we inserted text into the command line, false if we did not.
fn handle_completions(&mut self, token_range: Range<usize>) -> bool {
fn handle_completions(&mut self, token_range: Range<usize>, mut comp: Vec<Completion>) -> bool {
let tok = self.command_line.text()[token_range.clone()].to_owned();
let comp = &self.rls().comp;
// Check trivial cases.
comp.retain({
let best_rank = comp.iter().map(|c| c.rank()).min().unwrap_or(u32::MAX);
move |c| {
// Ignore completions with a less suitable match rank than the best.
assert!(c.rank() >= best_rank);
c.rank() == best_rank
}
});
// Determine whether we are going to replace the token or not. If any commands of the best
// rank do not require replacement, then ignore all those that want to use replacement.
let will_replace_token = comp.iter().all(|c| c.replaces_token());
comp.retain(|c| !c.replaces_token() || reader_can_replace(&tok, c.flags));
for c in &mut comp {
if !will_replace_token && c.replaces_token() {
c.flags |= CompleteFlags::SUPPRESS_PAGER_PREFIX;
}
}
let len = comp.len();
if len == 0 {
// No suitable completions found, flash screen and return.
@@ -6684,85 +6685,27 @@ fn handle_completions(&mut self, token_range: Range<usize>) -> bool {
return false;
} else if len == 1 {
// Exactly one suitable completion found - insert it.
let c = &comp[0];
self.try_insert(c.clone(), &tok, token_range);
return true;
}
let best_rank = get_best_rank(comp);
// Determine whether we are going to replace the token or not. If any commands of the best
// rank do not require replacement, then ignore all those that want to use replacement.
let mut will_replace_token = true;
for c in comp {
if c.rank() <= best_rank && !c.flags.contains(CompleteFlags::REPLACES_TOKEN) {
will_replace_token = false;
break;
}
}
// Decide which completions survived. There may be a lot of them; it would be nice if we could
// figure out how to avoid copying them here.
let mut surviving_completions = vec![];
let mut all_matches_exact_or_prefix = true;
for c in comp {
// Ignore completions with a less suitable match rank than the best.
if c.rank() > best_rank {
continue;
}
// Only use completions that match replace_token.
let completion_replaces_token = c.flags.contains(CompleteFlags::REPLACES_TOKEN);
let replaces_only_due_to_case_mismatch = {
c.flags.contains(CompleteFlags::REPLACES_TOKEN)
&& c.r#match.is_exact_or_prefix()
&& !matches!(c.r#match.case_fold, CaseSensitivity::Sensitive)
};
if completion_replaces_token != will_replace_token {
// Keep smart/samecase results even if we prefer not to replace the token.
if will_replace_token || !replaces_only_due_to_case_mismatch {
continue;
}
}
// Don't use completions that want to replace, if we cannot replace them.
if completion_replaces_token && !reader_can_replace(&tok, c.flags) {
continue;
}
all_matches_exact_or_prefix &= c.r#match.is_exact_or_prefix();
let mut completion = c.clone();
if replaces_only_due_to_case_mismatch && !will_replace_token {
completion.flags |= CompleteFlags::SUPPRESS_PAGER_PREFIX;
}
surviving_completions.push(completion);
}
if surviving_completions.len() == 1 {
// After sorting and stuff only one completion is left, use it.
//
// TODO: This happens when smartcase kicks in, e.g.
// the token is "cma" and the options are "cmake/" and "CMakeLists.txt"
// it would be nice if we could figure
// out how to use it more.
let c = std::mem::take(&mut surviving_completions[0]);
let c = std::mem::take(&mut comp[0]);
self.try_insert(c, &tok, token_range);
return true;
}
let mut use_prefix = false;
let mut common_prefix = L!("").to_owned();
let mut common_prefix = L!("");
let all_matches_exact_or_prefix = comp.iter().all(|c| c.r#match.is_exact_or_prefix());
assert!(will_replace_token || all_matches_exact_or_prefix);
if all_matches_exact_or_prefix {
// Try to find a common prefix to insert among the surviving completions.
let mut flags = CompleteFlags::empty();
let mut prefix_is_partial_completion = false;
let mut first = true;
for c in &surviving_completions {
for c in &comp {
if c.flags.contains(CompleteFlags::SUPPRESS_PAGER_PREFIX) {
continue;
}
if first {
// First entry, use the whole string.
common_prefix = c.completion.clone();
common_prefix = &c.completion;
flags = c.flags;
first = false;
} else {
@@ -6777,7 +6720,7 @@ fn handle_completions(&mut self, token_range: Range<usize>) -> bool {
}
// idx is now the length of the new common prefix.
common_prefix.truncate(idx);
common_prefix = common_prefix.slice_to(idx);
prefix_is_partial_completion = true;
// Early out if we decide there's no common prefix.
@@ -6801,7 +6744,7 @@ fn handle_completions(&mut self, token_range: Range<usize>) -> bool {
flags |= CompleteFlags::NO_SPACE;
}
self.completion_insert(
&common_prefix,
common_prefix,
token_range.end,
flags,
/*is_unique=*/ false,
@@ -6811,48 +6754,53 @@ fn handle_completions(&mut self, token_range: Range<usize>) -> bool {
}
}
// Print the completion list.
let prefix = if will_replace_token && !use_prefix {
Cow::Borrowed(L!(""))
} else {
let mut prefix = WString::new();
let full = if will_replace_token {
common_prefix.to_owned()
} else {
tok + common_prefix
};
if full.len() <= PREFIX_MAX_LEN {
prefix = full;
} else {
// Collapse parent directories and append end of string
prefix.push(get_ellipsis_char());
let truncated = &full[full.len() - PREFIX_MAX_LEN..];
let (i, last_component) = truncated.split('/').enumerate().last().unwrap();
if i == 0 {
// No path separators were found in the common prefix, so we can't collapse
// any further
prefix.push_utfstr(&truncated);
} else {
// Discard any parent directories and include whats left
prefix.push('/');
prefix.push_utfstr(last_component);
};
}
Cow::Owned(prefix)
};
if use_prefix {
for c in &mut surviving_completions {
let common_prefix_len = common_prefix.len();
for c in &mut comp {
if c.flags.contains(CompleteFlags::SUPPRESS_PAGER_PREFIX) {
// Keep replacement semantics and the original prefix so these completions can
// fix casing when selected.
continue;
}
c.flags &= !CompleteFlags::REPLACES_TOKEN;
c.completion.replace_range(0..common_prefix.len(), L!(""));
c.completion.replace_range(0..common_prefix_len, L!(""));
}
}
// Print the completion list.
let mut prefix = WString::new();
if will_replace_token || !all_matches_exact_or_prefix {
if use_prefix {
prefix.push_utfstr(&common_prefix);
}
} else if tok.len() + common_prefix.len() <= PREFIX_MAX_LEN {
prefix.push_utfstr(&tok);
prefix.push_utfstr(&common_prefix);
} else {
// Collapse parent directories and append end of string
prefix.push(get_ellipsis_char());
let full = tok + &common_prefix[..];
let truncated = &full[full.len() - PREFIX_MAX_LEN..];
let (i, last_component) = truncated.split('/').enumerate().last().unwrap();
if i == 0 {
// No path separators were found in the common prefix, so we can't collapse
// any further
prefix.push_utfstr(&truncated);
} else {
// Discard any parent directories and include whats left
prefix.push('/');
prefix.push_utfstr(last_component);
};
}
// Update the pager data.
self.pager.set_prefix(&prefix, true);
self.pager.set_completions(&surviving_completions, true);
self.pager.set_prefix(prefix, true);
self.pager.set_completions(&comp, true);
// Modify the command line to reflect the new pager.
self.pager_selection_changed();
false

View File

@@ -22,8 +22,8 @@
use libc::{ONLCR, STDERR_FILENO, STDOUT_FILENO};
use crate::common::{
get_ellipsis_char, get_omitted_newline_str, get_omitted_newline_width,
has_working_tty_timestamps, shell_modes, wcs2bytes, write_loop,
get_ellipsis_char, get_omitted_newline_str, has_working_tty_timestamps, shell_modes, wcs2bytes,
write_loop,
};
use crate::env::Environment;
use crate::flog::{flog, flogf};
@@ -687,10 +687,12 @@ fn abandon_line_string(screen_width: Option<usize>) -> Vec<u8> {
let mut abandon_line_string = Vec::with_capacity(screen_width + 32);
let omitted_newline_str = get_omitted_newline_str();
// Do the PROMPT_SP hack.
// Don't need to check for fish_wcwidth errors; this is done when setting up
// omitted_newline_char in common.rs.
let non_space_width = get_omitted_newline_width();
let non_space_width = omitted_newline_str.chars().count();
// We do `>` rather than `>=` because the code below might require one extra space.
if screen_width > non_space_width {
if use_terminfo() {
@@ -732,13 +734,13 @@ fn abandon_line_string(screen_width: Option<usize>) -> Vec<u8> {
abandon_line_string.write_command(EnterDimMode);
}
abandon_line_string.extend_from_slice(get_omitted_newline_str().as_bytes());
abandon_line_string.extend_from_slice(omitted_newline_str.as_bytes());
abandon_line_string.write_command(ExitAttributeMode);
abandon_line_string.extend(repeat_n(b' ', screen_width - non_space_width));
}
abandon_line_string.push(b'\r');
abandon_line_string.extend_from_slice(get_omitted_newline_str().as_bytes());
abandon_line_string.extend_from_slice(omitted_newline_str.as_bytes());
// Now we are certainly on a new line. But we may have dropped the omitted newline char on
// it. So append enough spaces to overwrite the omitted newline char, and then clear all the
// spaces from the new line.
@@ -1978,7 +1980,7 @@ fn compute_layout(
);
let left_prompt_width = left_prompt_layout.last_line_width;
let mut right_prompt_width = right_prompt_layout.last_line_width;
let right_prompt_width = right_prompt_layout.last_line_width;
// Get the width of the first line, and if there is more than one line.
let first_command_line_width: usize = line_at_cursor(commandline_before_suggestion, 0)
@@ -2014,24 +2016,14 @@ fn compute_layout(
..Default::default()
};
// Hide the right prompt if it doesn't fit on the first line.
if left_prompt_width + first_command_line_width + right_prompt_width < screen_width {
result.right_prompt = right_prompt;
} else {
right_prompt_width = 0;
}
// Now we should definitely fit.
assert!(left_prompt_width + right_prompt_width <= screen_width);
// Track each logical line from the autosuggestion so we can determine how much of it fits
// on screen. We allow the lines to soft wrap naturally and we only truncate vertically if
// we would exceed the screen height.
let cursor_y = left_prompt_layout.line_starts.len() - 1
+ commandline_before_suggestion
.chars()
.filter(|&c| c == '\n')
.count();
let commandline_before_suggestion_lines = commandline_before_suggestion
.chars()
.filter(|&c| c == '\n')
.count();
let cursor_y = left_prompt_layout.line_starts.len() - 1 + commandline_before_suggestion_lines;
let mut suggestion_lines = vec![];
@@ -2060,7 +2052,8 @@ fn compute_layout(
+ commandline_before_suggestion
.chars()
.rposition(|c| c == '\n')
.map_or(right_prompt_width, indent_width)
.map(indent_width)
.unwrap_or_default()
} else {
indent_width(suggestion_start - "\n".len())
};
@@ -2120,6 +2113,24 @@ fn consumed_lines_or_truncated_suggestion(
let mut autosuggestion = WString::new();
let mut displayed_len = 0;
{
// Hide the right prompt if it doesn't fit on the first line.
let first_command_line_suggestion_width = if commandline_before_suggestion_lines == 0 {
suggestion_lines.first().map_or(0, |line| {
line.chars().map(wcwidth_rendered_min_0).sum::<usize>()
})
} else {
0
};
if left_prompt_width
+ first_command_line_width
+ first_command_line_suggestion_width
+ right_prompt_width
<= screen_width
{
result.right_prompt = right_prompt;
}
}
for (line_idx, autosuggestion_line) in suggestion_lines.iter().enumerate() {
if line_idx != 0 {
autosuggestion.push('\n');
@@ -2419,20 +2430,19 @@ fn test_prompt_truncation() {
fn test_compute_layout() {
macro_rules! validate {
(
(
$screen_width:expr,
$left_untrunc_prompt:literal,
$right_untrunc_prompt:literal,
$commandline_before_suggestion:literal,
$autosuggestion_str:literal,
$commandline_after_suggestion:literal
)
-> (
$left_prompt:literal,
$left_prompt_space:expr,
$right_prompt:literal,
$autosuggestion:literal $(,)?
)
(
$screen_width:expr,
$left_untrunc_prompt:literal,
$right_untrunc_prompt:literal,
$commandline_before_suggestion:literal,
$autosuggestion_str:literal,
$commandline_after_suggestion:literal
) -> (
$left_prompt:literal,
$left_prompt_space:expr,
$right_prompt:literal,
$autosuggestion:literal $(,)?
)
) => {{
let full_commandline = L!($commandline_before_suggestion).to_owned()
+ L!($autosuggestion_str)
@@ -2480,7 +2490,7 @@ macro_rules! validate {
) -> (
"left>",
5,
"<right",
"",
" autosuggesTION",
)
);
@@ -2602,7 +2612,7 @@ macro_rules! validate {
) -> (
"left>",
5,
"",
"<RIGHT",
"and AUTOSUGGESTION",
)
);
@@ -2612,7 +2622,7 @@ macro_rules! validate {
) -> (
"left>",
5,
"",
"<RIGHT",
"AUTOSUGGESTION",
)
);
@@ -2622,7 +2632,7 @@ macro_rules! validate {
) -> (
"left>",
5,
"",
"<RIGHT",
"utosuggestion sofT WRAP",
)
);

View File

@@ -11,8 +11,10 @@
use crate::tty_handoff::{safe_deactivate_tty_protocols, safe_mark_tty_invalid};
use crate::wutil::{fish_wcstoi, perror};
use errno::{errno, set_errno};
use once_cell::sync::Lazy;
use std::sync::atomic::{AtomicI32, Ordering};
use std::sync::{
LazyLock,
atomic::{AtomicI32, Ordering},
};
/// Store the "main" pid. This allows us to reliably determine if we are in a forked child.
static MAIN_PID: AtomicI32 = AtomicI32::new(0);
@@ -278,7 +280,7 @@ pub fn signal_handle(sig: Signal) {
sigaction(sig, &act, std::ptr::null_mut());
}
pub static signals_to_default: Lazy<libc::sigset_t> = Lazy::new(|| {
pub static signals_to_default: LazyLock<libc::sigset_t> = LazyLock::new(|| {
let mut set = MaybeUninit::uninit();
unsafe { libc::sigemptyset(set.as_mut_ptr()) };
for data in SIGNAL_TABLE.iter() {

View File

@@ -9,7 +9,6 @@
use crate::text_face::{TextFace, TextStyling, UnderlineStyle};
use crate::threads::MainThread;
use bitflags::bitflags;
use once_cell::sync::OnceCell;
use std::cell::{RefCell, RefMut};
use std::env;
use std::ffi::{CStr, CString};
@@ -17,9 +16,8 @@
use std::os::fd::RawFd;
use std::os::unix::ffi::OsStrExt;
use std::path::PathBuf;
use std::sync::Arc;
use std::sync::Mutex;
use std::sync::atomic::{AtomicU8, Ordering};
use std::sync::{Arc, Mutex, OnceLock};
bitflags! {
#[derive(Copy, Clone, Default)]
@@ -397,7 +395,7 @@ fn osc_133_prompt_start(out: &mut impl Output) -> bool {
if !future_feature_flags::test(FeatureFlag::MarkPrompt) {
return false;
}
static TEST_BALLOON: OnceCell<()> = OnceCell::new();
static TEST_BALLOON: OnceLock<()> = OnceLock::new();
if TEST_BALLOON.set(()).is_ok() {
write_to_output!(out, "\x1b]133;A;click_events=1\x1b\\");
} else {

View File

@@ -9,16 +9,16 @@
use crate::topic_monitor::topic_monitor_init;
use crate::wutil::wgetcwd;
use crate::{env::EnvStack, proc::proc_init};
use once_cell::sync::OnceCell;
use std::cell::RefCell;
use std::collections::HashMap;
use std::env::set_current_dir;
use std::path::PathBuf;
use std::sync::OnceLock;
pub use serial_test::serial;
pub fn test_init() -> impl ScopeGuarding<Target = ()> {
static DONE: OnceCell<()> = OnceCell::new();
static DONE: OnceLock<()> = OnceLock::new();
DONE.get_or_init(|| {
// If we are building with `cargo build` and have build w/ `cmake`, this might not
// yet exist.

View File

@@ -20,19 +20,21 @@
use crate::wutil::{perror, wcstoi};
use fish_wchar::ToWString;
use libc::{EINVAL, ENOTTY, EPERM, STDIN_FILENO, WNOHANG};
use once_cell::sync::OnceCell;
use std::mem::MaybeUninit;
use std::sync::atomic::{AtomicBool, AtomicPtr, Ordering};
use std::sync::{
OnceLock,
atomic::{AtomicBool, AtomicPtr, Ordering},
};
/// Whether kitty keyboard protocol support is present in the TTY.
static KITTY_KEYBOARD_SUPPORTED: OnceCell<bool> = OnceCell::new();
static KITTY_KEYBOARD_SUPPORTED: OnceLock<bool> = OnceLock::new();
/// Set that the TTY supports the kitty keyboard protocol.
pub fn maybe_set_kitty_keyboard_capability() {
KITTY_KEYBOARD_SUPPORTED.get_or_init(|| true);
}
pub(crate) static SCROLL_CONTENT_UP_SUPPORTED: OnceCell<bool> = OnceCell::new();
pub(crate) static SCROLL_CONTENT_UP_SUPPORTED: OnceLock<bool> = OnceLock::new();
pub(crate) const SCROLL_CONTENT_UP_TERMINFO_CODE: &str = "indn";
// Get the support capability for kitty keyboard protocol.
@@ -47,10 +49,10 @@ pub fn maybe_set_scroll_content_up_capability() {
});
}
pub static TERMINAL_OS_NAME: OnceCell<Option<WString>> = OnceCell::new();
pub static TERMINAL_OS_NAME: OnceLock<Option<WString>> = OnceLock::new();
pub(crate) const XTGETTCAP_QUERY_OS_NAME: &str = "query-os-name";
pub static XTVERSION: OnceCell<WString> = OnceCell::new();
pub static XTVERSION: OnceLock<WString> = OnceLock::new();
pub fn xtversion() -> Option<&'static wstr> {
XTVERSION.get().as_ref().map(|s| s.as_utfstr())
@@ -65,7 +67,7 @@ pub enum TtyQuirks {
// Running in iTerm2 before 3.5.12, which causes issues when using the kitty keyboard protocol.
PreKittyIterm2,
// Whether we are running under tmux.
Tmux,
Tmux((u32, u32)),
// Whether we are running under WezTerm.
Wezterm,
}
@@ -80,8 +82,8 @@ fn detect(vars: &dyn Environment, xtversion: &wstr) -> Self {
PreCsiMidnightCommander
} else if get_iterm2_version(xtversion).is_some_and(|v| v < (3, 5, 12)) {
PreKittyIterm2
} else if xtversion.starts_with(L!("tmux ")) {
Tmux
} else if let Some(version) = get_tmux_version(xtversion) {
Tmux(version)
} else if xtversion.starts_with(L!("WezTerm ")) {
Wezterm
} else {
@@ -180,12 +182,18 @@ fn get_protocols(self) -> TtyProtocolsSet {
let mut off_chain = vec![];
// Enable focus reporting under tmux
if self == TtyQuirks::Tmux {
if matches!(self, TtyQuirks::Tmux(_)) {
on_chain.push(DecsetFocusReporting);
off_chain.push(DecrstFocusReporting);
}
on_chain.extend_from_slice(&[DecsetBracketedPaste, DecsetColorThemeReporting]);
off_chain.extend_from_slice(&[DecrstBracketedPaste, DecrstColorThemeReporting]);
on_chain.push(DecsetBracketedPaste);
off_chain.push(DecrstBracketedPaste);
if !matches!(
self, TtyQuirks::Tmux(version) if version < (3, 7)
) {
on_chain.push(DecsetColorThemeReporting);
off_chain.push(DecrstColorThemeReporting);
}
let on_chain = || on_chain.clone().into_iter();
let off_chain = || off_chain.clone().into_iter();
@@ -349,14 +357,12 @@ pub struct TtyHandoff {
// The job group which owns the tty, or empty if none.
owner: Option<JobGroupRef>,
// Whether terminal protocols were initially enabled.
// reclaim() restores the state to this.
// Restored on drop.
tty_protocols_initial: bool,
// The state of terminal protocols that we set.
// Note we track this separately from TTY_PROTOCOLS_ACTIVE. We undo the changes
// we make.
tty_protocols_applied: bool,
// Whether reclaim was called, restoring the tty to its pre-scoped value.
reclaimed: bool,
// Called after writing to the TTY.
on_write: fn(),
}
@@ -368,7 +374,6 @@ pub fn new(on_write: fn()) -> Self {
owner: None,
tty_protocols_initial: protocols_active,
tty_protocols_applied: protocols_active,
reclaimed: false,
on_write,
}
}
@@ -400,40 +405,6 @@ pub fn to_job_group(&mut self, jg: &JobGroupRef) {
}
}
/// Reclaim the tty if we transferred it.
pub fn reclaim(mut self) {
self.reclaim_impl()
}
/// Release the tty, meaning no longer restore anything in Drop - similar to `mem::forget`.
pub fn release(mut self) {
self.reclaimed = true;
}
/// Implementation of reclaim, factored out for use in Drop.
fn reclaim_impl(&mut self) {
assert!(!self.reclaimed, "Terminal already reclaimed");
self.reclaimed = true;
if self.owner.is_some() {
flog!(proc_pgroup, "fish reclaiming terminal");
if unsafe { libc::tcsetpgrp(STDIN_FILENO, libc::getpgrp()) } == -1 {
flog!(
warning,
"Could not return shell to foreground:",
errno::errno()
);
perror("tcsetpgrp");
}
self.owner = None;
}
// Restore the terminal protocols. Note this does nothing if they were unchanged.
if self.tty_protocols_initial {
self.enable_tty_protocols();
} else {
self.disable_tty_protocols();
}
}
/// Save the current tty modes into the owning job group, if we are transferred.
pub fn save_tty_modes(&mut self) {
if let Some(ref mut owner) = self.owner {
@@ -591,11 +562,25 @@ fn try_transfer(jg: &JobGroup) -> bool {
}
}
/// The destructor will assert if reclaim() has not been called.
impl Drop for TtyHandoff {
fn drop(&mut self) {
if !self.reclaimed {
self.reclaim_impl();
if self.owner.is_some() {
flog!(proc_pgroup, "fish reclaiming terminal");
if unsafe { libc::tcsetpgrp(STDIN_FILENO, libc::getpgrp()) } == -1 {
flog!(
warning,
"Could not return shell to foreground:",
errno::errno()
);
perror("tcsetpgrp");
}
self.owner = None;
}
// Restore the terminal protocols. Note this does nothing if they were unchanged.
if self.tty_protocols_initial {
self.enable_tty_protocols();
} else {
self.disable_tty_protocols();
}
}
}
@@ -604,15 +589,24 @@ fn drop(&mut self) {
fn get_iterm2_version(xtversion: &wstr) -> Option<(u32, u32, u32)> {
// TODO split_once
let mut xtversion = xtversion.split(' ');
let name = xtversion.next().unwrap();
let version = xtversion.next()?;
if name != "iTerm2" {
if xtversion.next().unwrap() != "iTerm2" {
return None;
}
let mut parts = version.split('.');
let mut version = xtversion.next()?.split('.');
Some((
wcstoi(parts.next()?).ok()?,
wcstoi(parts.next()?).ok()?,
wcstoi(parts.next()?).ok()?,
wcstoi(version.next()?).ok()?,
wcstoi(version.next()?).ok()?,
wcstoi(version.next()?).ok()?,
))
}
// If we are running under iTerm2, get the version as a tuple of (major, minor, patch).
fn get_tmux_version(xtversion: &wstr) -> Option<(u32, u32)> {
// TODO split_once
let mut xtversion = xtversion.split(' ');
if xtversion.next().unwrap() != "tmux" {
return None;
}
let mut version = xtversion.next()?.split('.');
Some((wcstoi(version.next()?).ok()?, wcstoi(version.next()?).ok()?))
}

Some files were not shown because too many files have changed in this diff Show More