Compare commits

...

128 Commits

Author SHA1 Message Date
calixteman
c00591c1b6
Merge pull request #20623 from calixteman/bug2014080
In tagged pdfs, TH can be either a column header or a row header (bug 2014080)
2026-02-06 16:42:22 +01:00
Calixte Denizet
280a02150e
In tagged pdfs, TH can be either a column header or a row header (bug 2014080) 2026-02-06 10:04:13 +01:00
calixteman
58ac273f1f
Merge pull request #20503 from andriivitiv/Fix-Worker-was-terminated-error
Fix `Worker was terminated` error when loading is cancelled
2026-02-06 09:59:05 +01:00
calixteman
b92bdf80a2
Merge pull request #20628 from calixteman/bug2014399
Cap the max canvas dimensions in order to avoid to downscale large images in the worker (bug 2014399)
2026-02-06 09:42:04 +01:00
Tim van der Meij
f302323c7e
Merge pull request #20627 from Snuffleupagus/ChunkedStream-onReceiveData-rm-copy
Improve progress reporting in `ChunkedStreamManager`, and prevent unnecessary data copy in `ChunkedStream.prototype.onReceiveData`
2026-02-05 21:27:42 +01:00
Tim van der Meij
a0f3528053
Merge pull request #20624 from calixteman/bug2013793
Flush the text content chunk only on real font changes (bug 2013793)
2026-02-05 21:14:49 +01:00
Calixte Denizet
ff42c0bd50
Cap the max canvas dimensions in order to avoid to downscale large images in the worker (bug 2014399) 2026-02-05 20:25:36 +01:00
Jonas Jenwald
b3cd042ded Prevent unnecessary data copy in ChunkedStream.prototype.onReceiveData
This method is only invoked via `ChunkedStreamManager.prototype.sendRequest`, which currently returns data in `Uint8Array` format (since it potentially combines multiple `ArrayBuffer`s).
Hence we end up doing a short-lived, but still completely unnecessary, data copy[1] in `ChunkedStream.prototype.onReceiveData` when handling range requests. In practice this is unlikely to be a big problem by default, given that streaming is used and the (low) value of the `rangeChunkSize` API-option. (However, in custom PDF.js deployments it might affect things more.)

Given that no data copy is better than a short lived one, let's fix this small oversight and add non-production `assert`s to keep it working as intended.
This way we also improve consistency, since all other streaming and range request methods (see e.g. `BasePDFStream` and related code) only return `ArrayBuffer` data.

---
[1] Remember that `new Uint8Array(arrayBuffer)` only creates a view of the underlying `arrayBuffer`, whereas `new Uint8Array(typedArray)` actually creates a copy of the `typedArray`.
2026-02-05 16:16:36 +01:00
Jonas Jenwald
01deb085f8 Improve progress reporting in the ChunkedStreamManager
Currently there's two small bugs, which have existed around a decade, in the `loaded` property that's sent via the "DocProgress" message from the `ChunkedStreamManager.prototype.onReceiveData` method.

 - When the entire PDF has loaded the `loaded` property can become larger than the `total` property, which obviously doesn't make sense.
   This happens whenever the size of the PDF is *not* a multiple of the `rangeChunkSize` API-option, which is a very common situation.

 - When streaming is being used, the `loaded` property can become smaller than the actually loaded amount of data.
   This happens whenever the size of a streamed chunk is *not* a multiple of the `rangeChunkSize` API-option, which is a common situation.
2026-02-05 16:04:45 +01:00
calixteman
222a24c623
Merge pull request #20622 from calixteman/bug2014167
Let the toggle button in the alt-text dialog downloading (resp. delete) the model and enabling (resp. disabling) alt-text guessing (bug 2014167)
2026-02-04 14:13:05 +01:00
calixteman
1e0ba4dfec
Merge pull request #20621 from calixteman/bug2013899
Avoid to have to download the model when toggling the button in the alt-text image settings dialog (bug 2013899)
2026-02-04 14:12:27 +01:00
calixteman
22b97d1741
Flush the text content chunk only on real font changes (bug 2013793) 2026-02-03 23:11:31 +01:00
Calixte Denizet
ea993bfc1b
Let the toggle button in the alt-text dialog downloading (resp. delete) the model and enabling (resp. disabling) alt-text guessing (bug 2014167) 2026-02-03 20:27:06 +01:00
Calixte Denizet
c7bea3b342
Avoid to have to download the model when toggling the button in the alt-text image settings dialog (bug 2013899) 2026-02-03 19:26:33 +01:00
calixteman
1c12b07726
Merge pull request #20613 from calixteman/ccittfax_pdfium
Use the ccittfax decoder from pdfium
2026-02-02 15:07:53 +01:00
calixteman
88c2051698
Use the ccittfax decoder from pdfium
The decoder is a dependency of the jbig2 one and is already
included in pdf.js, so we just need to wire it up.
It improves the performance of documents using ccittfax images.
2026-02-02 11:10:32 +01:00
Jonas Jenwald
bfd17b2586
Merge pull request #20615 from Snuffleupagus/transport-onProgress
Report loading progress "automatically" when using the `PDFDataTransportStream` class, and remove the `PDFDataRangeTransport.prototype.onDataProgress` method
2026-02-01 22:36:43 +01:00
Jonas Jenwald
d152e92185
Merge pull request #20614 from Snuffleupagus/BasePDFStream-url
Change all relevant `BasePDFStream` implementations to take an actual `URL` instance
2026-02-01 22:13:28 +01:00
Tim van der Meij
f4326e17c4
Merge pull request #20610 from calixteman/brotli
Add support for Brotli decompression
2026-02-01 20:41:06 +01:00
Tim van der Meij
3f21efc942
Merge pull request #20607 from Snuffleupagus/rm-web-interfaces
Replace the various interfaces in `web/interfaces.js` with proper classes
2026-02-01 20:31:13 +01:00
Tim van der Meij
8eb9340fa7
Merge pull request #20617 from timvandermeij/bump
Bump the stable version in `pdfjs.config`
2026-02-01 20:23:40 +01:00
Tim van der Meij
031f633236
Bump the stable version in pdfjs.config 2026-02-01 20:20:49 +01:00
Jonas Jenwald
6509fdb1d6 Assert that PDFFetchStream is only used with HTTP(S) URLs
Note how `getDocument` checks the protocol, via the `isValidFetchUrl` helper, before attempting to use the `PDFFetchStream` implementation.
2026-02-01 18:21:27 +01:00
Jonas Jenwald
586e85888b Change all relevant BasePDFStream implementations to take an actual URL instance
Currently this code expects a "url string", rather than a proper `URL` instance, which seems completely unnecessary now. The explanation for this is, as so often is the case, "historical reasons" since a lot of this code predates the general availability of `URL`.
2026-02-01 18:21:13 +01:00
Jonas Jenwald
76dabeddb3 Limit the Math.sumPrecise polyfill to non-MOZCENTRAL builds
After https://bugzilla.mozilla.org/show_bug.cgi?id=1985121 this functionality is now guaranteed to be available in Firefox.
Unfortunately general browser support is still somewhat lacking; see https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Math/sumPrecise#browser_compatibility

Also, while unrelated, use the `MathClamp` helper in the `applyOpacity` function.
2026-02-01 18:20:29 +01:00
Jonas Jenwald
d25f13d1fd Report loading progress "automatically" when using the PDFDataTransportStream class, and remove the PDFDataRangeTransport.prototype.onDataProgress method
This is consistent with the other `BasePDFStream` implementations, and simplifies the API surface of the `PDFDataRangeTransport` class (note the changes in the viewer).
Given that the `onDataProgress` method was changed to a no-op this won't affect third-party users, assuming there even are any since this code was written specifically for the Firefox PDF Viewer.
2026-02-01 18:20:19 +01:00
Jonas Jenwald
aa4d0f7c07 Temporarily disable typestest in GitHub Actions 2026-02-01 17:56:07 +01:00
Jonas Jenwald
023af46186 Replace the IRenderableView interface with an abstract RenderableView class
This should help reduce the maintenance burden of the code, since you no longer need to remember to update separate code when touching the different page/thumbnail classes.
2026-02-01 17:56:06 +01:00
Jonas Jenwald
839c257f87 Replace the IDownloadManager interface with an abstract BaseDownloadManager class
This should help reduce the maintenance burden of the code, since you no longer need to remember to update separate code when touching the different `DownloadManager` classes.
2026-02-01 17:56:03 +01:00
Jonas Jenwald
ff7f87fc21 Replace the IPDFPrintServiceFactory interface with an abstract BasePrintServiceFactory class
This should help reduce the maintenance burden of the code, since you no longer need to remember to update separate code when touching the different `PDFPrintServiceFactory` classes.
2026-02-01 17:53:45 +01:00
Jonas Jenwald
50a12e3e67 Remove the IL10n interface
Given that we either use the `L10n` class directly or extend it via `GenericL10n`, it should no longer be necessary to keep the interface-definition.
This should help reduce the maintenance burden of the code, since you no longer need to remember to update separate code when touching the `L10n` class.
2026-02-01 17:53:45 +01:00
Jonas Jenwald
b517b5c597 Remove the IPDFLinkService interface
Given that `SimpleLinkService` now extends the regular `PDFLinkService` class, see PR 18013, it should no longer be necessary to keep the interface-definition.
This should help reduce the maintenance burden of the code, since you no longer need to remember to update separate code when touching the `PDFLinkService` class.
2026-02-01 17:53:45 +01:00
Tim van der Meij
384c6208b2
Merge pull request #20565 from kairosci/fix-bug-20557
fix: Fix mailto links truncated at dash
2026-02-01 17:34:34 +01:00
Tim van der Meij
e4cd3176ab
Merge pull request #20602 from Snuffleupagus/BasePDFStream-2
Replace the `IPDFStream`, `IPDFStreamReader`, and `IPDFStreamRangeReader` interfaces with proper base classes
2026-02-01 16:53:17 +01:00
Jonas Jenwald
ecb09d62fc Add the current loading percentage to the onPassword callback
The percentage calculation is currently "spread out" across various viewer functionality, which we can avoid by having the API handle that instead.

Also, remove the `this.#lastProgress` special-case[1] and just register a "normal" `fullReader.onProgress` callback unconditionally. Once `headersReady` is resolved the callback can simply be removed when not needed, since the "worst" thing that could theoretically happen is that the loadingBar (in the viewer) updates sooner this way. In practice though, since `fullReader.read` cannot return data until `headersReady` is resolved, this change is not actually observable in the API.

---

[1] This was added in PR 8617, close to a decade ago, but it's not obvious to me that it was ever necessary to implement it that way.
2026-01-31 16:33:58 +01:00
calixteman
43273fde27
Add support for Brotli decompression
For now, `BrotliDecode` hasn't been specified but it should be in a
close future.
So when it's possible we use the native `DecompressionStream` API
with "brotli" as argument.
If that fails or if we've to decompress in a sync context, we fallback
to `BrotliStream` which a pure js implementation (see README in external/brotli).
2026-01-31 16:25:53 +01:00
Jonas Jenwald
4ca205bac3 Add an abstract BasePDFStreamRangeReader class, that all the old IPDFStreamRangeReader implementations inherit from
Given that there's no less than *five* different, but very similar, implementations this helps reduce code duplication and simplifies maintenance.
2026-01-30 14:15:39 +01:00
Jonas Jenwald
54d8c5e7b4 Add an abstract BasePDFStreamReader class, that all the old IPDFStreamReader implementations inherit from
Given that there's no less than *five* different, but very similar, implementations this helps reduce code duplication and simplifies maintenance.

Also, remove the `rangeChunkSize` not defined checks in all the relevant stream-constructor implementations.
Note how the API, since some time, always validates *and* provides that parameter when creating a `BasePDFStreamReader`-instance.
2026-01-30 14:15:39 +01:00
Jonas Jenwald
4a8fb4dde1 Add an abstract BasePDFStream class, that all the old IPDFStream implementations inherit from
Given that there's no less than *five* different, but very similar, implementations this helps reduce code duplication and simplifies maintenance.

Also, spotted during rebasing, pass the `enableHWA` option "correctly" (i.e. as part of the existing `transportParams`) to the `WorkerTransport`-class to keep the constructor simpler.
2026-01-30 14:15:39 +01:00
Jonas Jenwald
a80f10ff1a Remove the onProgress callback from the IPDFStreamRangeReader interface
Note how there's *nowhere* in the code-base where the `IPDFStreamRangeReader.prototype.onProgress` callback is actually being set and used, however the loadingBar (in the viewer) still works just fine since loading progress is already reported via:
 - The `ChunkedStreamManager` instance respectively the `getPdfManager` function, through the use of a "DocProgress" message, on the worker-thread.
 - A `IPDFStreamReader.prototype.onProgress` callback, on the main-thread.

Furthermore, it would definitely *not* be a good idea to add any `IPDFStreamRangeReader.prototype.onProgress` callbacks since they only include the `loaded`-property which would trigger the "indeterminate" loadingBar (in the viewer).

Looking briefly at the history of this code it's not clear, at least to me, when this became unused however it's probably close to a decade ago.
2026-01-30 14:15:39 +01:00
Jonas Jenwald
05b78ce03c Stop registering an onProgress callback on the PDFWorkerStreamRangeReader-instance, in the ChunkedStreamManager class
Given that nothing in the `PDFWorkerStreamRangeReader` class attempts to invoke the `onProgress` callback, this is effectively dead code now.
Looking briefly at the history of this code it's not clear, at least to me, when this became unused however it's probably close to a decade ago.

Finally, note also how progress is already being reported through the `ChunkedStreamManager.prototype.onReceiveData` method.
2026-01-30 14:15:38 +01:00
Jonas Jenwald
987265720e Remove the unused IPDFStreamRangeReader.prototype.isStreamingSupported getter
This getter was only invoked from `src/display/network.js` and `src/core/chunked_stream.js`, however in both cases it's hardcoded to `false` and thus isn't actually needed.
This originated in PR 6879, close to a decade ago, for a potential TODO which was never implemented and it ought to be OK to just simplify this now.
2026-01-30 14:15:38 +01:00
Jonas Jenwald
62d5408cf0 Stop tracking progressiveDataLength in the ChunkedStreamManager class
Currently this property is essentially "duplicated", so let's instead use the identical one that's availble on the `ChunkedStream` instance.
2026-01-30 14:15:38 +01:00
Jonas Jenwald
814df09e21
Merge pull request #20603 from Snuffleupagus/createChromiumPrefsSchema
Improve preferences building, and generate the `preferences_schema.json` file for the Chromium addon
2026-01-30 14:12:27 +01:00
Jonas Jenwald
d4fbae06d9
Merge pull request #20605 from Snuffleupagus/rm-util-global-tests
Remove unit-tests for global `ReadableStream` and `URL`
2026-01-30 14:06:15 +01:00
Jonas Jenwald
1370950843 Remove unnecessary IIFEs when setting the compatParams
This really isn't necessary, and it's just a left-over from before the code was moved into the current file.

Also, spotted during rebasing, use the existing "locale" hash-parameter in integration-tests rather than adding a duplicate one for testing.
2026-01-30 13:31:16 +01:00
Jonas Jenwald
5d02076313 Add tests (and CI) to ensure that preference generation works correctly for all relevant build-targets
Given that previous patches reduced that number of build-targets running this code, ensure that it's still tested sufficiently.
2026-01-30 13:31:13 +01:00
Jonas Jenwald
2a83f955b0 Make getDefaultPreferences a synchronous function, to simplify the build scripts 2026-01-30 13:26:19 +01:00
Jonas Jenwald
06cf7dd7b0 Stop pre-building the preference defaults, to simplify the build scripts
This is a left-over from before the introduction of `AppOptions`, but is no longer necessary now.
2026-01-30 13:26:19 +01:00
Jonas Jenwald
35e78f7f11 Generate the preferences_schema.json file, for the Chromium addon, during building
This avoids the hassle of having to manually update that file when adding/modifying preferences in the viewer.
Updating the preferences-metadata should now only be something that the Chromium addon maintainer has to do.
2026-01-30 13:26:16 +01:00
Jonas Jenwald
a2909f9b66 List postcss-values-parser as an import alias in the ESLint config
This is similar to how other packages are handled, note e.g. the `fluent` ones.
2026-01-30 10:26:23 +01:00
Jonas Jenwald
9c903a0ebc Remove unit-tests for global ReadableStream and URL
These unit-tests were added many years ago, when this functionality wasn't generally available and we still bundled polyfills.
Since they are both available everywhere nowadays, see [here](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream) and [here](https://developer.mozilla.org/en-US/docs/Web/API/URL), these unit-tests no longer make sense.
2026-01-30 10:16:21 +01:00
Tim van der Meij
471adfd023
Merge pull request #20596 from Snuffleupagus/FileSpec-fixes
Simplify the `FileSpec` class, and remove no longer needed polyfills
2026-01-29 22:03:38 +01:00
Tim van der Meij
7cdd03ad9b
Merge pull request #20595 from Snuffleupagus/NetworkManager-simplify
Simplify the `NetworkManager` class, and inline it in the  `PDFNetworkStream` class
2026-01-29 21:56:58 +01:00
Tim van der Meij
c0572c1c8f
Merge pull request #20594 from Snuffleupagus/Node-ReadableStream
[Node.js] Don't abort the full request for local PDF files smaller than two range requests, and use standard `ReadableStream`s
2026-01-29 21:48:43 +01:00
Tim van der Meij
2cef80d05b
Merge pull request #20598 from calixteman/fix_sidebar_resize
Fix the sidebar resizer accessibility
2026-01-29 21:42:30 +01:00
Jonas Jenwald
5b368dd58a Remove the Uint8Array.prototype.toHex(), Uint8Array.prototype.toBase64(), and Uint8Array.fromBase64() polyfills
(During rebasing of the previous patches I happened to look at the polyfills and noticed that this one could be removed now.)

See:
 - https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array/toHex#browser_compatibility
 - https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array/toBase64#browser_compatibility
 - https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array/fromBase64#browser_compatibility

Note that technically this functionality can still be disabled via a preference in Firefox, however that's slated for removal in [bug 1985120](https://bugzilla.mozilla.org/show_bug.cgi?id=1985120).
Looking at the Firefox source-code, see https://searchfox.org/firefox-main/search?q=array.tobase64%28%29&path=&case=false&regexp=false, you can see that it's already being used *unconditionally* elsewhere in the browser hence removing the polyfills ought to be fine (since toggling the preference would break other parts of the browser).
2026-01-29 17:27:43 +01:00
Jonas Jenwald
247ee02299 Remove the Promise.try() polyfill
(During rebasing of the previous patches I happened to look at the polyfills and noticed that this one could be removed now.)

Note:
 - https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/try#browser_compatibility
 - https://bugzilla.mozilla.org/show_bug.cgi?id=1928493
2026-01-29 17:25:46 +01:00
Jonas Jenwald
b3f35b6007 Return the rawFilename as-is even if it's empty, from FileSpec.prototype.serializable
It's more correct to return the `rawFilename` as-is, and limit the fallback for empty filenames to only the `filename` property.
2026-01-29 17:25:44 +01:00
Calixte Denizet
956c2a051a
Fix the sidebar resizer accessibility
It's width was a bit wrong because of its box-sizing property and it was causing some issues when resizing it with the keyboard.
And for the thumbnails sidebar, the tabindex was missing and some aria properties too.
2026-01-27 21:41:11 +01:00
calixteman
5505201930
Merge pull request #20582 from calixteman/reorg_save
Add a manage button in the thumbnail view in order to save an edited pdf (bug 2010830)
2026-01-26 19:25:50 +01:00
Calixte Denizet
dd6a0c6cf4
Add a manage button in the thumbnail view in order to save an edited pdf (bug 2010830) 2026-01-26 18:09:07 +01:00
calixteman
07a4aab246
Merge pull request #20587 from calixteman/reorg_simplify_mapping
Refactor a bit page mapping stuff in order to be able to support delete/copy pages
2026-01-26 17:43:49 +01:00
Calixte Denizet
806133379e
Refactor a bit page mapping stuff in order to be able to support delete/copy pages 2026-01-26 16:53:52 +01:00
calixteman
48df8a5ea2
Merge pull request #20586 from marco-c/commentundo
Bug 1999154 - Add the ability to undo comment deletion
2026-01-26 15:52:15 +01:00
calixteman
ab7d388ccb
Merge pull request #20591 from calixteman/reorg_fix_drag_width
Fix the drag marker dimensions in the thumbnails view
2026-01-26 09:00:01 +01:00
Calixte Denizet
f2ac669ee4
Fix the drag marker dimensions in the thumbnails view
STR:
 - open outline view;
 - switch to thumbnails one;
 - start to drag a thumbnail.
2026-01-25 21:49:17 +01:00
calixteman
001058abb2
Merge pull request #20593 from calixteman/decompress_content_stream
Use DecompressionStream in async code
2026-01-25 21:27:16 +01:00
Alessio Attilio
50f2d4db65 fix: allow hyphens in mailto link auto-detection (bug 20557)
Modified the regex in web/autolinker.js to explicitly allow hyphens (-) in
the domain part of email addresses, while maintaining the exclusion of
other punctuation. This fixes mailto links like user@uni-city.tld being
truncated at the hyphen.

Fixes #20557
2026-01-25 17:20:14 +01:00
calixteman
9f660be8a2
Use DecompressionStream in async code
Usually, content stream or fonts are compressed using FlateDecode.
So use the DecompressionStream API to decompress those streams
in the async code path.
2026-01-25 14:22:19 +01:00
Jonas Jenwald
640a3106d5 Remove caching/shadowing from the FileSpec getters, and simplify the code
Given that only the `FileSpec.prototype.serializable` getter is ever invoked from "outside" of the class, and only once per `FileSpec`-instance, the caching/shadowing isn't actually necessary.

Furthermore the `_contentRef`-caching wasn't actually correct, since it ended up storing a `BaseStream`-instance and those should *generally* never be cached.
(Since calling `BaseStream.prototype.getBytes()` more than once, without resetting the stream in between, will return an empty TypedArray after the first time.)
2026-01-25 13:16:29 +01:00
Jonas Jenwald
84b5866853 Reduce duplication in the pickPlatformItem helper function
Also, tweak code/comment used when handling "GoToR" destinations.
2026-01-25 13:16:06 +01:00
Jonas Jenwald
eaf605d720 Move the NetworkManager functionality into the PDFNetworkStream class
The `NetworkManager` is very old code at this point, and it predates the introduction of the streaming functionality by many years.
To simplify things, especially with upcoming re-factoring patches, let's move this functionality into private (and semi-private) methods in the `PDFNetworkStream` class to avoid having to deal with too many different scopes.
2026-01-25 12:41:33 +01:00
Jonas Jenwald
4d9301fceb Simplify the NetworkManager class
Store pending requests in a `WeakMap`, which allows working directly with the `XMLHttpRequest`-data and removes the need for a couple of methods.

Simplify the `PDFNetworkStreamRangeRequestReader.prototype._onDone` method, since after removing `moz-chunked-arraybuffer` support (see PR 10678) it's never called without an argument.

Stop sending the `begin`-property to the `onDone`-callbacks since it's unused. Originally this code was shared with the Firefox PDF Viewer, many years ago, and then that property mattered.

Re-factor the `status` validation, to avoid checking for a "bad" range-request response unconditionally.
2026-01-25 12:41:20 +01:00
Jonas Jenwald
663d4cd6e7 Use standard ReadableStreams in the src/display/node_stream.js code
Thanks to newer Node.js functionality, see https://nodejs.org/api/stream.html#streamreadabletowebstreamreadable-options, we can use standard `ReadableStream`s which help to significantly shorten and simplify the code.

For older Node.js versions we use the `node-readable-to-web-readable-stream` package, see https://www.npmjs.com/package/node-readable-to-web-readable-stream, to get the same functionality.
2026-01-25 12:34:47 +01:00
Jonas Jenwald
45294d31cb In Node.js, don't abort the full request for local PDF files smaller than two range requests
This follows the behaviour used with both the Fetch API and `XMLHttpRequest`, compare with the `validateRangeRequestCapabilities` helper function.
2026-01-25 12:34:35 +01:00
Tim van der Meij
95f62f3b33
Merge pull request #20580 from calixteman/reorg_link_outline
Fix links and outline after reorganizing a pdf
2026-01-23 20:49:20 +01:00
Tim van der Meij
23c7b1ac8e
Merge pull request #20583 from calixteman/simplify_menu
Hide the menu container in changing it's visibility
2026-01-23 20:41:08 +01:00
Tim van der Meij
bfa44af327
Merge pull request #20554 from dgiessing/patch-1
Update image pattern in gulpfile to accommodate missing images
2026-01-23 20:22:23 +01:00
Tim van der Meij
109ea59fbc
Merge pull request #20588 from mozilla/dependabot/npm_and_yarn/lodash-4.17.23
Bump lodash from 4.17.21 to 4.17.23
2026-01-23 20:16:58 +01:00
David Giessing
0aa4fc6af8
fix: Update image pattern in gulpfile to accommodate missing images 2026-01-22 23:01:46 +01:00
Marco Castelluccio
bfdcaadf7c
Use kbUndo when possible 2026-01-22 13:12:38 +01:00
Marco Castelluccio
bdc9323b15
Hide comment popup after redo action 2026-01-22 13:12:13 +01:00
dependabot[bot]
3270c4a504
Bump lodash from 4.17.21 to 4.17.23
Bumps [lodash](https://github.com/lodash/lodash) from 4.17.21 to 4.17.23.
- [Release notes](https://github.com/lodash/lodash/releases)
- [Commits](https://github.com/lodash/lodash/compare/4.17.21...4.17.23)

---
updated-dependencies:
- dependency-name: lodash
  dependency-version: 4.17.23
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-21 23:03:09 +00:00
Marco Castelluccio
84d15dc453
Restore date too 2026-01-21 17:06:13 +01:00
Marco Castelluccio
d9f67bd8ee
Bug 1999154 - Add the ability to undo comment deletion 2026-01-21 15:15:40 +01:00
Calixte Denizet
a4f4d460ca
Hide the menu container in changing it's visibility
This way, the dimensions of the menu container don't depend on its visibility.
This patch fixes few keyboard issues I noticed when reading:
https://developer.mozilla.org/en-US/docs/Web/Accessibility/ARIA/Reference/Roles/menu_role#keyboard_interactions
2026-01-20 14:42:57 +01:00
calixteman
6a4a3b060d
Merge pull request #20475 from calixteman/organize_pages
Add the possibility to order the pages in an extracted pdf (bug 1997379)
2026-01-19 19:08:32 +01:00
calixteman
ce296d8d42
Add the possibility to order the pages in an extracted pdf (bug 1997379)
or in a merged one.
2026-01-19 18:58:23 +01:00
Calixte Denizet
43bd7fa738
Fix links and outline after reorganizing a pdf 2026-01-19 17:38:17 +01:00
calixteman
8188c87461
Merge pull request #20572 from calixteman/issue20571
Avoid exception after having moved an annotation
2026-01-18 21:06:50 +01:00
calixteman
8abfd9a797
Avoid exception after having moved an annotation
It fixes #20571.
2026-01-18 21:01:40 +01:00
calixteman
f04deeeddf
Merge pull request #20577 from calixteman/update_search
The 'find in page' feature must correctly work after the pages have been reorganized (bug 2010814)
2026-01-18 20:57:32 +01:00
Calixte Denizet
3a20ea75b9
The 'find in page' feature must correctly work after the pages have been reorganized (bug 2010814) 2026-01-18 20:54:15 +01:00
Tim van der Meij
fef0cb1a6f
Merge pull request #20578 from calixteman/bug2010820
Select the dropped thumbnail (bug 2010820)
2026-01-18 13:03:47 +01:00
Calixte Denizet
eb014a36cc
Select the dropped thumbnail (bug 2010820) 2026-01-16 19:54:14 +01:00
calixteman
67673ea274
Merge pull request #20559 from calixteman/new_sidebar2
Add the possibility to drag & drop some thumbnails in the pages view (bug 2009573)
2026-01-14 22:13:52 +01:00
Calixte Denizet
5e89981282
Add the possibility to drag & drop some thumbnails in the pages view
The goal is to be able to reorganize the pages in a pdf.
2026-01-14 21:04:38 +01:00
calixteman
cbcb6279ad
Merge pull request #20569 from calixteman/fix_caret_dark_mode
Make sure the caret is black in dark mode when in caret browsing mode
2026-01-14 16:57:05 +01:00
calixteman
6612d7afaf
Merge pull request #20567 from calixteman/bug2009627
Hide the text in the text layer associated with MathML elements (bug 2009627)
2026-01-14 14:19:37 +01:00
Calixte Denizet
cffd54e9c6
Hide the text in the text layer associated with MathML elements (bug 2009627)
The bug was supposed to be fixed by #20471 but here there are some annotations in the pdf.
When those annotations are added to the DOM, the struct tree has to be rendered but without
the text layer (because of asynchronicity).
So this patch is making sure that the modifications in the text layer are done once the
layer is rendered.
2026-01-13 20:37:52 +01:00
Tim van der Meij
f40ab1a3f8
Merge pull request #20570 from calixteman/no_contents_image_stream
Don't use contents stream which have an image format
2026-01-13 20:27:59 +01:00
Calixte Denizet
b5ed988267
Don't use contents stream which have an image format
The original bug has been filled in mupdf bug tracker:
https://bugs.ghostscript.com/show_bug.cgi?id=709033

The attached pdf can be open in Chrome but not in Acrobat.
2026-01-13 18:39:17 +01:00
Calixte Denizet
a362a24779
Make sure the caret is black in dark mode when in caret browsing mode 2026-01-13 18:10:11 +01:00
Tim van der Meij
a5010f99e7
Merge pull request #20566 from calixteman/update_jbig2
Update jbig2 decoder (pdfium@3c679253a9e17c10be696d345c63636b18b7f925)
2026-01-11 20:23:45 +01:00
calixteman
1f69a3f6af
Update jbig2 decoder (pdfium@3c679253a9e17c10be696d345c63636b18b7f925)
See a40f47cbc4
for details.
2026-01-11 13:39:46 +01:00
calixteman
6f5d5ac6d8
Merge pull request #20552 from calixteman/issue20529
Add some tests for the JBIG2 js decoder
2026-01-09 16:45:09 +01:00
calixteman
1dd6649b7d
Add some tests for the JBIG2 js decoder
It fixes #20529.

The files come from:
https://github.com/SerenityOS/serenity/tree/master/Tests/LibGfx/test-inputs/jbig2/

Thank you to Nico Weber for offering these test files.
2026-01-07 22:05:05 +01:00
calixteman
3532ac39d8
Merge pull request #20551 from calixteman/bug2004951_part2
Don't add an aria-label on MathML elements in the struct tree (bug 2004951)
2026-01-06 22:37:46 +01:00
Calixte Denizet
da463f2da9
Don't add an aria-label on MathML elements in the struct tree (bug 2004951) 2026-01-06 21:38:42 +01:00
Tim van der Meij
91fa05d2e8
Merge pull request #20550 from calixteman/bug2004951_part1
Aria-hide artifacts in the text layer (bug 2004951)
2026-01-06 20:29:25 +01:00
Calixte Denizet
0ef085e23b
Aria-hide artifacts in the text layer (bug 2004951) 2026-01-05 16:57:01 +01:00
Tim van der Meij
a939f12391
Merge pull request #20549 from timvandermeij/talos-revert
Revert "Remove some files from talos tests because they aren't available on webarchive"
2026-01-04 19:50:34 +01:00
Tim van der Meij
adde05b530
Revert "Remove some files from talos tests because they aren't available on webarchive"
This reverts commit fbce8bf829cba78281ea93e845c524725ff2c3ff. The file
is available in the Web Archive again.

Fixes #20528.
2026-01-04 19:29:26 +01:00
Tim van der Meij
54d84799c2
Merge pull request #20548 from calixteman/jbig2wasm_failure
Fix wasm url issue for the jbig2 decoder
2026-01-04 19:14:37 +01:00
calixteman
eab33828a9
Fix wasm url issue for the jbig2 decoder
and add a test for jbig2 decoding with the js decoder.
2026-01-04 00:08:59 +01:00
Tim van der Meij
cb36cbdc13
Merge pull request #20547 from timvandermeij/updates
Update dependencies and translations to the most recent versions
2026-01-03 23:32:02 +01:00
calixteman
cb28250fbe
Merge pull request #20546 from calixteman/jbig2wasm
Use the PDFium JBig2 decoder compiled into wasm
2026-01-03 22:08:12 +01:00
calixteman
98c1955bd4
Use the PDFium JBig2 decoder compiled into wasm
The decoder is ~4x faster than the JS decoder on large images.
2026-01-03 22:05:14 +01:00
Tim van der Meij
5c06beae07
Update translations to the most recent versions 2026-01-03 20:43:08 +01:00
Tim van der Meij
687cd848e0
Upgrade eslint-plugin-perfectionist to version 5.2.0
This is a major version bump, but the changelog at
https://github.com/azat-io/eslint-plugin-perfectionist/releases/v5.0.0
doesn't indicate any breaking changes that should impact us.
2026-01-03 20:42:04 +01:00
Tim van der Meij
13e070a8a3
Upgrade globals to version 17.0.0
This is a major version bump, but the changelog at
https://github.com/sindresorhus/globals/releases/v17.0.0
doesn't indicate any breaking changes that should impact us.
2026-01-03 20:40:39 +01:00
Tim van der Meij
2838e161b8
Update dependencies to the most recent versions 2026-01-03 20:38:22 +01:00
calixteman
eccbcbe5eb
Merge pull request #20515 from calixteman/issue20513_2
Get glyph contours when stroking using a pattern
2025-12-29 11:32:28 +01:00
calixteman
424c7989aa
Get glyph contours when stroking using a pattern
Fix issue #20513 (second part).
2025-12-28 22:55:59 +01:00
Tim van der Meij
430b8a9e90
Merge pull request #20540 from timvandermeij/bump
Bump the stable version in `pdfjs.config`
2025-12-28 20:08:27 +01:00
Tim van der Meij
f385fd9783
Bump the stable version in pdfjs.config 2025-12-28 20:05:00 +01:00
Andrii Vitiv
9677798ba0
Fix Worker was terminated error when loading is cancelled
Fixes https://github.com/mozilla/pdf.js/issues/11595, where cancelling loading with `loadingTask.destroy()` before it finishes throws a `Worker was terminated` error that CANNOT be caught.

When worker is terminated, an error is thrown here:

6c746260a9/src/core/worker.js (L374)

Then `onFailure` runs, in which we throw again via `ensureNotTerminated()`. However, this second error is never caught (and cannot be), resulting in console spam.

There is no need to throw any additional errors since the termination is already reported [here](6c746260a9/src/core/worker.js (L371-L373)), and `onFailure` is supposed to handle errors, not throw them.
2025-12-14 18:15:10 +02:00
260 changed files with 8955 additions and 2958 deletions

View File

@ -30,8 +30,5 @@ jobs:
- name: Run lint
run: npx gulp lint
- name: Run lint-chromium
run: npx gulp lint-chromium
- name: Run lint-mozcentral
run: npx gulp lint-mozcentral

31
.github/workflows/prefs_tests.yml vendored Normal file
View File

@ -0,0 +1,31 @@
name: Prefs tests
on: [push, pull_request]
permissions:
contents: read
jobs:
test:
name: Test
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
node-version: [lts/*]
steps:
- name: Checkout repository
uses: actions/checkout@v6
with:
fetch-depth: 0
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v6
with:
node-version: ${{ matrix.node-version }}
- name: Install dependencies
run: npm ci
- name: Run prefs tests
run: npx gulp prefstest

View File

@ -28,4 +28,4 @@ jobs:
run: npm ci
- name: Run types tests
run: npx gulp typestest
run: npx gulp types

View File

@ -31,10 +31,12 @@ export default [
"**/docs/",
"**/node_modules/",
"external/bcmaps/",
"external/brotli/",
"external/builder/fixtures/",
"external/builder/fixtures_babel/",
"external/openjpeg/",
"external/qcms/",
"external/jbig2/",
"external/quickjs/",
"test/stats/results/",
"test/tmp/",
@ -116,6 +118,7 @@ export default [
"web",
"fluent-bundle",
"fluent-dom",
"postcss-values-parser",
// See https://github.com/firebase/firebase-admin-node/discussions/1359.
"eslint-plugin-perfectionist",
],

View File

@ -46,19 +46,13 @@ const PDFViewerApplication = {
* @returns {Promise} - Returns the promise, which is resolved when document
* is opened.
*/
open(params) {
async open(params) {
if (this.pdfLoadingTask) {
// We need to destroy already opened document
return this.close().then(
function () {
// ... and repeat the open() call.
return this.open(params);
}.bind(this)
);
// We need to destroy already opened document.
await this.close();
}
const url = params.url;
const self = this;
const { url } = params;
this.setTitleUsingUrl(url);
// Loading document.
@ -70,24 +64,22 @@ const PDFViewerApplication = {
});
this.pdfLoadingTask = loadingTask;
loadingTask.onProgress = function (progressData) {
self.progress(progressData.loaded / progressData.total);
};
loadingTask.onProgress = evt => this.progress(evt.percent);
return loadingTask.promise.then(
function (pdfDocument) {
pdfDocument => {
// Document loaded, specifying document for the viewer.
self.pdfDocument = pdfDocument;
self.pdfViewer.setDocument(pdfDocument);
self.pdfLinkService.setDocument(pdfDocument);
self.pdfHistory.initialize({
this.pdfDocument = pdfDocument;
this.pdfViewer.setDocument(pdfDocument);
this.pdfLinkService.setDocument(pdfDocument);
this.pdfHistory.initialize({
fingerprint: pdfDocument.fingerprints[0],
});
self.loadingBar.hide();
self.setTitleUsingMetadata(pdfDocument);
this.loadingBar.hide();
this.setTitleUsingMetadata(pdfDocument);
},
function (reason) {
reason => {
let key = "pdfjs-loading-error";
if (reason instanceof pdfjsLib.InvalidPDFException) {
key = "pdfjs-invalid-file-error";
@ -96,10 +88,10 @@ const PDFViewerApplication = {
? "pdfjs-missing-file-error"
: "pdfjs-unexpected-response-error";
}
self.l10n.get(key).then(msg => {
self.error(msg, { message: reason?.message });
this.l10n.get(key).then(msg => {
this.error(msg, { message: reason.message });
});
self.loadingBar.hide();
this.loadingBar.hide();
}
);
},
@ -109,9 +101,9 @@ const PDFViewerApplication = {
* @returns {Promise} - Returns the promise, which is resolved when all
* destruction is completed.
*/
close() {
async close() {
if (!this.pdfLoadingTask) {
return Promise.resolve();
return;
}
const promise = this.pdfLoadingTask.destroy();
@ -128,7 +120,7 @@ const PDFViewerApplication = {
}
}
return promise;
await promise;
},
get loadingBar() {
@ -152,48 +144,36 @@ const PDFViewerApplication = {
this.setTitle(title);
},
setTitleUsingMetadata(pdfDocument) {
const self = this;
pdfDocument.getMetadata().then(function (data) {
const info = data.info,
metadata = data.metadata;
self.documentInfo = info;
self.metadata = metadata;
async setTitleUsingMetadata(pdfDocument) {
const { info, metadata } = await pdfDocument.getMetadata();
this.documentInfo = info;
this.metadata = metadata;
// Provides some basic debug information
console.log(
"PDF " +
pdfDocument.fingerprints[0] +
" [" +
info.PDFFormatVersion +
" " +
(info.Producer || "-").trim() +
" / " +
(info.Creator || "-").trim() +
"]" +
" (PDF.js: " +
(pdfjsLib.version || "-") +
")"
);
// Provides some basic debug information
console.log(
`PDF ${pdfDocument.fingerprints[0]} [${info.PDFFormatVersion} ` +
`${(metadata?.get("pdf:producer") || info.Producer || "-").trim()} / ` +
`${(metadata?.get("xmp:creatortool") || info.Creator || "-").trim()}` +
`] (PDF.js: ${pdfjsLib.version || "?"} [${pdfjsLib.build || "?"}])`
);
let pdfTitle;
if (metadata && metadata.has("dc:title")) {
const title = metadata.get("dc:title");
// Ghostscript sometimes returns 'Untitled', so prevent setting the
// title to 'Untitled.
if (title !== "Untitled") {
pdfTitle = title;
}
let pdfTitle;
if (metadata && metadata.has("dc:title")) {
const title = metadata.get("dc:title");
// Ghostscript sometimes returns 'Untitled', so prevent setting the
// title to 'Untitled.
if (title !== "Untitled") {
pdfTitle = title;
}
}
if (!pdfTitle && info && info.Title) {
pdfTitle = info.Title;
}
if (!pdfTitle && info && info.Title) {
pdfTitle = info.Title;
}
if (pdfTitle) {
self.setTitle(pdfTitle + " - " + document.title);
}
});
if (pdfTitle) {
this.setTitle(pdfTitle + " - " + document.title);
}
},
setTitle: function pdfViewSetTitle(title) {
@ -223,8 +203,7 @@ const PDFViewerApplication = {
console.error(`${message}\n\n${moreInfoText.join("\n")}`);
},
progress: function pdfViewProgress(level) {
const percent = Math.round(level * 100);
progress(percent) {
// Updating the bar if value increases.
if (percent > this.loadingBar.percent || isNaN(percent)) {
this.loadingBar.percent = percent;

View File

@ -1,252 +0,0 @@
{
"type": "object",
"properties": {
"viewerCssTheme": {
"title": "Theme",
"description": "The theme to use.\n0 = Use system theme.\n1 = Light theme.\n2 = Dark theme.",
"type": "integer",
"enum": [0, 1, 2],
"default": 2
},
"showPreviousViewOnLoad": {
"description": "DEPRECATED. Set viewOnLoad to 1 to disable showing the last page/position on load.",
"type": "boolean",
"default": true
},
"viewOnLoad": {
"title": "View position on load",
"description": "The position in the document upon load.\n -1 = Default (uses OpenAction if available, otherwise equal to `viewOnLoad = 0`).\n 0 = The last viewed page/position.\n 1 = The initial page/position.",
"type": "integer",
"enum": [-1, 0, 1],
"default": 0
},
"defaultZoomDelay": {
"title": "Default zoom delay",
"description": "Delay (in ms) to wait before redrawing the canvas.",
"type": "integer",
"default": 400
},
"defaultZoomValue": {
"title": "Default zoom level",
"description": "Default zoom level of the viewer. Accepted values: 'auto', 'page-actual', 'page-width', 'page-height', 'page-fit', or a zoom level in percents.",
"type": "string",
"pattern": "|auto|page-actual|page-width|page-height|page-fit|[0-9]+\\.?[0-9]*(,[0-9]+\\.?[0-9]*){0,2}",
"default": ""
},
"sidebarViewOnLoad": {
"title": "Sidebar state on load",
"description": "Controls the state of the sidebar upon load.\n -1 = Default (uses PageMode if available, otherwise the last position if available/enabled).\n 0 = Do not show sidebar.\n 1 = Show thumbnails in sidebar.\n 2 = Show document outline in sidebar.\n 3 = Show attachments in sidebar.",
"type": "integer",
"enum": [-1, 0, 1, 2, 3],
"default": -1
},
"enableHandToolOnLoad": {
"description": "DEPRECATED. Set cursorToolOnLoad to 1 to enable the hand tool by default.",
"type": "boolean",
"default": false
},
"enableHWA": {
"title": "Enable hardware acceleration",
"description": "Whether to enable hardware acceleration.",
"type": "boolean",
"default": true
},
"enableAltText": {
"type": "boolean",
"default": false
},
"enableGuessAltText": {
"type": "boolean",
"default": true
},
"enableAltTextModelDownload": {
"type": "boolean",
"default": true
},
"enableNewAltTextWhenAddingImage": {
"type": "boolean",
"default": true
},
"altTextLearnMoreUrl": {
"type": "string",
"default": ""
},
"commentLearnMoreUrl": {
"type": "string",
"default": ""
},
"enableSignatureEditor": {
"type": "boolean",
"default": false
},
"enableUpdatedAddImage": {
"type": "boolean",
"default": false
},
"cursorToolOnLoad": {
"title": "Cursor tool on load",
"description": "The cursor tool that is enabled upon load.\n 0 = Text selection tool.\n 1 = Hand tool.",
"type": "integer",
"enum": [0, 1],
"default": 0
},
"pdfBugEnabled": {
"title": "Enable debugging tools",
"description": "Whether to enable debugging tools.",
"type": "boolean",
"default": false
},
"enableScripting": {
"title": "Enable active content (JavaScript) in PDFs",
"type": "boolean",
"description": "Whether to allow execution of active content (JavaScript) by PDF files.",
"default": false
},
"enableHighlightFloatingButton": {
"type": "boolean",
"default": false
},
"highlightEditorColors": {
"type": "string",
"default": "yellow=#FFFF98,green=#53FFBC,blue=#80EBFF,pink=#FFCBE6,red=#FF4F5F,yellow_HCM=#FFFFCC,green_HCM=#53FFBC,blue_HCM=#80EBFF,pink_HCM=#F6B8FF,red_HCM=#C50043"
},
"disableRange": {
"title": "Disable range requests",
"description": "Whether to disable range requests (not recommended).",
"type": "boolean",
"default": false
},
"disableStream": {
"title": "Disable streaming for requests",
"description": "Whether to disable streaming for requests (not recommended).",
"type": "boolean",
"default": false
},
"disableAutoFetch": {
"type": "boolean",
"default": false
},
"disableFontFace": {
"title": "Disable @font-face",
"description": "Whether to disable @font-face and fall back to canvas rendering (this is more resource-intensive).",
"type": "boolean",
"default": false
},
"disableTextLayer": {
"description": "DEPRECATED. Set textLayerMode to 0 to disable the text selection layer by default.",
"type": "boolean",
"default": false
},
"textLayerMode": {
"title": "Text layer mode",
"description": "Controls if the text layer is enabled, and the selection mode that is used.\n 0 = Disabled.\n 1 = Enabled.",
"type": "integer",
"enum": [0, 1],
"default": 1
},
"externalLinkTarget": {
"title": "External links target window",
"description": "Controls how external links will be opened.\n 0 = default.\n 1 = replaces current window.\n 2 = new window/tab.\n 3 = parent.\n 4 = in top window.",
"type": "integer",
"enum": [0, 1, 2, 3, 4],
"default": 0
},
"disablePageLabels": {
"type": "boolean",
"default": false
},
"disablePageMode": {
"description": "DEPRECATED.",
"type": "boolean",
"default": false
},
"disableTelemetry": {
"title": "Disable telemetry",
"type": "boolean",
"description": "Whether to prevent the extension from reporting the extension and browser version to the extension developers.",
"default": false
},
"annotationMode": {
"type": "integer",
"enum": [0, 1, 2, 3],
"default": 2
},
"annotationEditorMode": {
"type": "integer",
"enum": [-1, 0, 3, 15],
"default": 0
},
"capCanvasAreaFactor": {
"type": "integer",
"default": 200
},
"enablePermissions": {
"type": "boolean",
"default": false
},
"enableXfa": {
"type": "boolean",
"default": true
},
"historyUpdateUrl": {
"type": "boolean",
"default": false
},
"ignoreDestinationZoom": {
"title": "Ignore the zoom argument in destinations",
"description": "When enabled it will maintain the currently active zoom level, rather than letting the PDF document modify it, when navigating to internal destinations.",
"type": "boolean",
"default": false
},
"enablePrintAutoRotate": {
"title": "Automatically rotate printed pages",
"description": "When enabled, landscape pages are rotated when printed.",
"type": "boolean",
"default": true
},
"scrollModeOnLoad": {
"title": "Scroll mode on load",
"description": "Controls how the viewer scrolls upon load.\n -1 = Default (uses the last position if available/enabled).\n 3 = Page scrolling.\n 0 = Vertical scrolling.\n 1 = Horizontal scrolling.\n 2 = Wrapped scrolling.",
"type": "integer",
"enum": [-1, 0, 1, 2, 3],
"default": -1
},
"spreadModeOnLoad": {
"title": "Spread mode on load",
"description": "Whether the viewer should join pages into spreads upon load.\n -1 = Default (uses the last position if available/enabled).\n 0 = No spreads.\n 1 = Odd spreads.\n 2 = Even spreads.",
"type": "integer",
"enum": [-1, 0, 1, 2],
"default": -1
},
"forcePageColors": {
"description": "When enabled, the pdf rendering will use the high contrast mode colors",
"type": "boolean",
"default": false
},
"pageColorsBackground": {
"description": "The color is a string as defined in CSS. Its goal is to help improve readability in high contrast mode",
"type": "string",
"default": "Canvas"
},
"pageColorsForeground": {
"description": "The color is a string as defined in CSS. Its goal is to help improve readability in high contrast mode",
"type": "string",
"default": "CanvasText"
},
"enableAutoLinking": {
"description": "Enable creation of hyperlinks from text that look like URLs.",
"type": "boolean",
"default": true
},
"enableComment": {
"description": "Enable creation of comment annotations.",
"type": "boolean",
"default": false
},
"enableOptimizedPartialRendering": {
"description": "Enable tracking of PDF operations to optimize partial rendering.",
"type": "boolean",
"default": false
}
}
}

19
external/brotli/LICENSE_BROTLI vendored Normal file
View File

@ -0,0 +1,19 @@
Copyright (c) 2009, 2010, 2013-2016 by the Brotli Authors.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

8
external/brotli/README.md vendored Normal file
View File

@ -0,0 +1,8 @@
## Release
In order to get the file `decoder.js`:
* `gulp release-brotli --hash` followed by the git hash of the revision.
## Licensing
[brotli](https://github.com/google/brotli/) is under [MIT License](https://github.com/google/brotli/blob/master/LICENSE)

2466
external/brotli/decode.js vendored Normal file

File diff suppressed because one or more lines are too long

270
external/chromium/prefs.mjs vendored Normal file
View File

@ -0,0 +1,270 @@
/* Copyright 2026 Mozilla Foundation
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
const prefsMetadata = {
annotationEditorMode: {
enum: [-1, 0, 3, 15],
},
annotationMode: {
enum: [0, 1, 2, 3],
},
cursorToolOnLoad: {
title: "Cursor tool on load",
description:
"The cursor tool that is enabled upon load.\n 0 = Text selection tool.\n 1 = Hand tool.",
enum: [0, 1],
},
defaultZoomDelay: {
title: "Default zoom delay",
description: "Delay (in ms) to wait before redrawing the canvas.",
},
defaultZoomValue: {
title: "Default zoom level",
description:
"Default zoom level of the viewer. Accepted values: 'auto', 'page-actual', 'page-width', 'page-height', 'page-fit', or a zoom level in percents.",
pattern:
"|auto|page-actual|page-width|page-height|page-fit|[0-9]+\\.?[0-9]*(,[0-9]+\\.?[0-9]*){0,2}",
},
disableFontFace: {
title: "Disable @font-face",
description:
"Whether to disable @font-face and fall back to canvas rendering (this is more resource-intensive).",
},
disableRange: {
title: "Disable range requests",
description: "Whether to disable range requests (not recommended).",
},
disableStream: {
title: "Disable streaming for requests",
description: "Whether to disable streaming for requests (not recommended).",
},
disableTelemetry: {
title: "Disable telemetry",
description:
"Whether to prevent the extension from reporting the extension and browser version to the extension developers.",
},
enableAutoLinking: {
description: "Enable creation of hyperlinks from text that look like URLs.",
},
enableComment: {
description: "Enable creation of comment annotations.",
},
enableHWA: {
title: "Enable hardware acceleration",
description: "Whether to enable hardware acceleration.",
},
enableOptimizedPartialRendering: {
description:
"Enable tracking of PDF operations to optimize partial rendering.",
},
enablePrintAutoRotate: {
title: "Automatically rotate printed pages",
description: "When enabled, landscape pages are rotated when printed.",
},
enableScripting: {
title: "Enable active content (JavaScript) in PDFs",
description:
"Whether to allow execution of active content (JavaScript) by PDF files.",
},
externalLinkTarget: {
title: "External links target window",
description:
"Controls how external links will be opened.\n 0 = default.\n 1 = replaces current window.\n 2 = new window/tab.\n 3 = parent.\n 4 = in top window.",
enum: [0, 1, 2, 3, 4],
},
forcePageColors: {
description:
"When enabled, the pdf rendering will use the high contrast mode colors",
},
ignoreDestinationZoom: {
title: "Ignore the zoom argument in destinations",
description:
"When enabled it will maintain the currently active zoom level, rather than letting the PDF document modify it, when navigating to internal destinations.",
},
pageColorsBackground: {
description:
"The color is a string as defined in CSS. Its goal is to help improve readability in high contrast mode",
},
pageColorsForeground: {
description:
"The color is a string as defined in CSS. Its goal is to help improve readability in high contrast mode",
},
pdfBugEnabled: {
title: "Enable debugging tools",
description: "Whether to enable debugging tools.",
},
scrollModeOnLoad: {
title: "Scroll mode on load",
description:
"Controls how the viewer scrolls upon load.\n -1 = Default (uses the last position if available/enabled).\n 0 = Vertical scrolling.\n 1 = Horizontal scrolling.\n 2 = Wrapped scrolling.\n 3 = Page scrolling.",
enum: [-1, 0, 1, 2, 3],
},
sidebarViewOnLoad: {
title: "Sidebar state on load",
description:
"Controls the state of the sidebar upon load.\n -1 = Default (uses PageMode if available, otherwise the last position if available/enabled).\n 0 = Do not show sidebar.\n 1 = Show thumbnails in sidebar.\n 2 = Show document outline in sidebar.\n 3 = Show attachments in sidebar.",
enum: [-1, 0, 1, 2, 3],
},
spreadModeOnLoad: {
title: "Spread mode on load",
description:
"Whether the viewer should join pages into spreads upon load.\n -1 = Default (uses the last position if available/enabled).\n 0 = No spreads.\n 1 = Odd spreads.\n 2 = Even spreads.",
enum: [-1, 0, 1, 2],
},
textLayerMode: {
title: "Text layer mode",
description:
"Controls if the text layer is enabled, and the selection mode that is used.\n 0 = Disabled.\n 1 = Enabled.",
enum: [0, 1],
},
viewerCssTheme: {
title: "Theme",
description:
"The theme to use.\n0 = Use system theme.\n1 = Light theme.\n2 = Dark theme.",
enum: [0, 1, 2],
},
viewOnLoad: {
title: "View position on load",
description:
"The position in the document upon load.\n -1 = Default (uses OpenAction if available, otherwise equal to `viewOnLoad = 0`).\n 0 = The last viewed page/position.\n 1 = The initial page/position.",
enum: [-1, 0, 1],
},
};
// Deprecated keys are allowed in the managed preferences file.
// The code maintainer is responsible for adding migration logic to
// extensions/chromium/options/migration.js and web/chromecom.js .
const deprecatedPrefs = {
disablePageMode: {
description: "DEPRECATED.",
type: "boolean",
default: false,
},
disableTextLayer: {
description:
"DEPRECATED. Set textLayerMode to 0 to disable the text selection layer by default.",
type: "boolean",
default: false,
},
enableHandToolOnLoad: {
description:
"DEPRECATED. Set cursorToolOnLoad to 1 to enable the hand tool by default.",
type: "boolean",
default: false,
},
showPreviousViewOnLoad: {
description:
"DEPRECATED. Set viewOnLoad to 1 to disable showing the last page/position on load.",
type: "boolean",
default: true,
},
};
function buildPrefsSchema(prefs) {
const properties = Object.create(null);
for (const name in prefs) {
const pref = prefs[name];
let type = typeof pref;
switch (type) {
case "boolean":
case "string":
break;
case "number":
type = "integer";
break;
default:
throw new Error(`Invalid type (${type}) for "${name}"-preference.`);
}
const metadata = prefsMetadata[name];
if (metadata) {
let numMetadataKeys = 0;
// Do some (very basic) validation of the metadata.
for (const key in metadata) {
const entry = metadata[key];
switch (key) {
case "default":
case "type":
throw new Error(
`Invalid key (${key}) in metadata for "${name}"-preference.`
);
case "description":
if (entry.startsWith("DEPRECATED.")) {
throw new Error(
`The \`description\` of the "${name}"-preference cannot begin with "DEPRECATED."`
);
}
break;
}
numMetadataKeys++;
}
if (numMetadataKeys === 0) {
throw new Error(
`No metadata for "${name}"-preference, remove the entry.`
);
}
}
properties[name] = {
type,
default: pref,
...metadata,
};
}
for (const name in prefsMetadata) {
if (!properties[name]) {
// Do *not* throw here, since keeping the metadata up-to-date should be
// the responsibility of the CHROMIUM-addon maintainer.
console.error(
`The "${name}"-preference was removed, add it to \`deprecatedPrefs\` instead.\n`
);
}
}
for (const name in deprecatedPrefs) {
const entry = deprecatedPrefs[name];
if (properties[name]) {
throw new Error(
`The "${name}"-preference should not be listed as deprecated.`
);
}
if (!entry.description?.startsWith("DEPRECATED.")) {
throw new Error(
`The \`description\` of the deprecated "${name}"-preference must begin with "DEPRECATED."`
);
}
for (const key of ["default", "type"]) {
if (key in entry) {
continue;
}
throw new Error(
`A \`${key}\` entry must be provided for the deprecated "${name}"-preference.`
);
}
properties[name] = entry;
}
return {
type: "object",
properties,
};
}
export { buildPrefsSchema };

196
external/jbig2/LICENSE_JBIG2 vendored Normal file
View File

@ -0,0 +1,196 @@
// Copyright 2014 The PDFium Authors
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are
// met:
//
// * Redistributions of source code must retain the above copyright
// notice, this list of conditions and the following disclaimer.
// * Redistributions in binary form must reproduce the above
// copyright notice, this list of conditions and the following disclaimer
// in the documentation and/or other materials provided with the
// distribution.
// * Neither the name of Google Inc. nor the names of its
// contributors may be used to endorse or promote products derived from
// this software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
Apache License
Version 2.0, January 2004
https://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
https://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

13
external/jbig2/LICENSE_PDFJS_JBIG2 vendored Normal file
View File

@ -0,0 +1,13 @@
Copyright 2026 Mozilla Foundation
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

12
external/jbig2/README.md vendored Normal file
View File

@ -0,0 +1,12 @@
## Build
In order to generate the file `jbig2.js`:
* git clone https://github.com/mozilla/pdf.js.jbig2/
* the build requires to have a [Docker](https://www.docker.com/) setup and then:
* `node build.js -C` to build the Docker image
* `node build.js -co /pdf.js/external/jbig2/` to compile the decoder
## Licensing
[PDFium](https://pdfium.googlesource.com/pdfium/) is under [Apache-2.0](https://pdfium.googlesource.com/pdfium/+/main/LICENSE)
and [pdf.js.jbig2](https://github.com/mozilla/pdf.js.jbig2/) is released under [Apache-2.0](https://github.com/mozilla/pdf.js.jbig2/blob/main/LICENSE) license so `jbig2.js` is released under [Apache-2.0](https://github.com/mozilla/pdf.js.jbig2/blob/main/LICENSE) license too.

3
external/jbig2/jbig2.js vendored Normal file

File diff suppressed because one or more lines are too long

BIN
external/jbig2/jbig2.wasm vendored Normal file

Binary file not shown.

View File

@ -13,8 +13,6 @@
* limitations under the License.
*/
// TODO: Remove the exception below once someone figures out how to fix it.
// eslint-disable-next-line import/no-unresolved
import { parse, registerWalkers, Root } from "postcss-values-parser";
import { isString } from "stylelint/lib/utils/validateTypes.mjs";
import stylelint from "stylelint";

View File

@ -20,7 +20,9 @@ import {
import { exec, execSync, spawn, spawnSync } from "child_process";
import autoprefixer from "autoprefixer";
import babel from "@babel/core";
import { buildPrefsSchema } from "./external/chromium/prefs.mjs";
import crypto from "crypto";
import { finished } from "stream/promises";
import fs from "fs";
import gulp from "gulp";
import hljs from "highlight.js";
@ -67,6 +69,7 @@ const GH_PAGES_DIR = BUILD_DIR + "gh-pages/";
const DIST_DIR = BUILD_DIR + "dist/";
const TYPES_DIR = BUILD_DIR + "types/";
const TMP_DIR = BUILD_DIR + "tmp/";
const PREFSTEST_DIR = BUILD_DIR + "prefstest/";
const TYPESTEST_DIR = BUILD_DIR + "typestest/";
const COMMON_WEB_FILES = [
"web/images/*.{png,svg,gif}",
@ -229,7 +232,7 @@ function createWebpackAlias(defines) {
libraryAlias["display-fetch_stream"] = "src/display/fetch_stream.js";
libraryAlias["display-network"] = "src/display/network.js";
viewerAlias["web-download_manager"] = "web/download_manager.js";
viewerAlias["web-download_manager"] = "web/chromecom.js";
viewerAlias["web-external_services"] = "web/chromecom.js";
viewerAlias["web-null_l10n"] = "web/l10n.js";
viewerAlias["web-preferences"] = "web/chromecom.js";
@ -283,7 +286,6 @@ function createWebpackConfig(
disableVersionInfo = false,
disableSourceMaps = false,
disableLicenseHeader = false,
defaultPreferencesDir = null,
} = {}
) {
const versionInfo = !disableVersionInfo
@ -294,9 +296,6 @@ function createWebpackConfig(
BUNDLE_VERSION: versionInfo.version,
BUNDLE_BUILD: versionInfo.commit,
TESTING: defines.TESTING ?? process.env.TESTING === "true",
DEFAULT_PREFERENCES: defaultPreferencesDir
? getDefaultPreferences(defaultPreferencesDir)
: {},
DEFAULT_FTL: defines.GENERIC ? getDefaultFtl() : "",
};
const licenseHeaderLibre = fs
@ -452,51 +451,6 @@ function getVersionJSON() {
return JSON.parse(fs.readFileSync(BUILD_DIR + "version.json").toString());
}
function checkChromePreferencesFile(chromePrefsPath, webPrefs) {
const chromePrefs = JSON.parse(fs.readFileSync(chromePrefsPath).toString());
const chromePrefsKeys = Object.keys(chromePrefs.properties).filter(key => {
const description = chromePrefs.properties[key].description;
// Deprecated keys are allowed in the managed preferences file.
// The code maintainer is responsible for adding migration logic to
// extensions/chromium/options/migration.js and web/chromecom.js .
return !description?.startsWith("DEPRECATED.");
});
let ret = true;
// Verify that every entry in webPrefs is also in preferences_schema.json.
for (const [key, value] of Object.entries(webPrefs)) {
if (!chromePrefsKeys.includes(key)) {
// Note: this would also reject keys that are present but marked as
// DEPRECATED. A key should not be marked as DEPRECATED if it is still
// listed in webPrefs.
ret = false;
console.log(
`Warning: ${chromePrefsPath} does not contain an entry for pref: ${key}`
);
} else if (chromePrefs.properties[key].default !== value) {
ret = false;
console.log(
`Warning: not the same values (for "${key}"): ` +
`${chromePrefs.properties[key].default} !== ${value}`
);
}
}
// Verify that preferences_schema.json does not contain entries that are not
// in webPrefs (app_options.js).
for (const key of chromePrefsKeys) {
if (!(key in webPrefs)) {
ret = false;
console.log(
`Warning: ${chromePrefsPath} contains an unrecognized pref: ${key}. ` +
`Remove it, or prepend "DEPRECATED. " and add migration logic to ` +
`extensions/chromium/options/migration.js and web/chromecom.js.`
);
}
}
return ret;
}
function createMainBundle(defines) {
const mainFileConfig = createWebpackConfig(defines, {
filename: defines.MINIFIED ? "pdf.min.mjs" : "pdf.mjs",
@ -590,36 +544,24 @@ function createWorkerBundle(defines) {
}
function createWebBundle(defines, options) {
const viewerFileConfig = createWebpackConfig(
defines,
{
filename: "viewer.mjs",
library: {
type: "module",
},
const viewerFileConfig = createWebpackConfig(defines, {
filename: "viewer.mjs",
library: {
type: "module",
},
{
defaultPreferencesDir: options.defaultPreferencesDir,
}
);
});
return gulp
.src("./web/viewer.js", { encoding: false })
.pipe(webpack2Stream(viewerFileConfig));
}
function createGVWebBundle(defines, options) {
const viewerFileConfig = createWebpackConfig(
defines,
{
filename: "viewer-geckoview.mjs",
library: {
type: "module",
},
const viewerFileConfig = createWebpackConfig(defines, {
filename: "viewer-geckoview.mjs",
library: {
type: "module",
},
{
defaultPreferencesDir: options.defaultPreferencesDir,
}
);
});
return gulp
.src("./web/viewer-geckoview.js", { encoding: false })
.pipe(webpack2Stream(viewerFileConfig));
@ -696,6 +638,10 @@ function createWasmBundle() {
base: "external/qcms",
encoding: false,
}),
gulp.src(["external/jbig2/*.wasm", "external/jbig2/LICENSE_*"], {
base: "external/jbig2",
encoding: false,
}),
]);
}
@ -733,8 +679,7 @@ function getTempFile(prefix, suffix) {
function runTests(testsName, { bot = false, xfaOnly = false } = {}) {
return new Promise((resolve, reject) => {
console.log();
console.log("### Running " + testsName + " tests");
console.log("\n### Running " + testsName + " tests");
const PDF_TEST = process.env.PDF_TEST || "test_manifest.json";
let forceNoChrome = false;
@ -819,8 +764,7 @@ function collectArgs(options, args) {
}
function makeRef(done, bot) {
console.log();
console.log("### Creating reference images");
console.log("\n### Creating reference images");
let forceNoChrome = false;
const args = ["test.mjs", "--masterMode"];
@ -870,9 +814,30 @@ gulp.task("default", function (done) {
done();
});
function createBuildNumber(done) {
gulp.task("release-brotli", async function (done) {
const hashIndex = process.argv.indexOf("--hash");
if (hashIndex === -1 || hashIndex + 1 >= process.argv.length) {
throw new Error('Missing "--hash <commit-hash>" argument.');
}
console.log();
console.log("### Getting extension build number");
console.log("### Getting Brotli js file for release");
const OUTPUT_DIR = "./external/brotli/";
const hash = process.argv[hashIndex + 1];
const url = `https://raw.githubusercontent.com/google/brotli/${hash}/js/decode.js`;
const outputPath = OUTPUT_DIR + "decode.js";
const res = await fetch(url);
const fileStream = fs.createWriteStream(outputPath, { flags: "w" });
await finished(stream.Readable.fromWeb(res.body).pipe(fileStream));
fileStream.end();
console.log(`Brotli js file saved to: ${outputPath}`);
done();
});
function createBuildNumber(done) {
console.log("\n### Getting extension build number");
exec(
"git log --format=oneline " + config.baseVersion + "..",
@ -917,8 +882,7 @@ function createBuildNumber(done) {
}
function buildDefaultPreferences(defines, dir) {
console.log();
console.log("### Building default preferences");
console.log(`\n### Building default preferences (${dir})`);
const bundleDefines = {
...defines,
@ -944,12 +908,14 @@ function buildDefaultPreferences(defines, dir) {
.pipe(gulp.dest(DEFAULT_PREFERENCES_DIR + dir));
}
async function parseDefaultPreferences(dir) {
console.log();
console.log("### Parsing default preferences");
function getDefaultPreferences(dir) {
console.log(`\n### Parsing default preferences (${dir})`);
// eslint-disable-next-line no-unsanitized/method
const { AppOptions, OptionKind } = await import(
const require = process
.getBuiltinModule("module")
.createRequire(import.meta.url);
const { AppOptions, OptionKind } = require(
"./" + DEFAULT_PREFERENCES_DIR + dir + "app_options.mjs"
);
@ -958,20 +924,9 @@ async function parseDefaultPreferences(dir) {
/* defaultOnly = */ true
);
if (Object.keys(prefs).length === 0) {
throw new Error("No default preferences found.");
throw new Error(`No default preferences found in "${dir}".`);
}
fs.writeFileSync(
DEFAULT_PREFERENCES_DIR + dir + "default_preferences.json",
JSON.stringify(prefs)
);
}
function getDefaultPreferences(dir) {
const str = fs
.readFileSync(DEFAULT_PREFERENCES_DIR + dir + "default_preferences.json")
.toString();
return JSON.parse(str);
return prefs;
}
function getDefaultFtl() {
@ -992,8 +947,7 @@ function getDefaultFtl() {
gulp.task("locale", function () {
const VIEWER_LOCALE_OUTPUT = "web/locale/";
console.log();
console.log("### Building localization files");
console.log("\n### Building localization files");
fs.rmSync(VIEWER_LOCALE_OUTPUT, { recursive: true, force: true });
fs.mkdirSync(VIEWER_LOCALE_OUTPUT, { recursive: true });
@ -1041,8 +995,7 @@ gulp.task("cmaps", async function () {
const CMAP_INPUT = "external/cmaps";
const VIEWER_CMAP_OUTPUT = "external/bcmaps";
console.log();
console.log("### Building cmaps");
console.log("\n### Building cmaps");
// Testing a file that usually present.
if (!checkFile(CMAP_INPUT + "/UniJIS-UCS2-H")) {
@ -1103,11 +1056,7 @@ function buildGeneric(defines, dir) {
createMainBundle(defines).pipe(gulp.dest(dir + "build")),
createWorkerBundle(defines).pipe(gulp.dest(dir + "build")),
createSandboxBundle(defines).pipe(gulp.dest(dir + "build")),
createWebBundle(defines, {
defaultPreferencesDir: defines.SKIP_BABEL
? "generic/"
: "generic-legacy/",
}).pipe(gulp.dest(dir + "web")),
createWebBundle(defines).pipe(gulp.dest(dir + "web")),
gulp
.src(COMMON_WEB_FILES, { base: "web/", encoding: false })
.pipe(gulp.dest(dir + "web")),
@ -1151,17 +1100,10 @@ gulp.task(
"locale",
function scriptingGeneric() {
const defines = { ...DEFINES, GENERIC: true };
return ordered([
buildDefaultPreferences(defines, "generic/"),
createTemporaryScriptingBundle(defines),
]);
},
async function prefsGeneric() {
await parseDefaultPreferences("generic/");
return createTemporaryScriptingBundle(defines);
},
function createGeneric() {
console.log();
console.log("### Creating generic viewer");
console.log("\n### Creating generic viewer");
const defines = { ...DEFINES, GENERIC: true };
return buildGeneric(defines, GENERIC_DIR);
@ -1178,17 +1120,10 @@ gulp.task(
"locale",
function scriptingGenericLegacy() {
const defines = { ...DEFINES, GENERIC: true, SKIP_BABEL: false };
return ordered([
buildDefaultPreferences(defines, "generic-legacy/"),
createTemporaryScriptingBundle(defines),
]);
},
async function prefsGenericLegacy() {
await parseDefaultPreferences("generic-legacy/");
return createTemporaryScriptingBundle(defines);
},
function createGenericLegacy() {
console.log();
console.log("### Creating generic (legacy) viewer");
console.log("\n### Creating generic (legacy) viewer");
const defines = { ...DEFINES, GENERIC: true, SKIP_BABEL: false };
return buildGeneric(defines, GENERIC_LEGACY_DIR);
@ -1199,16 +1134,7 @@ gulp.task(
function buildComponents(defines, dir) {
fs.rmSync(dir, { recursive: true, force: true });
const COMPONENTS_IMAGES = [
"web/images/annotation-*.svg",
"web/images/loading-icon.gif",
"web/images/altText_*.svg",
"web/images/editor-toolbar-*.svg",
"web/images/messageBar_*.svg",
"web/images/toolbarButton-{editorHighlight,menuArrow}.svg",
"web/images/cursor-*.svg",
"web/images/comment-*.svg",
];
const COMPONENTS_IMAGES = ["web/images/*.svg", "web/images/*.gif"];
return ordered([
createComponentsBundle(defines).pipe(gulp.dest(dir)),
@ -1232,8 +1158,7 @@ function buildComponents(defines, dir) {
gulp.task(
"components",
gulp.series(createBuildNumber, function createComponents() {
console.log();
console.log("### Creating generic components");
console.log("\n### Creating generic components");
const defines = { ...DEFINES, COMPONENTS: true, GENERIC: true };
return buildComponents(defines, COMPONENTS_DIR);
@ -1243,8 +1168,7 @@ gulp.task(
gulp.task(
"components-legacy",
gulp.series(createBuildNumber, function createComponentsLegacy() {
console.log();
console.log("### Creating generic (legacy) components");
console.log("\n### Creating generic (legacy) components");
const defines = {
...DEFINES,
COMPONENTS: true,
@ -1259,8 +1183,7 @@ gulp.task(
gulp.task(
"image_decoders",
gulp.series(createBuildNumber, function createImageDecoders() {
console.log();
console.log("### Creating image decoders");
console.log("\n### Creating image decoders");
const defines = { ...DEFINES, GENERIC: true, IMAGE_DECODERS: true };
return createImageDecodersBundle(defines).pipe(
@ -1272,8 +1195,7 @@ gulp.task(
gulp.task(
"image_decoders-legacy",
gulp.series(createBuildNumber, function createImageDecodersLegacy() {
console.log();
console.log("### Creating (legacy) image decoders");
console.log("\n### Creating (legacy) image decoders");
const defines = {
...DEFINES,
GENERIC: true,
@ -1307,17 +1229,10 @@ gulp.task(
"locale",
function scriptingMinified() {
const defines = { ...DEFINES, MINIFIED: true, GENERIC: true };
return ordered([
buildDefaultPreferences(defines, "minified/"),
createTemporaryScriptingBundle(defines),
]);
},
async function prefsMinified() {
await parseDefaultPreferences("minified/");
return createTemporaryScriptingBundle(defines);
},
function createMinified() {
console.log();
console.log("### Creating minified viewer");
console.log("\n### Creating minified viewer");
const defines = { ...DEFINES, MINIFIED: true, GENERIC: true };
return buildMinified(defines, MINIFIED_DIR);
@ -1337,17 +1252,10 @@ gulp.task(
GENERIC: true,
SKIP_BABEL: false,
};
return ordered([
buildDefaultPreferences(defines, "minified-legacy/"),
createTemporaryScriptingBundle(defines),
]);
},
async function prefsMinifiedLegacy() {
await parseDefaultPreferences("minified-legacy/");
return createTemporaryScriptingBundle(defines);
},
function createMinifiedLegacy() {
console.log();
console.log("### Creating minified (legacy) viewer");
console.log("\n### Creating minified (legacy) viewer");
const defines = {
...DEFINES,
MINIFIED: true,
@ -1361,6 +1269,8 @@ gulp.task(
);
function createDefaultPrefsFile() {
console.log("\n### Building mozilla-central preferences file");
const defaultFileName = "PdfJsDefaultPrefs.js",
overrideFileName = "PdfJsOverridePrefs.js";
const licenseHeader = fs.readFileSync("./src/license_header.js").toString();
@ -1399,12 +1309,8 @@ gulp.task(
const defines = { ...DEFINES, MOZCENTRAL: true };
return buildDefaultPreferences(defines, "mozcentral/");
},
async function prefsMozcentral() {
await parseDefaultPreferences("mozcentral/");
},
function createMozcentral() {
console.log();
console.log("### Building mozilla-central extension");
console.log("\n### Building mozilla-central extension");
const defines = { ...DEFINES, MOZCENTRAL: true };
const gvDefines = { ...defines, GECKOVIEW: true };
@ -1438,12 +1344,12 @@ gulp.task(
createWorkerBundle(defines).pipe(
gulp.dest(MOZCENTRAL_CONTENT_DIR + "build")
),
createWebBundle(defines, { defaultPreferencesDir: "mozcentral/" }).pipe(
createWebBundle(defines).pipe(
gulp.dest(MOZCENTRAL_CONTENT_DIR + "web")
),
createGVWebBundle(gvDefines).pipe(
gulp.dest(MOZCENTRAL_CONTENT_DIR + "web")
),
createGVWebBundle(gvDefines, {
defaultPreferencesDir: "mozcentral/",
}).pipe(gulp.dest(MOZCENTRAL_CONTENT_DIR + "web")),
gulp
.src(MOZCENTRAL_WEB_FILES, { base: "web/", encoding: false })
.pipe(gulp.dest(MOZCENTRAL_CONTENT_DIR + "web")),
@ -1495,6 +1401,18 @@ gulp.task(
)
);
function createChromiumPrefsSchema() {
console.log("\n### Building Chromium preferences file");
const prefs = getDefaultPreferences("chromium/");
const chromiumPrefs = buildPrefsSchema(prefs);
return createStringSource(
"preferences_schema.json",
JSON.stringify(chromiumPrefs, null, 2)
);
}
gulp.task(
"chromium",
gulp.series(
@ -1507,12 +1425,8 @@ gulp.task(
createTemporaryScriptingBundle(defines),
]);
},
async function prefsChromium() {
await parseDefaultPreferences("chromium/");
},
function createChromium() {
console.log();
console.log("### Building Chromium extension");
console.log("\n### Building Chromium extension");
const defines = { ...DEFINES, CHROME: true, SKIP_BABEL: false };
const CHROME_BUILD_DIR = BUILD_DIR + "/chromium/",
@ -1538,7 +1452,7 @@ gulp.task(
createSandboxBundle(defines).pipe(
gulp.dest(CHROME_BUILD_CONTENT_DIR + "build")
),
createWebBundle(defines, { defaultPreferencesDir: "chromium/" }).pipe(
createWebBundle(defines).pipe(
gulp.dest(CHROME_BUILD_CONTENT_DIR + "web")
),
gulp
@ -1587,22 +1501,19 @@ gulp.task(
.pipe(replace(/\bPDFJSSCRIPT_VERSION\b/g, version))
.pipe(gulp.dest(CHROME_BUILD_DIR)),
gulp
.src(
[
"extensions/chromium/**/*.{html,js,css,png}",
"extensions/chromium/preferences_schema.json",
],
{ base: "extensions/chromium/", encoding: false }
)
.src(["extensions/chromium/**/*.{html,js,css,png}"], {
base: "extensions/chromium/",
encoding: false,
})
.pipe(gulp.dest(CHROME_BUILD_DIR)),
createChromiumPrefsSchema().pipe(gulp.dest(CHROME_BUILD_DIR)),
]);
}
)
);
gulp.task("jsdoc", function (done) {
console.log();
console.log("### Generating documentation (JSDoc)");
console.log("\n### Generating documentation (JSDoc)");
fs.rmSync(JSDOC_BUILD_DIR, { recursive: true, force: true });
fs.mkdirSync(JSDOC_BUILD_DIR, { recursive: true });
@ -1675,9 +1586,6 @@ function buildLib(defines, dir) {
BUNDLE_VERSION: versionInfo.version,
BUNDLE_BUILD: versionInfo.commit,
TESTING: defines.TESTING ?? process.env.TESTING === "true",
DEFAULT_PREFERENCES: getDefaultPreferences(
defines.SKIP_BABEL ? "lib/" : "lib-legacy/"
),
DEFAULT_FTL: getDefaultFtl(),
};
@ -1696,6 +1604,8 @@ function buildLib(defines, dir) {
gulp.src("test/unit/*.js", { base: ".", encoding: false }),
gulp.src("external/openjpeg/*.js", { base: "openjpeg/", encoding: false }),
gulp.src("external/qcms/*.js", { base: "qcms/", encoding: false }),
gulp.src("external/jbig2/*.js", { base: "jbig2/", encoding: false }),
gulp.src("external/brotli/*.js", { base: "brotli/", encoding: false }),
]);
return buildLibHelper(bundleDefines, inputStream, dir);
@ -1707,13 +1617,7 @@ gulp.task(
createBuildNumber,
function scriptingLib() {
const defines = { ...DEFINES, GENERIC: true, LIB: true };
return ordered([
buildDefaultPreferences(defines, "lib/"),
createTemporaryScriptingBundle(defines),
]);
},
async function prefsLib() {
await parseDefaultPreferences("lib/");
return createTemporaryScriptingBundle(defines);
},
function createLib() {
const defines = { ...DEFINES, GENERIC: true, LIB: true };
@ -1737,13 +1641,7 @@ gulp.task(
LIB: true,
SKIP_BABEL: false,
};
return ordered([
buildDefaultPreferences(defines, "lib-legacy/"),
createTemporaryScriptingBundle(defines),
]);
},
async function prefsLibLegacy() {
await parseDefaultPreferences("lib-legacy/");
return createTemporaryScriptingBundle(defines);
},
function createLibLegacy() {
const defines = {
@ -1926,6 +1824,49 @@ gulp.task(
)
);
gulp.task(
"prefstest",
gulp.series(
setTestEnv,
function genericPrefs() {
const defines = { ...DEFINES, GENERIC: true };
return buildDefaultPreferences(defines, "generic/");
},
function genericLegacyPrefs() {
const defines = { ...DEFINES, GENERIC: true, SKIP_BABEL: false };
return buildDefaultPreferences(defines, "generic-legacy/");
},
function chromiumPrefs() {
const defines = { ...DEFINES, CHROME: true, SKIP_BABEL: false };
return buildDefaultPreferences(defines, "chromium/");
},
function mozcentralPrefs() {
const defines = { ...DEFINES, MOZCENTRAL: true };
return buildDefaultPreferences(defines, "mozcentral/");
},
function checkPrefs() {
console.log("\n### Checking preference generation");
// Check that the preferences were correctly generated,
// for all the relevant builds.
for (const dir of [
"generic/",
"generic-legacy/",
"chromium/",
"mozcentral/",
]) {
getDefaultPreferences(dir);
}
// Check that all the relevant files can be generated.
return ordered([
createChromiumPrefsSchema().pipe(gulp.dest(PREFSTEST_DIR)),
createDefaultPrefsFile().pipe(gulp.dest(PREFSTEST_DIR)),
]);
}
)
);
gulp.task(
"typestest",
gulp.series(
@ -1959,8 +1900,7 @@ gulp.task(
);
function createBaseline(done) {
console.log();
console.log("### Creating baseline environment");
console.log("\n### Creating baseline environment");
const baselineCommit = process.env.BASELINE;
if (!baselineCommit) {
@ -2021,8 +1961,7 @@ gulp.task(
);
gulp.task("lint", function (done) {
console.log();
console.log("### Linting JS/CSS/JSON/SVG/HTML files");
console.log("\n### Linting JS/CSS/JSON/SVG/HTML files");
// Ensure that we lint the Firefox specific *.jsm files too.
const esLintOptions = [
@ -2102,8 +2041,7 @@ gulp.task("lint", function (done) {
gulp.task(
"lint-mozcentral",
gulp.series("mozcentral", function runLintMozcentral(done) {
console.log();
console.log("### Checking mozilla-central files");
console.log("\n### Checking mozilla-central files");
const styleLintOptions = [
"../../node_modules/stylelint/bin/stylelint.mjs",
@ -2128,39 +2066,6 @@ gulp.task(
})
);
gulp.task(
"lint-chromium",
gulp.series(
function scriptingLintChromium() {
const defines = {
...DEFINES,
CHROME: true,
SKIP_BABEL: false,
TESTING: false,
};
return buildDefaultPreferences(defines, "lint-chromium/");
},
async function prefsLintChromium() {
await parseDefaultPreferences("lint-chromium/");
},
function runLintChromium(done) {
console.log();
console.log("### Checking supplemental Chromium files");
if (
!checkChromePreferencesFile(
"extensions/chromium/preferences_schema.json",
getDefaultPreferences("lint-chromium/")
)
) {
done(new Error("chromium/preferences_schema is not in sync."));
return;
}
done();
}
)
);
gulp.task("dev-wasm", function () {
const VIEWER_WASM_OUTPUT = "web/wasm/";
@ -2180,8 +2085,7 @@ gulp.task(
});
},
function createDevSandbox() {
console.log();
console.log("### Building development sandbox");
console.log("\n### Building development sandbox");
const defines = { ...DEFINES, GENERIC: true, TESTING: true };
const sandboxDir = BUILD_DIR + "dev-sandbox/";
@ -2207,7 +2111,7 @@ gulp.task(
},
function watchWasm() {
gulp.watch(
["external/openjpeg/*", "external/qcms/*"],
["external/openjpeg/*", "external/qcms/*", "external/jbig2/*"],
{ ignoreInitial: false },
gulp.series("dev-wasm")
);
@ -2225,8 +2129,7 @@ gulp.task(
);
},
async function createServer() {
console.log();
console.log("### Starting local server");
console.log("\n### Starting local server");
let port = 8888;
const i = process.argv.indexOf("--port");
@ -2247,8 +2150,7 @@ gulp.task(
);
gulp.task("clean", function (done) {
console.log();
console.log("### Cleaning up project builds");
console.log("\n### Cleaning up project builds");
fs.rmSync(BUILD_DIR, { recursive: true, force: true });
done();
@ -2257,8 +2159,7 @@ gulp.task("clean", function (done) {
gulp.task("importl10n", async function () {
const { downloadL10n } = await import("./external/importL10n/locales.mjs");
console.log();
console.log("### Importing translations from mozilla-central");
console.log("\n### Importing translations from mozilla-central");
if (!fs.existsSync(L10N_DIR)) {
fs.mkdirSync(L10N_DIR);
@ -2267,8 +2168,7 @@ gulp.task("importl10n", async function () {
});
function ghPagesPrepare() {
console.log();
console.log("### Creating web site");
console.log("\n### Creating web site");
fs.rmSync(GH_PAGES_DIR, { recursive: true, force: true });
@ -2369,7 +2269,8 @@ function packageJson() {
bugs: DIST_BUGS_URL,
license: DIST_LICENSE,
optionalDependencies: {
"@napi-rs/canvas": "^0.1.84",
"@napi-rs/canvas": "^0.1.88",
"node-readable-to-web-readable-stream": "^0.4.2",
},
browser: {
canvas: false,
@ -2536,8 +2437,7 @@ gulp.task(
gulp.task(
"mozcentralbaseline",
gulp.series(createBaseline, function createMozcentralBaseline(done) {
console.log();
console.log("### Creating mozcentral baseline environment");
console.log("\n### Creating mozcentral baseline environment");
// Create a mozcentral build.
fs.rmSync(BASELINE_DIR + BUILD_DIR, { recursive: true, force: true });
@ -2574,8 +2474,7 @@ gulp.task(
"mozcentral",
"mozcentralbaseline",
function createMozcentralDiff(done) {
console.log();
console.log("### Creating mozcentral diff");
console.log("\n### Creating mozcentral diff");
// Create the diff between the current mozcentral build and the
// baseline mozcentral build, which both exist at this point.
@ -2617,14 +2516,12 @@ gulp.task(
);
gulp.task("externaltest", function (done) {
console.log();
console.log("### Running test-fixtures.js");
console.log("\n### Running test-fixtures.js");
safeSpawnSync("node", ["external/builder/test-fixtures.mjs"], {
stdio: "inherit",
});
console.log();
console.log("### Running test-fixtures_babel.js");
console.log("\n### Running test-fixtures_babel.js");
safeSpawnSync("node", ["external/builder/test-fixtures_babel.mjs"], {
stdio: "inherit",
});

View File

@ -167,10 +167,10 @@ pdfjs-printing-not-ready = Warnowanje: PDF njejo se za śišćanje dopołnje zac
## Tooltips and alt text for side panel toolbar buttons
pdfjs-toggle-sidebar-button =
.title = Bócnicu pokazaś/schowaś
.title = Bocnicu pokazaś/schowaś
pdfjs-toggle-sidebar-notification-button =
.title = Bocnicu pśešaltowaś (dokument rozrědowanje/pśipiski/warstwy wopśimujo)
pdfjs-toggle-sidebar-button-label = Bócnicu pokazaś/schowaś
pdfjs-toggle-sidebar-button-label = Bocnicu pokazaś/schowaś
pdfjs-document-outline-button =
.title = Dokumentowe naraźenje pokazaś (dwójne kliknjenje, aby se wšykne zapiski pokazali/schowali)
pdfjs-document-outline-button-label = Dokumentowa struktura
@ -389,9 +389,9 @@ pdfjs-editor-comments-sidebar-title =
*[other] { $count } komentarow
}
pdfjs-editor-comments-sidebar-close-button =
.title = Bócnicu zacyniś
.aria-label = Bócnicu zacyniś
pdfjs-editor-comments-sidebar-close-button-label = Bócnicu zacyniś
.title = Bocnicu zacyniś
.aria-label = Bocnicu zacyniś
pdfjs-editor-comments-sidebar-close-button-label = Bocnicu zacyniś
# Instructional copy to add a comment by selecting text or an annotations.
pdfjs-editor-comments-sidebar-no-comments1 = Wiźiśo něco wobspomnjeśa gódnego? Wuzwigniśo to a zawóstajśo komentar.
pdfjs-editor-comments-sidebar-no-comments-link = Dalšne informacije

View File

@ -532,15 +532,6 @@ pdfjs-editor-alt-text-settings-automatic-title = Automatic alt text
pdfjs-editor-alt-text-settings-create-model-button-label = Create alt text automatically
pdfjs-editor-alt-text-settings-create-model-description = Suggests descriptions to help people who cant see the image or when the image doesnt load.
# Variables:
# $totalSize (Number) - the total size (in MB) of the AI model.
pdfjs-editor-alt-text-settings-download-model-label = Alt text AI model ({ $totalSize } MB)
pdfjs-editor-alt-text-settings-ai-model-description = Runs locally on your device so your data stays private. Required for automatic alt text.
pdfjs-editor-alt-text-settings-delete-model-button = Delete
pdfjs-editor-alt-text-settings-download-model-button = Download
pdfjs-editor-alt-text-settings-downloading-model-button = Downloading…
pdfjs-editor-alt-text-settings-editor-title = Alt text editor
pdfjs-editor-alt-text-settings-show-dialog-button-label = Show alt text editor right away when adding an image
pdfjs-editor-alt-text-settings-show-dialog-description = Helps you make sure all your images have alt text.
@ -561,6 +552,7 @@ pdfjs-editor-undo-bar-message-freetext = Text removed
pdfjs-editor-undo-bar-message-ink = Drawing removed
pdfjs-editor-undo-bar-message-stamp = Image removed
pdfjs-editor-undo-bar-message-signature = Signature removed
pdfjs-editor-undo-bar-message-comment = Comment removed
# Variables:
# $count (Number) - the number of removed annotations.
pdfjs-editor-undo-bar-message-multiple =

View File

@ -628,6 +628,11 @@ pdfjs-editor-edit-comment-dialog-text-input =
.placeholder = Scomence a scrivi…
pdfjs-editor-edit-comment-dialog-cancel-button = Anule
## Edit a comment button in the editor toolbar
pdfjs-editor-add-comment-button =
.title = Zonte coment
## Main menu for adding/removing signatures
pdfjs-editor-delete-signature-button1 =

View File

@ -318,6 +318,12 @@ pdfjs-editor-signature-button-label = Қолтаңбаны қосу
## Default editor aria labels
# “Highlight” is a noun, the string is used on the editor for highlights.
pdfjs-editor-highlight-editor =
.aria-label = Ерекшелеу түзеткіші
# “Drawing” is a noun, the string is used on the editor for drawings.
pdfjs-editor-ink-editor =
.aria-label = Сурет салу түзеткіші
# Used when a signature editor is selected/hovered.
# Variables:
# $description (String) - a string describing/labeling the signature.
@ -380,6 +386,8 @@ pdfjs-editor-comments-sidebar-close-button =
.title = Бүйір панелін жабу
.aria-label = Бүйір панелін жабу
pdfjs-editor-comments-sidebar-close-button-label = Бүйір панелін жабу
# Instructional copy to add a comment by selecting text or an annotations.
pdfjs-editor-comments-sidebar-no-comments1 = Назар аударарлық бірдеңе көрдіңіз бе? Оны ерекшелеп, түсіндірме қалдырыңыз.
pdfjs-editor-comments-sidebar-no-comments-link = Көбірек білу
## Alt-text dialog
@ -513,6 +521,7 @@ pdfjs-editor-alt-text-settings-close-button = Жабу
## Accessibility labels (announced by screen readers) for objects added to the editor.
pdfjs-editor-highlight-added-alert = Ерекшелеу қосылды
pdfjs-editor-freetext-added-alert = Мәтін қосылды
pdfjs-editor-ink-added-alert = Сызба қосылды
pdfjs-editor-stamp-added-alert = Сурет қосылды
@ -586,6 +595,8 @@ pdfjs-editor-add-signature-save-checkbox = Қолтаңбаны сақтау
pdfjs-editor-add-signature-save-warning-message = Сақталған 5 қолтаңбаның шегіне жеттіңіз. Көбірек сақтау үшін біреуін алып тастаңыз.
pdfjs-editor-add-signature-image-upload-error-title = Суретті жүктеп жіберу мүмкін емес.
pdfjs-editor-add-signature-image-upload-error-description = Желі байланысын тексеріңіз немесе басқа бейнені қолданып көріңіз.
pdfjs-editor-add-signature-image-no-data-error-title = Бұл суретті қолтаңбаға түрлендіру мүмкін емес
pdfjs-editor-add-signature-image-no-data-error-description = Басқа суретті жүктеп салып көріңіз.
pdfjs-editor-add-signature-error-close-button = Жабу
## Dialog buttons
@ -594,10 +605,27 @@ pdfjs-editor-add-signature-cancel-button = Бас тарту
pdfjs-editor-add-signature-add-button = Қосу
pdfjs-editor-edit-signature-update-button = Жаңарту
## Comment popup
pdfjs-editor-edit-comment-popup-button-label = Түсіндірмені түзету
pdfjs-editor-edit-comment-popup-button =
.title = Түсіндірмені түзету
pdfjs-editor-delete-comment-popup-button-label = Түсіндірмені өшіру
pdfjs-editor-delete-comment-popup-button =
.title = Түсіндірмені өшіру
pdfjs-show-comment-button =
.title = Түсіндірмені көрсету
## Edit a comment dialog
# An existing comment is edited
pdfjs-editor-edit-comment-dialog-title-when-editing = Түсіндірмені түзету
pdfjs-editor-edit-comment-dialog-save-button-when-editing = Жаңарту
# No existing comment
pdfjs-editor-edit-comment-dialog-title-when-adding = Түсіндірмені қосу
pdfjs-editor-edit-comment-dialog-save-button-when-adding = Қосу
pdfjs-editor-edit-comment-dialog-text-input =
.placeholder = Теріп бастаңыз…
pdfjs-editor-edit-comment-dialog-cancel-button = Бас тарту
## Edit a comment button in the editor toolbar

View File

@ -309,6 +309,10 @@ pdfjs-editor-signature-add-signature-button-label = പുതിയ ഒപ്പ
# $description (String) - a string describing/labeling the signature.
pdfjs-editor-add-saved-signature-button =
.title = കരുതിവച്ച ഒപ്പു് : { $description }
pdfjs-editor-comments-sidebar-close-button =
.title = അണിവക്കം അടയ്ക്കുക
.aria-label = അണിവക്കം അടയ്ക്കുക
pdfjs-editor-comments-sidebar-close-button-label = അണിവക്കം അടയ്ക്കുക
## Alt-text dialog

View File

@ -70,10 +70,10 @@ pdfjs-page-rotate-ccw-button =
.title = Повернуть против часовой стрелки
pdfjs-page-rotate-ccw-button-label = Повернуть против часовой стрелки
pdfjs-cursor-text-select-tool-button =
.title = Включить Инструмент «Выделение текста»
.title = Включить инструмент «Выделение текста»
pdfjs-cursor-text-select-tool-button-label = Инструмент «Выделение текста»
pdfjs-cursor-hand-tool-button =
.title = Включить Инструмент «Рука»
.title = Включить инструмент «Рука»
pdfjs-cursor-hand-tool-button-label = Инструмент «Рука»
pdfjs-scroll-page-button =
.title = Использовать прокрутку страниц
@ -363,7 +363,7 @@ pdfjs-editor-free-highlight-thickness-input = Толщина
pdfjs-editor-free-highlight-thickness-title =
.title = Изменить толщину при выделении элементов, кроме текста
pdfjs-editor-add-signature-container =
.aria-label = Управление подписями и сохраненные подписи
.aria-label = Управление подписями и сохранённые подписи
pdfjs-editor-signature-add-signature-button =
.title = Добавить новую подпись
pdfjs-editor-signature-add-signature-button-label = Добавить новую подпись

466
package-lock.json generated
View File

@ -15,12 +15,12 @@
"@fluent/dom": "^0.10.2",
"@metalsmith/layouts": "^3.0.0",
"@metalsmith/markdown": "^1.10.0",
"@napi-rs/canvas": "^0.1.84",
"@types/node": "^25.0.1",
"autoprefixer": "^10.4.22",
"@napi-rs/canvas": "^0.1.88",
"@types/node": "^25.0.3",
"autoprefixer": "^10.4.23",
"babel-loader": "^10.0.0",
"cached-iterable": "^0.3.0",
"caniuse-lite": "^1.0.30001760",
"caniuse-lite": "^1.0.30001762",
"core-js": "^3.47.0",
"eslint": "^9.39.2",
"eslint-config-prettier": "^10.1.8",
@ -28,10 +28,10 @@
"eslint-plugin-jasmine": "^4.2.2",
"eslint-plugin-json": "^4.0.1",
"eslint-plugin-no-unsanitized": "^4.1.4",
"eslint-plugin-perfectionist": "^4.15.1",
"eslint-plugin-perfectionist": "^5.2.0",
"eslint-plugin-prettier": "^5.5.4",
"eslint-plugin-unicorn": "^62.0.0",
"globals": "^16.5.0",
"globals": "^17.0.0",
"gulp": "^5.0.1",
"gulp-cli": "^3.1.0",
"gulp-postcss": "^10.0.0",
@ -44,6 +44,7 @@
"jstransformer-nunjucks": "^1.2.0",
"metalsmith": "^2.6.3",
"metalsmith-html-relative": "^2.0.9",
"node-readable-to-web-readable-stream": "^0.4.2",
"ordered-read-streams": "^2.0.0",
"pngjs": "^7.0.0",
"postcss": "^8.5.6",
@ -52,7 +53,7 @@
"postcss-nesting": "^13.0.2",
"postcss-values-parser": "^7.0.0",
"prettier": "^3.7.4",
"puppeteer": "^24.33.0",
"puppeteer": "^24.34.0",
"stylelint": "^16.26.1",
"stylelint-prettier": "^5.0.3",
"svglint": "^4.1.2",
@ -61,7 +62,7 @@
"ttest": "^4.0.0",
"typescript": "^5.9.3",
"vinyl": "^3.0.1",
"webpack": "^5.103.0",
"webpack": "^5.104.1",
"webpack-stream": "^7.0.0",
"yargs": "^18.0.0"
},
@ -2377,9 +2378,9 @@
}
},
"node_modules/@napi-rs/canvas": {
"version": "0.1.84",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas/-/canvas-0.1.84.tgz",
"integrity": "sha512-88FTNFs4uuiFKP0tUrPsEXhpe9dg7za9ILZJE08pGdUveMIDeana1zwfVkqRHJDPJFAmGY3dXmJ99dzsy57YnA==",
"version": "0.1.88",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas/-/canvas-0.1.88.tgz",
"integrity": "sha512-/p08f93LEbsL5mDZFQ3DBxcPv/I4QG9EDYRRq1WNlCOXVfAHBTHMSVMwxlqG/AtnSfUr9+vgfN7MKiyDo0+Weg==",
"dev": true,
"license": "MIT",
"workspaces": [
@ -2388,23 +2389,28 @@
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
},
"optionalDependencies": {
"@napi-rs/canvas-android-arm64": "0.1.84",
"@napi-rs/canvas-darwin-arm64": "0.1.84",
"@napi-rs/canvas-darwin-x64": "0.1.84",
"@napi-rs/canvas-linux-arm-gnueabihf": "0.1.84",
"@napi-rs/canvas-linux-arm64-gnu": "0.1.84",
"@napi-rs/canvas-linux-arm64-musl": "0.1.84",
"@napi-rs/canvas-linux-riscv64-gnu": "0.1.84",
"@napi-rs/canvas-linux-x64-gnu": "0.1.84",
"@napi-rs/canvas-linux-x64-musl": "0.1.84",
"@napi-rs/canvas-win32-x64-msvc": "0.1.84"
"@napi-rs/canvas-android-arm64": "0.1.88",
"@napi-rs/canvas-darwin-arm64": "0.1.88",
"@napi-rs/canvas-darwin-x64": "0.1.88",
"@napi-rs/canvas-linux-arm-gnueabihf": "0.1.88",
"@napi-rs/canvas-linux-arm64-gnu": "0.1.88",
"@napi-rs/canvas-linux-arm64-musl": "0.1.88",
"@napi-rs/canvas-linux-riscv64-gnu": "0.1.88",
"@napi-rs/canvas-linux-x64-gnu": "0.1.88",
"@napi-rs/canvas-linux-x64-musl": "0.1.88",
"@napi-rs/canvas-win32-arm64-msvc": "0.1.88",
"@napi-rs/canvas-win32-x64-msvc": "0.1.88"
}
},
"node_modules/@napi-rs/canvas-android-arm64": {
"version": "0.1.84",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-android-arm64/-/canvas-android-arm64-0.1.84.tgz",
"integrity": "sha512-pdvuqvj3qtwVryqgpAGornJLV6Ezpk39V6wT4JCnRVGy8I3Tk1au8qOalFGrx/r0Ig87hWslysPpHBxVpBMIww==",
"version": "0.1.88",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-android-arm64/-/canvas-android-arm64-0.1.88.tgz",
"integrity": "sha512-KEaClPnZuVxJ8smUWjV1wWFkByBO/D+vy4lN+Dm5DFH514oqwukxKGeck9xcKJhaWJGjfruGmYGiwRe//+/zQQ==",
"cpu": [
"arm64"
],
@ -2416,12 +2422,16 @@
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-darwin-arm64": {
"version": "0.1.84",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-darwin-arm64/-/canvas-darwin-arm64-0.1.84.tgz",
"integrity": "sha512-A8IND3Hnv0R6abc6qCcCaOCujTLMmGxtucMTZ5vbQUrEN/scxi378MyTLtyWg+MRr6bwQJ6v/orqMS9datIcww==",
"version": "0.1.88",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-darwin-arm64/-/canvas-darwin-arm64-0.1.88.tgz",
"integrity": "sha512-Xgywz0dDxOKSgx3eZnK85WgGMmGrQEW7ZLA/E7raZdlEE+xXCozobgqz2ZvYigpB6DJFYkqnwHjqCOTSDGlFdg==",
"cpu": [
"arm64"
],
@ -2433,12 +2443,16 @@
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-darwin-x64": {
"version": "0.1.84",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-darwin-x64/-/canvas-darwin-x64-0.1.84.tgz",
"integrity": "sha512-AUW45lJhYWwnA74LaNeqhvqYKK/2hNnBBBl03KRdqeCD4tKneUSrxUqIv8d22CBweOvrAASyKN3W87WO2zEr/A==",
"version": "0.1.88",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-darwin-x64/-/canvas-darwin-x64-0.1.88.tgz",
"integrity": "sha512-Yz4wSCIQOUgNucgk+8NFtQxQxZV5NO8VKRl9ePKE6XoNyNVC8JDqtvhh3b3TPqKK8W5p2EQpAr1rjjm0mfBxdg==",
"cpu": [
"x64"
],
@ -2450,12 +2464,16 @@
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-linux-arm-gnueabihf": {
"version": "0.1.84",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-arm-gnueabihf/-/canvas-linux-arm-gnueabihf-0.1.84.tgz",
"integrity": "sha512-8zs5ZqOrdgs4FioTxSBrkl/wHZB56bJNBqaIsfPL4ZkEQCinOkrFF7xIcXiHiKp93J3wUtbIzeVrhTIaWwqk+A==",
"version": "0.1.88",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-arm-gnueabihf/-/canvas-linux-arm-gnueabihf-0.1.88.tgz",
"integrity": "sha512-9gQM2SlTo76hYhxHi2XxWTAqpTOb+JtxMPEIr+H5nAhHhyEtNmTSDRtz93SP7mGd2G3Ojf2oF5tP9OdgtgXyKg==",
"cpu": [
"arm"
],
@ -2467,12 +2485,16 @@
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-linux-arm64-gnu": {
"version": "0.1.84",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-arm64-gnu/-/canvas-linux-arm64-gnu-0.1.84.tgz",
"integrity": "sha512-i204vtowOglJUpbAFWU5mqsJgH0lVpNk/Ml4mQtB4Lndd86oF+Otr6Mr5KQnZHqYGhlSIKiU2SYnUbhO28zGQA==",
"version": "0.1.88",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-arm64-gnu/-/canvas-linux-arm64-gnu-0.1.88.tgz",
"integrity": "sha512-7qgaOBMXuVRk9Fzztzr3BchQKXDxGbY+nwsovD3I/Sx81e+sX0ReEDYHTItNb0Je4NHbAl7D0MKyd4SvUc04sg==",
"cpu": [
"arm64"
],
@ -2484,12 +2506,16 @@
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-linux-arm64-musl": {
"version": "0.1.84",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-arm64-musl/-/canvas-linux-arm64-musl-0.1.84.tgz",
"integrity": "sha512-VyZq0EEw+OILnWk7G3ZgLLPaz1ERaPP++jLjeyLMbFOF+Tr4zHzWKiKDsEV/cT7btLPZbVoR3VX+T9/QubnURQ==",
"version": "0.1.88",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-arm64-musl/-/canvas-linux-arm64-musl-0.1.88.tgz",
"integrity": "sha512-kYyNrUsHLkoGHBc77u4Unh067GrfiCUMbGHC2+OTxbeWfZkPt2o32UOQkhnSswKd9Fko/wSqqGkY956bIUzruA==",
"cpu": [
"arm64"
],
@ -2501,12 +2527,16 @@
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-linux-riscv64-gnu": {
"version": "0.1.84",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-riscv64-gnu/-/canvas-linux-riscv64-gnu-0.1.84.tgz",
"integrity": "sha512-PSMTh8DiThvLRsbtc/a065I/ceZk17EXAATv9uNvHgkgo7wdEfTh2C3aveNkBMGByVO3tvnvD5v/YFtZL07cIg==",
"version": "0.1.88",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-riscv64-gnu/-/canvas-linux-riscv64-gnu-0.1.88.tgz",
"integrity": "sha512-HVuH7QgzB0yavYdNZDRyAsn/ejoXB0hn8twwFnOqUbCCdkV+REna7RXjSR7+PdfW0qMQ2YYWsLvVBT5iL/mGpw==",
"cpu": [
"riscv64"
],
@ -2518,12 +2548,16 @@
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-linux-x64-gnu": {
"version": "0.1.84",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-x64-gnu/-/canvas-linux-x64-gnu-0.1.84.tgz",
"integrity": "sha512-N1GY3noO1oqgEo3rYQIwY44kfM11vA0lDbN0orTOHfCSUZTUyiYCY0nZ197QMahZBm1aR/vYgsWpV74MMMDuNA==",
"version": "0.1.88",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-x64-gnu/-/canvas-linux-x64-gnu-0.1.88.tgz",
"integrity": "sha512-hvcvKIcPEQrvvJtJnwD35B3qk6umFJ8dFIr8bSymfrSMem0EQsfn1ztys8ETIFndTwdNWJKWluvxztA41ivsEw==",
"cpu": [
"x64"
],
@ -2535,12 +2569,16 @@
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-linux-x64-musl": {
"version": "0.1.84",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-x64-musl/-/canvas-linux-x64-musl-0.1.84.tgz",
"integrity": "sha512-vUZmua6ADqTWyHyei81aXIt9wp0yjeNwTH0KdhdeoBb6azHmFR8uKTukZMXfLCC3bnsW0t4lW7K78KNMknmtjg==",
"version": "0.1.88",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-x64-musl/-/canvas-linux-x64-musl-0.1.88.tgz",
"integrity": "sha512-eSMpGYY2xnZSQ6UxYJ6plDboxq4KeJ4zT5HaVkUnbObNN6DlbJe0Mclh3wifAmquXfrlgTZt6zhHsUgz++AK6g==",
"cpu": [
"x64"
],
@ -2552,12 +2590,37 @@
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-win32-arm64-msvc": {
"version": "0.1.88",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-win32-arm64-msvc/-/canvas-win32-arm64-msvc-0.1.88.tgz",
"integrity": "sha512-qcIFfEgHrchyYqRrxsCeTQgpJZ/GqHiqPcU/Fvw/ARVlQeDX1VyFH+X+0gCR2tca6UJrq96vnW+5o7buCq+erA==",
"cpu": [
"arm64"
],
"dev": true,
"license": "MIT",
"optional": true,
"os": [
"win32"
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-win32-x64-msvc": {
"version": "0.1.84",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-win32-x64-msvc/-/canvas-win32-x64-msvc-0.1.84.tgz",
"integrity": "sha512-YSs8ncurc1xzegUMNnQUTYrdrAuaXdPMOa+iYYyAxydOtg0ppV386hyYMsy00Yip1NlTgLCseRG4sHSnjQx6og==",
"version": "0.1.88",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-win32-x64-msvc/-/canvas-win32-x64-msvc-0.1.88.tgz",
"integrity": "sha512-ROVqbfS4QyZxYkqmaIBBpbz/BQvAR+05FXM5PAtTYVc0uyY8Y4BHJSMdGAaMf6TdIVRsQsiq+FG/dH9XhvWCFQ==",
"cpu": [
"x64"
],
@ -2569,6 +2632,10 @@
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@nodelib/fs.scandir": {
@ -2857,9 +2924,9 @@
"license": "MIT"
},
"node_modules/@types/node": {
"version": "25.0.1",
"resolved": "https://registry.npmjs.org/@types/node/-/node-25.0.1.tgz",
"integrity": "sha512-czWPzKIAXucn9PtsttxmumiQ9N0ok9FrBwgRWrwmVLlp86BrMExzvXRLFYRJ+Ex3g6yqj+KuaxfX1JTgV2lpfg==",
"version": "25.0.3",
"resolved": "https://registry.npmjs.org/@types/node/-/node-25.0.3.tgz",
"integrity": "sha512-W609buLVRVmeW693xKfzHeIV6nJGGz98uCPfeXI1ELMLXVeKYZ9m15fAMSaUPBHYLGFsVRcMmSCksQOrZV9BYA==",
"dev": true,
"license": "MIT",
"dependencies": {
@ -2896,14 +2963,14 @@
}
},
"node_modules/@typescript-eslint/project-service": {
"version": "8.44.1",
"resolved": "https://registry.npmjs.org/@typescript-eslint/project-service/-/project-service-8.44.1.tgz",
"integrity": "sha512-ycSa60eGg8GWAkVsKV4E6Nz33h+HjTXbsDT4FILyL8Obk5/mx4tbvCNsLf9zret3ipSumAOG89UcCs/KRaKYrA==",
"version": "8.51.0",
"resolved": "https://registry.npmjs.org/@typescript-eslint/project-service/-/project-service-8.51.0.tgz",
"integrity": "sha512-Luv/GafO07Z7HpiI7qeEW5NW8HUtZI/fo/kE0YbtQEFpJRUuR0ajcWfCE5bnMvL7QQFrmT/odMe8QZww8X2nfQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"@typescript-eslint/tsconfig-utils": "^8.44.1",
"@typescript-eslint/types": "^8.44.1",
"@typescript-eslint/tsconfig-utils": "^8.51.0",
"@typescript-eslint/types": "^8.51.0",
"debug": "^4.3.4"
},
"engines": {
@ -2918,14 +2985,14 @@
}
},
"node_modules/@typescript-eslint/scope-manager": {
"version": "8.44.1",
"resolved": "https://registry.npmjs.org/@typescript-eslint/scope-manager/-/scope-manager-8.44.1.tgz",
"integrity": "sha512-NdhWHgmynpSvyhchGLXh+w12OMT308Gm25JoRIyTZqEbApiBiQHD/8xgb6LqCWCFcxFtWwaVdFsLPQI3jvhywg==",
"version": "8.51.0",
"resolved": "https://registry.npmjs.org/@typescript-eslint/scope-manager/-/scope-manager-8.51.0.tgz",
"integrity": "sha512-JhhJDVwsSx4hiOEQPeajGhCWgBMBwVkxC/Pet53EpBVs7zHHtayKefw1jtPaNRXpI9RA2uocdmpdfE7T+NrizA==",
"dev": true,
"license": "MIT",
"dependencies": {
"@typescript-eslint/types": "8.44.1",
"@typescript-eslint/visitor-keys": "8.44.1"
"@typescript-eslint/types": "8.51.0",
"@typescript-eslint/visitor-keys": "8.51.0"
},
"engines": {
"node": "^18.18.0 || ^20.9.0 || >=21.1.0"
@ -2936,9 +3003,9 @@
}
},
"node_modules/@typescript-eslint/tsconfig-utils": {
"version": "8.44.1",
"resolved": "https://registry.npmjs.org/@typescript-eslint/tsconfig-utils/-/tsconfig-utils-8.44.1.tgz",
"integrity": "sha512-B5OyACouEjuIvof3o86lRMvyDsFwZm+4fBOqFHccIctYgBjqR3qT39FBYGN87khcgf0ExpdCBeGKpKRhSFTjKQ==",
"version": "8.51.0",
"resolved": "https://registry.npmjs.org/@typescript-eslint/tsconfig-utils/-/tsconfig-utils-8.51.0.tgz",
"integrity": "sha512-Qi5bSy/vuHeWyir2C8u/uqGMIlIDu8fuiYWv48ZGlZ/k+PRPHtaAu7erpc7p5bzw2WNNSniuxoMSO4Ar6V9OXw==",
"dev": true,
"license": "MIT",
"engines": {
@ -2953,9 +3020,9 @@
}
},
"node_modules/@typescript-eslint/types": {
"version": "8.44.1",
"resolved": "https://registry.npmjs.org/@typescript-eslint/types/-/types-8.44.1.tgz",
"integrity": "sha512-Lk7uj7y9uQUOEguiDIDLYLJOrYHQa7oBiURYVFqIpGxclAFQ78f6VUOM8lI2XEuNOKNB7XuvM2+2cMXAoq4ALQ==",
"version": "8.51.0",
"resolved": "https://registry.npmjs.org/@typescript-eslint/types/-/types-8.51.0.tgz",
"integrity": "sha512-TizAvWYFM6sSscmEakjY3sPqGwxZRSywSsPEiuZF6d5GmGD9Gvlsv0f6N8FvAAA0CD06l3rIcWNbsN1e5F/9Ag==",
"dev": true,
"license": "MIT",
"engines": {
@ -2967,22 +3034,21 @@
}
},
"node_modules/@typescript-eslint/typescript-estree": {
"version": "8.44.1",
"resolved": "https://registry.npmjs.org/@typescript-eslint/typescript-estree/-/typescript-estree-8.44.1.tgz",
"integrity": "sha512-qnQJ+mVa7szevdEyvfItbO5Vo+GfZ4/GZWWDRRLjrxYPkhM+6zYB2vRYwCsoJLzqFCdZT4mEqyJoyzkunsZ96A==",
"version": "8.51.0",
"resolved": "https://registry.npmjs.org/@typescript-eslint/typescript-estree/-/typescript-estree-8.51.0.tgz",
"integrity": "sha512-1qNjGqFRmlq0VW5iVlcyHBbCjPB7y6SxpBkrbhNWMy/65ZoncXCEPJxkRZL8McrseNH6lFhaxCIaX+vBuFnRng==",
"dev": true,
"license": "MIT",
"dependencies": {
"@typescript-eslint/project-service": "8.44.1",
"@typescript-eslint/tsconfig-utils": "8.44.1",
"@typescript-eslint/types": "8.44.1",
"@typescript-eslint/visitor-keys": "8.44.1",
"@typescript-eslint/project-service": "8.51.0",
"@typescript-eslint/tsconfig-utils": "8.51.0",
"@typescript-eslint/types": "8.51.0",
"@typescript-eslint/visitor-keys": "8.51.0",
"debug": "^4.3.4",
"fast-glob": "^3.3.2",
"is-glob": "^4.0.3",
"minimatch": "^9.0.4",
"semver": "^7.6.0",
"ts-api-utils": "^2.1.0"
"tinyglobby": "^0.2.15",
"ts-api-utils": "^2.2.0"
},
"engines": {
"node": "^18.18.0 || ^20.9.0 || >=21.1.0"
@ -3022,9 +3088,9 @@
}
},
"node_modules/@typescript-eslint/typescript-estree/node_modules/semver": {
"version": "7.7.2",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.7.2.tgz",
"integrity": "sha512-RF0Fw+rO5AMf9MAyaRXI4AV0Ulj5lMHqVxxdSgiVbixSCXoEmmX/jk0CuJw4+3SqroYO9VoUh+HcuJivvtJemA==",
"version": "7.7.3",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.7.3.tgz",
"integrity": "sha512-SdsKMrI9TdgjdweUSR9MweHA4EJ8YxHn8DFaDisvhVlUOe4BF1tLD7GAj0lIqWVl+dPb/rExr0Btby5loQm20Q==",
"dev": true,
"license": "ISC",
"bin": {
@ -3035,16 +3101,16 @@
}
},
"node_modules/@typescript-eslint/utils": {
"version": "8.44.1",
"resolved": "https://registry.npmjs.org/@typescript-eslint/utils/-/utils-8.44.1.tgz",
"integrity": "sha512-DpX5Fp6edTlocMCwA+mHY8Mra+pPjRZ0TfHkXI8QFelIKcbADQz1LUPNtzOFUriBB2UYqw4Pi9+xV4w9ZczHFg==",
"version": "8.51.0",
"resolved": "https://registry.npmjs.org/@typescript-eslint/utils/-/utils-8.51.0.tgz",
"integrity": "sha512-11rZYxSe0zabiKaCP2QAwRf/dnmgFgvTmeDTtZvUvXG3UuAdg/GU02NExmmIXzz3vLGgMdtrIosI84jITQOxUA==",
"dev": true,
"license": "MIT",
"dependencies": {
"@eslint-community/eslint-utils": "^4.7.0",
"@typescript-eslint/scope-manager": "8.44.1",
"@typescript-eslint/types": "8.44.1",
"@typescript-eslint/typescript-estree": "8.44.1"
"@typescript-eslint/scope-manager": "8.51.0",
"@typescript-eslint/types": "8.51.0",
"@typescript-eslint/typescript-estree": "8.51.0"
},
"engines": {
"node": "^18.18.0 || ^20.9.0 || >=21.1.0"
@ -3059,13 +3125,13 @@
}
},
"node_modules/@typescript-eslint/visitor-keys": {
"version": "8.44.1",
"resolved": "https://registry.npmjs.org/@typescript-eslint/visitor-keys/-/visitor-keys-8.44.1.tgz",
"integrity": "sha512-576+u0QD+Jp3tZzvfRfxon0EA2lzcDt3lhUbsC6Lgzy9x2VR4E+JUiNyGHi5T8vk0TV+fpJ5GLG1JsJuWCaKhw==",
"version": "8.51.0",
"resolved": "https://registry.npmjs.org/@typescript-eslint/visitor-keys/-/visitor-keys-8.51.0.tgz",
"integrity": "sha512-mM/JRQOzhVN1ykejrvwnBRV3+7yTKK8tVANVN3o1O0t0v7o+jqdVu9crPy5Y9dov15TJk/FTIgoUGHrTOVL3Zg==",
"dev": true,
"license": "MIT",
"dependencies": {
"@typescript-eslint/types": "8.44.1",
"@typescript-eslint/types": "8.51.0",
"eslint-visitor-keys": "^4.2.1"
},
"engines": {
@ -3726,9 +3792,9 @@
}
},
"node_modules/autoprefixer": {
"version": "10.4.22",
"resolved": "https://registry.npmjs.org/autoprefixer/-/autoprefixer-10.4.22.tgz",
"integrity": "sha512-ARe0v/t9gO28Bznv6GgqARmVqcWOV3mfgUPn9becPHMiD3o9BwlRgaeccZnwTpZ7Zwqrm+c1sUSsMxIzQzc8Xg==",
"version": "10.4.23",
"resolved": "https://registry.npmjs.org/autoprefixer/-/autoprefixer-10.4.23.tgz",
"integrity": "sha512-YYTXSFulfwytnjAPlw8QHncHJmlvFKtczb8InXaAx9Q0LbfDnfEYDE55omerIJKihhmU61Ft+cAOSzQVaBUmeA==",
"dev": true,
"funding": [
{
@ -3746,10 +3812,9 @@
],
"license": "MIT",
"dependencies": {
"browserslist": "^4.27.0",
"caniuse-lite": "^1.0.30001754",
"browserslist": "^4.28.1",
"caniuse-lite": "^1.0.30001760",
"fraction.js": "^5.3.4",
"normalize-range": "^0.1.2",
"picocolors": "^1.1.1",
"postcss-value-parser": "^4.2.0"
},
@ -3986,9 +4051,9 @@
"license": "MIT"
},
"node_modules/baseline-browser-mapping": {
"version": "2.8.32",
"resolved": "https://registry.npmjs.org/baseline-browser-mapping/-/baseline-browser-mapping-2.8.32.tgz",
"integrity": "sha512-OPz5aBThlyLFgxyhdwf/s2+8ab3OvT7AdTNvKHBwpXomIYeXqpUUuT8LrdtxZSsWJ4R4CU1un4XGh5Ez3nlTpw==",
"version": "2.9.11",
"resolved": "https://registry.npmjs.org/baseline-browser-mapping/-/baseline-browser-mapping-2.9.11.tgz",
"integrity": "sha512-Sg0xJUNDU1sJNGdfGWhVHX0kkZ+HWcvmVymJbj6NSgZZmW/8S9Y2HQ5euytnIgakgxN6papOAWiwDo1ctFDcoQ==",
"dev": true,
"license": "Apache-2.0",
"bin": {
@ -3996,9 +4061,9 @@
}
},
"node_modules/basic-ftp": {
"version": "5.0.5",
"resolved": "https://registry.npmjs.org/basic-ftp/-/basic-ftp-5.0.5.tgz",
"integrity": "sha512-4Bcg1P8xhUuqcii/S0Z9wiHIrQVPMermM1any+MX5GeGD7faD3/msQUDGLol9wOcz4/jbg/WJnGqoJF6LiBdtg==",
"version": "5.1.0",
"resolved": "https://registry.npmjs.org/basic-ftp/-/basic-ftp-5.1.0.tgz",
"integrity": "sha512-RkaJzeJKDbaDWTIPiJwubyljaEPwpVWkm9Rt5h9Nd6h7tEXTJ3VB4qxdZBioV7JO5yLUaOKwz7vDOzlncUsegw==",
"dev": true,
"license": "MIT",
"engines": {
@ -4097,9 +4162,9 @@
}
},
"node_modules/browserslist": {
"version": "4.28.0",
"resolved": "https://registry.npmjs.org/browserslist/-/browserslist-4.28.0.tgz",
"integrity": "sha512-tbydkR/CxfMwelN0vwdP/pLkDwyAASZ+VfWm4EOwlB6SWhx1sYnWLqo8N5j0rAzPfzfRaxt0mM/4wPU/Su84RQ==",
"version": "4.28.1",
"resolved": "https://registry.npmjs.org/browserslist/-/browserslist-4.28.1.tgz",
"integrity": "sha512-ZC5Bd0LgJXgwGqUknZY/vkUQ04r8NXnJZ3yYi4vDmSiZmC/pdSN0NbNRPxZpbtO4uAfDUAFffO8IZoM3Gj8IkA==",
"dev": true,
"funding": [
{
@ -4117,11 +4182,11 @@
],
"license": "MIT",
"dependencies": {
"baseline-browser-mapping": "^2.8.25",
"caniuse-lite": "^1.0.30001754",
"electron-to-chromium": "^1.5.249",
"baseline-browser-mapping": "^2.9.0",
"caniuse-lite": "^1.0.30001759",
"electron-to-chromium": "^1.5.263",
"node-releases": "^2.0.27",
"update-browserslist-db": "^1.1.4"
"update-browserslist-db": "^1.2.0"
},
"bin": {
"browserslist": "cli.js"
@ -4307,9 +4372,9 @@
}
},
"node_modules/caniuse-lite": {
"version": "1.0.30001760",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001760.tgz",
"integrity": "sha512-7AAMPcueWELt1p3mi13HR/LHH0TJLT11cnwDJEs3xA4+CK/PLKeO9Kl1oru24htkyUKtkGCvAx4ohB0Ttry8Dw==",
"version": "1.0.30001762",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001762.tgz",
"integrity": "sha512-PxZwGNvH7Ak8WX5iXzoK1KPZttBXNPuaOvI2ZYU7NrlM+d9Ov+TUvlLOBNGzVXAntMSMMlJPd+jY6ovrVjSmUw==",
"dev": true,
"funding": [
{
@ -4464,9 +4529,9 @@
}
},
"node_modules/chromium-bidi": {
"version": "11.0.0",
"resolved": "https://registry.npmjs.org/chromium-bidi/-/chromium-bidi-11.0.0.tgz",
"integrity": "sha512-cM3DI+OOb89T3wO8cpPSro80Q9eKYJ7hGVXoGS3GkDPxnYSqiv+6xwpIf6XERyJ9Tdsl09hmNmY94BkgZdVekw==",
"version": "12.0.1",
"resolved": "https://registry.npmjs.org/chromium-bidi/-/chromium-bidi-12.0.1.tgz",
"integrity": "sha512-fGg+6jr0xjQhzpy5N4ErZxQ4wF7KLEvhGZXD6EgvZKDhu7iOhZXnZhcDxPJDcwTcrD48NPzOCo84RP2lv3Z+Cg==",
"dev": true,
"license": "Apache-2.0",
"dependencies": {
@ -5312,9 +5377,9 @@
}
},
"node_modules/electron-to-chromium": {
"version": "1.5.262",
"resolved": "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.5.262.tgz",
"integrity": "sha512-NlAsMteRHek05jRUxUR0a5jpjYq9ykk6+kO0yRaMi5moe7u0fVIOeQ3Y30A8dIiWFBNUoQGi1ljb1i5VtS9WQQ==",
"version": "1.5.267",
"resolved": "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.5.267.tgz",
"integrity": "sha512-0Drusm6MVRXSOJpGbaSVgcQsuB4hEkMpHXaVstcPmhu5LIedxs1xNK/nIxmQIU/RPC0+1/o0AVZfBTkTNJOdUw==",
"dev": true,
"license": "ISC"
},
@ -5512,9 +5577,9 @@
}
},
"node_modules/es-module-lexer": {
"version": "1.7.0",
"resolved": "https://registry.npmjs.org/es-module-lexer/-/es-module-lexer-1.7.0.tgz",
"integrity": "sha512-jEQoCwk8hyb2AZziIOLhDqpm5+2ww5uIE6lkO/6jcOCusfk6LhMHpXXfBLXTZ7Ydyt0j4VoUQv6uGNYbdW+kBA==",
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/es-module-lexer/-/es-module-lexer-2.0.0.tgz",
"integrity": "sha512-5POEcUuZybH7IdmGsD8wlf0AI55wMecM9rVBTI/qEAy2c1kTOm3DjFYjrBdI2K3BaJjJYfYFeRtM0t9ssnRuxw==",
"dev": true,
"license": "MIT"
},
@ -5829,18 +5894,17 @@
}
},
"node_modules/eslint-plugin-perfectionist": {
"version": "4.15.1",
"resolved": "https://registry.npmjs.org/eslint-plugin-perfectionist/-/eslint-plugin-perfectionist-4.15.1.tgz",
"integrity": "sha512-MHF0cBoOG0XyBf7G0EAFCuJJu4I18wy0zAoT1OHfx2o6EOx1EFTIzr2HGeuZa1kDcusoX0xJ9V7oZmaeFd773Q==",
"version": "5.2.0",
"resolved": "https://registry.npmjs.org/eslint-plugin-perfectionist/-/eslint-plugin-perfectionist-5.2.0.tgz",
"integrity": "sha512-rLD4VyA6sxcCPlG/koqjp0D46JTNRURSDs22Jr1JeVbOiu1BoeRdROnJoqDoGESuXbwxvGEnMSqClX/Q3HSMig==",
"dev": true,
"license": "MIT",
"dependencies": {
"@typescript-eslint/types": "^8.38.0",
"@typescript-eslint/utils": "^8.38.0",
"@typescript-eslint/utils": "^8.51.0",
"natural-orderby": "^5.0.0"
},
"engines": {
"node": "^18.0.0 || >=20.0.0"
"node": "^20.0.0 || >=22.0.0"
},
"peerDependencies": {
"eslint": ">=8.45.0"
@ -5913,6 +5977,19 @@
"eslint": ">=9.38.0"
}
},
"node_modules/eslint-plugin-unicorn/node_modules/globals": {
"version": "16.5.0",
"resolved": "https://registry.npmjs.org/globals/-/globals-16.5.0.tgz",
"integrity": "sha512-c/c15i26VrJ4IRt5Z89DnIzCGDn9EcebibhAOjw5ibqEHsE1wLUgkPn9RDmNcUKyU87GeaL633nyJ+pplFR2ZQ==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=18"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/eslint-plugin-unicorn/node_modules/semver": {
"version": "7.7.3",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.7.3.tgz",
@ -6883,9 +6960,9 @@
}
},
"node_modules/globals": {
"version": "16.5.0",
"resolved": "https://registry.npmjs.org/globals/-/globals-16.5.0.tgz",
"integrity": "sha512-c/c15i26VrJ4IRt5Z89DnIzCGDn9EcebibhAOjw5ibqEHsE1wLUgkPn9RDmNcUKyU87GeaL633nyJ+pplFR2ZQ==",
"version": "17.0.0",
"resolved": "https://registry.npmjs.org/globals/-/globals-17.0.0.tgz",
"integrity": "sha512-gv5BeD2EssA793rlFWVPMMCqefTlpusw6/2TbAVMy0FzcG8wKJn4O+NqJ4+XWmmwrayJgw5TzrmWjFgmz1XPqw==",
"dev": true,
"license": "MIT",
"engines": {
@ -8703,9 +8780,9 @@
}
},
"node_modules/lodash": {
"version": "4.17.21",
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz",
"integrity": "sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==",
"version": "4.17.23",
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.23.tgz",
"integrity": "sha512-LgVTMpQtIopCi79SJeDiP0TfWi5CNEc/L/aRdTh3yIvmZXTnheWpKjSZhnvMl8iXbC1tFg9gdHHDMLoV7CnG+w==",
"dev": true,
"license": "MIT"
},
@ -9297,6 +9374,13 @@
"node": ">= 0.4.0"
}
},
"node_modules/node-readable-to-web-readable-stream": {
"version": "0.4.2",
"resolved": "https://registry.npmjs.org/node-readable-to-web-readable-stream/-/node-readable-to-web-readable-stream-0.4.2.tgz",
"integrity": "sha512-/cMZNI34v//jUTrI+UIo4ieHAB5EZRY/+7OmXZgBxaWBMcW2tGdceIw06RFxWxrKZ5Jp3sI2i5TsRo+CBhtVLQ==",
"dev": true,
"license": "MIT"
},
"node_modules/node-releases": {
"version": "2.0.27",
"resolved": "https://registry.npmjs.org/node-releases/-/node-releases-2.0.27.tgz",
@ -9343,16 +9427,6 @@
"node": ">=0.10.0"
}
},
"node_modules/normalize-range": {
"version": "0.1.2",
"resolved": "https://registry.npmjs.org/normalize-range/-/normalize-range-0.1.2.tgz",
"integrity": "sha512-bdok/XvKII3nUpklnV6P2hxtMNrCboOjAcyBuQnWEhO665FwrSNRxU+AqpsyvO6LgGYPspN+lu5CLtw4jPRKNA==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/now-and-later": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/now-and-later/-/now-and-later-3.0.0.tgz",
@ -10386,18 +10460,18 @@
}
},
"node_modules/puppeteer": {
"version": "24.33.0",
"resolved": "https://registry.npmjs.org/puppeteer/-/puppeteer-24.33.0.tgz",
"integrity": "sha512-nl3wsAztq5F8zybn4Tk41OCnYIzFIzGC6AN0WcF2KCUnWenajvRRPgBmS6LvNUV2HEeIzT2zRZHH0TgVxLDKew==",
"version": "24.34.0",
"resolved": "https://registry.npmjs.org/puppeteer/-/puppeteer-24.34.0.tgz",
"integrity": "sha512-Sdpl/zsYOsagZ4ICoZJPGZw8d9gZmK5DcxVal11dXi/1/t2eIXHjCf5NfmhDg5XnG9Nye+yo/LqMzIxie2rHTw==",
"dev": true,
"hasInstallScript": true,
"license": "Apache-2.0",
"dependencies": {
"@puppeteer/browsers": "2.11.0",
"chromium-bidi": "11.0.0",
"chromium-bidi": "12.0.1",
"cosmiconfig": "^9.0.0",
"devtools-protocol": "0.0.1534754",
"puppeteer-core": "24.33.0",
"puppeteer-core": "24.34.0",
"typed-query-selector": "^2.12.0"
},
"bin": {
@ -10408,18 +10482,18 @@
}
},
"node_modules/puppeteer-core": {
"version": "24.33.0",
"resolved": "https://registry.npmjs.org/puppeteer-core/-/puppeteer-core-24.33.0.tgz",
"integrity": "sha512-tPTxVg+Qdj/8av4cy6szv3GlhxeOoNhiiMZ955fjxQyvPQE/6DjCa6ZyF/x0WJrlgBZtaLSP8TQgJb7FdLDXXA==",
"version": "24.34.0",
"resolved": "https://registry.npmjs.org/puppeteer-core/-/puppeteer-core-24.34.0.tgz",
"integrity": "sha512-24evawO+mUGW4mvS2a2ivwLdX3gk8zRLZr9HP+7+VT2vBQnm0oh9jJEZmUE3ePJhRkYlZ93i7OMpdcoi2qNCLg==",
"dev": true,
"license": "Apache-2.0",
"dependencies": {
"@puppeteer/browsers": "2.11.0",
"chromium-bidi": "11.0.0",
"chromium-bidi": "12.0.1",
"debug": "^4.4.3",
"devtools-protocol": "0.0.1534754",
"typed-query-selector": "^2.12.0",
"webdriver-bidi-protocol": "0.3.9",
"webdriver-bidi-protocol": "0.3.10",
"ws": "^8.18.3"
},
"engines": {
@ -12643,6 +12717,54 @@
"node": ">=0.10.0"
}
},
"node_modules/tinyglobby": {
"version": "0.2.15",
"resolved": "https://registry.npmjs.org/tinyglobby/-/tinyglobby-0.2.15.tgz",
"integrity": "sha512-j2Zq4NyQYG5XMST4cbs02Ak8iJUdxRM0XI5QyxXuZOzKOINmWurp3smXu3y5wDcJrptwpSjgXHzIQxR0omXljQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"fdir": "^6.5.0",
"picomatch": "^4.0.3"
},
"engines": {
"node": ">=12.0.0"
},
"funding": {
"url": "https://github.com/sponsors/SuperchupuDev"
}
},
"node_modules/tinyglobby/node_modules/fdir": {
"version": "6.5.0",
"resolved": "https://registry.npmjs.org/fdir/-/fdir-6.5.0.tgz",
"integrity": "sha512-tIbYtZbucOs0BRGqPJkshJUYdL+SDH7dVM8gjy+ERp3WAUjLEFJE+02kanyHtwjWOnwrKYBiwAmM0p4kLJAnXg==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=12.0.0"
},
"peerDependencies": {
"picomatch": "^3 || ^4"
},
"peerDependenciesMeta": {
"picomatch": {
"optional": true
}
}
},
"node_modules/tinyglobby/node_modules/picomatch": {
"version": "4.0.3",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-4.0.3.tgz",
"integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/jonschlinkert"
}
},
"node_modules/to-regex-range": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/to-regex-range/-/to-regex-range-5.0.1.tgz",
@ -12680,9 +12802,9 @@
}
},
"node_modules/ts-api-utils": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/ts-api-utils/-/ts-api-utils-2.1.0.tgz",
"integrity": "sha512-CUgTZL1irw8u29bzrOD/nH85jqyc74D6SshFgujOIA7osm2Rz7dYH77agkx7H4FBNxDq7Cjf+IjaX/8zwFW+ZQ==",
"version": "2.4.0",
"resolved": "https://registry.npmjs.org/ts-api-utils/-/ts-api-utils-2.4.0.tgz",
"integrity": "sha512-3TaVTaAv2gTiMB35i3FiGJaRfwb3Pyn/j3m/bfAvGe8FB7CF6u+LMYqYlDh7reQf7UNvoTvdfAqHGmPGOSsPmA==",
"dev": true,
"license": "MIT",
"engines": {
@ -13031,9 +13153,9 @@
}
},
"node_modules/update-browserslist-db": {
"version": "1.1.4",
"resolved": "https://registry.npmjs.org/update-browserslist-db/-/update-browserslist-db-1.1.4.tgz",
"integrity": "sha512-q0SPT4xyU84saUX+tomz1WLkxUbuaJnR1xWt17M7fJtEJigJeWUNGUqrauFXsHnqev9y9JTRGwk13tFBuKby4A==",
"version": "1.2.3",
"resolved": "https://registry.npmjs.org/update-browserslist-db/-/update-browserslist-db-1.2.3.tgz",
"integrity": "sha512-Js0m9cx+qOgDxo0eMiFGEueWztz+d4+M3rGlmKPT+T4IS/jP4ylw3Nwpu6cpTTP8R1MAC1kF4VbdLt3ARf209w==",
"dev": true,
"funding": [
{
@ -13270,16 +13392,16 @@
}
},
"node_modules/webdriver-bidi-protocol": {
"version": "0.3.9",
"resolved": "https://registry.npmjs.org/webdriver-bidi-protocol/-/webdriver-bidi-protocol-0.3.9.tgz",
"integrity": "sha512-uIYvlRQ0PwtZR1EzHlTMol1G0lAlmOe6wPykF9a77AK3bkpvZHzIVxRE2ThOx5vjy2zISe0zhwf5rzuUfbo1PQ==",
"version": "0.3.10",
"resolved": "https://registry.npmjs.org/webdriver-bidi-protocol/-/webdriver-bidi-protocol-0.3.10.tgz",
"integrity": "sha512-5LAE43jAVLOhB/QqX4bwSiv0Hg1HBfMmOuwBSXHdvg4GMGu9Y0lIq7p4R/yySu6w74WmaR4GM4H9t2IwLW7hgw==",
"dev": true,
"license": "Apache-2.0"
},
"node_modules/webpack": {
"version": "5.103.0",
"resolved": "https://registry.npmjs.org/webpack/-/webpack-5.103.0.tgz",
"integrity": "sha512-HU1JOuV1OavsZ+mfigY0j8d1TgQgbZ6M+J75zDkpEAwYeXjWSqrGJtgnPblJjd/mAyTNQ7ygw0MiKOn6etz8yw==",
"version": "5.104.1",
"resolved": "https://registry.npmjs.org/webpack/-/webpack-5.104.1.tgz",
"integrity": "sha512-Qphch25abbMNtekmEGJmeRUhLDbe+QfiWTiqpKYkpCOWY64v9eyl+KRRLmqOFA2AvKPpc9DC6+u2n76tQLBoaA==",
"dev": true,
"license": "MIT",
"dependencies": {
@ -13291,10 +13413,10 @@
"@webassemblyjs/wasm-parser": "^1.14.1",
"acorn": "^8.15.0",
"acorn-import-phases": "^1.0.3",
"browserslist": "^4.26.3",
"browserslist": "^4.28.1",
"chrome-trace-event": "^1.0.2",
"enhanced-resolve": "^5.17.3",
"es-module-lexer": "^1.2.1",
"enhanced-resolve": "^5.17.4",
"es-module-lexer": "^2.0.0",
"eslint-scope": "5.1.1",
"events": "^3.2.0",
"glob-to-regexp": "^0.4.1",
@ -13305,7 +13427,7 @@
"neo-async": "^2.6.2",
"schema-utils": "^4.3.3",
"tapable": "^2.3.0",
"terser-webpack-plugin": "^5.3.11",
"terser-webpack-plugin": "^5.3.16",
"watchpack": "^2.4.4",
"webpack-sources": "^3.3.3"
},

View File

@ -10,12 +10,12 @@
"@fluent/dom": "^0.10.2",
"@metalsmith/layouts": "^3.0.0",
"@metalsmith/markdown": "^1.10.0",
"@napi-rs/canvas": "^0.1.84",
"@types/node": "^25.0.1",
"autoprefixer": "^10.4.22",
"@napi-rs/canvas": "^0.1.88",
"@types/node": "^25.0.3",
"autoprefixer": "^10.4.23",
"babel-loader": "^10.0.0",
"cached-iterable": "^0.3.0",
"caniuse-lite": "^1.0.30001760",
"caniuse-lite": "^1.0.30001762",
"core-js": "^3.47.0",
"eslint": "^9.39.2",
"eslint-config-prettier": "^10.1.8",
@ -23,10 +23,10 @@
"eslint-plugin-jasmine": "^4.2.2",
"eslint-plugin-json": "^4.0.1",
"eslint-plugin-no-unsanitized": "^4.1.4",
"eslint-plugin-perfectionist": "^4.15.1",
"eslint-plugin-perfectionist": "^5.2.0",
"eslint-plugin-prettier": "^5.5.4",
"eslint-plugin-unicorn": "^62.0.0",
"globals": "^16.5.0",
"globals": "^17.0.0",
"gulp": "^5.0.1",
"gulp-cli": "^3.1.0",
"gulp-postcss": "^10.0.0",
@ -39,6 +39,7 @@
"jstransformer-nunjucks": "^1.2.0",
"metalsmith": "^2.6.3",
"metalsmith-html-relative": "^2.0.9",
"node-readable-to-web-readable-stream": "^0.4.2",
"ordered-read-streams": "^2.0.0",
"pngjs": "^7.0.0",
"postcss": "^8.5.6",
@ -47,7 +48,7 @@
"postcss-nesting": "^13.0.2",
"postcss-values-parser": "^7.0.0",
"prettier": "^3.7.4",
"puppeteer": "^24.33.0",
"puppeteer": "^24.34.0",
"stylelint": "^16.26.1",
"stylelint-prettier": "^5.0.3",
"svglint": "^4.1.2",
@ -56,7 +57,7 @@
"ttest": "^4.0.0",
"typescript": "^5.9.3",
"vinyl": "^3.0.1",
"webpack": "^5.103.0",
"webpack": "^5.104.1",
"webpack-stream": "^7.0.0",
"yargs": "^18.0.0"
},

View File

@ -1,5 +1,5 @@
{
"stableVersion": "5.4.449",
"stableVersion": "5.4.624",
"baseVersion": "1b427a3af5e0a40c296a3cafb08edbd36d973ff1",
"versionPrefix": "5.4."
}

View File

@ -5290,8 +5290,8 @@ class FileAttachmentAnnotation extends MarkupAnnotation {
constructor(params) {
super(params);
const { dict, xref } = params;
const file = new FileSpec(dict.get("FS"), xref);
const { dict } = params;
const file = new FileSpec(dict.get("FS"));
this.data.annotationType = AnnotationType.FILEATTACHMENT;
this.data.hasOwnCanvas = this.data.noRotate;

View File

@ -68,6 +68,10 @@ class BaseStream {
return false;
}
get isImageStream() {
return false;
}
get canAsyncDecodeImageFromBuffer() {
return false;
}

86
src/core/brotli_stream.js Normal file
View File

@ -0,0 +1,86 @@
/* Copyright 2026 Mozilla Foundation
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import { BrotliDecode } from "../../external/brotli/decode.js";
import { DecodeStream } from "./decode_stream.js";
import { Stream } from "./stream.js";
class BrotliStream extends DecodeStream {
#isAsync = true;
constructor(stream, maybeLength) {
super(maybeLength);
this.stream = stream;
this.dict = stream.dict;
}
readBlock() {
// TODO: add some telemetry to measure how often we fallback here.
// Get all bytes from the input stream
const bytes = this.stream.getBytes();
const decodedData = BrotliDecode(
new Int8Array(bytes.buffer, bytes.byteOffset, bytes.length)
);
this.buffer = new Uint8Array(
decodedData.buffer,
decodedData.byteOffset,
decodedData.length
);
this.bufferLength = this.buffer.length;
this.eof = true;
}
async getImageData(length, _decoderOptions) {
const data = await this.asyncGetBytes();
if (!data) {
return this.getBytes(length);
}
if (data.length <= length) {
return data;
}
return data.subarray(0, length);
}
async asyncGetBytes() {
const { decompressed, compressed } =
await this.asyncGetBytesFromDecompressionStream("brotli");
if (decompressed) {
return decompressed;
}
// DecompressionStream failed (for example because there are some extra
// bytes after the end of the compressed data), so we fallback to our
// decoder.
// We already get the bytes from the underlying stream, so we just reuse
// them to avoid get them again.
this.#isAsync = false;
this.stream = new Stream(
compressed,
0,
compressed.length,
this.stream.dict
);
this.reset();
return null;
}
get isAsync() {
return this.#isAsync;
}
}
export { BrotliStream };

View File

@ -1057,7 +1057,7 @@ class Catalog {
if (obj instanceof Dict && obj.has("EmbeddedFiles")) {
const nameTree = new NameTree(obj.getRaw("EmbeddedFiles"), this.xref);
for (const [key, value] of nameTree.getAll()) {
const fs = new FileSpec(value, this.xref);
const fs = new FileSpec(value);
attachments ??= Object.create(null);
attachments[stringToPDFString(key, /* keepEscapeSequence = */ true)] =
fs.serializable;
@ -1623,23 +1623,21 @@ class Catalog {
case "GoToR":
const urlDict = action.get("F");
if (urlDict instanceof Dict) {
const fs = new FileSpec(
urlDict,
/* xref = */ null,
/* skipContent = */ true
);
const { rawFilename } = fs.serializable;
url = rawFilename;
const fs = new FileSpec(urlDict, /* skipContent = */ true);
({ rawFilename: url } = fs.serializable);
} else if (typeof urlDict === "string") {
url = urlDict;
} else {
break;
}
// NOTE: the destination is relative to the *remote* document.
const remoteDest = fetchRemoteDest(action);
if (remoteDest && typeof url === "string") {
if (remoteDest) {
// NOTE: We don't use the `updateUrlHash` function here, since
// the `createValidAbsoluteUrl` function (see below) already
// handles parsing and validation of the final URL.
// the `createValidAbsoluteUrl` function (see below) already handles
// parsing/validation of the final URL and manual splitting also
// ensures that the `unsafeUrl` property will be available/correct.
url = /* baseUrl = */ url.split("#", 1)[0] + "#" + remoteDest;
}
// The 'NewWindow' property, equal to `LinkTarget.BLANK`.

View File

@ -465,20 +465,33 @@ const blackTable3 = [
* @param {Object} [options] - Decoding options.
*/
class CCITTFaxDecoder {
constructor(source, options = {}) {
constructor(
source,
options = {
K: 0,
EndOfLine: false,
EncodedByteAlign: false,
Columns: 1728,
Rows: 0,
EndOfBlock: true,
BlackIs1: false,
}
) {
if (typeof source?.next !== "function") {
throw new Error('CCITTFaxDecoder - invalid "source" parameter.');
}
this.source = source;
this.eof = false;
this.encoding = options.K || 0;
this.eoline = options.EndOfLine || false;
this.byteAlign = options.EncodedByteAlign || false;
this.columns = options.Columns || 1728;
this.rows = options.Rows || 0;
this.eoblock = options.EndOfBlock ?? true;
this.black = options.BlackIs1 || false;
({
K: this.encoding,
EndOfLine: this.eoline,
EncodedByteAlign: this.byteAlign,
Columns: this.columns,
Rows: this.rows,
EndOfBlock: this.eoblock,
BlackIs1: this.black,
} = options);
this.codingLine = new Uint32Array(this.columns + 1);
this.refLine = new Uint32Array(this.columns + 2);

View File

@ -13,47 +13,114 @@
* limitations under the License.
*/
import { shadow, warn } from "../shared/util.js";
import { CCITTFaxDecoder } from "./ccitt.js";
import { DecodeStream } from "./decode_stream.js";
import { Dict } from "./primitives.js";
import { JBig2CCITTFaxWasmImage } from "./jbig2_ccittFax_wasm.js";
class CCITTFaxStream extends DecodeStream {
constructor(str, maybeLength, params) {
super(maybeLength);
this.stream = str;
this.maybeLength = maybeLength;
this.dict = str.dict;
if (!(params instanceof Dict)) {
params = Dict.empty;
}
const source = {
next() {
return str.getByte();
},
this.params = {
K: params.get("K") || 0,
EndOfLine: !!params.get("EndOfLine"),
EncodedByteAlign: !!params.get("EncodedByteAlign"),
Columns: params.get("Columns") || 1728,
Rows: params.get("Rows") || 0,
EndOfBlock: !!(params.get("EndOfBlock") ?? true),
BlackIs1: !!params.get("BlackIs1"),
};
this.ccittFaxDecoder = new CCITTFaxDecoder(source, {
K: params.get("K"),
EndOfLine: params.get("EndOfLine"),
EncodedByteAlign: params.get("EncodedByteAlign"),
Columns: params.get("Columns"),
Rows: params.get("Rows"),
EndOfBlock: params.get("EndOfBlock"),
BlackIs1: params.get("BlackIs1"),
});
}
get bytes() {
// If `this.maybeLength` is null, we'll get the entire stream.
return shadow(this, "bytes", this.stream.getBytes(this.maybeLength));
}
readBlock() {
this.decodeImageFallback();
}
get isImageStream() {
return true;
}
get isAsyncDecoder() {
return true;
}
async decodeImage(bytes, length, _decoderOptions) {
if (this.eof) {
return this.buffer;
}
if (!bytes) {
bytes = this.stream.isAsync
? (await this.stream.asyncGetBytes()) || this.bytes
: this.bytes;
}
try {
this.buffer = await JBig2CCITTFaxWasmImage.decode(
bytes,
this.dict.get("W", "Width"),
this.dict.get("H", "Height"),
null,
this.params
);
} catch {
warn("CCITTFaxStream: Falling back to JS CCITTFax decoder.");
return this.decodeImageFallback(bytes, length);
}
this.bufferLength = this.buffer.length;
this.eof = true;
return this.buffer;
}
decodeImageFallback(bytes, length) {
if (this.eof) {
return this.buffer;
}
const { params } = this;
if (!bytes) {
this.stream.reset();
bytes = this.bytes;
}
let pos = 0;
const source = {
next() {
return bytes[pos++] ?? -1;
},
};
if (length && this.buffer.byteLength < length) {
this.buffer = new Uint8Array(length);
}
this.ccittFaxDecoder = new CCITTFaxDecoder(source, params);
let outPos = 0;
while (!this.eof) {
const c = this.ccittFaxDecoder.readNextChar();
if (c === -1) {
this.eof = true;
return;
break;
}
this.ensureBuffer(this.bufferLength + 1);
this.buffer[this.bufferLength++] = c;
if (!length) {
this.ensureBuffer(outPos + 1);
}
this.buffer[outPos++] = c;
}
this.bufferLength = this.buffer.length;
return this.buffer.subarray(0, length || this.bufferLength);
}
}

View File

@ -14,10 +14,16 @@
*/
import { arrayBuffersToBytes, MissingDataException } from "./core_utils.js";
import { assert } from "../shared/util.js";
import { assert, MathClamp } from "../shared/util.js";
import { Stream } from "./stream.js";
class ChunkedStream extends Stream {
progressiveDataLength = 0;
_lastSuccessfulEnsureByteChunk = -1; // Single-entry cache
_loadedChunks = new Set();
constructor(length, chunkSize, manager) {
super(
/* arrayBuffer = */ new Uint8Array(length),
@ -27,11 +33,8 @@ class ChunkedStream extends Stream {
);
this.chunkSize = chunkSize;
this._loadedChunks = new Set();
this.numChunks = Math.ceil(length / chunkSize);
this.manager = manager;
this.progressiveDataLength = 0;
this.lastSuccessfulEnsureByteChunk = -1; // Single-entry cache
}
// If a particular stream does not implement one or more of these methods,
@ -67,6 +70,12 @@ class ChunkedStream extends Stream {
throw new Error(`Bad end offset: ${end}`);
}
if (typeof PDFJSDev === "undefined" || PDFJSDev.test("TESTING")) {
assert(
chunk instanceof ArrayBuffer,
"onReceiveData - expected an ArrayBuffer."
);
}
this.bytes.set(new Uint8Array(chunk), begin);
const beginChunk = Math.floor(begin / chunkSize);
const endChunk = Math.floor((end - 1) / chunkSize) + 1;
@ -82,6 +91,12 @@ class ChunkedStream extends Stream {
let position = this.progressiveDataLength;
const beginChunk = Math.floor(position / this.chunkSize);
if (typeof PDFJSDev === "undefined" || PDFJSDev.test("TESTING")) {
assert(
data instanceof ArrayBuffer,
"onReceiveProgressiveData - expected an ArrayBuffer."
);
}
this.bytes.set(new Uint8Array(data), position);
position += data.byteLength;
this.progressiveDataLength = position;
@ -106,14 +121,14 @@ class ChunkedStream extends Stream {
if (chunk > this.numChunks) {
return;
}
if (chunk === this.lastSuccessfulEnsureByteChunk) {
if (chunk === this._lastSuccessfulEnsureByteChunk) {
return;
}
if (!this._loadedChunks.has(chunk)) {
throw new MissingDataException(pos, pos + 1);
}
this.lastSuccessfulEnsureByteChunk = chunk;
this._lastSuccessfulEnsureByteChunk = chunk;
}
ensureRange(begin, end) {
@ -257,40 +272,37 @@ class ChunkedStream extends Stream {
}
class ChunkedStreamManager {
constructor(pdfNetworkStream, args) {
aborted = false;
currRequestId = 0;
_chunksNeededByRequest = new Map();
_loadedStreamCapability = Promise.withResolvers();
_promisesByRequest = new Map();
_requestsByChunk = new Map();
constructor(pdfStream, args) {
this.length = args.length;
this.chunkSize = args.rangeChunkSize;
this.stream = new ChunkedStream(this.length, this.chunkSize, this);
this.pdfNetworkStream = pdfNetworkStream;
this.pdfStream = pdfStream;
this.disableAutoFetch = args.disableAutoFetch;
this.msgHandler = args.msgHandler;
this.currRequestId = 0;
this._chunksNeededByRequest = new Map();
this._requestsByChunk = new Map();
this._promisesByRequest = new Map();
this.progressiveDataLength = 0;
this.aborted = false;
this._loadedStreamCapability = Promise.withResolvers();
}
sendRequest(begin, end) {
const rangeReader = this.pdfNetworkStream.getRangeReader(begin, end);
if (!rangeReader.isStreamingSupported) {
rangeReader.onProgress = this.onProgress.bind(this);
}
const rangeReader = this.pdfStream.getRangeReader(begin, end);
let chunks = [],
loaded = 0;
let chunks = [];
return new Promise((resolve, reject) => {
const readChunk = ({ value, done }) => {
try {
if (done) {
const chunkData = arrayBuffersToBytes(chunks);
resolve(arrayBuffersToBytes(chunks));
chunks = null;
resolve(chunkData);
return;
}
if (typeof PDFJSDev === "undefined" || PDFJSDev.test("TESTING")) {
@ -299,12 +311,6 @@ class ChunkedStreamManager {
"readChunk (sendRequest) - expected an ArrayBuffer."
);
}
loaded += value.byteLength;
if (rangeReader.isStreamingSupported) {
this.onProgress({ loaded });
}
chunks.push(value);
rangeReader.read().then(readChunk, reject);
} catch (e) {
@ -316,7 +322,7 @@ class ChunkedStreamManager {
if (this.aborted) {
return; // Ignoring any data after abort.
}
this.onReceiveData({ chunk: data, begin });
this.onReceiveData({ chunk: data.buffer, begin });
});
}
@ -446,34 +452,26 @@ class ChunkedStreamManager {
return groupedChunks;
}
onProgress(args) {
this.msgHandler.send("DocProgress", {
loaded: this.stream.numChunksLoaded * this.chunkSize + args.loaded,
total: this.length,
});
}
onReceiveData(args) {
const { chunkSize, length, stream } = this;
const chunk = args.chunk;
const isProgressive = args.begin === undefined;
const begin = isProgressive ? this.progressiveDataLength : args.begin;
const begin = isProgressive ? stream.progressiveDataLength : args.begin;
const end = begin + chunk.byteLength;
const beginChunk = Math.floor(begin / this.chunkSize);
const beginChunk = Math.floor(begin / chunkSize);
const endChunk =
end < this.length
? Math.floor(end / this.chunkSize)
: Math.ceil(end / this.chunkSize);
end < length ? Math.floor(end / chunkSize) : Math.ceil(end / chunkSize);
if (isProgressive) {
this.stream.onReceiveProgressiveData(chunk);
this.progressiveDataLength = end;
stream.onReceiveProgressiveData(chunk);
} else {
this.stream.onReceiveData(begin, chunk);
stream.onReceiveData(begin, chunk);
}
if (this.stream.isDataLoaded) {
this._loadedStreamCapability.resolve(this.stream);
if (stream.isDataLoaded) {
this._loadedStreamCapability.resolve(stream);
}
const loadedRequests = [];
@ -502,16 +500,16 @@ class ChunkedStreamManager {
// unfetched chunk of the PDF file.
if (!this.disableAutoFetch && this._requestsByChunk.size === 0) {
let nextEmptyChunk;
if (this.stream.numChunksLoaded === 1) {
if (stream.numChunksLoaded === 1) {
// This is a special optimization so that after fetching the first
// chunk, rather than fetching the second chunk, we fetch the last
// chunk.
const lastChunk = this.stream.numChunks - 1;
if (!this.stream.hasChunk(lastChunk)) {
const lastChunk = stream.numChunks - 1;
if (!stream.hasChunk(lastChunk)) {
nextEmptyChunk = lastChunk;
}
} else {
nextEmptyChunk = this.stream.nextEmptyChunk(endChunk);
nextEmptyChunk = stream.nextEmptyChunk(endChunk);
}
if (Number.isInteger(nextEmptyChunk)) {
this._requestChunks([nextEmptyChunk]);
@ -525,8 +523,12 @@ class ChunkedStreamManager {
}
this.msgHandler.send("DocProgress", {
loaded: this.stream.numChunksLoaded * this.chunkSize,
total: this.length,
loaded: MathClamp(
stream.numChunksLoaded * chunkSize,
stream.progressiveDataLength,
length
),
total: length,
});
}
@ -544,7 +546,7 @@ class ChunkedStreamManager {
abort(reason) {
this.aborted = true;
this.pdfNetworkStream?.cancelAllRequests(reason);
this.pdfStream?.cancelAllRequests(reason);
for (const capability of this._promisesByRequest.values()) {
capability.reject(reason);

View File

@ -699,6 +699,12 @@ class CMapFactory {
if (encoding instanceof Name) {
return createBuiltInCMap(encoding.name, fetchBuiltInCMap);
} else if (encoding instanceof BaseStream) {
if (encoding.isAsync) {
const bytes = await encoding.asyncGetBytes();
if (bytes) {
encoding = new Stream(bytes, 0, bytes.length, encoding.dict);
}
}
const parsedCMap = await parseCMap(
/* cMap = */ new CMap(),
/* lexer = */ new Lexer(encoding),

View File

@ -102,12 +102,52 @@ class DecodeStream extends BaseStream {
async getImageData(length, decoderOptions) {
if (!this.canAsyncDecodeImageFromBuffer) {
if (this.isAsyncDecoder) {
return this.decodeImage(null, decoderOptions);
return this.decodeImage(null, length, decoderOptions);
}
return this.getBytes(length, decoderOptions);
}
const data = await this.stream.asyncGetBytes();
return this.decodeImage(data, decoderOptions);
return this.decodeImage(data, length, decoderOptions);
}
async asyncGetBytesFromDecompressionStream(name) {
this.stream.reset();
const bytes = this.stream.isAsync
? await this.stream.asyncGetBytes()
: this.stream.getBytes();
try {
const { readable, writable } = new DecompressionStream(name);
const writer = writable.getWriter();
await writer.ready;
// We can't await writer.write() because it'll block until the reader
// starts which happens few lines below.
writer
.write(bytes)
.then(async () => {
await writer.ready;
await writer.close();
})
.catch(() => {});
const chunks = [];
let totalLength = 0;
for await (const chunk of readable) {
chunks.push(chunk);
totalLength += chunk.byteLength;
}
const data = new Uint8Array(totalLength);
let offset = 0;
for (const chunk of chunks) {
data.set(chunk, offset);
offset += chunk.byteLength;
}
return { decompressed: data, compressed: bytes };
} catch {
return { decompressed: null, compressed: bytes };
}
}
reset() {
@ -148,7 +188,7 @@ class DecodeStream extends BaseStream {
class StreamsSequenceStream extends DecodeStream {
constructor(streams, onError = null) {
streams = streams.filter(s => s instanceof BaseStream);
streams = streams.filter(s => s instanceof BaseStream && !s.isImageStream);
let maybeLength = 0;
for (const stream of streams) {

View File

@ -27,7 +27,6 @@ import {
stringToBytes,
stringToPDFString,
stringToUTF8String,
toHexUtil,
unreachable,
Util,
warn,
@ -61,6 +60,7 @@ import {
RefSetCache,
} from "./primitives.js";
import { getXfaFontDict, getXfaFontName } from "./xfa_fonts.js";
import { NullStream, Stream } from "./stream.js";
import { BaseStream } from "./base_stream.js";
import { calculateMD5 } from "./calculate_md5.js";
import { Catalog } from "./catalog.js";
@ -68,7 +68,6 @@ import { clearGlobalCaches } from "./cleanup_helper.js";
import { DatasetReader } from "./dataset_reader.js";
import { Intersector } from "./intersector.js";
import { Linearization } from "./parser.js";
import { NullStream } from "./stream.js";
import { ObjectLoader } from "./object_loader.js";
import { OperatorList } from "./operator_list.js";
import { PartialEvaluator } from "./evaluator.js";
@ -270,10 +269,32 @@ class Page {
async getContentStream() {
const content = await this.pdfManager.ensure(this, "content");
if (content instanceof BaseStream) {
if (content instanceof BaseStream && !content.isImageStream) {
if (content.isAsync) {
const bytes = await content.asyncGetBytes();
if (bytes) {
return new Stream(bytes, 0, bytes.length, content.dict);
}
}
return content;
}
if (Array.isArray(content)) {
const promises = [];
for (let i = 0, ii = content.length; i < ii; i++) {
const item = content[i];
if (item instanceof BaseStream && item.isAsync) {
promises.push(
item.asyncGetBytes().then(bytes => {
if (bytes) {
content[i] = new Stream(bytes, 0, bytes.length, item.dict);
}
})
);
}
}
if (promises.length > 0) {
await Promise.all(promises);
}
return new StreamsSequenceStream(
content,
this.#onSubStreamError.bind(this)
@ -442,6 +463,8 @@ class Page {
task,
intent,
cacheKey,
pageId = this.pageIndex,
pageIndex = this.pageIndex,
annotationStorage = null,
modifiedIds = null,
}) {
@ -527,13 +550,12 @@ class Page {
RESOURCES_KEYS_OPERATOR_LIST
);
const opList = new OperatorList(intent, sink);
handler.send("StartRenderPage", {
transparency: partialEvaluator.hasBlendModes(
resources,
this.nonBlendModesSet
),
pageIndex: this.pageIndex,
pageIndex,
cacheKey,
});
@ -1605,8 +1627,8 @@ class PDFDocument {
}
return shadow(this, "fingerprints", [
toHexUtil(hashOriginal),
hashModified ? toHexUtil(hashModified) : null,
hashOriginal.toHex(),
hashModified?.toHex() ?? null,
]);
}

View File

@ -470,6 +470,8 @@ class PDFEditor {
* included ranges (inclusive) or indices.
* @property {Array<Array<number>|number>} [excludePages]
* excluded ranges (inclusive) or indices.
* @property {Array<number>} [pageIndices]
* position of the pages in the final document.
*/
/**
@ -482,10 +484,18 @@ class PDFEditor {
let newIndex = 0;
this.hasSingleFile = pageInfos.length === 1;
const allDocumentData = [];
for (const { document, includePages, excludePages } of pageInfos) {
for (const {
document,
includePages,
excludePages,
pageIndices,
} of pageInfos) {
if (!document) {
continue;
}
if (pageIndices) {
newIndex = -1;
}
const documentData = new DocumentData(document);
allDocumentData.push(documentData);
promises.push(this.#collectDocumentData(documentData));
@ -504,6 +514,7 @@ class PDFEditor {
(deletedIndices ||= new Set()).add(page);
}
}
let pageIndex = 0;
for (let i = 0, ii = document.numPages; i < ii; i++) {
if (deletedIndices?.has(i)) {
continue;
@ -539,7 +550,23 @@ class PDFEditor {
if (!takePage) {
continue;
}
const newPageIndex = newIndex++;
let newPageIndex;
if (pageIndices) {
newPageIndex = pageIndices[pageIndex++];
}
if (newPageIndex === undefined) {
if (newIndex !== -1) {
newPageIndex = newIndex++;
} else {
for (
newPageIndex = 0;
this.oldPages[newPageIndex] === undefined;
newPageIndex++
) {
/* empty */
}
}
}
promises.push(
document.getPage(i).then(page => {
this.oldPages[newPageIndex] = new PageData(page, documentData);

View File

@ -1039,6 +1039,7 @@ class PartialEvaluator {
if (
isAddToPathSet ||
state.fillColorSpace.name === "Pattern" ||
state.strokeColorSpace.name === "Pattern" ||
font.disableFontFace
) {
PartialEvaluator.buildFontPaths(
@ -1705,7 +1706,7 @@ class PartialEvaluator {
return null;
}
getOperatorList({
async getOperatorList({
stream,
task,
resources,
@ -1714,6 +1715,13 @@ class PartialEvaluator {
fallbackFontDict = null,
prevRefs = null,
}) {
if (stream.isAsync) {
const bytes = await stream.asyncGetBytes();
if (bytes) {
stream = new Stream(bytes, 0, bytes.length, stream.dict);
}
}
const objId = stream.dict?.objId;
const seenRefs = new RefSet(prevRefs);
@ -2372,7 +2380,7 @@ class PartialEvaluator {
});
}
getTextContent({
async getTextContent({
stream,
task,
resources,
@ -2388,6 +2396,13 @@ class PartialEvaluator {
prevRefs = null,
intersector = null,
}) {
if (stream.isAsync) {
const bytes = await stream.asyncGetBytes();
if (bytes) {
stream = new Stream(bytes, 0, bytes.length, stream.dict);
}
}
const objId = stream.dict?.objId;
const seenRefs = new RefSet(prevRefs);
@ -2523,7 +2538,7 @@ class PartialEvaluator {
const preprocessor = new EvaluatorPreprocessor(stream, xref, stateManager);
let textState;
let textState, currentTextState;
function pushWhitespace({
width = 0,
@ -2785,7 +2800,9 @@ class PartialEvaluator {
// When the total height of the current chunk is negative
// then we're writing from bottom to top.
const textOrientation = Math.sign(textContentItem.height);
const textOrientation = Math.sign(
textContentItem.height || textContentItem.totalHeight
);
if (advanceY < textOrientation * textContentItem.negativeSpaceMax) {
if (
Math.abs(advanceX) >
@ -2849,7 +2866,9 @@ class PartialEvaluator {
// When the total width of the current chunk is negative
// then we're writing from right to left.
const textOrientation = Math.sign(textContentItem.width);
const textOrientation = Math.sign(
textContentItem.width || textContentItem.totalWidth
);
if (advanceX < textOrientation * textContentItem.negativeSpaceMax) {
if (
Math.abs(advanceY) >
@ -2907,6 +2926,15 @@ class PartialEvaluator {
}
function buildTextContentItem({ chars, extraSpacing }) {
if (
currentTextState !== textState &&
(currentTextState.fontName !== textState.fontName ||
currentTextState.fontSize !== textState.fontSize)
) {
flushTextContentItem();
currentTextState = textState.clone();
}
const font = textState.font;
if (!chars) {
// Just move according to the space we have.
@ -3162,8 +3190,8 @@ class PartialEvaluator {
break;
}
const previousState = textState;
textState = stateManager.state;
currentTextState ||= textState.clone();
const fn = operation.fn;
args = operation.args;
@ -3180,7 +3208,6 @@ class PartialEvaluator {
break;
}
flushTextContentItem();
textState.fontName = fontNameArg;
textState.fontSize = fontSizeArg;
next(handleSetFont(fontNameArg, null));
@ -3537,14 +3564,10 @@ class PartialEvaluator {
}
break;
case OPS.restore:
if (
previousState &&
(previousState.font !== textState.font ||
previousState.fontSize !== textState.fontSize ||
previousState.fontName !== textState.fontName)
) {
flushTextContentItem();
}
stateManager.restore();
break;
case OPS.save:
stateManager.save();
break;
} // switch
if (textContent.items.length >= (sink?.desiredSize ?? 1)) {
@ -4564,8 +4587,16 @@ class PartialEvaluator {
if (fontFile) {
if (!(fontFile instanceof BaseStream)) {
throw new FormatError("FontFile should be a stream");
} else if (fontFile.isEmpty) {
throw new FormatError("FontFile is empty");
} else {
if (fontFile.isAsync) {
const bytes = await fontFile.asyncGetBytes();
if (bytes) {
fontFile = new Stream(bytes, 0, bytes.length, fontFile.dict);
}
}
if (fontFile.isEmpty) {
throw new FormatError("FontFile is empty");
}
}
}
} catch (ex) {
@ -5060,7 +5091,7 @@ class TextState {
}
clone() {
const clone = Object.create(this);
const clone = Object.assign(Object.create(this), this);
clone.textMatrix = this.textMatrix.slice();
clone.textLineMatrix = this.textLineMatrix.slice();
clone.fontMatrix = this.fontMatrix.slice();

View File

@ -13,26 +13,18 @@
* limitations under the License.
*/
import { shadow, stringToPDFString, warn } from "../shared/util.js";
import { stringToPDFString, warn } from "../shared/util.js";
import { BaseStream } from "./base_stream.js";
import { Dict } from "./primitives.js";
function pickPlatformItem(dict) {
if (!(dict instanceof Dict)) {
return null;
}
// Look for the filename in this order:
// UF, F, Unix, Mac, DOS
if (dict.has("UF")) {
return dict.get("UF");
} else if (dict.has("F")) {
return dict.get("F");
} else if (dict.has("Unix")) {
return dict.get("Unix");
} else if (dict.has("Mac")) {
return dict.get("Mac");
} else if (dict.has("DOS")) {
return dict.get("DOS");
if (dict instanceof Dict) {
// Look for the filename in this order: UF, F, Unix, Mac, DOS
for (const key of ["UF", "F", "Unix", "Mac", "DOS"]) {
if (dict.has(key)) {
return dict.get(key);
}
}
}
return null;
}
@ -51,11 +43,10 @@ function stripPath(str) {
class FileSpec {
#contentAvailable = false;
constructor(root, xref, skipContent = false) {
constructor(root, skipContent = false) {
if (!(root instanceof Dict)) {
return;
}
this.xref = xref;
this.root = root;
if (root.has("FS")) {
this.fs = root.get("FS");
@ -73,56 +64,46 @@ class FileSpec {
}
get filename() {
let filename = "";
const item = pickPlatformItem(this.root);
if (item && typeof item === "string") {
filename = stringToPDFString(item, /* keepEscapeSequence = */ true)
// NOTE: The following replacement order is INTENTIONAL, regardless of
// what some static code analysers (e.g. CodeQL) may claim.
return stringToPDFString(item, /* keepEscapeSequence = */ true)
.replaceAll("\\\\", "\\")
.replaceAll("\\/", "/")
.replaceAll("\\", "/");
}
return shadow(this, "filename", filename || "unnamed");
return "";
}
get content() {
if (!this.#contentAvailable) {
return null;
}
this._contentRef ||= pickPlatformItem(this.root?.get("EF"));
const ef = pickPlatformItem(this.root?.get("EF"));
let content = null;
if (this._contentRef) {
const fileObj = this.xref.fetchIfRef(this._contentRef);
if (fileObj instanceof BaseStream) {
content = fileObj.getBytes();
} else {
warn(
"Embedded file specification points to non-existing/invalid content"
);
}
} else {
warn("Embedded file specification does not have any content");
if (ef instanceof BaseStream) {
return ef.getBytes();
}
return content;
warn("Embedded file specification points to non-existing/invalid content");
return null;
}
get description() {
let description = "";
const desc = this.root?.get("Desc");
if (desc && typeof desc === "string") {
description = stringToPDFString(desc);
return stringToPDFString(desc);
}
return shadow(this, "description", description);
return "";
}
get serializable() {
const { filename, content, description } = this;
return {
rawFilename: this.filename,
filename: stripPath(this.filename),
content: this.content,
description: this.description,
rawFilename: filename,
filename: stripPath(filename) || "unnamed",
content,
description,
};
}
}

View File

@ -122,6 +122,8 @@ const fixedDistCodeTab = [
];
class FlateStream extends DecodeStream {
#isAsync = true;
constructor(str, maybeLength) {
super(maybeLength);
@ -161,58 +163,30 @@ class FlateStream extends DecodeStream {
}
async asyncGetBytes() {
this.stream.reset();
const bytes = this.stream.getBytes();
try {
const { readable, writable } = new DecompressionStream("deflate");
const writer = writable.getWriter();
await writer.ready;
// We can't await writer.write() because it'll block until the reader
// starts which happens few lines below.
writer
.write(bytes)
.then(async () => {
await writer.ready;
await writer.close();
})
.catch(() => {});
const chunks = [];
let totalLength = 0;
for await (const chunk of readable) {
chunks.push(chunk);
totalLength += chunk.byteLength;
}
const data = new Uint8Array(totalLength);
let offset = 0;
for (const chunk of chunks) {
data.set(chunk, offset);
offset += chunk.byteLength;
}
return data;
} catch {
// DecompressionStream failed (for example because there are some extra
// bytes after the end of the compressed data), so we fallback to our
// decoder.
// We already get the bytes from the underlying stream, so we just reuse
// them to avoid get them again.
this.stream = new Stream(
bytes,
2 /* = header size (see ctor) */,
bytes.length,
this.stream.dict
);
this.reset();
return null;
const { decompressed, compressed } =
await this.asyncGetBytesFromDecompressionStream("deflate");
if (decompressed) {
return decompressed;
}
// DecompressionStream failed (for example because there are some extra
// bytes after the end of the compressed data), so we fallback to our
// decoder.
// We already get the bytes from the underlying stream, so we just reuse
// them to avoid get them again.
this.#isAsync = false;
this.stream = new Stream(
compressed,
2 /* = header size (see ctor) */,
compressed.length,
this.stream.dict
);
this.reset();
return null;
}
get isAsync() {
return true;
return this.#isAsync;
}
getBits(bits) {

View File

@ -369,7 +369,7 @@ class PDFImage {
const inverseDecode = decode?.[0] > 0;
const computedLength = ((width + 7) >> 3) * height;
const imgArray = image.getBytes(computedLength);
const imgArray = await image.getImageData(computedLength);
const isSingleOpaquePixel =
width === 1 &&

View File

@ -22,7 +22,7 @@ const MIN_IMAGE_DIM = 2048;
// In Chrome, there aren't max dimensions but only a max area. So an image with
// a very large dimensions is acceptable but it probably doesn't hurt to reduce
// it when considering that it will finally rendered on a small canvas.
const MAX_IMAGE_DIM = 65537;
const MAX_IMAGE_DIM = 32768;
const MAX_ERROR = 128;
// Large images are encoded in using the BMP format (it's a way faster than

View File

@ -0,0 +1,142 @@
/* Copyright 2026 Mozilla Foundation
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import { BaseException, warn } from "../shared/util.js";
import { fetchBinaryData } from "./core_utils.js";
import JBig2 from "../../external/jbig2/jbig2.js";
class JBig2Error extends BaseException {
constructor(msg) {
super(msg, "Jbig2Error");
}
}
class JBig2CCITTFaxWasmImage {
static #buffer = null;
static #handler = null;
static #modulePromise = null;
static #useWasm = true;
static #useWorkerFetch = true;
static #wasmUrl = null;
static setOptions({ handler, useWasm, useWorkerFetch, wasmUrl }) {
this.#useWasm = useWasm;
this.#useWorkerFetch = useWorkerFetch;
this.#wasmUrl = wasmUrl;
if (!useWorkerFetch) {
this.#handler = handler;
}
}
static async #instantiateWasm(fallbackCallback, imports, successCallback) {
const filename = "jbig2.wasm";
try {
if (!this.#buffer) {
if (this.#useWorkerFetch) {
this.#buffer = await fetchBinaryData(`${this.#wasmUrl}${filename}`);
} else {
this.#buffer = await this.#handler.sendWithPromise(
"FetchBinaryData",
{ type: "wasmFactory", filename }
);
}
}
const results = await WebAssembly.instantiate(this.#buffer, imports);
return successCallback(results.instance);
} catch (reason) {
warn(`JBig2Image#instantiateWasm: ${reason}`);
return fallbackCallback(null);
} finally {
this.#handler = null;
}
}
static async decode(bytes, width, height, globals, CCITTOptions) {
if (!this.#modulePromise) {
const { promise, resolve } = Promise.withResolvers();
const promises = [promise];
if (this.#useWasm) {
promises.push(
JBig2({
warn,
instantiateWasm: this.#instantiateWasm.bind(this, resolve),
})
);
} else {
resolve(null);
}
this.#modulePromise = Promise.race(promises);
}
const module = await this.#modulePromise;
if (!module) {
throw new JBig2Error("JBig2 failed to initialize");
}
let ptr, globalsPtr;
try {
const size = bytes.length;
ptr = module._malloc(size);
module.writeArrayToMemory(bytes, ptr);
if (CCITTOptions) {
module._ccitt_decode(
ptr,
size,
width,
height,
CCITTOptions.K,
CCITTOptions.EndOfLine ? 1 : 0,
CCITTOptions.EncodedByteAlign ? 1 : 0,
CCITTOptions.BlackIs1 ? 1 : 0,
CCITTOptions.Columns,
CCITTOptions.Rows
);
} else {
const globalsSize = globals ? globals.length : 0;
if (globalsSize > 0) {
globalsPtr = module._malloc(globalsSize);
module.writeArrayToMemory(globals, globalsPtr);
}
module._jbig2_decode(ptr, size, width, height, globalsPtr, globalsSize);
}
if (!module.imageData) {
throw new JBig2Error("Unknown error");
}
const { imageData } = module;
module.imageData = null;
return imageData;
} finally {
if (ptr) {
module._free(ptr);
}
if (globalsPtr) {
module._free(globalsPtr);
}
}
}
static cleanup() {
this.#modulePromise = null;
}
}
export { JBig2CCITTFaxWasmImage, JBig2Error };

View File

@ -13,11 +13,12 @@
* limitations under the License.
*/
import { shadow, warn } from "../shared/util.js";
import { BaseStream } from "./base_stream.js";
import { DecodeStream } from "./decode_stream.js";
import { Dict } from "./primitives.js";
import { JBig2CCITTFaxWasmImage } from "./jbig2_ccittFax_wasm.js";
import { Jbig2Image } from "./jbig2.js";
import { shadow } from "../shared/util.js";
/**
* For JBIG2's we use a library to decode these images and
@ -44,10 +45,47 @@ class Jbig2Stream extends DecodeStream {
}
readBlock() {
this.decodeImage();
this.decodeImageFallback();
}
decodeImage(bytes) {
get isAsyncDecoder() {
return true;
}
get isImageStream() {
return true;
}
async decodeImage(bytes, length, _decoderOptions) {
if (this.eof) {
return this.buffer;
}
bytes ||= this.bytes;
try {
let globals = null;
if (this.params instanceof Dict) {
const globalsStream = this.params.get("JBIG2Globals");
if (globalsStream instanceof BaseStream) {
globals = globalsStream.getBytes();
}
}
this.buffer = await JBig2CCITTFaxWasmImage.decode(
bytes,
this.dict.get("Width"),
this.dict.get("Height"),
globals
);
} catch {
warn("Jbig2Stream: Falling back to JS JBIG2 decoder.");
return this.decodeImageFallback(bytes, length);
}
this.bufferLength = this.buffer.length;
this.eof = true;
return this.buffer;
}
decodeImageFallback(bytes, _length) {
if (this.eof) {
return this.buffer;
}

View File

@ -194,6 +194,10 @@ class JpegStream extends DecodeStream {
decoder?.close();
}
}
get isImageStream() {
return true;
}
}
export { JpegStream };

View File

@ -49,7 +49,7 @@ class JpxStream extends DecodeStream {
return true;
}
async decodeImage(bytes, decoderOptions) {
async decodeImage(bytes, _length, decoderOptions) {
if (this.eof) {
return this.buffer;
}
@ -64,6 +64,10 @@ class JpxStream extends DecodeStream {
get canAsyncDecodeImageFromBuffer() {
return this.stream.isAsync;
}
get isImageStream() {
return true;
}
}
export { JpxStream };

View File

@ -29,6 +29,7 @@ import {
import { NullStream, Stream } from "./stream.js";
import { Ascii85Stream } from "./ascii_85_stream.js";
import { AsciiHexStream } from "./ascii_hex_stream.js";
import { BrotliStream } from "./brotli_stream.js";
import { CCITTFaxStream } from "./ccitt_stream.js";
import { FlateStream } from "./flate_stream.js";
import { Jbig2Stream } from "./jbig2_stream.js";
@ -822,6 +823,8 @@ class Parser {
return new RunLengthStream(stream, maybeLength);
case "JBIG2Decode":
return new Jbig2Stream(stream, maybeLength, params);
case "BrotliDecode":
return new BrotliStream(stream, maybeLength);
}
warn(`Filter "${name}" is not supported.`);
return stream;

View File

@ -22,6 +22,7 @@ import {
} from "../shared/util.js";
import { ChunkedStreamManager } from "./chunked_stream.js";
import { ImageResizer } from "./image_resizer.js";
import { JBig2CCITTFaxWasmImage } from "./jbig2_ccittFax_wasm.js";
import { JpegStream } from "./jpeg_stream.js";
import { JpxImage } from "./jpx.js";
import { MissingDataException } from "./core_utils.js";
@ -81,6 +82,7 @@ class BasePdfManager {
JpxImage.setOptions(options);
IccColorSpace.setOptions(options);
CmykICCBasedCS.setOptions(options);
JBig2CCITTFaxWasmImage.setOptions(options);
}
get docId() {

View File

@ -51,6 +51,7 @@ class Stream extends BaseStream {
const strEnd = this.end;
if (!length) {
this.pos = strEnd;
return bytes.subarray(pos, strEnd);
}
let end = pos + length;

View File

@ -219,23 +219,23 @@ class WorkerMessageHandler {
return new LocalPdfManager(pdfManagerArgs);
}
const pdfStream = new PDFWorkerStream(handler),
fullRequest = pdfStream.getFullReader();
const pdfStream = new PDFWorkerStream({ msgHandler: handler }),
fullReader = pdfStream.getFullReader();
const pdfManagerCapability = Promise.withResolvers();
let newPdfManager,
cachedChunks = [],
loaded = 0;
fullRequest.headersReady
fullReader.headersReady
.then(function () {
if (!fullRequest.isRangeSupported) {
if (!fullReader.isRangeSupported) {
return;
}
pdfManagerArgs.source = pdfStream;
pdfManagerArgs.length = fullRequest.contentLength;
pdfManagerArgs.length = fullReader.contentLength;
// We don't need auto-fetch when streaming is enabled.
pdfManagerArgs.disableAutoFetch ||= fullRequest.isStreamingSupported;
pdfManagerArgs.disableAutoFetch ||= fullReader.isStreamingSupported;
newPdfManager = new NetworkPdfManager(pdfManagerArgs);
// There may be a chance that `newPdfManager` is not initialized for
@ -282,10 +282,10 @@ class WorkerMessageHandler {
}
loaded += value.byteLength;
if (!fullRequest.isStreamingSupported) {
if (!fullReader.isStreamingSupported) {
handler.send("DocProgress", {
loaded,
total: Math.max(loaded, fullRequest.contentLength || 0),
total: Math.max(loaded, fullReader.contentLength || 0),
});
}
@ -294,12 +294,12 @@ class WorkerMessageHandler {
} else {
cachedChunks.push(value);
}
fullRequest.read().then(readChunk, reject);
fullReader.read().then(readChunk, reject);
} catch (e) {
reject(e);
}
};
fullRequest.read().then(readChunk, reject);
fullReader.read().then(readChunk, reject);
}).catch(function (e) {
pdfManagerCapability.reject(e);
cancelXHRs = null;
@ -319,7 +319,9 @@ class WorkerMessageHandler {
}
function onFailure(ex) {
ensureNotTerminated();
if (terminated) {
return;
}
if (ex instanceof PasswordException) {
const task = new WorkerTask(`PasswordException: response ${ex.code}`);
@ -853,8 +855,8 @@ class WorkerMessageHandler {
);
handler.on("GetOperatorList", function (data, sink) {
const pageIndex = data.pageIndex;
pdfManager.getPage(pageIndex).then(function (page) {
const { pageId, pageIndex } = data;
pdfManager.getPage(pageId).then(function (page) {
const task = new WorkerTask(`GetOperatorList: page ${pageIndex}`);
startWorkerTask(task);
@ -871,6 +873,7 @@ class WorkerMessageHandler {
cacheKey: data.cacheKey,
annotationStorage: data.annotationStorage,
modifiedIds: data.modifiedIds,
pageIndex,
})
.then(
function (operatorListInfo) {
@ -899,9 +902,10 @@ class WorkerMessageHandler {
});
handler.on("GetTextContent", function (data, sink) {
const { pageIndex, includeMarkedContent, disableNormalization } = data;
const { pageId, pageIndex, includeMarkedContent, disableNormalization } =
data;
pdfManager.getPage(pageIndex).then(function (page) {
pdfManager.getPage(pageId).then(function (page) {
const task = new WorkerTask("GetTextContent: page " + pageIndex);
startWorkerTask(task);

View File

@ -13,77 +13,35 @@
* limitations under the License.
*/
import { assert } from "../shared/util.js";
import {
BasePDFStream,
BasePDFStreamRangeReader,
BasePDFStreamReader,
} from "../shared/base_pdf_stream.js";
/** @implements {IPDFStream} */
class PDFWorkerStream {
constructor(msgHandler) {
this._msgHandler = msgHandler;
this._contentLength = null;
this._fullRequestReader = null;
this._rangeRequestReaders = [];
}
getFullReader() {
assert(
!this._fullRequestReader,
"PDFWorkerStream.getFullReader can only be called once."
);
this._fullRequestReader = new PDFWorkerStreamReader(this._msgHandler);
return this._fullRequestReader;
}
getRangeReader(begin, end) {
const reader = new PDFWorkerStreamRangeReader(begin, end, this._msgHandler);
this._rangeRequestReaders.push(reader);
return reader;
}
cancelAllRequests(reason) {
this._fullRequestReader?.cancel(reason);
for (const reader of this._rangeRequestReaders.slice(0)) {
reader.cancel(reason);
}
class PDFWorkerStream extends BasePDFStream {
constructor(source) {
super(source, PDFWorkerStreamReader, PDFWorkerStreamRangeReader);
}
}
/** @implements {IPDFStreamReader} */
class PDFWorkerStreamReader {
constructor(msgHandler) {
this._msgHandler = msgHandler;
this.onProgress = null;
class PDFWorkerStreamReader extends BasePDFStreamReader {
_reader = null;
this._contentLength = null;
this._isRangeSupported = false;
this._isStreamingSupported = false;
constructor(stream) {
super(stream);
const { msgHandler } = stream._source;
const readableStream = this._msgHandler.sendWithStream("GetReader");
const readableStream = msgHandler.sendWithStream("GetReader");
this._reader = readableStream.getReader();
this._headersReady = this._msgHandler
.sendWithPromise("ReaderHeadersReady")
.then(data => {
this._isStreamingSupported = data.isStreamingSupported;
this._isRangeSupported = data.isRangeSupported;
this._contentLength = data.contentLength;
});
}
msgHandler.sendWithPromise("ReaderHeadersReady").then(data => {
this._contentLength = data.contentLength;
this._isStreamingSupported = data.isStreamingSupported;
this._isRangeSupported = data.isRangeSupported;
get headersReady() {
return this._headersReady;
}
get contentLength() {
return this._contentLength;
}
get isStreamingSupported() {
return this._isStreamingSupported;
}
get isRangeSupported() {
return this._isRangeSupported;
this._headersCapability.resolve();
}, this._headersCapability.reject);
}
async read() {
@ -101,23 +59,20 @@ class PDFWorkerStreamReader {
}
}
/** @implements {IPDFStreamRangeReader} */
class PDFWorkerStreamRangeReader {
constructor(begin, end, msgHandler) {
this._msgHandler = msgHandler;
this.onProgress = null;
class PDFWorkerStreamRangeReader extends BasePDFStreamRangeReader {
_reader = null;
const readableStream = this._msgHandler.sendWithStream("GetRangeReader", {
constructor(stream, begin, end) {
super(stream, begin, end);
const { msgHandler } = stream._source;
const readableStream = msgHandler.sendWithStream("GetRangeReader", {
begin,
end,
});
this._reader = readableStream.getReader();
}
get isStreamingSupported() {
return false;
}
async read() {
const { value, done } = await this._reader.read();
if (done) {

View File

@ -90,7 +90,6 @@ import {
XFAObject,
XFAObjectArray,
} from "./xfa_object.js";
import { fromBase64Util, Util, warn } from "../../shared/util.js";
import {
getBBox,
getColor,
@ -103,6 +102,7 @@ import {
getStringOption,
HTMLResult,
} from "./utils.js";
import { Util, warn } from "../../shared/util.js";
import { getMetrics } from "./fonts.js";
import { recoverJsURL } from "../core_utils.js";
import { searchNode } from "./som.js";
@ -3420,7 +3420,7 @@ class Image extends StringObject {
}
if (!buffer && this.transferEncoding === "base64") {
buffer = fromBase64Util(this[$content]);
buffer = Uint8Array.fromBase64(this[$content]);
}
if (!buffer) {

View File

@ -18,9 +18,6 @@
// eslint-disable-next-line max-len
/** @typedef {import("../../web/text_accessibility.js").TextAccessibilityManager} TextAccessibilityManager */
// eslint-disable-next-line max-len
/** @typedef {import("../../web/interfaces").IDownloadManager} IDownloadManager */
/** @typedef {import("../../web/interfaces").IPDFLinkService} IPDFLinkService */
// eslint-disable-next-line max-len
/** @typedef {import("../src/display/editor/tools.js").AnnotationEditorUIManager} AnnotationEditorUIManager */
// eslint-disable-next-line max-len
/** @typedef {import("../../web/struct_tree_layer_builder.js").StructTreeLayerBuilder} StructTreeLayerBuilder */
@ -57,8 +54,8 @@ const TIMEZONE_OFFSET = new Date().getTimezoneOffset() * 60 * 1000;
* @typedef {Object} AnnotationElementParameters
* @property {Object} data
* @property {HTMLDivElement} layer
* @property {IPDFLinkService} linkService
* @property {IDownloadManager} [downloadManager]
* @property {PDFLinkService} linkService
* @property {BaseDownloadManager} [downloadManager]
* @property {AnnotationStorage} [annotationStorage]
* @property {string} [imageResourcesPath] - Path for image resources, mainly
* for annotation icons. Include trailing slash.
@ -293,7 +290,7 @@ class AnnotationElement {
this.annotationStorage.setValue(`${AnnotationEditorPrefix}${data.id}`, {
id: data.id,
annotationType: data.annotationType,
pageIndex: this.parent.page._pageIndex,
page: this.parent.page,
popup,
popupRef: data.popupRef,
modificationDate: new Date(),
@ -3736,8 +3733,8 @@ class FileAttachmentAnnotationElement extends AnnotationElement {
* @property {HTMLDivElement} div
* @property {Array} annotations
* @property {PDFPageProxy} page
* @property {IPDFLinkService} linkService
* @property {IDownloadManager} [downloadManager]
* @property {PDFLinkService} linkService
* @property {BaseDownloadManager} [downloadManager]
* @property {AnnotationStorage} [annotationStorage]
* @property {string} [imageResourcesPath] - Path for image resources, mainly
* for annotation icons. Include trailing slash.
@ -4018,8 +4015,6 @@ class AnnotationLayer {
* Add link annotations to the annotation layer.
*
* @param {Array<Object>} annotations
* @param {IPDFLinkService} linkService
* @memberof AnnotationLayer
*/
async addLinkAnnotations(annotations) {
const elementParams = {

View File

@ -196,6 +196,10 @@ class AnnotationStorage {
val instanceof AnnotationEditor
? val.serialize(/* isForCopying = */ false, context)
: val;
if (val.page) {
val.pageIndex = val.page._pageIndex;
delete val.page;
}
if (serialized) {
map.set(key, serialized);

View File

@ -25,6 +25,7 @@ import {
getVerbosityLevel,
info,
isNodeJS,
MathClamp,
RenderingIntentFlag,
setVerbosityLevel,
shadow,
@ -40,6 +41,7 @@ import {
deprecated,
isDataScheme,
isValidFetchUrl,
PagesMapper,
PageViewport,
RenderingCancelledException,
StatTimer,
@ -439,6 +441,7 @@ function getDocument(src = {}) {
ownerDocument,
pdfBug,
styleElement,
enableHWA,
loadingParams: {
disableAutoFetch,
enableXfa,
@ -462,7 +465,8 @@ function getDocument(src = {}) {
let networkStream;
if (rangeTransport) {
networkStream = new PDFDataTransportStream(rangeTransport, {
networkStream = new PDFDataTransportStream({
pdfDataRangeTransport: rangeTransport,
disableRange,
disableStream,
});
@ -507,8 +511,7 @@ function getDocument(src = {}) {
task,
networkStream,
transportParams,
transportFactory,
enableHWA
transportFactory
);
task._transport = transport;
messageHandler.send("Ready", null);
@ -523,6 +526,8 @@ function getDocument(src = {}) {
* @typedef {Object} OnProgressParameters
* @property {number} loaded - Currently loaded number of bytes.
* @property {number} total - Total number of bytes in the PDF file.
* @property {number} percent - Currently loaded percentage, as an integer value
* in the [0, 100] range. If `total` is undefined, the percentage is `NaN`.
*/
/**
@ -634,8 +639,6 @@ class PDFDataRangeTransport {
#progressiveReadListeners = [];
#progressListeners = [];
#rangeListeners = [];
/**
@ -654,6 +657,18 @@ class PDFDataRangeTransport {
this.initialData = initialData;
this.progressiveDone = progressiveDone;
this.contentDispositionFilename = contentDispositionFilename;
if (typeof PDFJSDev === "undefined" || PDFJSDev.test("GENERIC")) {
Object.defineProperty(this, "onDataProgress", {
value: () => {
deprecated(
"`PDFDataRangeTransport.prototype.onDataProgress` - method was " +
"removed, since loading progress is now reported automatically " +
"through the `PDFDataTransportStream` class (and related code)."
);
},
});
}
}
/**
@ -663,13 +678,6 @@ class PDFDataRangeTransport {
this.#rangeListeners.push(listener);
}
/**
* @param {function} listener
*/
addProgressListener(listener) {
this.#progressListeners.push(listener);
}
/**
* @param {function} listener
*/
@ -694,18 +702,6 @@ class PDFDataRangeTransport {
}
}
/**
* @param {number} loaded
* @param {number|undefined} total
*/
onDataProgress(loaded, total) {
this.#capability.promise.then(() => {
for (const listener of this.#progressListeners) {
listener(loaded, total);
}
});
}
/**
* @param {Uint8Array|null} chunk
*/
@ -1328,6 +1324,8 @@ class PDFDocumentProxy {
class PDFPageProxy {
#pendingCleanup = false;
#pagesMapper = PagesMapper.instance;
constructor(pageIndex, pageInfo, transport, pdfBug = false) {
this._pageIndex = pageIndex;
this._pageInfo = pageInfo;
@ -1350,6 +1348,13 @@ class PDFPageProxy {
return this._pageIndex + 1;
}
/**
* @param {number} value - The page number to set. First page is 1.
*/
set pageNumber(value) {
this._pageIndex = value - 1;
}
/**
* @type {number} The number of degrees the page is rotated clockwise.
*/
@ -1699,6 +1704,7 @@ class PDFPageProxy {
return this._transport.messageHandler.sendWithStream(
"GetTextContent",
{
pageId: this.#pagesMapper.getPageId(this._pageIndex + 1) - 1,
pageIndex: this._pageIndex,
includeMarkedContent: includeMarkedContent === true,
disableNormalization: disableNormalization === true,
@ -1884,6 +1890,7 @@ class PDFPageProxy {
const readableStream = this._transport.messageHandler.sendWithStream(
"GetOperatorList",
{
pageId: this.#pagesMapper.getPageId(this._pageIndex + 1) - 1,
pageIndex: this._pageIndex,
intent: renderingIntent,
cacheKey,
@ -2379,8 +2386,14 @@ class PDFWorker {
* @ignore
*/
class WorkerTransport {
downloadInfoCapability = Promise.withResolvers();
#fullReader = null;
#methodPromises = new Map();
#networkStream = null;
#pageCache = new Map();
#pagePromises = new Map();
@ -2389,21 +2402,19 @@ class WorkerTransport {
#passwordCapability = null;
constructor(
messageHandler,
loadingTask,
networkStream,
params,
factory,
enableHWA
) {
#pagesMapper = PagesMapper.instance;
constructor(messageHandler, loadingTask, networkStream, params, factory) {
this.messageHandler = messageHandler;
this.loadingTask = loadingTask;
this.#networkStream = networkStream;
this.commonObjs = new PDFObjects();
this.fontLoader = new FontLoader({
ownerDocument: params.ownerDocument,
styleElement: params.styleElement,
});
this.enableHWA = params.enableHWA;
this.loadingParams = params.loadingParams;
this._params = params;
@ -2416,14 +2427,10 @@ class WorkerTransport {
this.destroyed = false;
this.destroyCapability = null;
this._networkStream = networkStream;
this._fullReader = null;
this._lastProgress = null;
this.downloadInfoCapability = Promise.withResolvers();
this.enableHWA = enableHWA;
this.setupMessageHandler();
this.#pagesMapper.addListener(this.#updateCaches.bind(this));
if (typeof PDFJSDev === "undefined" || PDFJSDev.test("TESTING")) {
// For testing purposes.
Object.defineProperty(this, "getNetworkStreamName", {
@ -2448,6 +2455,24 @@ class WorkerTransport {
}
}
#updateCaches() {
const newPageCache = new Map();
const newPromiseCache = new Map();
for (let i = 0, ii = this.#pagesMapper.pagesNumber; i < ii; i++) {
const prevPageIndex = this.#pagesMapper.getPrevPageNumber(i + 1) - 1;
const page = this.#pageCache.get(prevPageIndex);
if (page) {
newPageCache.set(i, page);
}
const promise = this.#pagePromises.get(prevPageIndex);
if (promise) {
newPromiseCache.set(i, promise);
}
}
this.#pageCache = newPageCache;
this.#pagePromises = newPromiseCache;
}
#cacheSimpleMethod(name, data = null) {
const cachedPromise = this.#methodPromises.get(name);
if (cachedPromise) {
@ -2459,6 +2484,14 @@ class WorkerTransport {
return promise;
}
#onProgress({ loaded, total }) {
this.loadingTask.onProgress?.({
loaded,
total,
percent: MathClamp(Math.round((loaded / total) * 100), 0, 100),
});
}
get annotationStorage() {
return shadow(this, "annotationStorage", new AnnotationStorage());
}
@ -2570,7 +2603,7 @@ class WorkerTransport {
this.filterFactory.destroy();
TextLayer.cleanup();
this._networkStream?.cancelAllRequests(
this.#networkStream?.cancelAllRequests(
new AbortException("Worker was terminated.")
);
@ -2587,18 +2620,16 @@ class WorkerTransport {
messageHandler.on("GetReader", (data, sink) => {
assert(
this._networkStream,
"GetReader - no `IPDFStream` instance available."
this.#networkStream,
"GetReader - no `BasePDFStream` instance available."
);
this._fullReader = this._networkStream.getFullReader();
this._fullReader.onProgress = evt => {
this._lastProgress = {
loaded: evt.loaded,
total: evt.total,
};
};
this.#fullReader = this.#networkStream.getFullReader();
// If stream or range turn out to be disabled, once `headersReady` is
// resolved, this is our only way to report loading progress.
this.#fullReader.onProgress = evt => this.#onProgress(evt);
sink.onPull = () => {
this._fullReader
this.#fullReader
.read()
.then(function ({ value, done }) {
if (done) {
@ -2619,7 +2650,7 @@ class WorkerTransport {
};
sink.onCancel = reason => {
this._fullReader.cancel(reason);
this.#fullReader.cancel(reason);
sink.ready.catch(readyReason => {
if (this.destroyed) {
@ -2631,40 +2662,29 @@ class WorkerTransport {
});
messageHandler.on("ReaderHeadersReady", async data => {
await this._fullReader.headersReady;
await this.#fullReader.headersReady;
const { isStreamingSupported, isRangeSupported, contentLength } =
this._fullReader;
this.#fullReader;
// If stream or range are disabled, it's our only way to report
// loading progress.
if (!isStreamingSupported || !isRangeSupported) {
if (this._lastProgress) {
loadingTask.onProgress?.(this._lastProgress);
}
this._fullReader.onProgress = evt => {
loadingTask.onProgress?.({
loaded: evt.loaded,
total: evt.total,
});
};
if (isStreamingSupported && isRangeSupported) {
this.#fullReader.onProgress = null; // See comment in "GetReader" above.
}
return { isStreamingSupported, isRangeSupported, contentLength };
});
messageHandler.on("GetRangeReader", (data, sink) => {
assert(
this._networkStream,
"GetRangeReader - no `IPDFStream` instance available."
this.#networkStream,
"GetRangeReader - no `BasePDFStream` instance available."
);
const rangeReader = this._networkStream.getRangeReader(
const rangeReader = this.#networkStream.getRangeReader(
data.begin,
data.end
);
// When streaming is enabled, it's possible that the data requested here
// has already been fetched via the `_fullRequestReader` implementation.
// has already been fetched via the `#fullReader` implementation.
// However, given that the PDF data is loaded asynchronously on the
// main-thread and then sent via `postMessage` to the worker-thread,
// it may not have been available during parsing (hence the attempt to
@ -2672,7 +2692,7 @@ class WorkerTransport {
//
// To avoid wasting time and resources here, we'll thus *not* dispatch
// range requests if the data was already loaded but has not been sent to
// the worker-thread yet (which will happen via the `_fullRequestReader`).
// the worker-thread yet (which will happen via the `#fullReader`).
if (!rangeReader) {
sink.close();
return;
@ -2710,6 +2730,7 @@ class WorkerTransport {
});
messageHandler.on("GetDoc", ({ pdfInfo }) => {
this.#pagesMapper.pagesNumber = pdfInfo.numPages;
this._numPages = pdfInfo.numPages;
this._htmlForXfa = pdfInfo.htmlForXfa;
delete pdfInfo.htmlForXfa;
@ -2745,10 +2766,7 @@ class WorkerTransport {
messageHandler.on("DataLoaded", data => {
// For consistency: Ensure that progress is always reported when the
// entire PDF file has been loaded, regardless of how it was fetched.
loadingTask.onProgress?.({
loaded: data.length,
total: data.length,
});
this.#onProgress({ loaded: data.length, total: data.length });
this.downloadInfoCapability.resolve(data);
});
@ -2871,10 +2889,7 @@ class WorkerTransport {
if (this.destroyed) {
return; // Ignore any pending requests if the worker was terminated.
}
loadingTask.onProgress?.({
loaded: data.loaded,
total: data.total,
});
this.#onProgress(data);
});
messageHandler.on("FetchBinaryData", async data => {
@ -2915,7 +2930,7 @@ class WorkerTransport {
isPureXfa: !!this._htmlForXfa,
numPages: this._numPages,
annotationStorage: map,
filename: this._fullReader?.filename ?? null,
filename: this.#fullReader?.filename ?? null,
},
transfer
)
@ -2932,26 +2947,27 @@ class WorkerTransport {
if (
!Number.isInteger(pageNumber) ||
pageNumber <= 0 ||
pageNumber > this._numPages
pageNumber > this.#pagesMapper.pagesNumber
) {
return Promise.reject(new Error("Invalid page request."));
}
const pageIndex = pageNumber - 1;
const newPageIndex = this.#pagesMapper.getPageId(pageNumber) - 1;
const pageIndex = pageNumber - 1,
cachedPromise = this.#pagePromises.get(pageIndex);
const cachedPromise = this.#pagePromises.get(pageIndex);
if (cachedPromise) {
return cachedPromise;
}
const promise = this.messageHandler
.sendWithPromise("GetPage", {
pageIndex,
pageIndex: newPageIndex,
})
.then(pageInfo => {
if (this.destroyed) {
throw new Error("Transport destroyed");
}
if (pageInfo.refStr) {
this.#pageRefCache.set(pageInfo.refStr, pageNumber);
this.#pageRefCache.set(pageInfo.refStr, newPageIndex);
}
const page = new PDFPageProxy(
@ -2967,19 +2983,20 @@ class WorkerTransport {
return promise;
}
getPageIndex(ref) {
async getPageIndex(ref) {
if (!isRefProxy(ref)) {
return Promise.reject(new Error("Invalid pageIndex request."));
throw new Error("Invalid pageIndex request.");
}
return this.messageHandler.sendWithPromise("GetPageIndex", {
const index = await this.messageHandler.sendWithPromise("GetPageIndex", {
num: ref.num,
gen: ref.gen,
});
return this.#pagesMapper.getPageNumber(index + 1) - 1;
}
getAnnotations(pageIndex, intent) {
return this.messageHandler.sendWithPromise("GetAnnotations", {
pageIndex,
pageIndex: this.#pagesMapper.getPageId(pageIndex + 1) - 1,
intent,
});
}
@ -3046,13 +3063,13 @@ class WorkerTransport {
getPageJSActions(pageIndex) {
return this.messageHandler.sendWithPromise("GetPageJSActions", {
pageIndex,
pageIndex: this.#pagesMapper.getPageId(pageIndex + 1) - 1,
});
}
getStructTree(pageIndex) {
return this.messageHandler.sendWithPromise("GetStructTree", {
pageIndex,
pageIndex: this.#pagesMapper.getPageId(pageIndex + 1) - 1,
});
}
@ -3081,8 +3098,8 @@ class WorkerTransport {
.then(results => ({
info: results[0],
metadata: results[1] ? new Metadata(results[1]) : null,
contentDispositionFilename: this._fullReader?.filename ?? null,
contentLength: this._fullReader?.contentLength ?? null,
contentDispositionFilename: this.#fullReader?.filename ?? null,
contentLength: this.#fullReader?.contentLength ?? null,
hasStructTree: results[2],
}));
this.#methodPromises.set(name, promise);
@ -3122,7 +3139,10 @@ class WorkerTransport {
return null;
}
const refStr = ref.gen === 0 ? `${ref.num}R` : `${ref.num}R${ref.gen}`;
return this.#pageRefCache.get(refStr) ?? null;
const pageIndex = this.#pageRefCache.get(refStr);
return pageIndex >= 0
? this.#pagesMapper.getPageNumber(pageIndex + 1)
: null;
}
}
@ -3130,7 +3150,7 @@ class WorkerTransport {
* Allows controlling of the rendering tasks.
*/
class RenderTask {
#internalRenderTask = null;
_internalRenderTask = null;
/**
* Callback for incremental rendering -- a function that will be called
@ -3151,12 +3171,12 @@ class RenderTask {
onError = null;
constructor(internalRenderTask) {
this.#internalRenderTask = internalRenderTask;
this._internalRenderTask = internalRenderTask;
if (typeof PDFJSDev === "undefined" || PDFJSDev.test("TESTING")) {
// For testing purposes.
Object.defineProperty(this, "getOperatorList", {
value: () => this.#internalRenderTask.operatorList,
value: () => this._internalRenderTask.operatorList,
});
}
}
@ -3166,7 +3186,7 @@ class RenderTask {
* @type {Promise<void>}
*/
get promise() {
return this.#internalRenderTask.capability.promise;
return this._internalRenderTask.capability.promise;
}
/**
@ -3177,7 +3197,7 @@ class RenderTask {
* @param {number} [extraDelay]
*/
cancel(extraDelay = 0) {
this.#internalRenderTask.cancel(/* error = */ null, extraDelay);
this._internalRenderTask.cancel(/* error = */ null, extraDelay);
}
/**
@ -3185,11 +3205,11 @@ class RenderTask {
* @type {boolean}
*/
get separateAnnots() {
const { separateAnnots } = this.#internalRenderTask.operatorList;
const { separateAnnots } = this._internalRenderTask.operatorList;
if (!separateAnnots) {
return false;
}
const { annotationCanvasMap } = this.#internalRenderTask;
const { annotationCanvasMap } = this._internalRenderTask;
return (
separateAnnots.form ||
(separateAnnots.canvas && annotationCanvasMap?.size > 0)
@ -3389,7 +3409,6 @@ class InternalRenderTask {
if (this.operatorList.lastChunk) {
this.gfx.endDrawing();
InternalRenderTask.#canvasInUse.delete(this._canvas);
this.callback();
}
}

View File

@ -25,7 +25,7 @@ function getUrlProp(val) {
return null; // The 'url' is unused with `PDFDataRangeTransport`.
}
if (val instanceof URL) {
return val.href;
return val;
}
if (typeof val === "string") {
if (
@ -33,13 +33,18 @@ function getUrlProp(val) {
PDFJSDev.test("GENERIC") &&
isNodeJS
) {
return val; // Use the url as-is in Node.js environments.
if (/^[a-z][a-z0-9\-+.]+:/i.test(val)) {
return new URL(val);
}
// eslint-disable-next-line no-undef
const url = process.getBuiltinModule("url");
return new URL(url.pathToFileURL(val));
}
// The full path is required in the 'url' field.
const url = URL.parse(val, window.location);
if (url) {
return url.href;
return url;
}
}
throw new Error(

View File

@ -17,6 +17,7 @@ import {
BaseException,
DrawOPS,
FeatureTest,
MathClamp,
shadow,
Util,
warn,
@ -463,7 +464,7 @@ function isValidFetchUrl(url, baseUrl) {
}
const res = baseUrl ? URL.parse(url, baseUrl) : URL.parse(url);
// The Fetch API only supports the http/https protocols, and not file/ftp.
return res?.protocol === "http:" || res?.protocol === "https:";
return /https?:/.test(res?.protocol ?? "");
}
/**
@ -798,7 +799,7 @@ class CSSConstants {
}
function applyOpacity(r, g, b, opacity) {
opacity = Math.min(Math.max(opacity ?? 1, 0), 1);
opacity = MathClamp(opacity ?? 1, 0, 1);
const white = 255 * (1 - opacity);
r = Math.round(r * opacity + white);
g = Math.round(g * opacity + white);
@ -1034,6 +1035,226 @@ function makePathFromDrawOPS(data) {
return path;
}
/**
* Maps between page IDs and page numbers, allowing bidirectional conversion
* between the two representations. This is useful when the page numbering
* in the PDF document doesn't match the default sequential ordering.
*/
class PagesMapper {
/**
* Maps page IDs to their corresponding page numbers.
* @type {Uint32Array|null}
*/
static #idToPageNumber = null;
/**
* Maps page numbers to their corresponding page IDs.
* @type {Uint32Array|null}
*/
static #pageNumberToId = null;
/**
* Previous mapping of page IDs to page numbers.
* @type {Uint32Array|null}
*/
static #prevIdToPageNumber = null;
/**
* The total number of pages.
* @type {number}
*/
static #pagesNumber = 0;
/**
* Listeners for page changes.
* @type {Array<function>}
*/
static #listeners = [];
/**
* Gets the total number of pages.
* @returns {number} The number of pages.
*/
get pagesNumber() {
return PagesMapper.#pagesNumber;
}
/**
* Sets the total number of pages and initializes default mappings
* where page IDs equal page numbers (1-indexed).
* @param {number} n - The total number of pages.
*/
set pagesNumber(n) {
if (PagesMapper.#pagesNumber === n) {
return;
}
PagesMapper.#pagesNumber = n;
if (n === 0) {
PagesMapper.#pageNumberToId = null;
PagesMapper.#idToPageNumber = null;
}
}
addListener(listener) {
PagesMapper.#listeners.push(listener);
}
removeListener(listener) {
const index = PagesMapper.#listeners.indexOf(listener);
if (index >= 0) {
PagesMapper.#listeners.splice(index, 1);
}
}
#updateListeners() {
for (const listener of PagesMapper.#listeners) {
listener();
}
}
#init(mustInit) {
if (PagesMapper.#pageNumberToId) {
return;
}
const n = PagesMapper.#pagesNumber;
// Allocate a single array for better memory locality.
const array = new Uint32Array(3 * n);
const pageNumberToId = (PagesMapper.#pageNumberToId = array.subarray(0, n));
const idToPageNumber = (PagesMapper.#idToPageNumber = array.subarray(
n,
2 * n
));
if (mustInit) {
for (let i = 0; i < n; i++) {
pageNumberToId[i] = idToPageNumber[i] = i + 1;
}
}
PagesMapper.#prevIdToPageNumber = array.subarray(2 * n);
}
/**
* Move a set of pages to a new position while keeping IDnumber mappings in
* sync.
*
* @param {Set<number>} selectedPages - Page numbers being moved (1-indexed).
* @param {number[]} pagesToMove - Ordered list of page numbers to move.
* @param {number} index - Zero-based insertion index in the page-number list.
*/
movePages(selectedPages, pagesToMove, index) {
this.#init(true);
const pageNumberToId = PagesMapper.#pageNumberToId;
const idToPageNumber = PagesMapper.#idToPageNumber;
PagesMapper.#prevIdToPageNumber.set(idToPageNumber);
const movedCount = pagesToMove.length;
const mappedPagesToMove = new Uint32Array(movedCount);
let removedBeforeTarget = 0;
for (let i = 0; i < movedCount; i++) {
const pageIndex = pagesToMove[i] - 1;
mappedPagesToMove[i] = pageNumberToId[pageIndex];
if (pageIndex < index) {
removedBeforeTarget += 1;
}
}
const pagesNumber = PagesMapper.#pagesNumber;
// target index after removing elements that were before it
let adjustedTarget = index - removedBeforeTarget;
const remainingLen = pagesNumber - movedCount;
adjustedTarget = MathClamp(adjustedTarget, 0, remainingLen);
// Create the new mapping.
// First copy over the pages that are not being moved.
// Then insert the moved pages at the target position.
for (let i = 0, r = 0; i < pagesNumber; i++) {
if (!selectedPages.has(i + 1)) {
pageNumberToId[r++] = pageNumberToId[i];
}
}
// Shift the pages after the target position.
pageNumberToId.copyWithin(
adjustedTarget + movedCount,
adjustedTarget,
remainingLen
);
// Finally insert the moved pages.
pageNumberToId.set(mappedPagesToMove, adjustedTarget);
let hasChanged = false;
for (let i = 0, ii = pagesNumber; i < ii; i++) {
const id = pageNumberToId[i];
hasChanged ||= id !== i + 1;
idToPageNumber[id - 1] = i + 1;
}
this.#updateListeners();
if (!hasChanged) {
// Reset.
this.pagesNumber = 0;
}
}
/**
* Checks if the page mappings have been altered from their initial state.
* @returns {boolean} True if the mappings have been altered, false otherwise.
*/
hasBeenAltered() {
return PagesMapper.#pageNumberToId !== null;
}
/**
* Gets the current page mapping suitable for saving.
* @returns {Object} An object containing the page indices.
*/
getPageMappingForSaving() {
// Saving is index-based.
return {
pageIndices: PagesMapper.#idToPageNumber
? PagesMapper.#idToPageNumber.map(x => x - 1)
: null,
};
}
getPrevPageNumber(pageNumber) {
return PagesMapper.#prevIdToPageNumber[
PagesMapper.#pageNumberToId[pageNumber - 1] - 1
];
}
/**
* Gets the page number for a given page ID.
* @param {number} id - The page ID (1-indexed).
* @returns {number} The page number, or the ID itself if no mapping exists.
*/
getPageNumber(id) {
return PagesMapper.#idToPageNumber?.[id - 1] ?? id;
}
/**
* Gets the page ID for a given page number.
* @param {number} pageNumber - The page number (1-indexed).
* @returns {number} The page ID, or the page number itself if no mapping
* exists.
*/
getPageId(pageNumber) {
return PagesMapper.#pageNumberToId?.[pageNumber - 1] ?? pageNumber;
}
/**
* Gets or creates a singleton instance of PagesMapper.
* @returns {PagesMapper} The singleton instance.
*/
static get instance() {
return shadow(this, "instance", new PagesMapper());
}
getMapping() {
return PagesMapper.#pageNumberToId.subarray(0, this.pagesNumber);
}
}
export {
applyOpacity,
ColorScheme,
@ -1054,6 +1275,7 @@ export {
makePathFromDrawOPS,
noContextMenu,
OutputScale,
PagesMapper,
PageViewport,
PDFDateString,
PixelsPerInch,

View File

@ -30,10 +30,6 @@ class DrawLayer {
static #id = 0;
constructor({ pageIndex }) {
this.pageIndex = pageIndex;
}
setParent(parent) {
if (!this.#parent) {
this.#parent = parent;
@ -103,7 +99,7 @@ class DrawLayer {
root.append(defs);
const path = DrawLayer._svgFactory.createElement("path");
defs.append(path);
const pathId = `path_p${this.pageIndex}_${id}`;
const pathId = `path_${id}`;
path.setAttribute("id", pathId);
path.setAttribute("vector-effect", "non-scaling-stroke");
@ -135,7 +131,7 @@ class DrawLayer {
root.append(defs);
const path = DrawLayer._svgFactory.createElement("path");
defs.append(path);
const pathId = `path_p${this.pageIndex}_${id}`;
const pathId = `path_${id}`;
path.setAttribute("id", pathId);
path.setAttribute("vector-effect", "non-scaling-stroke");
@ -143,7 +139,7 @@ class DrawLayer {
if (mustRemoveSelfIntersections) {
const mask = DrawLayer._svgFactory.createElement("mask");
defs.append(mask);
maskId = `mask_p${this.pageIndex}_${id}`;
maskId = `mask_${id}`;
mask.setAttribute("id", maskId);
mask.setAttribute("maskUnits", "objectBoundingBox");
const rect = DrawLayer._svgFactory.createElement("rect");

View File

@ -18,7 +18,6 @@
/** @typedef {import("../display_utils.js").PageViewport} PageViewport */
// eslint-disable-next-line max-len
/** @typedef {import("../../../web/text_accessibility.js").TextAccessibilityManager} TextAccessibilityManager */
/** @typedef {import("../../../web/interfaces").IL10n} IL10n */
// eslint-disable-next-line max-len
/** @typedef {import("../annotation_layer.js").AnnotationLayer} AnnotationLayer */
/** @typedef {import("../draw_layer.js").DrawLayer} DrawLayer */
@ -47,7 +46,7 @@ import { StampEditor } from "./stamp.js";
* @property {boolean} enabled
* @property {TextAccessibilityManager} [accessibilityManager]
* @property {number} pageIndex
* @property {IL10n} l10n
* @property {L10n} l10n
* @property {AnnotationLayer} [annotationLayer]
* @property {HTMLDivElement} [textLayer]
* @property {DrawLayer} drawLayer
@ -144,6 +143,10 @@ class AnnotationEditorLayer {
this.#uiManager.addLayer(this);
}
updatePageIndex(newPageIndex) {
this.pageIndex = newPageIndex;
}
get isEmpty() {
return this.#editors.size === 0;
}

View File

@ -314,6 +314,20 @@ class Comment {
this.#deleted = false;
}
/**
* Restore the comment data (used for undo).
* @param {Object} data - The comment data to restore.
* @param {string} data.text - The comment text.
* @param {string|null} data.richText - The rich text content.
* @param {Date|null} data.date - The original date.
*/
restoreData({ text, richText, date }) {
this.#text = text;
this.#richText = richText;
this.#date = date;
this.#deleted = false;
}
setInitialText(text, richText = null) {
this.#initialText = text;
this.data = text;

View File

@ -13,10 +13,10 @@
* limitations under the License.
*/
import { fromBase64Util, toBase64Util, warn } from "../../../shared/util.js";
import { ContourDrawOutline } from "./contour.js";
import { InkDrawOutline } from "./inkdraw.js";
import { Outline } from "./outline.js";
import { warn } from "../../../shared/util.js";
const BASE_HEADER_LENGTH = 8;
const POINTS_PROPERTIES_NUMBER = 3;
@ -749,12 +749,12 @@ class SignatureExtractor {
const buf = await new Response(cs.readable).arrayBuffer();
const bytes = new Uint8Array(buf);
return toBase64Util(bytes);
return bytes.toBase64();
}
static async decompressSignature(signatureData) {
try {
const bytes = fromBase64Util(signatureData);
const bytes = Uint8Array.fromBase64(signatureData);
const { readable, writable } = new DecompressionStream("deflate-raw");
const writer = writable.getWriter();
await writer.ready;

View File

@ -209,6 +209,10 @@ class AnnotationEditor {
this.deleted = false;
}
updatePageIndex(newPageIndex) {
this.pageIndex = newPageIndex;
}
get editorType() {
return Object.getPrototypeOf(this).constructor._type;
}
@ -1204,6 +1208,9 @@ class AnnotationEditor {
}
get comment() {
if (!this.#comment) {
return null;
}
const {
data: { richText, text, date, deleted },
} = this.#comment;
@ -1217,9 +1224,14 @@ class AnnotationEditor {
};
}
set comment(text) {
set comment(value) {
this.#comment ||= new Comment(this);
this.#comment.data = text;
if (typeof value === "object" && value !== null) {
// Restore full comment data (used for undo).
this.#comment.restoreData(value);
} else {
this.#comment.data = value;
}
if (this.hasComment) {
this.removeCommentButtonFromToolbar();
this.addStandaloneCommentButton();

View File

@ -948,6 +948,7 @@ class AnnotationEditorUIManager {
evt => this.updateParams(evt.type, evt.value),
{ signal }
);
eventBus._on("pagesedited", this.onPagesEdited.bind(this), { signal });
window.addEventListener(
"pointerdown",
() => {
@ -1177,6 +1178,23 @@ class AnnotationEditorUIManager {
this.#commentManager?.removeComments([editor.uid]);
}
/**
* Delete a comment from an editor with undo support.
* @param {AnnotationEditor} editor - The editor whose comment to delete.
* @param {Object} savedData - The comment data to save for undo.
*/
deleteComment(editor, savedData) {
const undo = () => {
editor.comment = savedData;
};
const cmd = () => {
this._editorUndoBar?.show(undo, "comment");
this.toggleComment(/* editor = */ null);
editor.comment = null;
};
this.addCommands({ cmd, undo, mustExec: true });
}
toggleComment(editor, isSelected, visibility = undefined) {
this.#commentManager?.toggleCommentPopup(editor, isSelected, visibility);
}
@ -1242,6 +1260,26 @@ class AnnotationEditorUIManager {
}
}
onPagesEdited({ pagesMapper }) {
for (const editor of this.#allEditors.values()) {
editor.updatePageIndex(
pagesMapper.getPrevPageNumber(editor.pageIndex + 1) - 1
);
}
const allLayers = this.#allLayers;
const newAllLayers = (this.#allLayers = new Map());
for (const [pageIndex, layer] of allLayers) {
const prevPageIndex = pagesMapper.getPrevPageNumber(pageIndex + 1) - 1;
if (prevPageIndex === -1) {
// TODO: handle the case where the deletion of the page has been undone.
layer.destroy();
continue;
}
newAllLayers.set(prevPageIndex, layer);
layer.updatePageIndex(prevPageIndex);
}
}
onPageChanging({ pageNumber }) {
this.#currentPageIndex = pageNumber - 1;
}

View File

@ -14,13 +14,18 @@
*/
import { AbortException, assert, warn } from "../shared/util.js";
import {
BasePDFStream,
BasePDFStreamRangeReader,
BasePDFStreamReader,
} from "../shared/base_pdf_stream.js";
import {
createHeaders,
createResponseError,
ensureResponseOrigin,
extractFilenameFromHeader,
getResponseOrigin,
validateRangeRequestCapabilities,
validateResponseStatus,
} from "./network_utils.js";
if (typeof PDFJSDev !== "undefined" && PDFJSDev.test("MOZCENTRAL")) {
@ -29,15 +34,21 @@ if (typeof PDFJSDev !== "undefined" && PDFJSDev.test("MOZCENTRAL")) {
);
}
function createFetchOptions(headers, withCredentials, abortController) {
return {
function fetchUrl(url, headers, withCredentials, abortController) {
return fetch(url, {
method: "GET",
headers,
signal: abortController.signal,
mode: "cors",
credentials: withCredentials ? "include" : "same-origin",
redirect: "follow",
};
});
}
function ensureResponseStatus(status, url) {
if (status !== 200 && status !== 206) {
throw createResponseError(status, url);
}
}
function getArrayBuffer(val) {
@ -51,95 +62,58 @@ function getArrayBuffer(val) {
return new Uint8Array(val).buffer;
}
/** @implements {IPDFStream} */
class PDFFetchStream {
class PDFFetchStream extends BasePDFStream {
_responseOrigin = null;
constructor(source) {
this.source = source;
this.isHttp = /^https?:/i.test(source.url);
this.headers = createHeaders(this.isHttp, source.httpHeaders);
super(source, PDFFetchStreamReader, PDFFetchStreamRangeReader);
const { httpHeaders, url } = source;
this._fullRequestReader = null;
this._rangeRequestReaders = [];
}
get _progressiveDataLength() {
return this._fullRequestReader?._loaded ?? 0;
}
getFullReader() {
assert(
!this._fullRequestReader,
"PDFFetchStream.getFullReader can only be called once."
/https?:/.test(url.protocol),
"PDFFetchStream only supports http(s):// URLs."
);
this._fullRequestReader = new PDFFetchStreamReader(this);
return this._fullRequestReader;
}
getRangeReader(begin, end) {
if (end <= this._progressiveDataLength) {
return null;
}
const reader = new PDFFetchStreamRangeReader(this, begin, end);
this._rangeRequestReaders.push(reader);
return reader;
}
cancelAllRequests(reason) {
this._fullRequestReader?.cancel(reason);
for (const reader of this._rangeRequestReaders.slice(0)) {
reader.cancel(reason);
}
this.headers = createHeaders(/* isHttp = */ true, httpHeaders);
}
}
/** @implements {IPDFStreamReader} */
class PDFFetchStreamReader {
constructor(stream) {
this._stream = stream;
this._reader = null;
this._loaded = 0;
this._filename = null;
const source = stream.source;
this._withCredentials = source.withCredentials || false;
this._contentLength = source.length;
this._headersCapability = Promise.withResolvers();
this._disableRange = source.disableRange || false;
this._rangeChunkSize = source.rangeChunkSize;
if (!this._rangeChunkSize && !this._disableRange) {
this._disableRange = true;
}
class PDFFetchStreamReader extends BasePDFStreamReader {
_abortController = new AbortController();
this._abortController = new AbortController();
this._isStreamingSupported = !source.disableStream;
this._isRangeSupported = !source.disableRange;
_reader = null;
constructor(stream) {
super(stream);
const {
disableRange,
disableStream,
length,
rangeChunkSize,
url,
withCredentials,
} = stream._source;
this._contentLength = length;
this._isStreamingSupported = !disableStream;
this._isRangeSupported = !disableRange;
// Always create a copy of the headers.
const headers = new Headers(stream.headers);
const url = source.url;
fetch(
url,
createFetchOptions(headers, this._withCredentials, this._abortController)
)
fetchUrl(url, headers, withCredentials, this._abortController)
.then(response => {
stream._responseOrigin = getResponseOrigin(response.url);
if (!validateResponseStatus(response.status)) {
throw createResponseError(response.status, url);
}
ensureResponseStatus(response.status, url);
this._reader = response.body.getReader();
this._headersCapability.resolve();
const responseHeaders = response.headers;
const { allowRangeRequests, suggestedLength } =
validateRangeRequestCapabilities({
responseHeaders,
isHttp: stream.isHttp,
rangeChunkSize: this._rangeChunkSize,
disableRange: this._disableRange,
isHttp: true,
rangeChunkSize,
disableRange,
});
this._isRangeSupported = allowRangeRequests;
@ -153,30 +127,10 @@ class PDFFetchStreamReader {
if (!this._isStreamingSupported && this._isRangeSupported) {
this.cancel(new AbortException("Streaming is disabled."));
}
this._headersCapability.resolve();
})
.catch(this._headersCapability.reject);
this.onProgress = null;
}
get headersReady() {
return this._headersCapability.promise;
}
get filename() {
return this._filename;
}
get contentLength() {
return this._contentLength;
}
get isRangeSupported() {
return this._isRangeSupported;
}
get isStreamingSupported() {
return this._isStreamingSupported;
}
async read() {
@ -186,10 +140,7 @@ class PDFFetchStreamReader {
return { value, done };
}
this._loaded += value.byteLength;
this.onProgress?.({
loaded: this._loaded,
total: this._contentLength,
});
this._callOnProgress();
return { value: getArrayBuffer(value), done: false };
}
@ -200,48 +151,32 @@ class PDFFetchStreamReader {
}
}
/** @implements {IPDFStreamRangeReader} */
class PDFFetchStreamRangeReader {
constructor(stream, begin, end) {
this._stream = stream;
this._reader = null;
this._loaded = 0;
const source = stream.source;
this._withCredentials = source.withCredentials || false;
this._readCapability = Promise.withResolvers();
this._isStreamingSupported = !source.disableStream;
class PDFFetchStreamRangeReader extends BasePDFStreamRangeReader {
_abortController = new AbortController();
_readCapability = Promise.withResolvers();
_reader = null;
constructor(stream, begin, end) {
super(stream, begin, end);
const { url, withCredentials } = stream._source;
this._abortController = new AbortController();
// Always create a copy of the headers.
const headers = new Headers(stream.headers);
headers.append("Range", `bytes=${begin}-${end - 1}`);
const url = source.url;
fetch(
url,
createFetchOptions(headers, this._withCredentials, this._abortController)
)
fetchUrl(url, headers, withCredentials, this._abortController)
.then(response => {
const responseOrigin = getResponseOrigin(response.url);
if (responseOrigin !== stream._responseOrigin) {
throw new Error(
`Expected range response-origin "${responseOrigin}" to match "${stream._responseOrigin}".`
);
}
if (!validateResponseStatus(response.status)) {
throw createResponseError(response.status, url);
}
this._readCapability.resolve();
ensureResponseOrigin(responseOrigin, stream._responseOrigin);
ensureResponseStatus(response.status, url);
this._reader = response.body.getReader();
this._readCapability.resolve();
})
.catch(this._readCapability.reject);
this.onProgress = null;
}
get isStreamingSupported() {
return this._isStreamingSupported;
}
async read() {
@ -250,9 +185,6 @@ class PDFFetchStreamRangeReader {
if (done) {
return { value, done };
}
this._loaded += value.byteLength;
this.onProgress?.({ loaded: this._loaded });
return { value: getArrayBuffer(value), done: false };
}

View File

@ -19,7 +19,6 @@ import {
isNodeJS,
shadow,
string32,
toBase64Util,
unreachable,
warn,
} from "../shared/util.js";
@ -408,7 +407,7 @@ class FontFaceObject {
return null;
}
// Add the @font-face rule to the document.
const url = `url(data:${this.mimetype};base64,${toBase64Util(this.data)});`;
const url = `url(data:${this.mimetype};base64,${this.data.toBase64()});`;
let rule;
if (!this.cssFontInfo) {
rule = `@font-face {font-family:"${this.loadedName}";src:${url}}`;

View File

@ -14,9 +14,15 @@
*/
import { assert, stringToBytes, warn } from "../shared/util.js";
import {
BasePDFStream,
BasePDFStreamRangeReader,
BasePDFStreamReader,
} from "../shared/base_pdf_stream.js";
import {
createHeaders,
createResponseError,
ensureResponseOrigin,
extractFilenameFromHeader,
getResponseOrigin,
validateRangeRequestCapabilities,
@ -31,77 +37,77 @@ if (typeof PDFJSDev !== "undefined" && PDFJSDev.test("MOZCENTRAL")) {
const OK_RESPONSE = 200;
const PARTIAL_CONTENT_RESPONSE = 206;
function getArrayBuffer(xhr) {
const data = xhr.response;
if (typeof data !== "string") {
return data;
}
return stringToBytes(data).buffer;
function getArrayBuffer(val) {
return typeof val !== "string" ? val : stringToBytes(val).buffer;
}
class NetworkManager {
class PDFNetworkStream extends BasePDFStream {
#pendingRequests = new WeakMap();
_responseOrigin = null;
constructor({ url, httpHeaders, withCredentials }) {
this.url = url;
this.isHttp = /^https?:/i.test(url);
this.headers = createHeaders(this.isHttp, httpHeaders);
this.withCredentials = withCredentials || false;
constructor(source) {
super(source, PDFNetworkStreamReader, PDFNetworkStreamRangeReader);
const { httpHeaders, url } = source;
this.currXhrId = 0;
this.pendingRequests = Object.create(null);
this.url = url;
this.isHttp = /https?:/.test(url.protocol);
this.headers = createHeaders(this.isHttp, httpHeaders);
}
request(args) {
/**
* @ignore
*/
_request(args) {
const xhr = new XMLHttpRequest();
const xhrId = this.currXhrId++;
const pendingRequest = (this.pendingRequests[xhrId] = { xhr });
const pendingRequest = {
validateStatus: null,
onHeadersReceived: args.onHeadersReceived,
onDone: args.onDone,
onError: args.onError,
onProgress: args.onProgress,
};
this.#pendingRequests.set(xhr, pendingRequest);
xhr.open("GET", this.url);
xhr.withCredentials = this.withCredentials;
xhr.withCredentials = this._source.withCredentials;
for (const [key, val] of this.headers) {
xhr.setRequestHeader(key, val);
}
if (this.isHttp && "begin" in args && "end" in args) {
xhr.setRequestHeader("Range", `bytes=${args.begin}-${args.end - 1}`);
pendingRequest.expectedStatus = PARTIAL_CONTENT_RESPONSE;
// From http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.35.2:
// "A server MAY ignore the Range header". This means it's possible to
// get a 200 rather than a 206 response from a range request.
pendingRequest.validateStatus = status =>
status === PARTIAL_CONTENT_RESPONSE || status === OK_RESPONSE;
} else {
pendingRequest.expectedStatus = OK_RESPONSE;
pendingRequest.validateStatus = status => status === OK_RESPONSE;
}
xhr.responseType = "arraybuffer";
assert(args.onError, "Expected `onError` callback to be provided.");
xhr.onerror = () => {
args.onError(xhr.status);
};
xhr.onreadystatechange = this.onStateChange.bind(this, xhrId);
xhr.onprogress = this.onProgress.bind(this, xhrId);
pendingRequest.onHeadersReceived = args.onHeadersReceived;
pendingRequest.onDone = args.onDone;
pendingRequest.onError = args.onError;
pendingRequest.onProgress = args.onProgress;
xhr.onerror = () => args.onError(xhr.status);
xhr.onreadystatechange = this.#onStateChange.bind(this, xhr);
xhr.onprogress = this.#onProgress.bind(this, xhr);
xhr.send(null);
return xhrId;
return xhr;
}
onProgress(xhrId, evt) {
const pendingRequest = this.pendingRequests[xhrId];
if (!pendingRequest) {
return; // Maybe abortRequest was called...
}
pendingRequest.onProgress?.(evt);
#onProgress(xhr, evt) {
const pendingRequest = this.#pendingRequests.get(xhr);
pendingRequest?.onProgress?.(evt);
}
onStateChange(xhrId, evt) {
const pendingRequest = this.pendingRequests[xhrId];
#onStateChange(xhr, evt) {
const pendingRequest = this.#pendingRequests.get(xhr);
if (!pendingRequest) {
return; // Maybe abortRequest was called...
}
const xhr = pendingRequest.xhr;
if (xhr.readyState >= 2 && pendingRequest.onHeadersReceived) {
pendingRequest.onHeadersReceived();
delete pendingRequest.onHeadersReceived;
@ -111,13 +117,12 @@ class NetworkManager {
return;
}
if (!(xhrId in this.pendingRequests)) {
if (!this.#pendingRequests.has(xhr)) {
// The XHR request might have been aborted in onHeadersReceived()
// callback, in which case we should abort request.
return;
}
delete this.pendingRequests[xhrId];
this.#pendingRequests.delete(xhr);
// Success status == 0 can be on ftp, file and other protocols.
if (xhr.status === 0 && this.isHttp) {
@ -126,147 +131,79 @@ class NetworkManager {
}
const xhrStatus = xhr.status || OK_RESPONSE;
// From http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.35.2:
// "A server MAY ignore the Range header". This means it's possible to
// get a 200 rather than a 206 response from a range request.
const ok_response_on_range_request =
xhrStatus === OK_RESPONSE &&
pendingRequest.expectedStatus === PARTIAL_CONTENT_RESPONSE;
if (
!ok_response_on_range_request &&
xhrStatus !== pendingRequest.expectedStatus
) {
if (!pendingRequest.validateStatus(xhrStatus)) {
pendingRequest.onError(xhr.status);
return;
}
const chunk = getArrayBuffer(xhr);
const chunk = getArrayBuffer(xhr.response);
if (xhrStatus === PARTIAL_CONTENT_RESPONSE) {
const rangeHeader = xhr.getResponseHeader("Content-Range");
const matches = /bytes (\d+)-(\d+)\/(\d+)/.exec(rangeHeader);
if (matches) {
pendingRequest.onDone({
begin: parseInt(matches[1], 10),
chunk,
});
if (/bytes (\d+)-(\d+)\/(\d+)/.test(rangeHeader)) {
pendingRequest.onDone(chunk);
} else {
warn(`Missing or invalid "Content-Range" header.`);
pendingRequest.onError(0);
}
} else if (chunk) {
pendingRequest.onDone({
begin: 0,
chunk,
});
pendingRequest.onDone(chunk);
} else {
pendingRequest.onError(xhr.status);
}
}
getRequestXhr(xhrId) {
return this.pendingRequests[xhrId].xhr;
}
isPendingRequest(xhrId) {
return xhrId in this.pendingRequests;
}
abortRequest(xhrId) {
const xhr = this.pendingRequests[xhrId].xhr;
delete this.pendingRequests[xhrId];
xhr.abort();
}
}
/** @implements {IPDFStream} */
class PDFNetworkStream {
constructor(source) {
this._source = source;
this._manager = new NetworkManager(source);
this._rangeChunkSize = source.rangeChunkSize;
this._fullRequestReader = null;
this._rangeRequestReaders = [];
}
_onRangeRequestReaderClosed(reader) {
const i = this._rangeRequestReaders.indexOf(reader);
if (i >= 0) {
this._rangeRequestReaders.splice(i, 1);
/**
* Abort the request, if it's pending.
* @ignore
*/
_abortRequest(xhr) {
if (this.#pendingRequests.has(xhr)) {
this.#pendingRequests.delete(xhr);
xhr.abort();
}
}
getFullReader() {
assert(
!this._fullRequestReader,
"PDFNetworkStream.getFullReader can only be called once."
);
this._fullRequestReader = new PDFNetworkStreamFullRequestReader(
this._manager,
this._source
);
return this._fullRequestReader;
}
getRangeReader(begin, end) {
const reader = new PDFNetworkStreamRangeRequestReader(
this._manager,
begin,
end
);
reader.onClosed = this._onRangeRequestReaderClosed.bind(this);
this._rangeRequestReaders.push(reader);
return reader;
}
const reader = super.getRangeReader(begin, end);
cancelAllRequests(reason) {
this._fullRequestReader?.cancel(reason);
for (const reader of this._rangeRequestReaders.slice(0)) {
reader.cancel(reason);
if (reader) {
reader.onClosed = () => this._rangeReaders.delete(reader);
}
return reader;
}
}
/** @implements {IPDFStreamReader} */
class PDFNetworkStreamFullRequestReader {
constructor(manager, source) {
this._manager = manager;
class PDFNetworkStreamReader extends BasePDFStreamReader {
_cachedChunks = [];
this._url = source.url;
this._fullRequestId = manager.request({
onHeadersReceived: this._onHeadersReceived.bind(this),
onDone: this._onDone.bind(this),
onError: this._onError.bind(this),
onProgress: this._onProgress.bind(this),
_done = false;
_requests = [];
_storedError = null;
constructor(stream) {
super(stream);
const { length } = stream._source;
this._contentLength = length;
// Note that `XMLHttpRequest` doesn't support streaming, and range requests
// will be enabled (if supported) in `this.#onHeadersReceived` below.
this._fullRequestXhr = stream._request({
onHeadersReceived: this.#onHeadersReceived.bind(this),
onDone: this.#onDone.bind(this),
onError: this.#onError.bind(this),
onProgress: this.#onProgress.bind(this),
});
this._headersCapability = Promise.withResolvers();
this._disableRange = source.disableRange || false;
this._contentLength = source.length; // Optional
this._rangeChunkSize = source.rangeChunkSize;
if (!this._rangeChunkSize && !this._disableRange) {
this._disableRange = true;
}
this._isStreamingSupported = false;
this._isRangeSupported = false;
this._cachedChunks = [];
this._requests = [];
this._done = false;
this._storedError = undefined;
this._filename = null;
this.onProgress = null;
}
_onHeadersReceived() {
const fullRequestXhrId = this._fullRequestId;
const fullRequestXhr = this._manager.getRequestXhr(fullRequestXhrId);
#onHeadersReceived() {
const stream = this._stream;
const { disableRange, rangeChunkSize } = stream._source;
const fullRequestXhr = this._fullRequestXhr;
this._manager._responseOrigin = getResponseOrigin(
fullRequestXhr.responseURL
);
stream._responseOrigin = getResponseOrigin(fullRequestXhr.responseURL);
const rawResponseHeaders = fullRequestXhr.getAllResponseHeaders();
const responseHeaders = new Headers(
@ -285,9 +222,9 @@ class PDFNetworkStreamFullRequestReader {
const { allowRangeRequests, suggestedLength } =
validateRangeRequestCapabilities({
responseHeaders,
isHttp: this._manager.isHttp,
rangeChunkSize: this._rangeChunkSize,
disableRange: this._disableRange,
isHttp: stream.isHttp,
rangeChunkSize,
disableRange,
});
if (allowRangeRequests) {
@ -303,68 +240,46 @@ class PDFNetworkStreamFullRequestReader {
// requests, there will be an issue for sites where you can only
// request the pdf once. However, if this is the case, then the
// server should not be returning that it can support range requests.
this._manager.abortRequest(fullRequestXhrId);
stream._abortRequest(fullRequestXhr);
}
this._headersCapability.resolve();
}
_onDone(data) {
if (data) {
if (this._requests.length > 0) {
const requestCapability = this._requests.shift();
requestCapability.resolve({ value: data.chunk, done: false });
} else {
this._cachedChunks.push(data.chunk);
}
#onDone(chunk) {
if (this._requests.length > 0) {
const capability = this._requests.shift();
capability.resolve({ value: chunk, done: false });
} else {
this._cachedChunks.push(chunk);
}
this._done = true;
if (this._cachedChunks.length > 0) {
return;
}
for (const requestCapability of this._requests) {
requestCapability.resolve({ value: undefined, done: true });
for (const capability of this._requests) {
capability.resolve({ value: undefined, done: true });
}
this._requests.length = 0;
}
_onError(status) {
this._storedError = createResponseError(status, this._url);
#onError(status) {
this._storedError = createResponseError(status, this._stream.url);
this._headersCapability.reject(this._storedError);
for (const requestCapability of this._requests) {
requestCapability.reject(this._storedError);
for (const capability of this._requests) {
capability.reject(this._storedError);
}
this._requests.length = 0;
this._cachedChunks.length = 0;
}
_onProgress(evt) {
#onProgress(evt) {
this.onProgress?.({
loaded: evt.loaded,
total: evt.lengthComputable ? evt.total : this._contentLength,
});
}
get filename() {
return this._filename;
}
get isRangeSupported() {
return this._isRangeSupported;
}
get isStreamingSupported() {
return this._isStreamingSupported;
}
get contentLength() {
return this._contentLength;
}
get headersReady() {
return this._headersCapability.promise;
}
async read() {
await this._headersCapability.promise;
@ -378,100 +293,82 @@ class PDFNetworkStreamFullRequestReader {
if (this._done) {
return { value: undefined, done: true };
}
const requestCapability = Promise.withResolvers();
this._requests.push(requestCapability);
return requestCapability.promise;
const capability = Promise.withResolvers();
this._requests.push(capability);
return capability.promise;
}
cancel(reason) {
this._done = true;
this._headersCapability.reject(reason);
for (const requestCapability of this._requests) {
requestCapability.resolve({ value: undefined, done: true });
for (const capability of this._requests) {
capability.resolve({ value: undefined, done: true });
}
this._requests.length = 0;
if (this._manager.isPendingRequest(this._fullRequestId)) {
this._manager.abortRequest(this._fullRequestId);
}
this._fullRequestReader = null;
this._stream._abortRequest(this._fullRequestXhr);
this._fullRequestXhr = null;
}
}
/** @implements {IPDFStreamRangeReader} */
class PDFNetworkStreamRangeRequestReader {
constructor(manager, begin, end) {
this._manager = manager;
class PDFNetworkStreamRangeReader extends BasePDFStreamRangeReader {
onClosed = null;
this._url = manager.url;
this._requestId = manager.request({
_done = false;
_queuedChunk = null;
_requests = [];
_storedError = null;
constructor(stream, begin, end) {
super(stream, begin, end);
this._requestXhr = stream._request({
begin,
end,
onHeadersReceived: this._onHeadersReceived.bind(this),
onDone: this._onDone.bind(this),
onError: this._onError.bind(this),
onProgress: this._onProgress.bind(this),
onHeadersReceived: this.#onHeadersReceived.bind(this),
onDone: this.#onDone.bind(this),
onError: this.#onError.bind(this),
onProgress: null,
});
this._requests = [];
this._queuedChunk = null;
this._done = false;
this._storedError = undefined;
this.onProgress = null;
this.onClosed = null;
}
_onHeadersReceived() {
const responseOrigin = getResponseOrigin(
this._manager.getRequestXhr(this._requestId)?.responseURL
);
if (responseOrigin !== this._manager._responseOrigin) {
this._storedError = new Error(
`Expected range response-origin "${responseOrigin}" to match "${this._manager._responseOrigin}".`
);
this._onError(0);
#onHeadersReceived() {
const responseOrigin = getResponseOrigin(this._requestXhr?.responseURL);
try {
ensureResponseOrigin(responseOrigin, this._stream._responseOrigin);
} catch (ex) {
this._storedError = ex;
this.#onError(0);
}
}
_close() {
this.onClosed?.(this);
}
_onDone(data) {
const chunk = data.chunk;
#onDone(chunk) {
if (this._requests.length > 0) {
const requestCapability = this._requests.shift();
requestCapability.resolve({ value: chunk, done: false });
const capability = this._requests.shift();
capability.resolve({ value: chunk, done: false });
} else {
this._queuedChunk = chunk;
}
this._done = true;
for (const requestCapability of this._requests) {
requestCapability.resolve({ value: undefined, done: true });
for (const capability of this._requests) {
capability.resolve({ value: undefined, done: true });
}
this._requests.length = 0;
this._close();
this.onClosed?.();
}
_onError(status) {
this._storedError ??= createResponseError(status, this._url);
for (const requestCapability of this._requests) {
requestCapability.reject(this._storedError);
#onError(status) {
this._storedError ??= createResponseError(status, this._stream.url);
for (const capability of this._requests) {
capability.reject(this._storedError);
}
this._requests.length = 0;
this._queuedChunk = null;
}
_onProgress(evt) {
if (!this.isStreamingSupported) {
this.onProgress?.({ loaded: evt.loaded });
}
}
get isStreamingSupported() {
return false;
}
async read() {
if (this._storedError) {
throw this._storedError;
@ -484,21 +381,20 @@ class PDFNetworkStreamRangeRequestReader {
if (this._done) {
return { value: undefined, done: true };
}
const requestCapability = Promise.withResolvers();
this._requests.push(requestCapability);
return requestCapability.promise;
const capability = Promise.withResolvers();
this._requests.push(capability);
return capability.promise;
}
cancel(reason) {
this._done = true;
for (const requestCapability of this._requests) {
requestCapability.resolve({ value: undefined, done: true });
for (const capability of this._requests) {
capability.resolve({ value: undefined, done: true });
}
this._requests.length = 0;
if (this._manager.isPendingRequest(this._requestId)) {
this._manager.abortRequest(this._requestId);
}
this._close();
this._stream._abortRequest(this._requestXhr);
this.onClosed?.();
}
}

View File

@ -101,21 +101,25 @@ function extractFilenameFromHeader(responseHeaders) {
function createResponseError(status, url) {
return new ResponseException(
`Unexpected server response (${status}) while retrieving PDF "${url}".`,
`Unexpected server response (${status}) while retrieving PDF "${url.href}".`,
status,
/* missing = */ status === 404 || (status === 0 && url.startsWith("file:"))
/* missing = */ status === 404 || (status === 0 && url.protocol === "file:")
);
}
function validateResponseStatus(status) {
return status === 200 || status === 206;
function ensureResponseOrigin(rangeOrigin, origin) {
if (rangeOrigin !== origin) {
throw new Error(
`Expected range response-origin "${rangeOrigin}" to match "${origin}".`
);
}
}
export {
createHeaders,
createResponseError,
ensureResponseOrigin,
extractFilenameFromHeader,
getResponseOrigin,
validateRangeRequestCapabilities,
validateResponseStatus,
};

View File

@ -14,7 +14,12 @@
*/
/* globals process */
import { AbortException, assert } from "../shared/util.js";
import { AbortException, assert, warn } from "../shared/util.js";
import {
BasePDFStream,
BasePDFStreamRangeReader,
BasePDFStreamReader,
} from "../shared/base_pdf_stream.js";
import { createResponseError } from "./network_utils.js";
if (typeof PDFJSDev !== "undefined" && PDFJSDev.test("MOZCENTRAL")) {
@ -23,273 +28,144 @@ if (typeof PDFJSDev !== "undefined" && PDFJSDev.test("MOZCENTRAL")) {
);
}
const urlRegex = /^[a-z][a-z0-9\-+.]+:/i;
function getReadableStream(readStream) {
const { Readable } = process.getBuiltinModule("stream");
function parseUrlOrPath(sourceUrl) {
if (urlRegex.test(sourceUrl)) {
return new URL(sourceUrl);
if (typeof Readable.toWeb === "function") {
// See https://nodejs.org/api/stream.html#streamreadabletowebstreamreadable-options
return Readable.toWeb(readStream);
}
const url = process.getBuiltinModule("url");
return new URL(url.pathToFileURL(sourceUrl));
// Fallback to support Node.js versions older than `24.0.0` and `22.17.0`.
const require = process
.getBuiltinModule("module")
.createRequire(import.meta.url);
const polyfill = require("node-readable-to-web-readable-stream");
return polyfill.makeDefaultReadableStreamFromNodeReadable(readStream);
}
class PDFNodeStream {
function getArrayBuffer(val) {
if (val instanceof Uint8Array) {
return val.buffer;
}
if (val instanceof ArrayBuffer) {
return val;
}
warn(`getArrayBuffer - unexpected data format: ${val}`);
return new Uint8Array(val).buffer;
}
class PDFNodeStream extends BasePDFStream {
constructor(source) {
this.source = source;
this.url = parseUrlOrPath(source.url);
super(source, PDFNodeStreamReader, PDFNodeStreamRangeReader);
const { url } = source;
assert(
this.url.protocol === "file:",
url.protocol === "file:",
"PDFNodeStream only supports file:// URLs."
);
this._fullRequestReader = null;
this._rangeRequestReaders = [];
}
get _progressiveDataLength() {
return this._fullRequestReader?._loaded ?? 0;
}
getFullReader() {
assert(
!this._fullRequestReader,
"PDFNodeStream.getFullReader can only be called once."
);
this._fullRequestReader = new PDFNodeStreamFsFullReader(this);
return this._fullRequestReader;
}
getRangeReader(start, end) {
if (end <= this._progressiveDataLength) {
return null;
}
const rangeReader = new PDFNodeStreamFsRangeReader(this, start, end);
this._rangeRequestReaders.push(rangeReader);
return rangeReader;
}
cancelAllRequests(reason) {
this._fullRequestReader?.cancel(reason);
for (const reader of this._rangeRequestReaders.slice(0)) {
reader.cancel(reason);
}
}
}
class PDFNodeStreamFsFullReader {
class PDFNodeStreamReader extends BasePDFStreamReader {
_reader = null;
constructor(stream) {
this._url = stream.url;
this._done = false;
this._storedError = null;
this.onProgress = null;
const source = stream.source;
this._contentLength = source.length; // optional
this._loaded = 0;
this._filename = null;
super(stream);
const { disableRange, disableStream, length, rangeChunkSize, url } =
stream._source;
this._disableRange = source.disableRange || false;
this._rangeChunkSize = source.rangeChunkSize;
if (!this._rangeChunkSize && !this._disableRange) {
this._disableRange = true;
}
this._isStreamingSupported = !source.disableStream;
this._isRangeSupported = !source.disableRange;
this._readableStream = null;
this._readCapability = Promise.withResolvers();
this._headersCapability = Promise.withResolvers();
this._contentLength = length;
this._isStreamingSupported = !disableStream;
this._isRangeSupported = !disableRange;
const fs = process.getBuiltinModule("fs");
fs.promises.lstat(this._url).then(
stat => {
// Setting right content length.
this._contentLength = stat.size;
fs.promises
.lstat(url)
.then(stat => {
const readStream = fs.createReadStream(url);
const readableStream = getReadableStream(readStream);
this._setReadableStream(fs.createReadStream(this._url));
this._headersCapability.resolve();
},
error => {
if (error.code === "ENOENT") {
error = createResponseError(/* status = */ 0, this._url.href);
this._reader = readableStream.getReader();
const { size } = stat;
if (size <= 2 * rangeChunkSize) {
// The file size is smaller than the size of two chunks, so it doesn't
// make any sense to abort the request and retry with a range request.
this._isRangeSupported = false;
}
// Setting right content length.
this._contentLength = size;
// We need to stop reading when range is supported and streaming is
// disabled.
if (!this._isStreamingSupported && this._isRangeSupported) {
this.cancel(new AbortException("Streaming is disabled."));
}
this._headersCapability.resolve();
})
.catch(error => {
if (error.code === "ENOENT") {
error = createResponseError(/* status = */ 0, url);
}
this._storedError = error;
this._headersCapability.reject(error);
}
);
}
get headersReady() {
return this._headersCapability.promise;
}
get filename() {
return this._filename;
}
get contentLength() {
return this._contentLength;
}
get isRangeSupported() {
return this._isRangeSupported;
}
get isStreamingSupported() {
return this._isStreamingSupported;
});
}
async read() {
await this._readCapability.promise;
if (this._done) {
return { value: undefined, done: true };
}
if (this._storedError) {
throw this._storedError;
await this._headersCapability.promise;
const { value, done } = await this._reader.read();
if (done) {
return { value, done };
}
this._loaded += value.byteLength;
this._callOnProgress();
const chunk = this._readableStream.read();
if (chunk === null) {
this._readCapability = Promise.withResolvers();
return this.read();
}
this._loaded += chunk.length;
this.onProgress?.({
loaded: this._loaded,
total: this._contentLength,
});
// Ensure that `read()` method returns ArrayBuffer.
const buffer = new Uint8Array(chunk).buffer;
return { value: buffer, done: false };
return { value: getArrayBuffer(value), done: false };
}
cancel(reason) {
// Call `this._error()` method when cancel is called
// before _readableStream is set.
if (!this._readableStream) {
this._error(reason);
return;
}
this._readableStream.destroy(reason);
}
_error(reason) {
this._storedError = reason;
this._readCapability.resolve();
}
_setReadableStream(readableStream) {
this._readableStream = readableStream;
readableStream.on("readable", () => {
this._readCapability.resolve();
});
readableStream.on("end", () => {
// Destroy readable to minimize resource usage.
readableStream.destroy();
this._done = true;
this._readCapability.resolve();
});
readableStream.on("error", reason => {
this._error(reason);
});
// We need to stop reading when range is supported and streaming is
// disabled.
if (!this._isStreamingSupported && this._isRangeSupported) {
this._error(new AbortException("streaming is disabled"));
}
// Destroy ReadableStream if already in errored state.
if (this._storedError) {
this._readableStream.destroy(this._storedError);
}
this._reader?.cancel(reason);
}
}
class PDFNodeStreamFsRangeReader {
constructor(stream, start, end) {
this._url = stream.url;
this._done = false;
this._storedError = null;
this.onProgress = null;
this._loaded = 0;
this._readableStream = null;
this._readCapability = Promise.withResolvers();
const source = stream.source;
this._isStreamingSupported = !source.disableStream;
class PDFNodeStreamRangeReader extends BasePDFStreamRangeReader {
_readCapability = Promise.withResolvers();
_reader = null;
constructor(stream, begin, end) {
super(stream, begin, end);
const { url } = stream._source;
const fs = process.getBuiltinModule("fs");
this._setReadableStream(
fs.createReadStream(this._url, { start, end: end - 1 })
);
}
try {
const readStream = fs.createReadStream(url, {
start: begin,
end: end - 1,
});
const readableStream = getReadableStream(readStream);
get isStreamingSupported() {
return this._isStreamingSupported;
this._reader = readableStream.getReader();
this._readCapability.resolve();
} catch (error) {
this._readCapability.reject(error);
}
}
async read() {
await this._readCapability.promise;
if (this._done) {
return { value: undefined, done: true };
const { value, done } = await this._reader.read();
if (done) {
return { value, done };
}
if (this._storedError) {
throw this._storedError;
}
const chunk = this._readableStream.read();
if (chunk === null) {
this._readCapability = Promise.withResolvers();
return this.read();
}
this._loaded += chunk.length;
this.onProgress?.({ loaded: this._loaded });
// Ensure that `read()` method returns ArrayBuffer.
const buffer = new Uint8Array(chunk).buffer;
return { value: buffer, done: false };
return { value: getArrayBuffer(value), done: false };
}
cancel(reason) {
// Call `this._error()` method when cancel is called
// before _readableStream is set.
if (!this._readableStream) {
this._error(reason);
return;
}
this._readableStream.destroy(reason);
}
_error(reason) {
this._storedError = reason;
this._readCapability.resolve();
}
_setReadableStream(readableStream) {
this._readableStream = readableStream;
readableStream.on("readable", () => {
this._readCapability.resolve();
});
readableStream.on("end", () => {
// Destroy readableStream to minimize resource usage.
readableStream.destroy();
this._done = true;
this._readCapability.resolve();
});
readableStream.on("error", reason => {
this._error(reason);
});
// Destroy readableStream if already in errored state.
if (this._storedError) {
this._readableStream.destroy(this._storedError);
}
this._reader?.cancel(reason);
}
}

View File

@ -294,6 +294,9 @@ class TextLayer {
if (item.id) {
this.#container.setAttribute("id", `${item.id}`);
}
if (item.tag === "Artifact") {
this.#container.ariaHidden = true;
}
parent.append(this.#container);
} else if (item.type === "endMarkedContent") {
this.#container = this.#container.parentNode;

View File

@ -13,185 +13,141 @@
* limitations under the License.
*/
/** @typedef {import("../interfaces").IPDFStream} IPDFStream */
/** @typedef {import("../interfaces").IPDFStreamReader} IPDFStreamReader */
// eslint-disable-next-line max-len
/** @typedef {import("../interfaces").IPDFStreamRangeReader} IPDFStreamRangeReader */
import {
BasePDFStream,
BasePDFStreamRangeReader,
BasePDFStreamReader,
} from "../shared/base_pdf_stream.js";
import { assert } from "../shared/util.js";
import { isPdfFile } from "./display_utils.js";
/** @implements {IPDFStream} */
class PDFDataTransportStream {
constructor(
pdfDataRangeTransport,
{ disableRange = false, disableStream = false }
) {
assert(
pdfDataRangeTransport,
'PDFDataTransportStream - missing required "pdfDataRangeTransport" argument.'
);
const { length, initialData, progressiveDone, contentDispositionFilename } =
pdfDataRangeTransport;
function getArrayBuffer(val) {
// Prevent any possible issues by only transferring a Uint8Array that
// completely "utilizes" its underlying ArrayBuffer.
return val instanceof Uint8Array && val.byteLength === val.buffer.byteLength
? val.buffer
: new Uint8Array(val).buffer;
}
this._queuedChunks = [];
this._progressiveDone = progressiveDone;
this._contentDispositionFilename = contentDispositionFilename;
class PDFDataTransportStream extends BasePDFStream {
_progressiveDone = false;
_queuedChunks = [];
constructor(source) {
super(
source,
PDFDataTransportStreamReader,
PDFDataTransportStreamRangeReader
);
const { pdfDataRangeTransport } = source;
const { initialData, progressiveDone } = pdfDataRangeTransport;
if (initialData?.length > 0) {
// Prevent any possible issues by only transferring a Uint8Array that
// completely "utilizes" its underlying ArrayBuffer.
const buffer =
initialData instanceof Uint8Array &&
initialData.byteLength === initialData.buffer.byteLength
? initialData.buffer
: new Uint8Array(initialData).buffer;
const buffer = getArrayBuffer(initialData);
this._queuedChunks.push(buffer);
}
this._pdfDataRangeTransport = pdfDataRangeTransport;
this._isStreamingSupported = !disableStream;
this._isRangeSupported = !disableRange;
this._contentLength = length;
this._fullRequestReader = null;
this._rangeReaders = [];
this._progressiveDone = progressiveDone;
pdfDataRangeTransport.addRangeListener((begin, chunk) => {
this._onReceiveData({ begin, chunk });
});
pdfDataRangeTransport.addProgressListener((loaded, total) => {
this._onProgress({ loaded, total });
this.#onReceiveData(begin, chunk);
});
pdfDataRangeTransport.addProgressiveReadListener(chunk => {
this._onReceiveData({ chunk });
this.#onReceiveData(/* begin = */ undefined, chunk);
});
pdfDataRangeTransport.addProgressiveDoneListener(() => {
this._onProgressiveDone();
this._fullReader?.progressiveDone();
this._progressiveDone = true;
});
pdfDataRangeTransport.transportReady();
}
_onReceiveData({ begin, chunk }) {
// Prevent any possible issues by only transferring a Uint8Array that
// completely "utilizes" its underlying ArrayBuffer.
const buffer =
chunk instanceof Uint8Array &&
chunk.byteLength === chunk.buffer.byteLength
? chunk.buffer
: new Uint8Array(chunk).buffer;
#onReceiveData(begin, chunk) {
const buffer = getArrayBuffer(chunk);
if (begin === undefined) {
if (this._fullRequestReader) {
this._fullRequestReader._enqueue(buffer);
if (this._fullReader) {
this._fullReader._enqueue(buffer);
} else {
this._queuedChunks.push(buffer);
}
} else {
const found = this._rangeReaders.some(function (rangeReader) {
if (rangeReader._begin !== begin) {
return false;
}
rangeReader._enqueue(buffer);
return true;
});
const rangeReader = this._rangeReaders
.keys()
.find(r => r._begin === begin);
assert(
found,
"_onReceiveData - no `PDFDataTransportStreamRangeReader` instance found."
rangeReader,
"#onReceiveData - no `PDFDataTransportStreamRangeReader` instance found."
);
}
}
get _progressiveDataLength() {
return this._fullRequestReader?._loaded ?? 0;
}
_onProgress(evt) {
if (evt.total === undefined) {
// Reporting to first range reader, if it exists.
this._rangeReaders[0]?.onProgress?.({ loaded: evt.loaded });
} else {
this._fullRequestReader?.onProgress?.({
loaded: evt.loaded,
total: evt.total,
});
}
}
_onProgressiveDone() {
this._fullRequestReader?.progressiveDone();
this._progressiveDone = true;
}
_removeRangeReader(reader) {
const i = this._rangeReaders.indexOf(reader);
if (i >= 0) {
this._rangeReaders.splice(i, 1);
rangeReader._enqueue(buffer);
}
}
getFullReader() {
assert(
!this._fullRequestReader,
"PDFDataTransportStream.getFullReader can only be called once."
);
const queuedChunks = this._queuedChunks;
const reader = super.getFullReader();
this._queuedChunks = null;
return new PDFDataTransportStreamReader(
this,
queuedChunks,
this._progressiveDone,
this._contentDispositionFilename
);
return reader;
}
getRangeReader(begin, end) {
if (end <= this._progressiveDataLength) {
return null;
const reader = super.getRangeReader(begin, end);
if (reader) {
reader.onDone = () => this._rangeReaders.delete(reader);
this._source.pdfDataRangeTransport.requestDataRange(begin, end);
}
const reader = new PDFDataTransportStreamRangeReader(this, begin, end);
this._pdfDataRangeTransport.requestDataRange(begin, end);
this._rangeReaders.push(reader);
return reader;
}
cancelAllRequests(reason) {
this._fullRequestReader?.cancel(reason);
super.cancelAllRequests(reason);
for (const reader of this._rangeReaders.slice(0)) {
reader.cancel(reason);
}
this._pdfDataRangeTransport.abort();
this._source.pdfDataRangeTransport.abort();
}
}
/** @implements {IPDFStreamReader} */
class PDFDataTransportStreamReader {
constructor(
stream,
queuedChunks,
progressiveDone = false,
contentDispositionFilename = null
) {
this._stream = stream;
this._done = progressiveDone || false;
this._filename = isPdfFile(contentDispositionFilename)
? contentDispositionFilename
: null;
this._queuedChunks = queuedChunks || [];
this._loaded = 0;
class PDFDataTransportStreamReader extends BasePDFStreamReader {
_done = false;
_queuedChunks = null;
_requests = [];
constructor(stream) {
super(stream);
const { pdfDataRangeTransport, disableRange, disableStream } =
stream._source;
const { length, contentDispositionFilename } = pdfDataRangeTransport;
this._queuedChunks = stream._queuedChunks || [];
for (const chunk of this._queuedChunks) {
this._loaded += chunk.byteLength;
}
this._requests = [];
this._headersReady = Promise.resolve();
stream._fullRequestReader = this;
this._done = stream._progressiveDone;
this.onProgress = null;
this._contentLength = length;
this._isStreamingSupported = !disableStream;
this._isRangeSupported = !disableRange;
if (isPdfFile(contentDispositionFilename)) {
this._filename = contentDispositionFilename;
}
this._headersCapability.resolve();
// Report loading progress when there is `initialData`, and `_enqueue` has
// not been invoked, but with a small delay to give an `onProgress` callback
// a chance to be registered first.
const loaded = this._loaded;
Promise.resolve().then(() => {
if (loaded > 0 && this._loaded === loaded) {
this._callOnProgress();
}
});
}
_enqueue(chunk) {
@ -199,32 +155,13 @@ class PDFDataTransportStreamReader {
return; // Ignore new data.
}
if (this._requests.length > 0) {
const requestCapability = this._requests.shift();
requestCapability.resolve({ value: chunk, done: false });
const capability = this._requests.shift();
capability.resolve({ value: chunk, done: false });
} else {
this._queuedChunks.push(chunk);
}
this._loaded += chunk.byteLength;
}
get headersReady() {
return this._headersReady;
}
get filename() {
return this._filename;
}
get isRangeSupported() {
return this._stream._isRangeSupported;
}
get isStreamingSupported() {
return this._stream._isStreamingSupported;
}
get contentLength() {
return this._stream._contentLength;
this._callOnProgress();
}
async read() {
@ -235,38 +172,38 @@ class PDFDataTransportStreamReader {
if (this._done) {
return { value: undefined, done: true };
}
const requestCapability = Promise.withResolvers();
this._requests.push(requestCapability);
return requestCapability.promise;
const capability = Promise.withResolvers();
this._requests.push(capability);
return capability.promise;
}
cancel(reason) {
this._done = true;
for (const requestCapability of this._requests) {
requestCapability.resolve({ value: undefined, done: true });
for (const capability of this._requests) {
capability.resolve({ value: undefined, done: true });
}
this._requests.length = 0;
}
progressiveDone() {
if (this._done) {
return;
}
this._done = true;
this._done ||= true;
}
}
/** @implements {IPDFStreamRangeReader} */
class PDFDataTransportStreamRangeReader {
constructor(stream, begin, end) {
this._stream = stream;
this._begin = begin;
this._end = end;
this._queuedChunk = null;
this._requests = [];
this._done = false;
class PDFDataTransportStreamRangeReader extends BasePDFStreamRangeReader {
onDone = null;
this.onProgress = null;
_begin = -1;
_done = false;
_queuedChunk = null;
_requests = [];
constructor(stream, begin, end) {
super(stream, begin, end);
this._begin = begin;
}
_enqueue(chunk) {
@ -276,19 +213,16 @@ class PDFDataTransportStreamRangeReader {
if (this._requests.length === 0) {
this._queuedChunk = chunk;
} else {
const requestsCapability = this._requests.shift();
requestsCapability.resolve({ value: chunk, done: false });
for (const requestCapability of this._requests) {
requestCapability.resolve({ value: undefined, done: true });
const firstCapability = this._requests.shift();
firstCapability.resolve({ value: chunk, done: false });
for (const capability of this._requests) {
capability.resolve({ value: undefined, done: true });
}
this._requests.length = 0;
}
this._done = true;
this._stream._removeRangeReader(this);
}
get isStreamingSupported() {
return false;
this.onDone?.();
}
async read() {
@ -300,18 +234,18 @@ class PDFDataTransportStreamRangeReader {
if (this._done) {
return { value: undefined, done: true };
}
const requestCapability = Promise.withResolvers();
this._requests.push(requestCapability);
return requestCapability.promise;
const capability = Promise.withResolvers();
this._requests.push(capability);
return capability.promise;
}
cancel(reason) {
this._done = true;
for (const requestCapability of this._requests) {
requestCapability.resolve({ value: undefined, done: true });
for (const capability of this._requests) {
capability.resolve({ value: undefined, done: true });
}
this._requests.length = 0;
this._stream._removeRangeReader(this);
this.onDone?.();
}
}

View File

@ -16,7 +16,6 @@
// eslint-disable-next-line max-len
/** @typedef {import("./annotation_storage").AnnotationStorage} AnnotationStorage */
/** @typedef {import("./display_utils").PageViewport} PageViewport */
/** @typedef {import("../../web/interfaces").IPDFLinkService} IPDFLinkService */
import { XfaText } from "./xfa_text.js";
@ -26,7 +25,7 @@ import { XfaText } from "./xfa_text.js";
* @property {HTMLDivElement} div
* @property {Object} xfaHtml
* @property {AnnotationStorage} [annotationStorage]
* @property {IPDFLinkService} linkService
* @property {PDFLinkService} linkService
* @property {string} [intent] - (default value is 'display').
*/

View File

@ -57,6 +57,7 @@ import {
isPdfFile,
noContextMenu,
OutputScale,
PagesMapper,
PDFDateString,
PixelsPerInch,
RenderingCancelledException,
@ -128,6 +129,7 @@ globalThis.pdfjsLib = {
normalizeUnicode,
OPS,
OutputScale,
PagesMapper,
PasswordResponses,
PDFDataRangeTransport,
PDFDateString,
@ -187,6 +189,7 @@ export {
normalizeUnicode,
OPS,
OutputScale,
PagesMapper,
PasswordResponses,
PDFDataRangeTransport,
PDFDateString,

View File

@ -13,56 +13,123 @@
* limitations under the License.
*/
import { assert, unreachable } from "./util.js";
/**
* Interface that represents PDF data transport. If possible, it allows
* progressively load entire or fragment of the PDF binary data.
*
* @interface
*/
class IPDFStream {
class BasePDFStream {
#PDFStreamReader = null;
#PDFStreamRangeReader = null;
_fullReader = null;
_rangeReaders = new Set();
_source = null;
constructor(source, PDFStreamReader, PDFStreamRangeReader) {
if (
(typeof PDFJSDev === "undefined" || PDFJSDev.test("TESTING")) &&
this.constructor === BasePDFStream
) {
unreachable("Cannot initialize BasePDFStream.");
}
this._source = source;
this.#PDFStreamReader = PDFStreamReader;
this.#PDFStreamRangeReader = PDFStreamRangeReader;
}
get _progressiveDataLength() {
return this._fullReader?._loaded ?? 0;
}
/**
* Gets a reader for the entire PDF data.
* @returns {IPDFStreamReader}
* @returns {BasePDFStreamReader}
*/
getFullReader() {
return null;
assert(
!this._fullReader,
"BasePDFStream.getFullReader can only be called once."
);
return (this._fullReader = new this.#PDFStreamReader(this));
}
/**
* Gets a reader for the range of the PDF data.
*
* NOTE: Currently this method is only expected to be invoked *after*
* the `IPDFStreamReader.prototype.headersReady` promise has resolved.
* the `BasePDFStreamReader.prototype.headersReady` promise has resolved.
*
* @param {number} begin - the start offset of the data.
* @param {number} end - the end offset of the data.
* @returns {IPDFStreamRangeReader}
* @returns {BasePDFStreamRangeReader}
*/
getRangeReader(begin, end) {
return null;
if (end <= this._progressiveDataLength) {
return null;
}
const reader = new this.#PDFStreamRangeReader(this, begin, end);
this._rangeReaders.add(reader);
return reader;
}
/**
* Cancels all opened reader and closes all their opened requests.
* @param {Object} reason - the reason for cancelling
*/
cancelAllRequests(reason) {}
cancelAllRequests(reason) {
this._fullReader?.cancel(reason);
// Always create a copy of the rangeReaders.
for (const reader of new Set(this._rangeReaders)) {
reader.cancel(reason);
}
}
}
/**
* Interface for a PDF binary data reader.
*
* @interface
*/
class IPDFStreamReader {
constructor() {
/**
* Sets or gets the progress callback. The callback can be useful when the
* isStreamingSupported property of the object is defined as false.
* The callback is called with one parameter: an object with the loaded and
* total properties.
*/
this.onProgress = null;
class BasePDFStreamReader {
/**
* Sets or gets the progress callback. The callback can be useful when the
* isStreamingSupported property of the object is defined as false.
* The callback is called with one parameter: an object with the loaded and
* total properties.
*/
onProgress = null;
_contentLength = 0;
_filename = null;
_headersCapability = Promise.withResolvers();
_isRangeSupported = false;
_isStreamingSupported = false;
_loaded = 0;
_stream = null;
constructor(stream) {
if (
(typeof PDFJSDev === "undefined" || PDFJSDev.test("TESTING")) &&
this.constructor === BasePDFStreamReader
) {
unreachable("Cannot initialize BasePDFStreamReader.");
}
this._stream = stream;
}
_callOnProgress() {
this.onProgress?.({ loaded: this._loaded, total: this._contentLength });
}
/**
@ -71,7 +138,7 @@ class IPDFStreamReader {
* @type {Promise}
*/
get headersReady() {
return Promise.resolve();
return this._headersCapability.promise;
}
/**
@ -81,7 +148,7 @@ class IPDFStreamReader {
* header is missing/invalid.
*/
get filename() {
return null;
return this._filename;
}
/**
@ -90,7 +157,7 @@ class IPDFStreamReader {
* @type {number} The data length (or 0 if unknown).
*/
get contentLength() {
return 0;
return this._contentLength;
}
/**
@ -100,7 +167,7 @@ class IPDFStreamReader {
* @type {boolean}
*/
get isRangeSupported() {
return false;
return this._isRangeSupported;
}
/**
@ -109,7 +176,7 @@ class IPDFStreamReader {
* @type {boolean}
*/
get isStreamingSupported() {
return false;
return this._isStreamingSupported;
}
/**
@ -120,37 +187,33 @@ class IPDFStreamReader {
* set to true.
* @returns {Promise}
*/
async read() {}
async read() {
unreachable("Abstract method `read` called");
}
/**
* Cancels all pending read requests and closes the stream.
* @param {Object} reason
*/
cancel(reason) {}
cancel(reason) {
unreachable("Abstract method `cancel` called");
}
}
/**
* Interface for a PDF binary data fragment reader.
*
* @interface
*/
class IPDFStreamRangeReader {
constructor() {
/**
* Sets or gets the progress callback. The callback can be useful when the
* isStreamingSupported property of the object is defined as false.
* The callback is called with one parameter: an object with the loaded
* property.
*/
this.onProgress = null;
}
class BasePDFStreamRangeReader {
_stream = null;
/**
* Gets ability of the stream to progressively load binary data.
* @type {boolean}
*/
get isStreamingSupported() {
return false;
constructor(stream, begin, end) {
if (
(typeof PDFJSDev === "undefined" || PDFJSDev.test("TESTING")) &&
this.constructor === BasePDFStreamRangeReader
) {
unreachable("Cannot initialize BasePDFStreamRangeReader.");
}
this._stream = stream;
}
/**
@ -161,13 +224,17 @@ class IPDFStreamRangeReader {
* set to true.
* @returns {Promise}
*/
async read() {}
async read() {
unreachable("Abstract method `read` called");
}
/**
* Cancels all pending read requests and closes the stream.
* @param {Object} reason
*/
cancel(reason) {}
cancel(reason) {
unreachable("Abstract method `cancel` called");
}
}
export { IPDFStream, IPDFStreamRangeReader, IPDFStreamReader };
export { BasePDFStream, BasePDFStreamRangeReader, BasePDFStreamReader };

View File

@ -1235,47 +1235,12 @@ function MathClamp(v, min, max) {
return Math.min(Math.max(v, min), max);
}
// TODO: Remove this once `Uint8Array.prototype.toHex` is generally available.
function toHexUtil(arr) {
if (Uint8Array.prototype.toHex) {
return arr.toHex();
}
return Array.from(arr, num => hexNumbers[num]).join("");
}
// TODO: Remove this once `Uint8Array.prototype.toBase64` is generally
// available.
function toBase64Util(arr) {
if (Uint8Array.prototype.toBase64) {
return arr.toBase64();
}
return btoa(bytesToString(arr));
}
// TODO: Remove this once `Uint8Array.fromBase64` is generally available.
function fromBase64Util(str) {
if (Uint8Array.fromBase64) {
return Uint8Array.fromBase64(str);
}
return stringToBytes(atob(str));
}
// TODO: Remove this once https://bugzilla.mozilla.org/show_bug.cgi?id=1928493
// is fixed.
// TODO: Remove this once `Math.sumPrecise` is generally available.
if (
(typeof PDFJSDev === "undefined" || PDFJSDev.test("SKIP_BABEL")) &&
typeof Promise.try !== "function"
(typeof PDFJSDev === "undefined" ||
PDFJSDev.test("SKIP_BABEL && !MOZCENTRAL")) &&
typeof Math.sumPrecise !== "function"
) {
Promise.try = function (fn, ...args) {
return new Promise(resolve => {
resolve(fn(...args));
});
};
}
// TODO: Remove this once the `javascript.options.experimental.math_sumprecise`
// preference is removed from Firefox.
if (typeof Math.sumPrecise !== "function") {
// Note that this isn't a "proper" polyfill, but since we're only using it to
// replace `Array.prototype.reduce()` invocations it should be fine.
Math.sumPrecise = function (numbers) {
@ -1338,7 +1303,6 @@ export {
FeatureTest,
FONT_IDENTITY_MATRIX,
FormatError,
fromBase64Util,
getModificationDate,
getUuid,
getVerbosityLevel,
@ -1368,8 +1332,6 @@ export {
stringToPDFString,
stringToUTF8String,
TextRenderingMode,
toBase64Util,
toHexUtil,
UnknownErrorException,
unreachable,
updateUrlHash,

View File

@ -628,6 +628,13 @@ class Driver {
`[${this.currentTask + 1}/${this.manifest.length}] ${task.id}:\n`
);
if (task.type === "skip-because-failing") {
this._log(` Skipping file "${task.file} because it's failing"\n`);
this.currentTask++;
this._nextTask();
return;
}
// Support *linked* test-cases for the other suites, e.g. unit- and
// integration-tests, without needing to run them as reference-tests.
if (task.type === "other") {

View File

@ -1,7 +1,7 @@
import { decodeFontData, ttx, verifyTtxOutput } from "./fontutils.js";
import { ttx, verifyTtxOutput } from "./fontutils.js";
describe("font1", function () {
const font1_1 = decodeFontData(
const font1_1 = Uint8Array.fromBase64(
// eslint-disable-next-line max-len
"T1RUTwAJAIAAAwAQQ0ZGIP/t0rAAAACcAAADKU9TLzJDxycMAAADyAAAAGBjbWFwwFIBcgAABCgAAABUaGVhZKsnTJ4AAAR8AAAANmhoZWEDHvxTAAAEtAAAACRobXR4AAAAAAAABNgAAAA4bWF4cAAOUAAAAAUQAAAABm5hbWX8Fq+xAAAFGAAAAfhwb3N0AAMAAAAABxAAAAAgAQAEAgABAQEMS0hQRkxFK01UU1kAAQEBOfgeAPgfAfggAvghA/gXBIv+Tvqn+bAFHQAAAMgPHQAAAL0QHQAAANsRHQAAACcdAAADARL4IAwWAAcBAQgUGx5TV19yYWRpY2FsY2lyY2xlY29weXJ0c2ltaWxhcjEuMUNvcHlyaWdodCAoQykgMTk5MiwgMTk5MyBUaGUgVGVYcGxvcmF0b3JzIENvcnBvcmF0aW9uTVRTWU1hdGhUaW1lAAAAAAkAAg0YQ0RmZ3AAAKYAqAGIAYkADAAeAFwAXgGHAAoCAAEAAwAWAFoAtgDxARcBNgGKAd4CDiAO93W9Ad/4+AP5TPd1Fb38+FkHDvfslp/3PtH3Pp8B9xjR9zDQ9zDRFPz4P/eAFfd193UFRQb7UvtS+1L3UgVFBvd1+3X7dvt1BdIG91L3UvdS+1IF0gYO+MT7ZbP5vLMBw7P5vLMD+kT3fxX3iPtc91z7iPuI+1z7XPuI+4j3XPtc94j3iPdc91z3iB78UPwoFft0+0f3SPd093T3R/dI93T3dPdI+0j7dPt0+0j7SPt0Hw73Zb33Br0Bw/kwA/ln+C8VT3o8Lz8hMvc4+xYbP0E/WncfQIwH3KLi0Mb3AuL7OPcUG9nc272ZH9IHDjig97O997SfAfgBvQP5aPd1Fb37yffIWfvI+8lZ98n7yL33yAcO9MP3JsMBw/kwA/lo98cVw/0wUwf5MPteFcP9MFMHDkX7SaD4JJ/4JJ8B9yXVA/dv9w0V0n6yPZwejQfZnZiy0hr3PAfQn7HSmx6WByRNd/sLH/tGB0t7bEZ5HtB4m2xLGvtFB/sMyXfyHpYHRJt3sdAaDkX7SaD4JJ/4JJ8B9yvVA/d19xwVy5uq0J4eRp17qssa90UH9wxNnyQegAfSe59lRhr7PAdEmGTZeh6JBz15fmREGvs8B0Z3ZUR7HoAH8smf9wsfDvgq/k6g99/k+LCfAcD5yAP4Kf5OFZUG+F76fQVWBvwe/fT7cffE+yz7KJp23dsFDnie+GWenJD3K54G+2WiBx4KBI8MCb0KvQufqQwMqZ8MDfmgFPhMFQAAAAAAAwIkAfQABQAAAooCuwAAAIwCigK7AAAB3wAxAQIAAAAABgAAAAAAAAAAAAABEAAAAAAAAAAAAAAAKjIxKgAAAEPgBwMc/EYAZAMcA7oAAAAAAAAAAAAAAAAAAABDAAMAAAABAAMAAQAAAAwABABIAAAACgAIAAIAAgBEAGcAcOAH//8AAABDAGYAcOAA////wv+h/5kAAAABAAAAAAAAAAQAAAABAAEAAgACAAMAAwAEAAQAAQAAAAAQAAAAAABfDzz1AAAD6AAAAACeC34nAAAAAJ4LficAAPxGD/8DHAAAABEAAAAAAAAAAAABAAADHPxGAAD//wAAAAAAAAAAAAAAAAAAAAAAAAAAAAAADgAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAABQAAAOAAAAAAAUAPYAAQAAAAAAAAAQAAAAAQAAAAAAAQALABAAAQAAAAAAAgAHABsAAQAAAAAAAwAIACIAAQAAAAAABAALACoAAQAAAAAABQAMADUAAQAAAAAABgAAAEEAAQAAAAAABwAHAEEAAQAAAAAACAAHAEgAAQAAAAAACQAHAE8AAwABBAkAAAAgAFYAAwABBAkAAQAWAHYAAwABBAkAAgAOAIwAAwABBAkAAwAQAJoAAwABBAkABAAWAKoAAwABBAkABQAYAMAAAwABBAkABgAAANgAAwABBAkABwAOANgAAwABBAkACAAOAOYAAwABBAkACQAOAPRPcmlnaW5hbCBsaWNlbmNlS0hQRkxFK01UU1lVbmtub3dudW5pcXVlSURLSFBGTEUrTVRTWVZlcnNpb24gMC4xMVVua25vd25Vbmtub3duVW5rbm93bgBPAHIAaQBnAGkAbgBhAGwAIABsAGkAYwBlAG4AYwBlAEsASABQAEYATABFACsATQBUAFMAWQBVAG4AawBuAG8AdwBuAHUAbgBpAHEAdQBlAEkARABLAEgAUABGAEwARQArAE0AVABTAFkAVgBlAHIAcwBpAG8AbgAgADAALgAxADEAVQBuAGsAbgBvAHcAbgBVAG4AawBuAG8AdwBuAFUAbgBrAG4AbwB3AG4AAwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=="
);

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -14,22 +14,10 @@
* limitations under the License.
*/
import { bytesToString, stringToBytes } from "../../src/shared/util.js";
function decodeFontData(base64) {
const str = atob(base64);
return stringToBytes(str);
}
function encodeFontData(data) {
const str = bytesToString(data);
return btoa(str);
}
async function ttx(data) {
const response = await fetch("/ttx", {
method: "POST",
body: encodeFontData(data),
body: data.toBase64(),
});
if (!response.ok) {
@ -45,4 +33,4 @@ function verifyTtxOutput(output) {
}
}
export { decodeFontData, encodeFontData, ttx, verifyTtxOutput };
export { ttx, verifyTtxOutput };

View File

@ -430,4 +430,185 @@ describe("accessibility", () => {
);
});
});
describe("Artifacts must be aria-hidden", () => {
let pages;
beforeEach(async () => {
pages = await loadAndWait("bug1937438_mml_from_latex.pdf", ".textLayer");
});
afterEach(async () => {
await closePages(pages);
});
it("must check that some artifacts are aria-hidden", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
const parentSquareRootHidden = await page.evaluate(() => {
for (const span of document.querySelectorAll(".textLayer span")) {
if (span.textContent === "√") {
return span.parentElement.getAttribute("aria-hidden");
}
}
return false;
});
expect(parentSquareRootHidden)
.withContext(`In ${browserName}`)
.toEqual("true");
})
);
});
});
describe("No alt-text with MathML", () => {
let pages;
beforeEach(async () => {
pages = await loadAndWait("bug2004951.pdf", ".textLayer");
});
afterEach(async () => {
await closePages(pages);
});
it("must check that there's no alt-text on the MathML node", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
const isSanitizerSupported = await page.evaluate(() => {
try {
// eslint-disable-next-line no-undef
return typeof Sanitizer !== "undefined";
} catch {
return false;
}
});
const ariaLabel = await page.$eval(
"span[aria-owns='p3R_mc2']",
el => el.getAttribute("aria-label") || ""
);
if (isSanitizerSupported) {
expect(ariaLabel).withContext(`In ${browserName}`).toEqual("");
} else {
expect(ariaLabel)
.withContext(`In ${browserName}`)
.toEqual("cube root of , x plus y end cube root ");
}
})
);
});
});
describe("Text elements must be aria-hidden when there's MathML and annotations", () => {
let pages;
beforeEach(async () => {
pages = await loadAndWait("bug2009627.pdf", ".textLayer");
});
afterEach(async () => {
await closePages(pages);
});
it("must check that the text in text layer is aria-hidden", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
const isSanitizerSupported = await page.evaluate(() => {
try {
// eslint-disable-next-line no-undef
return typeof Sanitizer !== "undefined";
} catch {
return false;
}
});
const ariaHidden = await page.evaluate(() =>
Array.from(
document.querySelectorAll(".structTree :has(> math)")
).map(el =>
document
.getElementById(el.getAttribute("aria-owns"))
.getAttribute("aria-hidden")
)
);
if (isSanitizerSupported) {
expect(ariaHidden)
.withContext(`In ${browserName}`)
.toEqual(["true", "true", "true"]);
} else {
// eslint-disable-next-line no-console
console.log(
`Pending in Chrome: Sanitizer API (in ${browserName}) is not supported`
);
}
})
);
});
});
describe("A TH in a TR itself in a TBody is rowheader", () => {
let pages;
beforeEach(async () => {
pages = await loadAndWait("bug2014080.pdf", ".textLayer");
});
afterEach(async () => {
await closePages(pages);
});
it("must check that the table has the right structure", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
let elementRole = await page.evaluate(() =>
Array.from(
document.querySelector(".structTree [role='table']").children
).map(child => child.getAttribute("role"))
);
// THeader and TBody must be rowgroup.
expect(elementRole)
.withContext(`In ${browserName}`)
.toEqual(["rowgroup", "rowgroup"]);
elementRole = await page.evaluate(() =>
Array.from(
document.querySelector(
".structTree [role='table'] > [role='rowgroup'] > [role='row']"
).children
).map(child => child.getAttribute("role"))
);
// THeader has 3 columnheader.
expect(elementRole)
.withContext(`In ${browserName}`)
.toEqual(["columnheader", "columnheader", "columnheader"]);
elementRole = await page.evaluate(() =>
Array.from(
document.querySelector(
".structTree [role='table'] > [role='rowgroup']:nth-child(2)"
).children
).map(child => child.getAttribute("role"))
);
// TBody has 5 rows.
expect(elementRole)
.withContext(`In ${browserName}`)
.toEqual(["row", "row", "row", "row", "row"]);
elementRole = await page.evaluate(() =>
Array.from(
document.querySelector(
".structTree [role='table'] > [role='rowgroup']:nth-child(2) > [role='row']:first-child"
).children
).map(child => child.getAttribute("role"))
);
// First row has a rowheader and 2 cells.
expect(elementRole)
.withContext(`In ${browserName}`)
.toEqual(["rowheader", "cell", "cell"]);
})
);
});
});
});

View File

@ -24,6 +24,8 @@ import {
highlightSpan,
kbModifierDown,
kbModifierUp,
kbRedo,
kbUndo,
loadAndWait,
scrollIntoView,
selectEditor,
@ -109,7 +111,7 @@ describe("Comment", () => {
".annotationEditorLayer",
"page-width",
null,
{ enableComment: true, localeProperties: "ar" }
{ enableComment: true, locale: "ar" }
);
});
@ -965,4 +967,213 @@ describe("Comment", () => {
);
});
});
describe("Undo deletion popup for comments (bug 1999154)", () => {
let pages;
beforeEach(async () => {
pages = await loadAndWait(
"tracemonkey.pdf",
".annotationEditorLayer",
"page-fit",
null,
{ enableComment: true }
);
});
afterEach(async () => {
await closePages(pages);
});
it("must check that deleting a comment can be undone using the undo button", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await switchToHighlight(page);
await highlightSpan(page, 1, "Abstract");
const editorSelector = getEditorSelector(0);
const comment = "Test comment for undo";
await editComment(page, editorSelector, comment);
// Stay in highlight mode - don't disable it
await waitAndClick(
page,
`${editorSelector} .annotationCommentButton`
);
await page.waitForSelector("#commentPopup", { visible: true });
// Capture the date before deletion
const dateBefore = await page.evaluate(
() =>
document.querySelector("#commentPopup .commentPopupTime")
?.textContent
);
await waitAndClick(page, "button.commentPopupDelete");
await page.waitForSelector("#editorUndoBar", { visible: true });
await page.waitForSelector("#editorUndoBarUndoButton", {
visible: true,
});
await page.click("#editorUndoBarUndoButton");
// Check that the comment is restored by hovering to show the popup
await page.hover(`${editorSelector} .annotationCommentButton`);
await page.waitForSelector("#commentPopup", { visible: true });
const popupText = await page.evaluate(
() =>
document.querySelector("#commentPopup .commentPopupText")
?.textContent
);
expect(popupText).withContext(`In ${browserName}`).toEqual(comment);
// Check that the date is preserved
const dateAfter = await page.evaluate(
() =>
document.querySelector("#commentPopup .commentPopupTime")
?.textContent
);
expect(dateAfter)
.withContext(`In ${browserName}`)
.toEqual(dateBefore);
})
);
});
it("must check that the undo deletion popup displays 'Comment removed' message", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await switchToHighlight(page);
await highlightSpan(page, 1, "Abstract");
const editorSelector = getEditorSelector(0);
await editComment(page, editorSelector, "Test comment");
// Stay in highlight mode - don't disable it
await waitAndClick(
page,
`${editorSelector} .annotationCommentButton`
);
await page.waitForSelector("#commentPopup", { visible: true });
await waitAndClick(page, "button.commentPopupDelete");
await page.waitForFunction(() => {
const messageElement = document.querySelector(
"#editorUndoBarMessage"
);
return messageElement && messageElement.textContent.trim() !== "";
});
const message = await page.waitForSelector("#editorUndoBarMessage");
const messageText = await page.evaluate(
el => el.textContent,
message
);
expect(messageText)
.withContext(`In ${browserName}`)
.toContain("Comment removed");
})
);
});
it("must check that the undo bar closes when clicking the close button", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await switchToHighlight(page);
await highlightSpan(page, 1, "Abstract");
const editorSelector = getEditorSelector(0);
await editComment(page, editorSelector, "Test comment");
// Stay in highlight mode - don't disable it
await waitAndClick(
page,
`${editorSelector} .annotationCommentButton`
);
await page.waitForSelector("#commentPopup", { visible: true });
await waitAndClick(page, "button.commentPopupDelete");
await page.waitForSelector("#editorUndoBar", { visible: true });
await waitAndClick(page, "#editorUndoBarCloseButton");
await page.waitForSelector("#editorUndoBar", { hidden: true });
})
);
});
it("must check that deleting a comment can be undone using Ctrl+Z", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await switchToHighlight(page);
await highlightSpan(page, 1, "Abstract");
const editorSelector = getEditorSelector(0);
const comment = "Test comment for Ctrl+Z undo";
await editComment(page, editorSelector, comment);
// Stay in highlight mode - don't disable it
await waitAndClick(
page,
`${editorSelector} .annotationCommentButton`
);
await page.waitForSelector("#commentPopup", { visible: true });
await waitAndClick(page, "button.commentPopupDelete");
await page.waitForSelector("#editorUndoBar", { visible: true });
// Use Ctrl+Z to undo
await kbUndo(page);
// The undo bar should be hidden after undo
await page.waitForSelector("#editorUndoBar", { hidden: true });
// Check that the comment is restored by hovering to show the popup
await page.hover(`${editorSelector} .annotationCommentButton`);
await page.waitForSelector("#commentPopup", { visible: true });
const popupText = await page.evaluate(
() =>
document.querySelector("#commentPopup .commentPopupText")
?.textContent
);
expect(popupText).withContext(`In ${browserName}`).toEqual(comment);
})
);
});
it("must check that the comment popup is hidden after redo", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await switchToHighlight(page);
await highlightSpan(page, 1, "Abstract");
const editorSelector = getEditorSelector(0);
const comment = "Test comment for redo";
await editComment(page, editorSelector, comment);
// Show the popup by clicking the comment button
await waitAndClick(
page,
`${editorSelector} .annotationCommentButton`
);
await page.waitForSelector("#commentPopup", { visible: true });
// Delete the comment
await waitAndClick(page, "button.commentPopupDelete");
await page.waitForSelector("#editorUndoBar", { visible: true });
// Undo the deletion
await kbUndo(page);
await page.waitForSelector("#editorUndoBar", { hidden: true });
// Show the popup again by clicking the comment button
await waitAndClick(
page,
`${editorSelector} .annotationCommentButton`
);
await page.waitForSelector("#commentPopup", { visible: true });
// Redo the deletion - popup should be hidden
await kbRedo(page);
await page.waitForSelector("#commentPopup", { hidden: true });
})
);
});
});
});

View File

@ -3645,4 +3645,46 @@ describe("FreeText Editor", () => {
);
});
});
describe("No exception when moving (issue 20571)", () => {
let pages;
beforeEach(async () => {
pages = await loadAndWait(
"tracemonkey.pdf",
".annotationEditorLayer",
100
);
});
afterEach(async () => {
await closePages(pages);
});
it("must check that the buttons work correctly", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await switchToFreeText(page);
const rect = await getRect(page, ".annotationEditorLayer");
await createFreeTextEditor({
page,
x: rect.x + 100,
y: rect.y + 100,
data: "Hello PDF.js World !!",
});
await switchToFreeText(page, /* disable = */ true);
await switchToFreeText(page);
const editorSelector = getEditorSelector(0);
await selectEditor(page, editorSelector);
await dragAndDrop(page, editorSelector, [[10, 10]]);
await switchToFreeText(page, /* disable = */ true);
await switchToFreeText(page);
})
);
});
});
});

View File

@ -37,6 +37,7 @@ async function runTests(results) {
"freetext_editor_spec.mjs",
"highlight_editor_spec.mjs",
"ink_editor_spec.mjs",
"reorganize_pages_spec.mjs",
"scripting_spec.mjs",
"signature_editor_spec.mjs",
"stamp_editor_spec.mjs",

View File

@ -0,0 +1,633 @@
/* Copyright 2026 Mozilla Foundation
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import {
awaitPromise,
clearInput,
closePages,
createPromise,
dragAndDrop,
getAnnotationSelector,
getRect,
getThumbnailSelector,
loadAndWait,
scrollIntoView,
waitForDOMMutation,
} from "./test_utils.mjs";
async function waitForThumbnailVisible(page, pageNums) {
await page.click("#viewsManagerToggleButton");
const thumbSelector = "#thumbnailsView .thumbnailImage";
await page.waitForSelector(thumbSelector, { visible: true });
if (!pageNums) {
return null;
}
if (!Array.isArray(pageNums)) {
pageNums = [pageNums];
}
return Promise.all(
pageNums.map(pageNum =>
page.waitForSelector(getThumbnailSelector(pageNum), { visible: true })
)
);
}
function waitForPagesEdited(page) {
return createPromise(page, resolve => {
window.PDFViewerApplication.eventBus.on(
"pagesedited",
({ pagesMapper }) => {
resolve(Array.from(pagesMapper.getMapping()));
},
{
once: true,
}
);
});
}
async function waitForHavingContents(page, expected) {
await page.evaluate(() => {
// Make sure all the pages will be visible.
window.PDFViewerApplication.pdfViewer.scrollMode = 2 /* = ScrollMode.WRAPPED = */;
window.PDFViewerApplication.pdfViewer.updateScale({
drawingDelay: 0,
scaleFactor: 0.01,
});
});
return page.waitForFunction(
ex => {
const buffer = [];
for (const textLayer of document.querySelectorAll(".textLayer")) {
buffer.push(parseInt(textLayer.textContent.trim(), 10));
}
return ex.length === buffer.length && ex.every((v, i) => v === buffer[i]);
},
{},
expected
);
}
function getSearchResults(page) {
return page.evaluate(() => {
const pages = document.querySelectorAll(".page");
const results = [];
for (let i = 0; i < pages.length; i++) {
const domPage = pages[i];
const highlights = domPage.querySelectorAll("span.highlight");
if (highlights.length === 0) {
continue;
}
results.push([
i + 1,
Array.from(highlights).map(span => span.textContent),
]);
}
return results;
});
}
function movePages(page, selectedPages, atIndex) {
return page.evaluate(
(selected, index) => {
const viewer = window.PDFViewerApplication.pdfViewer;
const pagesToMove = Array.from(selected).sort((a, b) => a - b);
viewer.pagesMapper.pagesNumber =
document.querySelectorAll(".page").length;
viewer.pagesMapper.movePages(new Set(pagesToMove), pagesToMove, index);
window.PDFViewerApplication.eventBus.dispatch("pagesedited", {
pagesMapper: viewer.pagesMapper,
index,
pagesToMove,
});
},
selectedPages,
atIndex
);
}
describe("Reorganize Pages View", () => {
describe("Drag & Drop", () => {
let pages;
beforeEach(async () => {
pages = await loadAndWait(
"page_with_number.pdf",
"#viewsManagerToggleButton",
"page-fit",
null,
{ enableSplitMerge: true }
);
});
afterEach(async () => {
await closePages(pages);
});
it("should show a drag marker when dragging a thumbnail", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await waitForThumbnailVisible(page, 1);
const rect1 = await getRect(page, getThumbnailSelector(1));
const rect2 = await getRect(page, getThumbnailSelector(2));
const handleAddedMarker = await waitForDOMMutation(
page,
mutationList => {
for (const mutation of mutationList) {
if (mutation.type !== "childList") {
continue;
}
for (const node of mutation.addedNodes) {
if (node.classList.contains("dragMarker")) {
return true;
}
}
}
return false;
}
);
const handleRemovedMarker = await waitForDOMMutation(
page,
mutationList => {
for (const mutation of mutationList) {
if (mutation.type !== "childList") {
continue;
}
for (const node of mutation.removedNodes) {
if (node.classList.contains("dragMarker")) {
return true;
}
}
}
return false;
}
);
const dndPromise = dragAndDrop(
page,
getThumbnailSelector(1),
[[0, rect2.y - rect1.y + rect2.height / 2]],
10
);
await dndPromise;
await awaitPromise(handleAddedMarker);
await awaitPromise(handleRemovedMarker);
})
);
});
it("should reorder thumbnails after dropping", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await waitForThumbnailVisible(page, 1);
const rect1 = await getRect(page, getThumbnailSelector(1));
const rect2 = await getRect(page, getThumbnailSelector(2));
const handlePagesEdited = await waitForPagesEdited(page);
await dragAndDrop(
page,
getThumbnailSelector(1),
[[0, rect2.y - rect1.y + rect2.height / 2]],
10
);
const pagesMapping = await awaitPromise(handlePagesEdited);
const expected = [
2, 1, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17,
];
expect(pagesMapping)
.withContext(`In ${browserName}`)
.toEqual(expected);
await waitForHavingContents(page, expected);
})
);
});
it("should reorder thumbnails after dropping at position 0", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await waitForThumbnailVisible(page, 1);
const rect1 = await getRect(page, getThumbnailSelector(1));
const rect2 = await getRect(page, getThumbnailSelector(2));
const handlePagesEdited = await waitForPagesEdited(page);
await dragAndDrop(
page,
getThumbnailSelector(2),
[[0, rect1.y - rect2.y - rect1.height]],
10
);
const pagesMapping = await awaitPromise(handlePagesEdited);
const expected = [
2, 1, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17,
];
expect(pagesMapping)
.withContext(`In ${browserName}`)
.toEqual(expected);
await waitForHavingContents(page, expected);
})
);
});
it("should reorder thumbnails after dropping two adjacent pages", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await waitForThumbnailVisible(page, 1);
const rect2 = await getRect(page, getThumbnailSelector(2));
const rect4 = await getRect(page, getThumbnailSelector(4));
await page.click(`.thumbnail:has(${getThumbnailSelector(1)}) input`);
const handlePagesEdited = await waitForPagesEdited(page);
await dragAndDrop(
page,
getThumbnailSelector(2),
[[0, rect4.y - rect2.y]],
10
);
const pagesMapping = await awaitPromise(handlePagesEdited);
const expected = [
3, 4, 1, 2, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17,
];
expect(pagesMapping)
.withContext(`In ${browserName}`)
.toEqual(expected);
await waitForHavingContents(page, expected);
})
);
});
it("should reorder thumbnails after dropping two non-adjacent pages", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await waitForThumbnailVisible(page, 1);
const rect1 = await getRect(page, getThumbnailSelector(1));
const rect2 = await getRect(page, getThumbnailSelector(2));
await (await page.$(".thumbnail[page-id='14'")).scrollIntoView();
await page.waitForSelector(getThumbnailSelector(14), {
visible: true,
});
await page.click(`.thumbnail:has(${getThumbnailSelector(14)}) input`);
await (await page.$(".thumbnail[page-id='1'")).scrollIntoView();
await page.waitForSelector(getThumbnailSelector(1), {
visible: true,
});
const handlePagesEdited = await waitForPagesEdited(page);
await dragAndDrop(
page,
getThumbnailSelector(1),
[[0, rect2.y - rect1.y + rect2.height / 2]],
10
);
const pagesMapping = await awaitPromise(handlePagesEdited);
const expected = [
2, 1, 14, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 15, 16, 17,
];
expect(pagesMapping)
.withContext(`In ${browserName}`)
.toEqual(expected);
await waitForHavingContents(page, expected);
})
);
});
it("should select the dropped page (bug 2010820)", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await waitForThumbnailVisible(page, 1);
const rect1 = await getRect(page, getThumbnailSelector(1));
const rect2 = await getRect(page, getThumbnailSelector(2));
await page.click(getThumbnailSelector(2));
await page.waitForSelector(
`${getThumbnailSelector(2)}[aria-current="page"]`
);
const handlePagesEdited = await waitForPagesEdited(page);
await dragAndDrop(
page,
getThumbnailSelector(1),
[[0, rect2.y - rect1.y + rect2.height / 2]],
10
);
await awaitPromise(handlePagesEdited);
await page.waitForSelector(
`${getThumbnailSelector(2)}[aria-current="page"]`
);
await page.waitForSelector(
`${getThumbnailSelector(1)}[aria-current="false"]`
);
})
);
});
});
describe("Search in pdf", () => {
let pages;
beforeEach(async () => {
pages = await loadAndWait(
"page_with_number.pdf",
"#viewsManagerToggleButton",
"1",
null,
{ enableSplitMerge: true }
);
});
afterEach(async () => {
await closePages(pages);
});
it("should check if the search is working after moving pages", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await page.click("#viewFindButton");
await page.waitForSelector(":has(> #findHighlightAll)", {
visible: true,
});
await page.click(":has(> #findHighlightAll)");
await page.waitForSelector("#findInput", { visible: true });
await page.type("#findInput", "1");
await page.keyboard.press("Enter");
await page.waitForFunction(
() => document.querySelectorAll("span.highlight").length === 10
);
let results = await getSearchResults(page);
expect(results)
.withContext(`In ${browserName}`)
.toEqual([
// Page number, [matches]
[1, ["1"]],
[10, ["1"]],
[11, ["1", "1"]],
[12, ["1"]],
[13, ["1"]],
[14, ["1"]],
[15, ["1"]],
[16, ["1"]],
[17, ["1"]],
]);
await movePages(page, [11, 2], 3);
await page.waitForFunction(
() => document.querySelectorAll("span.highlight").length === 0
);
await clearInput(page, "#findInput", true);
await page.type("#findInput", "1");
await page.keyboard.press("Enter");
await page.waitForFunction(
() => document.querySelectorAll("span.highlight").length === 10
);
results = await getSearchResults(page);
expect(results)
.withContext(`In ${browserName}`)
.toEqual([
// Page number, [matches]
[1, ["1"]],
[4, ["1", "1"]],
[11, ["1"]],
[12, ["1"]],
[13, ["1"]],
[14, ["1"]],
[15, ["1"]],
[16, ["1"]],
[17, ["1"]],
]);
await movePages(page, [13], 0);
await page.waitForFunction(
() => document.querySelectorAll("span.highlight").length === 0
);
await clearInput(page, "#findInput", true);
await page.type("#findInput", "1");
await page.keyboard.press("Enter");
await page.waitForFunction(
() => document.querySelectorAll("span.highlight").length === 10
);
results = await getSearchResults(page);
expect(results)
.withContext(`In ${browserName}`)
.toEqual([
// Page number, [matches]
[1, ["1"]],
[2, ["1"]],
[5, ["1", "1"]],
[12, ["1"]],
[13, ["1"]],
[14, ["1"]],
[15, ["1"]],
[16, ["1"]],
[17, ["1"]],
]);
})
);
});
});
describe("Links and outlines", () => {
let pages;
beforeEach(async () => {
pages = await loadAndWait(
"page_with_number_and_link.pdf",
"#viewsManagerToggleButton",
"page-fit",
null,
{ enableSplitMerge: true }
);
});
afterEach(async () => {
await closePages(pages);
});
it("should check that link is updated after moving pages", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await waitForThumbnailVisible(page, 1);
await movePages(page, [2], 10);
await scrollIntoView(page, getAnnotationSelector("107R"));
await page.click(getAnnotationSelector("107R"));
const currentPage = await page.$eval(
"#pageNumber",
el => el.valueAsNumber
);
expect(currentPage).withContext(`In ${browserName}`).toBe(10);
})
);
});
it("should check that outlines are updated after moving pages", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await waitForThumbnailVisible(page, 1);
await movePages(page, [2, 4], 10);
await page.click("#viewsManagerSelectorButton");
await page.click("#outlinesViewMenu");
await page.waitForSelector("#outlinesView", { visible: true });
await page.click("#outlinesView .treeItem:nth-child(2)");
const currentPage = await page.$eval(
"#pageNumber",
el => el.valueAsNumber
);
// 9 because 2 and 4 were moved after page 10.
expect(currentPage).withContext(`In ${browserName}`).toBe(9);
})
);
});
});
describe("Drag marker must have the right non-zero dimensions", () => {
let pages;
beforeEach(async () => {
pages = await loadAndWait(
"page_with_number_and_link.pdf",
"#viewsManagerToggleButton",
"1",
null,
{
enableSplitMerge: true,
sidebarViewOnLoad: 2 /* = SidebarView.OUTLINES */,
}
);
});
afterEach(async () => {
await closePages(pages);
});
it("should check if the drag marker width is non-zero", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await page.waitForSelector("#outlinesView", { visible: true });
await page.waitForSelector("#viewsManagerSelectorButton", {
visible: true,
});
await page.click("#viewsManagerSelectorButton");
await page.waitForSelector("#thumbnailsViewMenu", { visible: true });
await page.click("#thumbnailsViewMenu");
const thumbSelector = "#thumbnailsView .thumbnailImage";
await page.waitForSelector(thumbSelector, { visible: true });
const rect1 = await getRect(page, getThumbnailSelector(1));
const rect2 = await getRect(page, getThumbnailSelector(2));
const handleAddedMarker = await waitForDOMMutation(
page,
mutationList => {
for (const mutation of mutationList) {
if (mutation.type !== "childList") {
continue;
}
for (const node of mutation.addedNodes) {
if (node.classList.contains("dragMarker")) {
const rect = node.getBoundingClientRect();
return rect.width !== 0;
}
}
}
return false;
}
);
await dragAndDrop(
page,
getThumbnailSelector(1),
[[0, rect2.y - rect1.y + rect2.height / 2]],
10
);
await awaitPromise(handleAddedMarker);
})
);
});
});
describe("Save a pdf", () => {
let pages;
beforeEach(async () => {
pages = await loadAndWait(
"page_with_number.pdf",
"#viewsManagerToggleButton",
"1",
null,
{ enableSplitMerge: true }
);
});
afterEach(async () => {
await closePages(pages);
});
it("should check that a save is triggered", async () => {
await Promise.all(
pages.map(async ([browserName, page]) => {
await waitForThumbnailVisible(page, 1);
await page.waitForSelector("#viewsManagerStatusActionButton", {
visible: true,
});
const rect1 = await getRect(page, getThumbnailSelector(1));
const rect2 = await getRect(page, getThumbnailSelector(2));
await dragAndDrop(
page,
getThumbnailSelector(1),
[[0, rect2.y - rect1.y + rect2.height / 2]],
10
);
const handleSaveAs = await createPromise(page, resolve => {
window.PDFViewerApplication.eventBus.on(
"savepageseditedpdf",
({ data }) => {
resolve(Array.from(data.pageIndices));
},
{
once: true,
}
);
});
await page.click("#viewsManagerStatusActionButton");
await page.waitForSelector("#viewsManagerStatusActionSaveAs", {
visible: true,
});
await page.click("#viewsManagerStatusActionSaveAs");
const pageIndices = await awaitPromise(handleSaveAs);
expect(pageIndices)
.withContext(`In ${browserName}`)
.toEqual([
1, 0, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16,
]);
})
);
});
});
});

View File

@ -177,7 +177,7 @@ describe("Signature Editor", () => {
const editorSelector = getEditorSelector(0);
await page.waitForSelector(editorSelector, { visible: true });
await page.waitForSelector(
`.canvasWrapper > svg use[href="#path_p1_0"]`,
`.canvasWrapper > svg use[href="#path_0"]`,
{ visible: true }
);
@ -282,7 +282,7 @@ describe("Signature Editor", () => {
});
await page.waitForSelector(
".canvasWrapper > svg use[href='#path_p1_0']"
".canvasWrapper > svg use[href='#path_0']"
);
})
);
@ -340,7 +340,7 @@ describe("Signature Editor", () => {
});
await page.waitForSelector(
".canvasWrapper > svg use[href='#path_p1_0']"
".canvasWrapper > svg use[href='#path_0']"
);
})
);
@ -427,7 +427,7 @@ describe("Signature Editor", () => {
const editorSelector = getEditorSelector(0);
await page.waitForSelector(editorSelector, { visible: true });
await page.waitForSelector(
`.canvasWrapper > svg use[href="#path_p1_0"]`,
`.canvasWrapper > svg use[href="#path_0"]`,
{ visible: true }
);
@ -527,13 +527,13 @@ describe("Signature Editor", () => {
const editorSelector = getEditorSelector(0);
await page.waitForSelector(editorSelector, { visible: true });
await page.waitForSelector(
`.canvasWrapper > svg use[href="#path_p1_0"]`,
`.canvasWrapper > svg use[href="#path_0"]`,
{ visible: true }
);
const color = await page.evaluate(() => {
const use = document.querySelector(
`.canvasWrapper > svg use[href="#path_p1_0"]`
`.canvasWrapper > svg use[href="#path_0"]`
);
return use.parentNode.getAttribute("fill");
});
@ -583,13 +583,13 @@ describe("Signature Editor", () => {
const editorSelector = getEditorSelector(0);
await page.waitForSelector(editorSelector, { visible: true });
await page.waitForSelector(
`.canvasWrapper > svg use[href="#path_p1_0"]`,
`.canvasWrapper > svg use[href="#path_0"]`,
{ visible: true }
);
const color = await page.evaluate(() => {
const use = document.querySelector(
`.canvasWrapper > svg use[href="#path_p1_0"]`
`.canvasWrapper > svg use[href="#path_0"]`
);
return use.parentNode.getAttribute("fill");
});
@ -672,7 +672,7 @@ describe("Signature Editor", () => {
});
const { width, height } = await getRect(
page,
".canvasWrapper > svg use[href='#path_p1_0']"
".canvasWrapper > svg use[href='#path_0']"
);
expect(Math.abs(contentWidth / width - contentHeight / height))

View File

@ -1174,7 +1174,9 @@ describe("Stamp Editor", () => {
// Run sequentially to avoid clipboard issues.
for (const [, page] of pages) {
await page.evaluate(() => {
window.PDFViewerApplication.mlManager.enableAltTextModelDownload = false;
const { mlManager } = window.PDFViewerApplication;
mlManager.enableGuessAltText =
mlManager.enableAltTextModelDownload = false;
});
await switchToStamp(page);
@ -1197,7 +1199,9 @@ describe("Stamp Editor", () => {
// Run sequentially to avoid clipboard issues.
for (const [browserName, page] of pages) {
await page.evaluate(() => {
window.PDFViewerApplication.mlManager.enableAltTextModelDownload = true;
const { mlManager } = window.PDFViewerApplication;
mlManager.enableGuessAltText =
mlManager.enableAltTextModelDownload = true;
});
await switchToStamp(page);

View File

@ -158,6 +158,24 @@ async function waitForSandboxTrip(page) {
await awaitPromise(handle);
}
async function waitForDOMMutation(page, callback) {
return page.evaluateHandle(
cb => [
new Promise(resolve => {
const mutationObserver = new MutationObserver(mutationList => {
// eslint-disable-next-line no-eval
if (eval(`(${cb})`)(mutationList)) {
mutationObserver.disconnect();
resolve();
}
});
mutationObserver.observe(document, { childList: true, subtree: true });
}),
],
callback.toString()
);
}
function waitForTimeout(milliseconds) {
/**
* Wait for the given number of milliseconds.
@ -234,6 +252,10 @@ function getAnnotationSelector(id) {
return `[data-annotation-id="${id}"]`;
}
function getThumbnailSelector(pageNumber) {
return `.thumbnailImage[data-l10n-args='{"page":${pageNumber}}']`;
}
async function getSpanRectFromText(page, pageNumber, text) {
await page.waitForSelector(
`.page[data-page-number="${pageNumber}"] > .textLayer .endOfContent`
@ -957,6 +979,7 @@ export {
getSelector,
getSerialized,
getSpanRectFromText,
getThumbnailSelector,
getXY,
highlightSpan,
isCanvasMonochrome,
@ -991,6 +1014,7 @@ export {
waitAndClick,
waitForAnnotationEditorLayer,
waitForAnnotationModeChanged,
waitForDOMMutation,
waitForEntryInStorage,
waitForEvent,
waitForNoElement,

View File

@ -124,6 +124,13 @@ describe("PDF Thumbnail View", () => {
.withContext(`In ${browserName}`)
.toBe(true);
await kbFocusNext(page);
expect(
await isElementFocused(page, "#viewsManagerStatusActionButton")
)
.withContext(`In ${browserName}`)
.toBe(true);
await kbFocusNext(page);
expect(
await isElementFocused(

105
test/pdfs/.gitignore vendored
View File

@ -768,3 +768,108 @@
!issue20513.pdf
!issue20516.pdf
!issue20489.pdf
!bug2004951.pdf
!bitmap-composite-and-xnor-halftone.pdf
!bitmap-composite-and-xnor-refine.pdf
!bitmap-composite-and-xnor-text.pdf
!bitmap-composite-and-xnor.pdf
!bitmap-composite-or-xor-replace-halftone.pdf
!bitmap-composite-or-xor-replace-refine.pdf
!bitmap-composite-or-xor-replace-text.pdf
!bitmap-composite-or-xor-replace.pdf
!bitmap-customat-tpgdon.pdf
!bitmap-customat.pdf
!bitmap-halftone-10bpp-mmr.pdf
!bitmap-halftone-10bpp.pdf
!bitmap-halftone-composite.pdf
!bitmap-halftone-grid.pdf
!bitmap-halftone-refine.pdf
!bitmap-halftone-skip-dummy.pdf
!bitmap-halftone-skip-grid-template1.pdf
!bitmap-halftone-skip-grid-template2.pdf
!bitmap-halftone-skip-grid-template3.pdf
!bitmap-halftone-skip-grid.pdf
!bitmap-halftone-template1.pdf
!bitmap-halftone-template2.pdf
!bitmap-halftone-template3.pdf
!bitmap-halftone.pdf
!bitmap-initially-unknown-size.pdf
!bitmap-mmr.pdf
!bitmap-p32-eof.pdf
!bitmap-randomaccess.pdf
!bitmap-refine-customat-tpgron.pdf
!bitmap-refine-customat.pdf
!bitmap-refine-lossless.pdf
!bitmap-refine-page-subrect.pdf
!bitmap-refine-page.pdf
!bitmap-refine-refine.pdf
!bitmap-refine-template1-tpgron.pdf
!bitmap-refine-template1.pdf
!bitmap-refine-tpgron.pdf
!bitmap-refine.pdf
!bitmap-stripe-initially-unknown-height.pdf
!bitmap-stripe-last-implicit.pdf
!bitmap-stripe-single-no-end-of-stripe.pdf
!bitmap-stripe-single.pdf
!bitmap-stripe.pdf
!bitmap-symbol-big-segmentid.pdf
!bitmap-symbol-context-reuse.pdf
!bitmap-symbol-empty.pdf
!bitmap-symbol-negative-sbdsoffset.pdf
!bitmap-symbol-refine.pdf
!bitmap-symbol-symbolrefine-textrefine.pdf
!bitmap-symbol-symbolrefineone-customat.pdf
!bitmap-symbol-symbolrefineone-template1.pdf
!bitmap-symbol-symbolrefineone.pdf
!bitmap-symbol-symbolrefineseveral.pdf
!bitmap-symbol-symhuff-texthuff.pdf
!bitmap-symbol-symhuff-texthuffB10B13.pdf
!bitmap-symbol-symhuffB5B3-texthuffB7B9B12.pdf
!bitmap-symbol-symhuffcustom-texthuffcustom.pdf
!bitmap-symbol-symhuffrefine-textrefine.pdf
!bitmap-symbol-symhuffrefineone.pdf
!bitmap-symbol-symhuffrefineseveral.pdf
!bitmap-symbol-symhuffuncompressed-texthuff.pdf
!bitmap-symbol-textbottomleft.pdf
!bitmap-symbol-textbottomlefttranspose.pdf
!bitmap-symbol-textbottomright.pdf
!bitmap-symbol-textbottomrighttranspose.pdf
!bitmap-symbol-textcomposite.pdf
!bitmap-symbol-texthuffrefine.pdf
!bitmap-symbol-texthuffrefineB15.pdf
!bitmap-symbol-texthuffrefinecustom.pdf
!bitmap-symbol-texthuffrefinecustomdims.pdf
!bitmap-symbol-texthuffrefinecustompos.pdf
!bitmap-symbol-texthuffrefinecustomposdims.pdf
!bitmap-symbol-texthuffrefinecustomsize.pdf
!bitmap-symbol-textrefine-customat.pdf
!bitmap-symbol-textrefine-negative-delta-width.pdf
!bitmap-symbol-textrefine.pdf
!bitmap-symbol-texttopright.pdf
!bitmap-symbol-texttoprighttranspose.pdf
!bitmap-symbol-texttranspose.pdf
!bitmap-symbol.pdf
!bitmap-template1-customat-tpgdon.pdf
!bitmap-template1-customat.pdf
!bitmap-template1-tpgdon.pdf
!bitmap-template1.pdf
!bitmap-template2-customat-tpgdon.pdf
!bitmap-template2-customat.pdf
!bitmap-template2-tpgdon.pdf
!bitmap-template2.pdf
!bitmap-template3-customat-tpgdon.pdf
!bitmap-template3-customat.pdf
!bitmap-template3-tpgdon.pdf
!bitmap-template3.pdf
!bitmap-tpgdon.pdf
!bitmap-trailing-7fff-stripped-harder-refine.pdf
!bitmap-trailing-7fff-stripped-harder.pdf
!bitmap-trailing-7fff-stripped.pdf
!bitmap.pdf
!bomb_giant.pdf
!bug2009627.pdf
!page_with_number.pdf
!page_with_number_and_link.pdf
!Brotli-Prototype-FileA.pdf
!bug2013793.pdf
!bug2014080.pdf

Binary file not shown.

View File

@ -0,0 +1 @@
https://github.com/pdfminer/pdfminer.six/files/8399364/64.pdf

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More