Compare commits

...

630 Commits

Author SHA1 Message Date
Brian DeHamer
f31c2921c1
Merge pull request #2058 from actions/dependabot/npm_and_yarn/packages/attest/undici-5.29.0
Bump undici from 5.28.5 to 5.29.0 in /packages/attest
2025-05-25 16:30:11 -07:00
dependabot[bot]
41b3ce3141
Bump undici from 5.28.5 to 5.29.0 in /packages/attest
Bumps [undici](https://github.com/nodejs/undici) from 5.28.5 to 5.29.0.
- [Release notes](https://github.com/nodejs/undici/releases)
- [Commits](https://github.com/nodejs/undici/compare/v5.28.5...v5.29.0)

---
updated-dependencies:
- dependency-name: undici
  dependency-version: 5.29.0
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-15 16:30:57 +00:00
Josh Gross
8d8a914a94
Document context.runAttempt in @actions/github 6.0.1 (#2054) 2025-05-13 10:37:14 -04:00
Brian DeHamer
36db4d62ad
Merge pull request #2045 from actions/dependabot/npm_and_yarn/packages/attest/octokit/endpoint-9.0.6
Bump @octokit/endpoint from 9.0.5 to 9.0.6 in /packages/attest
2025-05-08 10:47:59 -07:00
Brian DeHamer
a25b686a45
Merge pull request #2044 from actions/dependabot/npm_and_yarn/packages/attest/octokit/request-error-5.1.1
Bump @octokit/request-error from 5.1.0 to 5.1.1 in /packages/attest
2025-05-08 10:47:20 -07:00
dependabot[bot]
957610a37a
Bump @octokit/request-error from 5.1.0 to 5.1.1 in /packages/attest
Bumps [@octokit/request-error](https://github.com/octokit/request-error.js) from 5.1.0 to 5.1.1.
- [Release notes](https://github.com/octokit/request-error.js/releases)
- [Commits](https://github.com/octokit/request-error.js/compare/v5.1.0...v5.1.1)

---
updated-dependencies:
- dependency-name: "@octokit/request-error"
  dependency-version: 5.1.1
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-08 11:19:50 +00:00
dependabot[bot]
6ed621e7d1
Bump @octokit/endpoint from 9.0.5 to 9.0.6 in /packages/attest
Bumps [@octokit/endpoint](https://github.com/octokit/endpoint.js) from 9.0.5 to 9.0.6.
- [Release notes](https://github.com/octokit/endpoint.js/releases)
- [Commits](https://github.com/octokit/endpoint.js/compare/v9.0.5...v9.0.6)

---
updated-dependencies:
- dependency-name: "@octokit/endpoint"
  dependency-version: 9.0.6
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-08 11:19:48 +00:00
Ryan Ghadimi
8007c1c535
Merge pull request #2049 from actions/ghadimir/audit_fix
NPM audit fixes
2025-05-08 12:18:34 +01:00
Ryan Ghadimi
6444290c57 release prep 2025-05-08 08:53:55 +00:00
Ryan Ghadimi
f32d6bc043 bump octokit core 2025-05-08 08:42:32 +00:00
Ryan Ghadimi
2e4ab87130 artifact deps 2025-05-08 08:38:48 +00:00
Ryan Ghadimi
ef199a9ab0
Merge pull request #2043 from actions/ghadimir/audit_fix
NPM Audit Fixes
2025-05-07 15:58:29 +01:00
Ryan Ghadimi
917a43eb6e bump octokit methods 2025-05-07 11:17:56 +00:00
Ryan Ghadimi
07cac0a6b3 bump gh package ver 2025-05-07 11:12:29 +00:00
Ryan Ghadimi
2046ee6d6b gh package release prep 2025-05-07 11:08:28 +00:00
Ryan Ghadimi
2b476323c4 fix packages/gh deps 2025-05-07 11:05:00 +00:00
Ryan Ghadimi
aebe304a19
Merge pull request #2041 from actions/ghadimir/fix_cache_tests
Fix cache tests
2025-05-07 09:53:32 +01:00
Ryan Ghadimi
e8f276a715 alphabetically order them 2025-05-07 08:31:17 +00:00
Ryan Ghadimi
d156bcaa78 maybe this works instead 2025-05-06 20:22:05 +00:00
Ryan Ghadimi
5ae4c5be28 don't need that maybe 2025-05-06 20:08:50 +00:00
Ryan Ghadimi
d50f1ac1b9 change url 2025-05-06 20:02:27 +00:00
Ryan Ghadimi
87cb7035bb add env variable for cache tests 2025-05-06 19:50:44 +00:00
Alisson Tenório
1b1e81526b
Update README.md (#1719) 2025-04-09 10:46:07 -04:00
Salman Chishti
525ebf0c50
Merge pull request #2004 from AbhiPrasad/patch-1
fix link in `@actions/artifact` `RELEASES.md`
2025-04-09 15:34:10 +01:00
Abhijeet Prasad
07341e11d8
fix link in @actions/artifact RELEASES.md 2025-03-26 11:22:14 -04:00
Salman Chishti
930c890727
Merge pull request #1995 from actions/salmanmkc/2-new-cache-artifacts-release
Prepare Cache v4.0.3 & Artifact v2.3.2 releases
2025-03-17 21:22:10 +00:00
Salman Chishti
a410c4a9cf
remove extra brace 2025-03-17 17:14:25 +00:00
Salman Chishti
10277d48ca
Add update to release doc, as will include it in this release 2025-03-17 17:12:32 +00:00
JoannaaKL
857c61a9df
Merge pull request #1994 from gitulisca-enterprise-cloud-testing/gitulisca/log-restore-request-version
Log cache version requested on debugging message
2025-03-17 17:58:16 +01:00
Salman Chishti
c40bccc9c3
Use patch instead of minor 2025-03-17 14:08:42 +00:00
Salman Chishti
ff4d4afef8
shared instead of secure 2025-03-17 12:48:56 +00:00
Salman Chishti
4d4bbebd6a
update package-lock.json 2025-03-17 12:47:54 +00:00
Salman Chishti
261fcae498
change it to minor version instead of patch 2025-03-17 12:44:51 +00:00
Salman Chishti
4059d2af66
update versions for cache and artifact 2025-03-17 12:09:16 +00:00
Salman Chishti
2559a2ac8a
Merge pull request #1982 from actions/salmanmkc/obfuscate-sas
Remove logging of any SAS tokens in Actions/Cache and Actions/Artifact
2025-03-17 11:47:29 +00:00
Art Leo
514314311c
Log cache version requested 2025-03-15 10:13:43 +11:00
Salman Chishti
957d42e6c5 add encoding back with extra tests 2025-03-14 06:38:57 -07:00
Salman Chishti
39419dd8c3 don't need to url encode or set var 2025-03-14 06:21:41 -07:00
Salman Chishti
d13e6311f1 fix tests 2025-03-14 04:28:22 -07:00
Salman Chishti
6876e2a664 update ts docs 2025-03-13 04:47:49 -07:00
Salman Chishti
fc482662af PR feedback, back to simplified approach, no export on client as well 2025-03-13 04:23:45 -07:00
Salman Chishti
abd9054c61 Log debug error when failing to decode 2025-03-12 08:14:01 -07:00
Ryan Ghadimi
253e837c4d
Merge pull request #1991 from actions/ghadimir/hash_to_digest_upload
Change hash to digest for consistent terminology across runner logs
2025-03-12 12:26:25 +00:00
Salman Chishti
3ac34ffcb7 Mask different situations, malformed URL, encoded, decoded, raw signatures, nested parameters, and moved to a utility file 2025-03-12 03:17:35 -07:00
Ryan Ghadimi
56c5a39afb
Update blob-upload.ts 2025-03-12 07:59:00 +00:00
Ryan Ghadimi
7ae578ddd1
Merge pull request #1987 from actions/ghadimir/digest_typo
Bump release version
2025-03-11 11:07:20 +00:00
Ryan Ghadimi
b2d2270685 Bump package.json 2025-03-11 11:02:42 +00:00
Ryan Ghadimi
0d1d5c7687 Bump release version 2025-03-11 10:58:38 +00:00
Ryan Ghadimi
769bb0fea1
Merge pull request #1986 from actions/ghadimir/digest_typo
Fix comment on expectedHash
2025-03-11 10:57:05 +00:00
Ryan Ghadimi
d7ddca4309 Fix comment on expectedHash 2025-03-11 10:52:19 +00:00
Ryan Ghadimi
8780507298
Merge pull request #1985 from actions/ghadimir/dropdown_releases
Dropdown for package when releasing
2025-03-10 15:42:45 +00:00
Ryan Ghadimi
790c56665a
Update releases.yml 2025-03-10 15:33:38 +00:00
Ryan Ghadimi
9d8017eadb
Merge pull request #1976 from actions/ghadimir/prep_artifact_release
Prepare for Artifact v2.3.0 release
2025-03-10 15:23:55 +00:00
Ryan Ghadimi
20fee3ea63
Update @actions/artifact version to 2.3.0 2025-03-10 15:12:36 +00:00
Ryan Ghadimi
7501423b6f
Update RELEASES.md for version 2.3.0 2025-03-10 15:11:43 +00:00
Ryan Ghadimi
d0cc3418ea
Bump version to 2.3.0
Better semver
2025-03-10 15:11:18 +00:00
Salman Chishti
5007821c77 Remove clean script 2025-03-10 06:51:30 -07:00
Salman Chishti
47c4fa85df masks the whole URL, update tests 2025-03-10 06:47:52 -07:00
Salman Chishti
1cd2f8a538 Instead of using utility method in core lib, use method in both twirp clients 2025-03-07 06:01:25 -08:00
Ryan Ghadimi
b85d4e6b38 Prepare for Artifact v2.2.3 release 2025-03-07 10:14:36 +00:00
Ryan Ghadimi
dc22dc7cad
Merge pull request #1975 from actions/ghadimir/update_call_to_list_artifacts
Compare Artifact Digests
2025-03-07 09:51:05 +00:00
Ryan Ghadimi
8c05dc87d8
Change info logs to debug logs 2025-03-07 09:38:33 +00:00
Salman Chishti
884aa17886 remove these changes 2025-03-06 14:31:21 -08:00
Salman Chishti
944e6b78db Add secret and signature masking for cache and artifact packages 2025-03-06 14:25:32 -08:00
JoannaaKL
d70fb49aaa
Merge pull request #1974 from actions/list-artifacts-fix
Dont skip pages
2025-03-06 09:35:57 +01:00
Ryan Ghadimi
3726c11433 Please the linter 2025-03-05 14:44:58 +00:00
Ryan Ghadimi
71b40f7024 nicer wording 2025-03-05 14:35:01 +00:00
Ryan Ghadimi
83e5e2517b Change some debug -> info for artifacts hash logging 2025-03-05 14:30:51 +00:00
Ryan Ghadimi
d5c8a0fa27 Update proto artifact interface, retrieve artifact digests, return indicator of mismatch failure 2025-03-05 11:29:44 +00:00
JoannaaKL
780e24be34
Dont skip pages 2025-03-05 09:27:35 +00:00
Brian DeHamer
ec9716b3cc
Merge pull request #1969 from actions/bdehamer/workflow-ref
set workflow.ref provenance field from ref claim
2025-02-26 09:50:14 -08:00
Brian DeHamer
0bc338adab
set workflow.ref provenance field from ref claim
Updates the `buildSLSAProvenancePredicate` function to populate the
`workflow.ref` field from the `ref` claim in the OIDC token.

Signed-off-by: Brian DeHamer <bdehamer@github.com>
2025-02-26 08:47:27 -08:00
Rob Herley
5378ea8eca
Merge pull request #1968 from actions/robherley/cache/v4.0.2
cache: prep v4.0.2 release
2025-02-25 16:00:06 -05:00
Brian DeHamer
b95b593ca5
Merge pull request #1957 from actions/bdehamer/update-undici
Bump undici to v5.28.5
2025-02-25 12:54:29 -08:00
Rob Herley
4fedf471b1
cache: prep v4.0.2 release 2025-02-25 15:03:37 -05:00
Rob Herley
1b9063ee0e
Merge pull request #1966 from actions/robherley/wrap-create-cache-err
cache: wrap create failures in ReserveCacheError
2025-02-25 15:00:25 -05:00
Rob Herley
d096588f08
cache: wrap create failures in ReserveCacheError 2025-02-25 12:49:08 -05:00
Yang Cao
662b9d91f5
Merge pull request #1963 from actions/yacaovsnc/release_2_2_2
Prepare artifact release 2.2.2
2025-02-20 16:29:30 -05:00
Yang Cao
a62f530b6f Update package-lock.json 2025-02-20 21:20:28 +00:00
Yang Cao
2995cdf0a1 Prepare artifact release 2.2.2 2025-02-20 21:12:25 +00:00
Yang Cao
f10f9c8217
Merge pull request #1962 from actions/yacaovsnc/set_default_concurrency_to_5
Default upload artifacts concurrency to 5
2025-02-20 13:56:30 -05:00
Yang Cao
c26e6f3aba Default upload artifacts concurrency to 5 2025-02-20 17:03:29 +00:00
Rob Herley
2b08dc18f2
Merge pull request #1958 from actions/robherley/cache/v4.0.1
Update manifests & release notes for cache v4.0.1
2025-02-14 12:20:52 -05:00
Brian DeHamer
412108cd55
add undici to @actions/github dependencies
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2025-02-14 08:12:00 -08:00
Rob Herley
8fcec1fb58
update manifests & release notes for cache v4.0.1 2025-02-14 11:02:13 -05:00
Brian DeHamer
95e747361e
bump undici to 5.28.5
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2025-02-14 08:02:10 -08:00
Rob Herley
aad39a371f
Merge pull request #1954 from actions/robherley/miss-msg
Cache miss as debug, not warning annotation
2025-02-14 10:58:45 -05:00
Rob Herley
7fe619c58c
update mocks 2025-02-14 09:42:41 -05:00
Rob Herley
e6fb8f1c5d
cache miss as debug, not warning annotation 2025-02-14 09:28:01 -05:00
Rob Herley
6a942b304d
Merge pull request #1947 from actions/robherley/rm-twirp-ts
Remove runtime dependency on `twirp-ts`
2025-02-14 09:14:17 -05:00
Ehsan Hosseini
340a6b15b5
update undici package to 5.25.5 (#1942) 2025-01-28 10:14:55 -05:00
Rob Herley
e0c069db55
remove runtime dependency on twirp-ts 2025-01-27 17:52:55 +00:00
Josh Gross
1f7c2c79e0
[tool-cache] Update @actions/core and prepare 2.0.2 release (#1872)
Some checks failed
Code Scanning - Action / CodeQL-Build (push) Failing after 3m6s
toolkit-unit-tests / Build (18.x, ubuntu-latest) (push) Failing after 10m57s
artifact-unit-tests / Upload (ubuntu-latest) (push) Failing after 12m51s
cache-unit-tests / Build (ubuntu-latest) (push) Failing after 40m27s
toolkit-audit / Audit (push) Failing after 42m47s
toolkit-unit-tests / Build (20.x, ubuntu-latest) (push) Failing after 3h14m49s
toolkit-unit-tests / Build (20.x, windows-latest) (push) Has been cancelled
cache-windows-bsd-unit-tests / Build (push) Has been cancelled
toolkit-unit-tests / Build (18.x, windows-latest) (push) Has been cancelled
toolkit-unit-tests / Build (20.x, macos-latest-large) (push) Has been cancelled
artifact-unit-tests / Upload (macos-latest) (push) Has been cancelled
artifact-unit-tests / Upload (windows-latest) (push) Has been cancelled
artifact-unit-tests / Verify and Delete (push) Has been cancelled
cache-unit-tests / Build (macOS-latest) (push) Has been cancelled
cache-unit-tests / Build (windows-latest) (push) Has been cancelled
toolkit-unit-tests / Build (18.x, macos-latest-large) (push) Has been cancelled
* Update `@actions/core` and prepare 2.0.2 release

* Include these changes in the release notes
2025-01-15 15:57:09 -05:00
Yang Cao
5e8c25d1f5
Merge pull request #1929 from actions/yacaovsnc/release_artifact_2_2_1
Prep release packages/artifact v2.2.1
2025-01-09 09:21:32 -05:00
Yang Cao
3095d112ef Prep release packages/artifact v2.2.1 2025-01-08 21:11:59 +00:00
Yang Cao
16ef1448d7
Merge pull request #1928 from actions/yacaovsnc/artifact_upload_concurrency_and_timeout
Make both upload concurrency and timeout settings configurable with env variables.
2025-01-08 16:07:30 -05:00
Yang Cao
e55409315f Rename the prefix to be more specific 2025-01-08 20:32:45 +00:00
Yang Cao
d4385a64a7 Concurrency has a min of 1 2025-01-08 18:14:04 +00:00
Yang Cao
ede05b95d7 Make concurrency change opt-in, but can only go lower 2025-01-08 18:11:38 +00:00
Yang Cao
f3c12d5561 Set default concurrency to 10 and make timeout configurable 2025-01-08 16:19:09 +00:00
Josh Gross
adb9c4a7f4
Remove more unused cache APIs (#1909) 2024-12-19 13:26:19 -05:00
Josh Gross
01f21badd5
Remove more unused cache APIs 2024-12-17 14:51:57 -05:00
Josh Gross
26f8f84a96
Remove unused cache API (#1907) 2024-12-17 14:04:05 -05:00
Brian DeHamer
433f76091b
Merge pull request #1908 from actions/bdehamer/artifact-2.2.0
Prepare artifact release 2.2.0
2024-12-17 10:24:18 -08:00
Brian DeHamer
4426b4ea91
Prepare artifact release 2.2.0
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-12-17 10:05:45 -08:00
Brian DeHamer
f522fdf89d
Merge pull request #1896 from actions/bdehamer/artifact-digest
return artifact digest on upload
2024-12-17 10:01:16 -08:00
Brian DeHamer
1e0c16f0dc
return artifact digest on upload
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-12-06 14:27:46 -08:00
Bassem Dghaidi
b7a00a3203
Merge pull request #1886 from actions/Link-/cache-4.0.0
Prepare `@actions/cache` `4.0.0` release
2024-12-04 20:09:19 +01:00
Bassem Dghaidi
0827eef58f Rerun CI 2024-12-04 10:53:00 -08:00
Bassem Dghaidi
cd9197e9bd Add announcement link 2024-12-04 08:23:10 -08:00
Bassem Dghaidi
72447df44c Update deprecation notice 2024-12-04 05:33:47 -08:00
Bassem Dghaidi
59845ec372 Update deprecation notice 2024-12-04 05:30:50 -08:00
Bassem Dghaidi
cb001af8a3 Update README to include deprecation notice 2024-12-03 02:52:39 -08:00
Bassem Dghaidi
4498687c5e Prepare @actions/cache 4.0.0 release 2024-12-03 02:40:00 -08:00
Bassem Dghaidi
a10e209c8d
Merge pull request #1882 from actions/enhance-blob-client
Enhance blob client resilience & performance
2024-12-02 20:48:46 +01:00
Bassem Dghaidi
c02c929c56 Minor comment adjustments 2024-12-02 11:10:25 -08:00
Bassem Dghaidi
c649df4b94 Minor comment adjustments 2024-12-02 10:55:33 -08:00
Bassem Dghaidi
fb40492b6f
Merge branch 'enhance-blob-client' of github.com:actions/toolkit into enhance-blob-client 2024-12-02 10:55:00 -08:00
Bassem Dghaidi
502e8ce651 Minor comment adjustments 2024-12-02 10:53:29 -08:00
Bassem Dghaidi
3f7df8ec5a
Fix comments
Co-authored-by: Josh Gross <joshmgross@github.com>
2024-12-02 19:46:18 +01:00
Bassem Dghaidi
b24632bd80
Fix comments
Co-authored-by: Josh Gross <joshmgross@github.com>
2024-12-02 19:46:11 +01:00
Bassem Dghaidi
792ec716de Tune upload options 2024-12-02 07:32:33 -08:00
Bassem Dghaidi
7ad18fd6bd Fix linter complaints 2024-12-02 04:24:17 -08:00
Bassem Dghaidi
87171e29ca Fix tests 2024-12-02 04:18:46 -08:00
Bassem Dghaidi
a762876d6d Minor refactoring 2024-12-02 04:08:21 -08:00
Bassem Dghaidi
d89855bb90 Fix upload progress bug 2024-12-02 03:55:57 -08:00
Bassem Dghaidi
db1d01308c Troubleshoot 2024-12-02 03:35:20 -08:00
Bassem Dghaidi
4a272e9053 Troubleshoot 2024-12-02 03:08:05 -08:00
Bassem Dghaidi
ee1c07d0aa Add error handling for failed uploads 2024-12-02 02:38:51 -08:00
Bassem Dghaidi
c6f1224d30 Add progress tracking for blob uploads 2024-12-02 02:33:27 -08:00
Bassem Dghaidi
1d403c2fd8 Fix tests 2024-11-29 07:36:51 -08:00
Bassem Dghaidi
65892d5ffe Fine tune blob uploads 2024-11-29 07:09:05 -08:00
Bassem Dghaidi
8c5f6f2dc5 Force use of Azure for restoreCacheV2 2024-11-28 07:42:07 -08:00
Bassem Dghaidi
62f5f1885b Refactor saveCacheV2 to use saveCache from cacheHttpClient 2024-11-28 07:22:01 -08:00
Bassem Dghaidi
eaf0083ee2 Respect download options for restore 2024-11-28 04:56:37 -08:00
Bassem Dghaidi
c1fb081674
Linter fixes 2024-11-28 03:53:34 -08:00
Bassem Dghaidi
df166709a3
Refactor cache upload functionality and improve test cases 2024-11-28 03:52:09 -08:00
Bassem Dghaidi
c5a5de05f6 Delete download-cache 2024-11-28 03:36:32 -08:00
Bassem Dghaidi
3a128c88c3 Merge branch 'main' into enhance-blob-client 2024-11-27 08:25:51 -08:00
John Sudol
9cc30cb0d3
Add saveCacheV2 tests (#1879) 2024-11-27 09:30:36 -05:00
Bassem Dghaidi
35d87ab129
Refactor code formatting for consistency and readability 2024-11-27 05:58:22 -08:00
Bassem Dghaidi
af3981c955 Update the useragent of the old http client to pass cache version 2024-11-27 05:50:01 -08:00
Bassem Dghaidi
27e5cf2514 Replace downloadCacheFile with downloadCacheStorageSDK 2024-11-27 04:51:21 -08:00
John Sudol
b050504b2d Add test case for when the uploadFile fails on the blobclient 2024-11-27 01:45:46 +00:00
John Sudol
5d0a4af70a Remove unused mock 2024-11-26 23:33:19 +00:00
John Sudol
94f18eb26e Only mock the cacheUtil methods we need 2024-11-26 23:05:11 +00:00
John Sudol
208dbe2131 PR feedback 2024-11-26 16:36:12 +00:00
John Sudol
46174ed573 run prettier 2024-11-26 00:56:07 +00:00
John Sudol
1f087496ca Add debug message for uploadResponse 2024-11-26 00:43:37 +00:00
John Sudol
8f606682c2 Add saveCacheV2 tests 2024-11-26 00:23:42 +00:00
Bassem Dghaidi
928d3e806d
Merge pull request #1876 from actions/add-restore-tests
Add `restoreCacheV2` tests
2024-11-25 21:35:31 +01:00
Bassem Dghaidi
35ede8fcf0 Add a new debug message for downloads 2024-11-25 12:08:07 -08:00
Bassem Dghaidi
4d31e1048a Add the download cache file status code to debug log 2024-11-25 07:34:52 -08:00
Bassem Dghaidi
0e321b26f4 Add the download cache file status code to debug log 2024-11-25 07:34:07 -08:00
Bassem Dghaidi
2d2513915c
Remove unused package
Co-authored-by: Rob Herley <robherley@github.com>
2024-11-25 16:13:20 +01:00
Bassem Dghaidi
de236da416 Fix cache lookup scenario 2024-11-25 05:47:51 -08:00
Bassem Dghaidi
4dadd612d6 Add support for matching on restore key values 2024-11-25 05:42:50 -08:00
Bassem Dghaidi
54ac2dd012 Add cache service version debug message 2024-11-25 04:08:47 -08:00
Bassem Dghaidi
4de30f744e Add more tests for restoreCacheV2 2024-11-25 03:53:03 -08:00
Bassem Dghaidi
27dfd2c41c Merge branch 'main' into add-restore-tests 2024-11-22 10:23:10 -08:00
Bassem Dghaidi
20ed2908f1
Merge pull request #1857 from actions/neo-cache-service
Integrate cache service v2
2024-11-22 19:22:23 +01:00
Bassem Dghaidi
39d19810a8 Add restore tests 2024-11-22 09:01:59 -08:00
Bassem Dghaidi
e2028d43a2 Linter fixes and remove unnecessary dependency 2024-11-21 04:05:04 -08:00
Bassem Dghaidi
267841d7bd
Add isGhes gate and refactor to clean up circular dependencies 2024-11-21 04:01:44 -08:00
Bassem Dghaidi
ab58a59f33 Bump cross-spawn to 7.0.6 2024-11-20 14:02:54 -08:00
Bassem Dghaidi
a1e6ef3759 Update cache service APIs & cleanup 2024-11-20 13:53:47 -08:00
Bassem Dghaidi
8616c313a2 Remove unused definitions 2024-11-14 07:11:12 -08:00
Bassem Dghaidi
3ca85474b8 Merge branch 'neo-cache-service' of github.com:actions/toolkit into neo-cache-service 2024-11-14 06:50:01 -08:00
Bassem Dghaidi
6c11d441a5
Remove unnecessary type hints 2024-11-14 06:49:55 -08:00
Bassem Dghaidi
68ab87caa2
Add check to make sure archive has been created already
Co-authored-by: Josh Gross <joshmgross@github.com>
2024-11-14 15:49:02 +01:00
Bassem Dghaidi
555b03f6fd Revert package.json 2024-11-14 06:40:10 -08:00
Bassem Dghaidi
ab8110fa2f Remove unecessary packages from top level package.json 2024-11-14 06:36:42 -08:00
Bassem Dghaidi
5e9ef8532f Lint fixes 2024-11-14 04:47:27 -08:00
Bassem Dghaidi
ea4bf4810a Remove unnecessary debug information 2024-11-14 04:39:30 -08:00
Bassem Dghaidi
c3e354da23 Remove unnecessary debug information 2024-11-14 04:33:31 -08:00
Bassem Dghaidi
2ee77e654f Add missing function return types 2024-11-14 03:42:14 -08:00
Bassem Dghaidi
83baffc3f6
Package upgrades with security fixes 2024-11-14 03:34:32 -08:00
Bassem Dghaidi
19cdd5f210
Linter cleanups 2024-11-14 03:34:13 -08:00
Bassem Dghaidi
b2557ac90c Formatting and stylistic cleanup 2024-11-14 03:22:03 -08:00
Bassem Dghaidi
69409b3acd
Fix broken test 2024-11-14 03:10:48 -08:00
Bassem Dghaidi
9dff82c727
Port dependencies & remove dependency on toolkit/artifacts 2024-11-14 03:01:04 -08:00
Bassem Dghaidi
d109d9c03e
Handle ACTIONS_CACHE_SERVICE_V2 feature flag 2024-11-14 03:00:43 -08:00
Bassem Dghaidi
4e1912a3c3 Restore __tests__ 2024-11-14 02:08:24 -08:00
Bassem Dghaidi
9da70ffbd7 Post merge cleanup 2024-11-14 02:04:20 -08:00
Bassem Dghaidi
75cdb2c08f
Merge branch 'main' into neo-cache-service 2024-11-14 02:02:55 -08:00
Josh Gross
bb2278e5cf
Extend Node version test coverage (#1843)
* Extend Node version test coverage

* Remove Node 16
2024-11-08 10:30:18 -05:00
Josh Gross
77f247b2f3
Prepare @actions/cache 3.3.0 release (#1871) 2024-11-01 13:32:42 -04:00
Brian DeHamer
d13839fcf4
Merge pull request #1870 from actions/bdehamer/attest-1.5-release-notes
`@actions/attest`: Release notes for v1.5.0 release
2024-11-01 09:55:13 -07:00
Brian DeHamer
7e54468896
update release notes for @actions/attest v1.5.0
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-11-01 09:45:11 -07:00
Brian DeHamer
339447c5d3
Merge pull request #1863 from meriadec/attest-provenance-tags
Handle tags containing "@" character in `buildSLSAProvenancePredicate`
2024-11-01 09:35:13 -07:00
Brian DeHamer
43ce96d373
Merge pull request #1865 from actions/bdehamer/multi-subject
`@actions/attest`: Support multi-subject attestations
2024-11-01 09:33:11 -07:00
Brian DeHamer
265a5be8bc
support multi-subject attestations
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-11-01 09:08:19 -07:00
Brian DeHamer
65ee4d33af
use macos-latest-large in test/release workflows (#1869)
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-11-01 11:59:55 -04:00
Meriadec Pillet
717ba9d9a4
Handle tags containing "@" character in buildSLSAProvenancePredicate
When using some monorepo-related tools (like [changesets](https://github.com/changesets/changesets)),
the produced tags have a special format that includes `@` character.

For example, a `foo` package on a monorepo will produce Git tags looking
like `foo@1.0.0` if using changesets.

When used in combination with `actions/attest-build-provenance`, the
action was not properly re-crafting the tag in `buildSLSAProvenancePredicate` because
it was always splitting the workflow ref by `@` and taking the second
element.

This result in this error on CI:

```
Error: Error: Failed to persist attestation: Invalid Argument - values do not match: refs/tags/foo != refs/tags/foo@1.0.0 - https://docs.github.com/rest/repos/repos#create-an-attestation
````

This PR slightly update the logic there, and rather take "everything
located after the first '@'". This shouldn't introduce any breaking
change, while giving support for custom tags.

I've added the corresponding test case, it passes, however I couldn't
successfully run the full test suite (neither on `main`). Looking
forward for CI outcome.

Thanks in advance for the review 🙏.
2024-10-30 14:29:42 +01:00
Bassem Dghaidi
01bf918aa5 Refactoring & cleanup 2024-10-24 06:09:23 -07:00
Bassem Dghaidi
28dbd8ff93
Cleanups and package refactoring 2024-10-24 05:19:48 -07:00
Josh Gross
7f5921cddd
Document unreleased changes in cache and tool-cache (#1856) 2024-10-22 12:01:31 -04:00
Bassem Dghaidi
89354f6540
Cleanup implementation and use tarballs instead of streaming zip 2024-10-21 05:21:32 -07:00
Bassem Dghaidi
d399e33060 Merge branch 'main' into neo-cache-service 2024-10-21 02:25:12 -07:00
Brian DeHamer
29d342f176
Merge pull request #1848 from actions/bdehamer/attest-prep-1-5
`@actions/attest`: prep release of @actions/attest v1.5.0
2024-10-14 12:49:33 -07:00
Brian DeHamer
72113fe791
Merge pull request #1847 from actions/bdehamer/attest-update-core
`@actions/attest`: bump @actions/core from 1.10.1 to 1.11.1
2024-10-14 12:49:15 -07:00
Brian DeHamer
7b4d9763cc
Merge pull request #1846 from actions/bdehamer/sigstore-3-0-0
`@actions/attest`: bump @sigstore/sign from 2.3.2 to 3.0.0
2024-10-14 12:48:55 -07:00
Brian DeHamer
26c752f562
prep release of @actions/attest v1.5.0
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-10-14 12:33:10 -07:00
Brian DeHamer
ac1332a8e2
bump @actions/core from 1.10.1 to 1.11.1
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-10-14 12:16:09 -07:00
Brian DeHamer
c6c5ef6b8e
bump @sigstore/sign from 2.3.2 to 3.0.0
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-10-14 12:06:26 -07:00
Bassem Dghaidi
4d1dedf2c7
Merge branch 'main' into neo-cache-service 2024-10-09 07:45:11 -07:00
Bassem Dghaidi
13abc95165
Port restoreCache to new service 2024-10-09 04:32:57 -07:00
Rob Herley
ee93b05ee9
Merge pull request #1845 from actions/robherley/update-release-notes
Update artifact release notes
2024-10-08 14:11:08 -04:00
Rob Herley
799f8f5f3d
Update artifact release notes
Includes:
- #1815
2024-10-08 14:06:04 -04:00
Rob Herley
201b082ce1
Merge pull request #1844 from actions/robherley/artifact-2.1.11
Properly resolve relative symlinks
2024-10-08 13:08:45 -04:00
Rob Herley
49cbbbcd99
Update symlink bug fix reference number 2024-10-08 13:02:06 -04:00
Rob Herley
545e0e6b95
properly resolve relative symlinks 2024-10-08 12:35:48 -04:00
JoannaaKL
c18a7d2f73
Merge pull request #1815 from mydea/fn/remove-crypto
Use native `crypto` package from node
2024-10-07 11:06:38 +02:00
Josh Gross
d14afd7973
Explicitly import crypto (#1842)
* Explicitly import `crypto`

* Add release notes for 1.11.1

* Fix crypto mock in test

* Fix `crypto` mock

* Lint
2024-10-04 17:23:42 -04:00
Josh Gross
22a72ac3d7
Include #1551 in @actions/core 1.11.0 release notes (#1840) 2024-10-02 14:30:25 -04:00
Josh Gross
6ca0d9b637
Release @actions/core v1.11.0 (#1839) 2024-10-02 13:49:03 -04:00
Rob Herley
650f7c6aa3
Merge pull request #1830 from actions/robherley/artifact-2.1.10
Fix regression, auto readlink on symlinks again
2024-10-02 13:06:15 -04:00
Josh Gross
78af634e7e
Remove dependency on uuid package (#1824) 2024-10-02 12:28:06 -04:00
Rob Herley
2a8f1c5ddd
bump package lock version 2024-10-01 16:43:30 -04:00
Bassem Dghaidi
e62c6428e7 Fix service urls 2024-09-24 03:29:14 -07:00
Bassem Dghaidi
07e51a445e Add cache service v2 client 2024-09-24 03:17:44 -07:00
Bassem Dghaidi
70e5684b1f
Merge branch 'main' into neo-cache-service 2024-09-24 02:36:02 -07:00
Rob Herley
5a62022195
/ 2024-09-20 17:52:14 -04:00
Rob Herley
8551843690
fix assertion 2024-09-20 17:45:55 -04:00
Rob Herley
d6694e491d
update release notes 2024-09-20 17:31:40 -04:00
Rob Herley
7f19a7886a
fix regression, auto readlink on symlinks again 2024-09-20 17:23:43 -04:00
Brian DeHamer
6dd369c0e6
Merge pull request #1823 from actions/bdehamer/enterprise-issuer
[@actions/attest] Fix bug with customized OIDC issuer
2024-09-05 09:17:37 -07:00
Brian DeHamer
2a07de1333
fix bug with customized oidc issuer
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-09-04 10:24:28 -07:00
Francesco Novy
2e1998fc42 update lockfile 2024-08-30 09:41:33 +02:00
Francesco Novy
b7a914b73b Use native crypto package from node 2024-08-30 09:30:02 +02:00
Brian DeHamer
6c4e082c18
Merge pull request #1805 from actions/bdehamer/update-http-client
bump @actions/http-client from 2.2.1 to 2.2.3
2024-08-22 08:39:26 -07:00
Brian DeHamer
1e69bffbba
bump @actions/http-client from 2.2.1 to 2.2.3
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-08-22 07:52:03 -07:00
Thomas Boop
d1aa255c7f
HTTP Client 2.2.3 Release (#1804)
* http-client 2.2.3

* fix audit

* Revert "fix audit"

724956ffa7d2369e0fcc7e0a4f0ae7f6fb2ff034

* update versions

* Revert "update versions"

139b3391a00f8d8a03a2bc782f40e7cefbe9354c

* exclude dev dependencies while we work on removing lerna
2024-08-22 10:13:36 -04:00
Brian DeHamer
7298ff3219
Merge pull request #1799 from actions/bdehamer/http-client-proxy-auth
fix encoding for proxy auth token
2024-08-21 06:41:49 -07:00
Brian DeHamer
571d782946
Merge pull request #1797 from actions/bdehamer/attester-release-notes
improve release notes for @actions/attest
2024-08-19 07:38:36 -07:00
Brian DeHamer
ada9e00cda
fix encoding for proxy auth token
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-08-16 15:03:40 -07:00
Josh Gross
faf9cb2ea2
Include the package name in the Publish Workflow run (#1793) 2024-08-16 16:15:14 -04:00
Brian DeHamer
ac3a063583
improve release notes for @actions/attest
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-08-16 12:43:39 -07:00
Brian DeHamer
7cc96bb976
Merge pull request #1796 from actions/bdehamer/attest-issuer
derive default OIDC issuer from current tenant
2024-08-16 12:21:00 -07:00
Brian DeHamer
fa6cc53297
derive default OIDC issuer from current tenant
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-08-16 12:07:23 -07:00
Thomas Boop
f299e8ba1e
HTTP Client 2.2.2 Release (#1794)
* 2.2.2 release

* update nodes
2024-08-16 13:11:10 -04:00
Yu
1b9927d1c7
Handle Encoded URL for Proxy Username and Password in HTTP Client (#1782)
* uri-decode-fix

Signed-off-by: Yu <yu.yang@anz.com>

* http-client URLdecode fix

Signed-off-by: Yu <yu.yang@anz.com>

* http-client URLdecode test typo fix

Signed-off-by: Yu <yu.yang@anz.com>

---------

Signed-off-by: Yu <yu.yang@anz.com>
2024-08-16 12:43:10 -04:00
Brian DeHamer
279e891118
Merge pull request #1790 from actions/bdehamer/attest-headers
support for headers param in attest functions
2024-08-16 07:21:46 -07:00
Brian DeHamer
340a1033a5
support for headers param in attest functions
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-08-15 15:35:32 -07:00
Josh Gross
50f2977cce
Add glob option to ignore hidden files (#1791)
* Add glob option to ignore hidden files

* Use the basename of the file/directory to check for `.`

* Ensure the `excludeHiddenFiles` is properly copied

* Allow the root directory to be matched

* Fix description of `excludeHiddenFiles`

* Document Windows hidden attribute limitation

* Bump version

* `lint`

* Document 0.5.0 release

* Lint again
2024-08-15 17:13:49 -04:00
Thomas Boop
48a65377c0
Fix HTTP client tests (#1792)
* fix tests and update dependencies
2024-08-15 16:53:06 -04:00
Rob Herley
f003268b32
Merge pull request #1786 from SMoraisAnsys/fix/chunk-timeout
refactor: set chunk timeout back to 5 minutes
2024-08-06 12:12:38 -04:00
Sébastien Morais
3a33cca851
FIX: Set chunk timeout back to 5 minutes 2024-08-06 10:27:41 +02:00
Rob Herley
bb6c500939
Merge pull request #1781 from actions/robherley/artifact-2.1.9
Prep for @actions/artifact v2.1.9
2024-08-01 09:42:30 -04:00
Rob Herley
76b6e24aee
bump pkg lock 2024-07-31 10:12:04 -04:00
Rob Herley
58d14c4ef5
prep for @actions/artifact v2.1.9 2024-07-31 10:05:34 -04:00
Rob Herley
7463cf3da6
Merge pull request #1771 from rmunn/fix-too-many-open-files
Prevent "too many open files" in artifact upload
2024-07-31 09:20:36 -04:00
Brian DeHamer
90d9783552
Merge pull request #1776 from actions/bdehamer/jwks-proxy-fix
fix proxy support for jwks retrieval
2024-07-29 16:31:41 -07:00
Robin Munn
7c61054649 Remove unused import 2024-07-27 17:00:02 +07:00
Brian DeHamer
b28406bd1f
fix proxy support for jwks retrieval
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-07-26 15:03:40 -07:00
Robin Munn
9517cdf52d Prevent "too many open files" in artifact upload
See https://www.archiverjs.com/docs/archiver/#file
2024-07-26 08:49:34 +07:00
Rob Herley
49927e464a
Merge pull request #1774 from actions/robherley/fix-chunk-timeout
Fix artifact upload chunk timeout logic + update tests
2024-07-25 13:52:09 -04:00
Rob Herley
3e34f6d19c
add comment for chunk timeout 2024-07-24 12:40:57 -04:00
Rob Herley
182702d2df
fix chunk timeout + update tests 2024-07-23 21:57:39 -04:00
Rob Herley
1db73622df
Merge pull request #1768 from actions/robherley/artifacts-allow-localhost
Allow localhost hostnames for artifact checks
2024-07-03 14:38:52 -04:00
Rob Herley
56832696fc
npm audit fix 2024-07-03 17:03:40 +00:00
Rob Herley
176b40a888
allow localhost hostnames for artifact checks 2024-07-03 16:55:53 +00:00
Bassem Dghaidi
4902d3a118 Add backend ids 2024-06-24 01:16:11 -07:00
Bassem Dghaidi
04d1a7ec3c Add fix cache paths 2024-06-17 03:36:06 -07:00
Bassem Dghaidi
e1b7e78d60 Fix cache misses 2024-06-17 02:39:45 -07:00
Bassem Dghaidi
7640cf17c1 Fix cache misses 2024-06-17 02:35:25 -07:00
Bassem Dghaidi
8d7ed4fb57 Fix cache service url bug 2024-06-17 01:32:41 -07:00
Bassem Dghaidi
5afc042a74 Add download cache v2 2024-06-17 01:17:10 -07:00
Bassem Dghaidi
5e5faf73fc Use zlib for compression 2024-06-13 03:16:59 -07:00
Brian DeHamer
361a115e53
Merge pull request #1759 from actions/bdehamer/rekor-409
config rekor to fetch on conflict
2024-06-12 12:25:06 -07:00
Brian DeHamer
dddc440d56
config rekor to fetch on conflict
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-06-12 11:57:18 -07:00
Brian DeHamer
08d6f14ea8
Merge pull request #1745 from actions/bdehamer/attest-provenance
(@actions/attest) New GHA provenance build type
2024-06-12 11:45:37 -07:00
Bassem Dghaidi
9e63a77e7a Implement cache v2 2024-06-10 12:19:52 -07:00
Bassem Dghaidi
146143a9b4 Implement cache v2 2024-06-10 11:55:28 -07:00
Bassem Dghaidi
6635d12ce0 Implement cache v2 2024-06-10 11:36:37 -07:00
Bassem Dghaidi
dccc3f7f1c Fix upload mechanics 2024-06-10 11:01:01 -07:00
Bassem Dghaidi
66d5434f23
Add v2 cache upload 2024-06-10 10:56:20 -07:00
Brian DeHamer
73100a7f85
new GHA build provenance
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-06-05 14:54:34 -07:00
Brian DeHamer
c6b487124a
Merge pull request #1738 from actions/bdehamer/attest-1.3.0
(@actions/attest) prepare 1.3.0 release
2024-06-05 14:53:11 -07:00
Bassem Dghaidi
c8466d1fac Add twirp client 2024-05-29 08:31:54 -07:00
Bassem Dghaidi
264230c2c5 add debug 2024-05-23 09:04:37 -07:00
Bassem Dghaidi
32dbccb77b Add debug message 2024-05-23 07:25:17 -07:00
Brian DeHamer
8735a7e2da
prep 1.3.0 release of @actions/attest
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-05-21 13:11:37 -07:00
Fredrik Skogman
d1df13e178
Merge pull request #1735 from kommendorkapten/dynamic-urls
Read the server url from the environment variable.
2024-05-21 07:35:07 +02:00
Fredrik Skogman
d3d7736bae
Fixed a spelling error 2024-05-20 07:57:44 +02:00
Fredrik Skogman
7d18e7aa0d
PR feedback. Juse more JS idiomatic code 2024-05-20 07:52:36 +02:00
Fredrik Skogman
e60694077d
Read the server url from the environment variable.
Instead of having the urls hardcoded, read them from the environment.
I opted to read from the environment variable instead of the github context
because it would be easier to test.
2024-05-16 17:00:35 +02:00
Brian DeHamer
ae38557bb0
Merge pull request #1730 from actions/bdehamer/attest-readme
Update @actions/attest README
2024-05-01 11:48:55 -07:00
Brian DeHamer
abb586d71e
add doc link in @actions/attest readme
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-05-01 11:30:45 -07:00
Brian DeHamer
81a73aba8b
Merge pull request #1725 from actions/bdehamer/attest-retry-persist
(@actions/attest) retry request on failure to save attestation
2024-04-24 19:59:43 -07:00
Brian DeHamer
0e8fe8af62
retry request on failure to save attestation
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-04-24 15:07:39 -07:00
Bethany
29885a805e
Merge pull request #1724 from actions/bethanyj28/update-unzip-stream
Use latest `unzip-stream` and `unzip.Extract`
2024-04-24 09:09:09 -04:00
bethanyj28
9eb3d3a673 lint 2024-04-23 16:10:57 -04:00
bethanyj28
6e642f628f lint 2024-04-23 16:06:02 -04:00
bethanyj28
0159bbe7f2 bump version 2024-04-23 16:03:52 -04:00
bethanyj28
476276bf98 use latest unzip-stream 2024-04-23 15:54:54 -04:00
Brian DeHamer
d82fd09f99
Merge pull request #1714 from actions/bdehamer/attest-no-make-fetch-happen
(@actions/attest) remove dep on make-fetch-happen
2024-04-23 10:39:57 -07:00
Brian DeHamer
2961d73391
remove dep on make-fetch-happen
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-04-23 09:39:17 -07:00
Rob Herley
eb1cb3649c
Merge pull request #1721 from actions/robherley/retry-502-invalid-body
artifact client: retry on non-JSON response
2024-04-19 14:02:46 -04:00
Rob Herley
b384fe17ba
bump pkg version + release notes 2024-04-19 15:08:30 +00:00
Rob Herley
ccb1df45d1
artifact client: retry on non-JSON response 2024-04-19 14:03:47 +00:00
eggyhead
5a736647a1
Merge pull request #1712 from actions/vmjoseph/update-archiver
Upgrading `upload-artifact` and `download-artifact` archiver package
2024-04-15 13:03:10 -07:00
Vallie Joseph
918b468a41 replacing writeFile with writeFileSync 2024-04-15 16:57:28 +00:00
Vallie Joseph
234761dc05 replacing writeFile with writeFileSync 2024-04-15 16:51:30 +00:00
Vallie Joseph
fa1cb5d153 correcting imports 2024-04-15 16:49:47 +00:00
Vallie Joseph
e998cf1216 cleaning up tests 2024-04-15 16:32:31 +00:00
Vallie Joseph
2bbbf928ae re-adding minor ver for now 2024-04-15 16:20:24 +00:00
Vallie Joseph
fa06a1eadf removing minor ver for now 2024-04-15 16:18:41 +00:00
Vallie Joseph
5eea9e34e7 cleaning up comments and removing clear timeout outside of finaly 2024-04-15 16:08:45 +00:00
Vallie Joseph
75b5e5376d updating artifact version 2024-04-15 15:32:08 +00:00
Vallie Joseph
be507421b1 . 2024-04-15 15:24:57 +00:00
Vallie Joseph
5d943d4b7f Rever http 2024-04-15 12:59:58 +00:00
Vallie Joseph
67951b1f2b Merge branch 'main' into vmjoseph/update-archiver 2024-04-15 12:18:10 +00:00
eggyhead
c104cf5dc0
Merge pull request #1713 from actions/eggyhead/fix-tar-ddos-vuln
fixing https://github.com/advisories/GHSA-f5x3-32g6-xq36
2024-04-12 13:41:10 -07:00
Vallie Joseph
4fb4c6ed94 Merge branch 'eggyhead/fix-tar-ddos-vuln' into vmjoseph/update-archiver 2024-04-12 20:31:55 +00:00
eggyhead
df5a794b3d fixing new-package script instruction 2024-04-10 21:48:57 +00:00
eggyhead
c01bc907ed fixing https://github.com/advisories/GHSA-f5x3-32g6-xq36 2024-04-10 21:30:24 +00:00
Vallie Joseph
222733049e . 2024-04-09 21:22:40 +00:00
Vallie Joseph
fa9db3c8fa wrapping timeout in try catch 2024-04-09 21:18:30 +00:00
Vallie Joseph
18a8a22c65 updating upload try catch to always call cleartimeout 2024-04-09 21:05:58 +00:00
Vallie Joseph
425f05e29d moving timer outside of uploadZipToBlobStorage 2024-04-09 21:04:29 +00:00
Vallie Joseph
90fca23920 replacing timeout 2024-04-09 20:51:12 +00:00
Vallie Joseph
0d3d3bbb40 Adding missing progress time 2024-04-09 20:40:08 +00:00
Vallie Joseph
98ce947a6c updating timeout 2024-04-09 19:38:57 +00:00
Vallie Joseph
2ed9516172 updating timeout 2024-04-09 19:24:52 +00:00
Vallie Joseph
4fc93ec115 . 2024-04-09 19:01:54 +00:00
Vallie Joseph
61d6acdeb1 updating test 2024-04-09 18:52:19 +00:00
Vallie Joseph
f98ccd1e39 updating tests 2024-04-09 18:21:41 +00:00
Vallie Joseph
7f0a981b2e Revert http 2024-04-09 18:09:34 +00:00
Vallie Joseph
2e7a11c409 upgrading archiver package along with chunk timeout 2024-04-09 18:02:48 +00:00
Brian DeHamer
9ddf153e00
Merge pull request #1701 from actions/bdehamer/attest-v03-bundle
(@actions/attest) generate attestations using v0.3 bundle format
2024-04-03 13:51:26 -07:00
Brian DeHamer
f8d95a85df
generate v0.3 bundles in attest package
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-04-03 12:12:26 -07:00
Brian DeHamer
59e9d284e9
Merge pull request #1693 from actions/bdehamer/oidc-provenance
(@actions/attest) build provenance statement from OIDC claims
2024-03-28 13:44:22 -07:00
Brian DeHamer
4ce4c767e2
npm audit fix
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-03-22 12:44:24 -07:00
Brian DeHamer
a0e6af1e53
build provenance stmt from OIDC claims
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-03-22 12:34:42 -07:00
Bethany
ef77c9d60b
Merge pull request #1683 from Smeb/fix-1579
fix #1579: add test to check getCacheVersion does not mutate arguments
2024-03-07 10:48:45 -05:00
Smeb
8fee77b04b fix #1579: add test to check getCacheVersion does not mutate arguments 2024-03-07 16:23:04 +01:00
Luke Tomlinson
b807fc9c54
Update http-client to 2.2.1 (#1679) 2024-03-01 15:09:37 -05:00
Bethany
55c7a1e03d
Merge pull request #1678 from actions/bethanyj28/logging
Add info level logging for zip extract
2024-03-01 13:09:41 -05:00
bethanyj28
4799020e28 bump version 2024-03-01 13:04:16 -05:00
bethanyj28
bb420e4681 add info level logging for zip extract 2024-03-01 12:54:40 -05:00
Bethany
0c735ba79d
Merge pull request #1677 from actions/bethanyj28/update-releases
Flip releases update order
2024-02-29 12:01:04 -05:00
Bethany
e918bf24ae
Update RELEASES.md 2024-02-29 10:41:57 -05:00
Bethany
eea6b7f517
Update RELEASES.md 2024-02-29 10:40:22 -05:00
teatimeguest
ff435e591d
Make sure RequestOptions.keepAlive is applied properly on node20 runtime (#1572) 2024-02-28 12:10:57 -05:00
Bethany
df3315bbea
Merge pull request #1676 from actions/bethanyj28/flip-releases
Flip releases order
2024-02-28 10:46:45 -05:00
Bethany
b7770574c2
flip releases order 2024-02-28 10:35:01 -05:00
Brian DeHamer
29bf378d97
Merge pull request #1675 from actions/provenance-permissions
fix permissions for release workflow
2024-02-26 11:40:12 -08:00
Brian DeHamer
68b042febd
fix permissions for release workflow
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-02-26 11:32:45 -08:00
Brian DeHamer
c366a07d62
Merge pull request #1672 from actions/attest-v1.0.0
bump @actions/attest to 1.0.0
2024-02-26 11:13:48 -08:00
Brian DeHamer
9e5eb95517
Merge pull request #1674 from actions/npm-provenance
publish npm packages with build provenance
2024-02-26 11:13:32 -08:00
Brian DeHamer
7f96bd610d
publish npm packages with build provenance
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-02-26 10:42:33 -08:00
Thomas Boop
8f53a1d37f
Update CODEOWNERS (#1673) 2024-02-26 13:31:23 -05:00
Brian DeHamer
37a562b194
bump @actions/attest to 1.0.0
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-02-26 10:21:47 -08:00
Brian DeHamer
ad1f156c7c
Merge pull request #1667 from actions/bdehamer/attest
add new @actions/attest package
2024-02-26 10:15:14 -08:00
Brian DeHamer
6079dea4c4
add new @actions/attest package
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-02-26 08:52:20 -08:00
Bethany
437f2be56d
Merge pull request #1671 from actions/bethanyj28/update-version
Update artifacts to 2.1.3
2024-02-26 10:24:29 -05:00
bethanyj28
97c606b612 update to 2.1.3 2024-02-26 10:18:02 -05:00
Bethany
5a7faf0eb5
Merge pull request #1670 from actions/bethanyj28/fix-callback
Ensure callback is only called once
2024-02-26 10:04:37 -05:00
bethanyj28
dcc55dfd04 feedback 2024-02-26 09:56:00 -05:00
bethanyj28
902046e4d8 ensure callback is only called once 2024-02-26 09:36:35 -05:00
Bethany
88f7a7bc65
Merge pull request #1666 from actions/bethanyj28/download-path
Use `unzip.Parse` over `unzip.Extract`
2024-02-23 16:22:24 -05:00
bethanyj28
6cf4fbcef8 add a comment 2024-02-23 15:33:24 -05:00
bethanyj28
7fa864a4f4 go back to normalize) 2024-02-23 15:28:25 -05:00
Bethany
f77cbc9ef7
Update packages/artifact/src/internal/download/download-artifact.ts
Co-authored-by: Tingluo Huang <tingluohuang@github.com>
2024-02-23 15:20:01 -05:00
bethanyj28
8a1800c5da use resolve instead of normalize 2024-02-23 15:15:17 -05:00
bethanyj28
90894a8853 bump version 2024-02-23 15:03:09 -05:00
bethanyj28
614f27a4fb use stream transform 2024-02-23 14:34:39 -05:00
bethanyj28
ac84a9bee3 re-add noop logs and format + lint 2024-02-23 13:46:22 -05:00
bethanyj28
4256ea99c5 update test case and handling 2024-02-23 13:41:40 -05:00
bethanyj28
76489f433b attempt with comparing index 2024-02-23 11:59:36 -05:00
bethanyj28
e9005f7727 ensure no path traversal 2024-02-23 10:54:12 -05:00
bethanyj28
8d03fb4787 prettier 2024-02-23 08:46:56 -05:00
bethanyj28
d3301c9bc2 update path parsing 2024-02-23 08:42:23 -05:00
bethanyj28
1e326de474 use existing function 2024-02-23 08:28:37 -05:00
bethanyj28
83731e6528 remove awaits from on entry 2024-02-22 22:06:32 -05:00
bethanyj28
a24b9c0184 handle directories 2024-02-22 21:54:54 -05:00
bethanyj28
31c555afda prettier 2024-02-22 20:31:49 -05:00
bethanyj28
9dea373bba wait for upload to finish 2024-02-22 20:29:42 -05:00
bethanyj28
b956d8a4dd audit, lint, format 2024-02-22 17:55:53 -05:00
bethanyj28
81d5e48db0 update tests 2024-02-22 17:51:15 -05:00
bethanyj28
bc5b3a85ae use on entry 2024-02-22 17:16:32 -05:00
Konrad Pabjan
415c42d27c
Update workflows to use v4 actions (#1652)
* Update releases.yml to use v4 actions

* Bump all workflows
2024-02-01 12:50:47 -05:00
eggyhead
e6c1cd0d8c
Merge pull request #1651 from actions/eggyhead/update-ghescheck-cache-v3.2.4
updating cache version and release to include ghes check change
2024-02-01 09:21:58 -08:00
eggyhead
39621898ff
Merge pull request #1650 from actions/eggyhead/update-ghescheck-artifacts-v2.1.1
updating artifact version and release to include ghes check change
2024-02-01 08:43:20 -08:00
eggyhead
c500de6dea updating cache version and release to include ghes check change
Revert "updating cache version and release to include ghes check change"

This reverts commit 7185d8964514361b7b8dcdba1f9dd54ef24b8bdd.

updating cache version and release to include ghes check change
2024-01-31 21:23:20 +00:00
eggyhead
c4f4f5ae07 updating artifact version and release to include ghes check change 2024-01-31 21:15:11 +00:00
eggyhead
f1d9b4b985
Merge pull request #1648 from actions/eggyhead/ghescheck-updatehosts
Update GHES host check
2024-01-31 10:33:31 -08:00
eggyhead
d134334a38 lint fixes 2024-01-31 16:51:04 +00:00
eggyhead
3b02a6fdc5 updating alowed hosts in isGhes check
updating alowed hosts in artifact ghes check

using dot prepend ghe host
2024-01-31 16:30:37 +00:00
eggyhead
1fe633e27c
Merge pull request #1627 from actions/eggyhead/hyperlinks-faq
adding hyperlinks for new section of artifacts faq
2024-01-19 08:40:40 -08:00
eggyhead
74bca717aa
Update packages/artifact/docs/faq.md
Consistent spacing in version table

Co-authored-by: Bethany <bethanyj28@users.noreply.github.com>
2024-01-19 08:37:38 -08:00
eggyhead
bb4505e078 yaml formatting 2024-01-18 17:36:26 +00:00
eggyhead
dbfca0275d removing numbered list 2024-01-18 17:35:08 +00:00
eggyhead
d01372220d bold text 2024-01-18 17:33:39 +00:00
eggyhead
8e13afa0db updating language and adding compatibility table 2024-01-18 17:32:19 +00:00
Rob Herley
4e3b068ce1
Merge pull request #1629 from actions/robherley/update-docs-2.1.0
v2.1.0 Generate docs + update release notes
2024-01-18 11:25:50 -05:00
Rob Herley
017d757dd4
update releases.md 2024-01-18 11:07:25 -05:00
eggyhead
5212cb5ed9
Merge pull request #1628 from actions/eggyhead/update-getartifact-errmessage
updating artifact not found error message
2024-01-18 08:02:52 -08:00
eggyhead
cca96584eb removing newline and camelcasing GitHub 2024-01-18 15:57:21 +00:00
Rob Herley
58c2878fce
generate docs + update releases 2024-01-18 09:51:01 -05:00
Rob Herley
daf23ba955
Merge pull request #1626 from actions/robherley/delete-artifacts
Add methods to delete artifacts
2024-01-18 09:46:52 -05:00
eggyhead
5016db01fe update message for internal method 2024-01-18 04:14:39 +00:00
eggyhead
30942cc4ae updating artifact not found error message to include more information and link to FAQ 2024-01-18 04:10:35 +00:00
eggyhead
98f72c3040 adding hyperlinks for new section of artifacts faq 2024-01-18 04:03:48 +00:00
eggyhead
64c0992283 adding version compatibility and retention to artifacts FAQ 2024-01-18 03:58:06 +00:00
Rob Herley
1852eb2115
more delete examples 2024-01-17 18:58:58 -05:00
Rob Herley
abe0bd98df
delete example 2024-01-17 18:21:25 -05:00
Rob Herley
2ad687a32e
add integration test for delete 2024-01-17 17:54:10 -05:00
Rob Herley
2f5fb3f92b
list for correct backend ids in internal delete 2024-01-17 17:53:49 -05:00
Rob Herley
7fd71a5e13
fix typo 2024-01-17 16:56:34 -05:00
Rob Herley
b62d4c91b6
add public and internal methods to delete artifacts 2024-01-17 16:18:49 -05:00
Rob Herley
1b5a6e26f4
Merge pull request #1623 from actions/robherley/update-cache-release
Updates release notes for @actions/cache v3.2.3
2024-01-10 17:40:55 -05:00
Rob Herley
7c27528ab4
Update RELEASES.md
Updates release notes for @actions/cache v3.2.3
2024-01-10 17:32:52 -05:00
Rob Herley
82e8bc69b8
Merge pull request #1622 from actions/robherley/bump-cache-version
Update cache npm package version
2024-01-10 17:29:16 -05:00
Rob Herley
b9079670eb
Update cache npm package version 2024-01-10 17:05:13 -05:00
Rob Herley
cab491a426
Merge pull request #1378 from MSP-Greg/00-cache-paths-dup
cache - getCacheVersion - dup paths array
2024-01-10 17:01:51 -05:00
Vallie Joseph
0389dcd5e4
updating release notes (#1620) 2024-01-10 10:43:38 -05:00
Ryan Troost
64b2775394
Merge pull request #1613 from actions/srryan/download-v4-client-blob
Update `http.client` to retry transient network hang ups
2024-01-09 16:01:39 -05:00
Vallie Joseph
439cd9b37e appeasing linter 2024-01-09 19:47:25 +00:00
Vallie Joseph
c1ded1dc4d appeasing linter 2024-01-09 19:47:02 +00:00
Vallie Joseph
f37c445bc5 reverting jest 2024-01-09 19:46:17 +00:00
Vallie Joseph
e95bcfe359
Update jest.config.js 2024-01-09 14:44:29 -05:00
Vallie Joseph
7549d1b218 removing info logs 2024-01-09 19:42:04 +00:00
Vallie Joseph
2124ef2413 cleaning up logs 2024-01-09 19:36:26 +00:00
Vallie Joseph
d617670abc updating timer; removing logs 2024-01-09 19:23:57 +00:00
Vallie Joseph
47157e5ade fixing true 2024-01-09 19:05:11 +00:00
Vallie Joseph
8a6aae0a16 updating global timeout 2024-01-09 19:03:41 +00:00
Vallie Joseph
58ec2bdcc9 increase timeout 2024-01-09 18:55:50 +00:00
Vallie Joseph
e19b629130 increasing timeout 2024-01-09 18:45:26 +00:00
Vallie Joseph
d63a8c4d3f updating package-json 2024-01-09 17:13:35 +00:00
Vallie Joseph
67d2d582dc adding delayed response to message body http mock 2024-01-09 16:44:12 +00:00
Vallie Joseph
9d70b8a9fb testing reject after timeout 2024-01-08 15:20:05 +00:00
Vallie Joseph
7f47ffaee2 committing v1 2023-12-22 03:51:47 +00:00
Vallie Joseph
98e1a813db testing ci 2023-12-21 20:22:20 +00:00
Vallie Joseph
0d39975814 updating test with blob timeouts 2023-12-21 18:31:01 +00:00
Vallie Joseph
f482643a6e updating timeout for retries 2023-12-21 15:10:01 +00:00
bethanyj28
ff2c524611 lint and format 2023-12-21 09:25:34 -05:00
srryan
ecb4df89bf remove the exit 2023-12-20 18:23:47 -05:00
srryan
03319fcffa client fixes for retries + logging 2023-12-20 18:08:00 -05:00
srryan
c33724abbd update to http client 2023-12-20 15:45:19 -05:00
Rob Herley
d6f3ee93b8
reject don't throw 2023-12-20 14:37:13 -05:00
Rob Herley
34a411f3c0
add timeout in between data chunks 2023-12-20 13:59:31 -05:00
Rob Herley
2d6ba67518
retry the promise 2023-12-20 13:11:04 -05:00
Yukai Chou
5430c5d848
fix typo (#1611) 2023-12-20 03:16:52 -05:00
James Renaud
bc68ce94ea
chore(docs): add missing job summary documentation (#1574)
Co-authored-by: Konrad Pabjan <konradpabjan@github.com>
2023-12-20 03:12:17 -05:00
srryan
78ed49ff88 update error handling abort 2023-12-19 12:46:58 -05:00
srryan
c119fcd773 update optional settings for blob client 2023-12-19 12:02:10 -05:00
srryan
73babeabef add explicit options 2023-12-19 11:49:39 -05:00
Vallie Joseph
bf93b54558 adding logger for blob client and response 2023-12-18 23:09:10 +00:00
srryan
0c0770ce57 cleanup 2023-12-18 17:52:55 -05:00
srryan
571bf222ee update to use blob client over http client 2023-12-18 17:11:14 -05:00
Rob Herley
68f22927e7
Merge pull request #1608 from actions/robherley/artifact-client-import
Update artifact module quick start
2023-12-14 15:46:14 -05:00
Rob Herley
11a2dd3117
update artifact module quick start 2023-12-14 15:38:49 -05:00
Rob Herley
43c63eef65
Merge pull request #1607 from actions/robherley/update-artifact-tests
Update artifact workflow tests
2023-12-13 12:47:12 -05:00
Rob Herley
6a9034d692
update artifact workflow tests 2023-12-13 12:19:14 -05:00
Rob Herley
eff198be5b
Merge pull request #1605 from actions/robherley/usage-message
Better error message for artifact usage limits
2023-12-12 09:49:55 -05:00
Rob Herley
16b786a545
better error message for usage limits 2023-12-11 22:01:08 -05:00
Rob Herley
18ce228b82
Merge pull request #1603 from actions/robherley/network-errors
Add specific messages for network-specific node error codes
2023-12-11 17:34:24 -05:00
Rob Herley
a4bd0f1214
Add specific messages for network-specific node error codes 2023-12-11 17:07:48 -05:00
Rob Herley
37a66ebd47
Merge pull request #1602 from actions/robherley/replace-unzip-lib
[artifact] replace unzipper with unzip-stream
2023-12-11 14:22:07 -05:00
Rob Herley
09249a72d7
push null at end of mocked message 2023-12-11 13:41:11 -05:00
Rob Herley
4c531c013a
update packages 2023-12-11 12:24:41 -05:00
Rob Herley
3c3af56b29
replace unzipper with unzip-stream 2023-12-11 12:15:40 -05:00
Vallie Joseph
950e1711a1
Improve error messages (duplicate artifacts; too many artifacts) (#1600)
* cleaning up error messages

* updating package-json

* updating package-lock

* .

* .

* testing return message

* updating error check

* adding test

* rmv unused var

* updating status code to match conflict message
2023-12-11 11:26:54 -05:00
Jonathan Tamsut
88b76de595
Add back 429 to list of retryable requests (#1599)
* add back 429 to list of retryable requests

* fix lint error
2023-12-08 11:00:44 -08:00
Jonathan Tamsut
55a05255d7
Remove 429 request from list of retry-able status codes (#1597)
* remove 429 request from retryable

* remove 413

* make linter happy
2023-12-07 13:22:17 -08:00
Rob Herley
64d1b104d0
Generate Typescript Docs for @actions/artifact (#1595)
* autogenerate artifact documentation

* clean up comments for better autogen docs
2023-12-07 09:57:20 -08:00
Rob Herley
43ccaf05d9
Merge pull request #1596 from actions/robherley/cleanup-handlers
Cleanup artifact handlers hanging node process
2023-12-06 19:27:30 -05:00
Rob Herley
f732e4cd62
linter 2023-12-06 23:57:33 +00:00
Rob Herley
8c317a0e59
one too many parses 2023-12-06 23:51:16 +00:00
Rob Herley
715b1acc05
cleanup artifact handlers hanging node process 2023-12-06 23:42:07 +00:00
Rob Herley
207747e7af
Merge pull request #1594 from actions/robherley/artifact-docs-updates
@actions/artifact doc updates
2023-12-06 14:30:00 -05:00
Rob Herley
c042a30d3d
Update packages/artifact/CONTRIBUTIONS.md
Co-authored-by: Mattia Richetto <mattiaerre@github.com>
2023-12-06 14:05:38 -05:00
Rob Herley
70cad3f635
Update packages/artifact/README.md
Co-authored-by: Konrad Pabjan <konradpabjan@github.com>
2023-12-06 13:19:38 -05:00
Rob Herley
1f87038676
Update packages/artifact/README.md
Co-authored-by: Konrad Pabjan <konradpabjan@github.com>
2023-12-06 13:19:32 -05:00
Rob Herley
8cd4434523
mention job limit 2023-12-06 17:30:13 +00:00
Rob Herley
2e6c9a1f14
pr feedback 2023-12-06 17:28:03 +00:00
Rob Herley
c08a7d1b2e
Update packages/artifact/README.md
Co-authored-by: Konrad Pabjan <konradpabjan@github.com>
2023-12-06 12:19:49 -05:00
Rob Herley
49ef8b93a8
fix typo 2023-12-06 15:38:59 +00:00
Rob Herley
19d4d9d3b2
releases.md: link to breaking v2 changes 2023-12-06 14:52:49 +00:00
Rob Herley
b43b97985c
Update packages/artifact/docs/faq.md
Co-authored-by: Bethany <bethanyj28@users.noreply.github.com>
2023-12-06 09:31:55 -05:00
Rob Herley
23fb8c4782
Update packages/artifact/README.md
Co-authored-by: Bethany <bethanyj28@users.noreply.github.com>
2023-12-06 09:31:09 -05:00
Rob Herley
dc515188a8
Update packages/artifact/README.md
Co-authored-by: Bethany <bethanyj28@users.noreply.github.com>
2023-12-06 09:30:53 -05:00
Rob Herley
79ace256d6
Update packages/artifact/README.md
Co-authored-by: Bethany <bethanyj28@users.noreply.github.com>
2023-12-06 09:30:35 -05:00
Rob Herley
68958c2486
Update packages/artifact/README.md
Co-authored-by: Bethany <bethanyj28@users.noreply.github.com>
2023-12-06 09:30:20 -05:00
Rob Herley
0c9621922e
add faq, update releases 2023-12-06 04:22:18 +00:00
Rob Herley
9b31b03496
more readme updates 2023-12-06 04:10:46 +00:00
Rob Herley
befa19f3a8
initalize artifact client as default export 2023-12-06 04:00:07 +00:00
Rob Herley
e27efe5620
readme & error updates 2023-12-05 21:55:22 +00:00
Rob Herley
449b28aee2
update contributing docs 2023-12-05 21:10:48 +00:00
Rob Herley
04945c6048
Merge pull request #1593 from actions/robherley/api-consistency
Consistent error behavior for Artifact methods
2023-12-05 15:22:16 -05:00
Rob Herley
5f152b798e
Update artifact-tests.yml 2023-12-05 13:54:14 -05:00
Rob Herley
c390199be6
Update artifact-tests.yml 2023-12-05 13:51:51 -05:00
Rob Herley
a3053b5cc2
fix typo 2023-12-05 18:47:37 +00:00
Rob Herley
b9872153b8
update GHES warning behavior 2023-12-05 18:42:36 +00:00
Rob Herley
ce9eae0785
consistent promise behavior for download artifact 2023-12-05 18:35:26 +00:00
Rob Herley
d3c5f358d1
consistent promise behavior for get artifact 2023-12-05 17:56:18 +00:00
Rob Herley
75a3586061
consistent promise behavior for upload artifact 2023-12-05 17:35:46 +00:00
Rob Herley
8ac8bf1d3d
Merge pull request #1592 from actions/robherley/get-list-artifact-updates
Additional get/list artifact updates
2023-12-04 12:40:59 -05:00
Rob Herley
141b3509e4
update import 2023-12-03 21:13:55 +00:00
Rob Herley
790e6f7194
more docs 2023-12-03 20:52:36 +00:00
Rob Herley
ef454f0991
add tests for list-artifacts 2023-12-03 20:48:33 +00:00
Rob Herley
86ce0b159a
get artifact tests 2023-12-03 19:43:37 +00:00
Rob Herley
c11a7cdeac
wip 2023-12-03 06:24:49 +00:00
Rob Herley
c94ca49c9c
ability to filter artifacts by latest 2023-12-03 05:01:20 +00:00
Rob Herley
fa7657714a
fix import 2023-12-02 21:34:07 -05:00
Rob Herley
c1f9d37323
updates to get/list artifacts 2023-12-02 21:18:22 -05:00
Rob Herley
8f1c589e25
Merge pull request #1591 from actions/robherley/artifact-internal-apis
Implement internal APIs for list/get/download artifacts
2023-12-01 16:17:26 -05:00
Rob Herley
281697ecbe
fix test expectations 2023-12-01 16:34:27 +00:00
Rob Herley
a59f976dd4
minor fixes 2023-12-01 09:05:46 -05:00
Rob Herley
57db7a6302
more debug info 2023-12-01 03:04:10 +00:00
Rob Herley
4789a46578
make FindOptions interface more user friendly 2023-12-01 02:15:25 +00:00
Rob Herley
32549e8197
update download-artifact tests for public and internal impl 2023-12-01 01:32:45 +00:00
Rob Herley
22b7aeb707
some test updates 2023-12-01 00:31:27 +00:00
Rob Herley
e9d6649a14
consume new pb wrappers 2023-11-30 19:10:07 +00:00
Rob Herley
695bf98f84
rewrite artifacts client to have public and internal implementations 2023-11-30 03:47:04 +00:00
Tingluo Huang
0787a93181
Merge pull request #1588 from sshmaxime/main
Add RUN_ATTEMPT to `@actions/github` Context class
2023-11-28 10:43:43 -05:00
Maxime Aubanel
faa425440f
Add RUN_ATTEMPT to Github context 2023-11-28 16:32:10 +01:00
Rob Herley
0407266511
Merge pull request #1584 from actions/robherley/upload-v4-improvements
Increase Artifact v4 upload speed
2023-11-20 16:30:50 -05:00
Rob Herley
a920781ca9
fix results url construction 2023-11-20 18:06:44 +00:00
Rob Herley
9e7201ff5b
audit fix 2023-11-20 16:51:13 +00:00
Rob Herley
3a610e848c
linter 2023-11-20 16:46:08 +00:00
Rob Herley
606ebdcf6d
extra log line for debug 2023-11-20 16:27:35 +00:00
Rob Herley
7b01731091
increase upload concurrency based on cpus, adjust highWaterMark, specify compression level 2023-11-20 15:03:58 +00:00
Nikolai Laevskii
20f826bfe7
Add platform info utilities to @actions/core (#1551)
* Introduce platform utilities into @actions/core

* Add tests for the platform helper

* Update README.md

* Update README.md with more details
2023-11-14 14:15:26 -05:00
Rob Herley
fe3e7ce9a7
Merge pull request #1563 from actions/robherley/artifact-v4/sha256
Use sha256 instead of md5 for artifact v4 integrity hash
2023-10-16 13:31:00 -04:00
Rob Herley
8cd02dfabc
audit fix 2023-10-16 16:27:26 +00:00
Rob Herley
82474125c8
use sha256 instead of md5 for artifact v4 integrity hash 2023-10-16 16:20:24 +00:00
Tatyana Kostromskaya
494f12bcd9
Update dependencies in github package (#1553)
* Update octokit package

* define type for function

* fix linter

* Update github package to latest

* Update RELEASES.md
2023-10-10 16:04:42 +02:00
Tatyana Kostromskaya
797f48fcfa
Update release notes for http-client@2.2.0 (#1549) 2023-10-06 16:03:00 +02:00
Tatyana Kostromskaya
c8d1588732
Merge pull request #1547 from actions/takost/update-http-client
Add function to return proxy agent dispatcher for compatibility with latest `octokit` packages
2023-10-06 14:47:16 +02:00
Tatyana Kostromskaya
13e0ce9cf7 resolve comments 2023-10-06 12:39:29 +00:00
Tatyana Kostromskaya
eae1b66cb0 fix audit 2023-10-05 16:41:02 +02:00
Tatyana Kostromskaya
129f884271 fix format 2023-10-05 16:34:31 +02:00
Tatyana Kostromskaya
0faced6a0b Add function to return proxy agent dispatcher for compatibility with latest octokit 2023-10-05 16:20:26 +02:00
Patrick Ellis
0d63834474
Merge pull request #1541 from actions/pje/upgrade-codeql-actions-to-v2
Upgrade codeql actions to v2
2023-09-27 16:14:48 -04:00
Patrick Ellis
8f032d304a
Upgrade codeql actions to v2
Currently we're using v1, and there have been some important changes since then.

In particular, the latest version, v2.14.6, contains an important security patch:

> The CodeQL CLI no longer supports the `SEMMLE_JAVA_ARGS` environment variable. All previous versions of the CodeQL CLI perform command substitution on the `SEMMLE_JAVA_ARGS` value (for example, replacing `'$(echo foo)'` with `'foo'`) when starting a new Java virtual machine, which, depending on the execution environment, may have security implications. Users are advised to check their environments for possible `SEMMLE_JAVA_ARGS` misuse.

See the [codeql-cli-binaries release notes](https://github.com/github/codeql-cli-binaries/releases/tag/v2.14.4) for full details.
2023-09-27 15:18:59 -04:00
Tatyana Kostromskaya
28b09e224f
Merge pull request #1526 from actions/takost/upd-dependencies
Update dependencies to latest
2023-09-27 12:37:10 +02:00
Tatyana Kostromskaya
111c95866e fix test + update semver 2023-09-26 11:10:18 +00:00
Tatyana Kostromskaya
ddc9c52eb6 revert octokit changes 2023-09-26 11:05:37 +00:00
Tatyana Kostromskaya
6d37c6eb2b try to fix tests 2023-09-15 15:04:21 +00:00
Tatyana Kostromskaya
6477ef1460 tests 2023-09-15 13:54:28 +00:00
Tatyana Kostromskaya
2e5b10e3bd fix tests 2023-09-15 13:45:26 +00:00
Tatyana Kostromskaya
8c1e6a00f0 try to fix test 2023-09-15 13:28:29 +00:00
Tatyana Kostromskaya
b2d5fa216f update github package 2023-09-14 14:32:08 +00:00
Luke Tomlinson
c5c786523e
@actions/core v1.10.1 (#1529) 2023-09-11 10:45:23 -04:00
Sean Burgess
63c648f3c2
Fix error message reference (#1511) 2023-09-11 10:25:17 -04:00
Tatyana Kostromskaya
ce31408ff5 Update dependencies 2023-09-08 14:29:27 +00:00
Tatyana Kostromskaya
e26febd988
Merge pull request #1508 from actions/takost/update-workflows-to-node20
Update workflows to node20
2023-09-01 11:55:53 +02:00
Tatyana Kostromskaya
b051b4bada . 2023-08-29 14:56:32 +00:00
Tatyana Kostromskaya
a08d666c78 . 2023-08-29 14:23:58 +00:00
Tatyana Kostromskaya
83bb7cdeef . 2023-08-29 14:09:20 +00:00
Tatyana Kostromskaya
b552972717 revert 2023-08-29 11:19:06 +00:00
Tatyana Kostromskaya
e3b0601228 test 2023-08-29 10:43:51 +00:00
Tatyana Kostromskaya
0956e634df test 2023-08-29 10:27:11 +00:00
Tatyana Kostromskaya
c171cf52fb upd 2023-08-28 17:09:50 +02:00
Tatyana Kostromskaya
2f1b34f165 test tests 2023-08-28 16:59:29 +02:00
Tatyana Kostromskaya
b61854c5ca update workflows to node20 2023-08-28 16:40:06 +02:00
Bethany
3d652d3133
Merge pull request #1505 from actions/bethanyj28/upload-tests
Add tests for `upload-artifact.ts`
2023-08-24 09:29:01 -04:00
Rob Herley
c3df0928e2
Merge pull request #1502 from actions/robherley/download-artifact
[Artifacts] Support streaming download of artifact archive from blob storage
2023-08-24 09:27:27 -04:00
Rob Herley
9d756b2bc9
linter 2023-08-24 09:16:35 -04:00
Rob Herley
67c3b7a45c
add tests for download artifact 2023-08-23 23:18:03 -04:00
Bethany
3963c722d8 merge download changes and lint 2023-08-23 14:02:50 -07:00
Bethany
3b44a4cc23 prettier 2023-08-23 13:55:26 -07:00
Bethany
03a876f0a7 add tests for upload 2023-08-23 13:54:31 -07:00
Bethany
62f943c0cc
Merge pull request #1503 from actions/bethanyj28/download-artifact
Get a single artifact by name and download to `GITHUB_WORKSPACE`
2023-08-23 14:11:06 -04:00
Bethany
291200d54f include get artifact changes 2023-08-23 10:40:25 -07:00
Bethany
06e751600e move constants to retry-options 2023-08-23 10:36:33 -07:00
Bethany
4b6a4d80e1 use inline eslint disable 2023-08-23 10:12:06 -07:00
Bethany
b2da9aa12c use string interpolation 2023-08-23 07:35:23 -07:00
Bethany
88f749f686 lint 2023-08-23 07:28:17 -07:00
Bethany
b4f8e602b2 remove folder option in favor of path 2023-08-23 07:21:01 -07:00
Bethany
ced07aa89c Use options to specify download folder 2023-08-23 06:47:51 -07:00
Bethany
6adf053d36 prettier 2023-08-22 11:47:14 -07:00
Bethany
671bf1ebd5 use GITHUB_WORKSPACE as default download dir 2023-08-22 11:44:38 -07:00
Bethany
dd26bb1149 use require 2023-08-22 11:33:00 -07:00
Bethany
81a802e7e0 lint 2023-08-22 10:06:40 -07:00
Bethany
4214a1ff24 update dependencies and prettier 2023-08-22 09:57:14 -07:00
Bethany
0555a5f458 add get-artifact logic 2023-08-22 09:17:43 -07:00
Rob Herley
3aaff6685b
cleanup 2023-08-21 17:47:17 -04:00
Rob Herley
9b383229c1
add download apis to stream zip from blob storage 2023-08-21 21:23:54 +00:00
Konrad Pabjan
7b617c260d
[Artifacts] @actions/artifact list artifact functionality + download interface setup (#1495)
* actions/artifact preparation for download-artifact v4

* Test matrix strategy

* Fix needs dependency

* Improve list artifact test

* Fix typo

* Fix variables

* Cleanup download-all interfaces

* Fix tsc error

* Simplify to just name instead of artifactName

* Simplify to id instead of ArtifactId

* PR cleanup
2023-08-17 14:40:33 -04:00
Konrad Pabjan
20afb1a9fc
[Artifacts] Add tests for E2E artifact upload (#1497)
* Add tests for E2E artifact upload

* Trigger Build

* Extra debug logs

* Debug dumping GitHub Context

* More logging

* Minor cleanup

* Trigger Build

* Unique artifact name

* Fix typo

* Fix

* Try using github-script

* Potential fix

* Cleanup

* More cleanup
2023-08-17 12:32:55 -04:00
Konrad Pabjan
c9dab8c79d
[Artifacts] Save md5 hash for each artifact upload (#1494)
* Hash artifact upload using md5

* Add imports

* Small tweaks

* PR feedback

* PR Feedback
2023-08-15 13:39:57 -04:00
Konrad Pabjan
45c49b09df
[Artifacts] zip creation + blob storage upload functionality (#1488)
* Artifact zip creation + blob storage upload functionality

* Fix lint

* PR feedback
2023-08-10 15:28:41 -04:00
Bethany
ab78839e86
Merge pull request #1487 from actions/bethanyj28/add-artifact-api-logic
Utilize client to create and finalize artifact
2023-08-10 10:19:24 -04:00
Bethany
f03b6d639f update import 2023-08-09 17:50:46 -07:00
Bethany
58858b5078 don't use non-null assertions 2023-08-09 17:48:53 -07:00
Bethany
188abfc20b implement feedback 2023-08-09 17:42:14 -07:00
Bethany
2f42c127c7 update tests 2023-08-09 13:20:06 -07:00
Bethany
4dda3ab8a0 move getExpiration to upload-artifact 2023-08-09 13:12:30 -07:00
Bethany
4b219f79f3 Add tests for backend id fetch 2023-08-09 12:29:43 -07:00
Bethany
08d6314f7c prettier 2023-08-09 12:09:17 -07:00
Bethany
b851b70474 catch errors at the root, remove unneccessary disabled rule 2023-08-09 12:08:43 -07:00
Bethany
e8fb71c4bb lint 2023-08-09 11:34:18 -07:00
Bethany
73ad88882e utilize client, fetch IDs 2023-08-09 11:26:33 -07:00
Bethany
92695f58da
Merge pull request #1486 from actions/bethanyj28/add-twirp-client
Add twirp client
2023-08-09 10:46:24 -04:00
Bethany
760f3fd3d1
Update packages/artifact/src/internal/shared/config.ts
Co-authored-by: Konrad Pabjan <konradpabjan@github.com>
2023-08-09 10:30:50 -04:00
Bethany
c6117995d3
Update packages/artifact/src/internal/shared/config.ts
Co-authored-by: Konrad Pabjan <konradpabjan@github.com>
2023-08-09 10:30:44 -04:00
Bethany
24da3e2d1c lint 2023-08-09 07:10:43 -07:00
Bethany
deda97d5e6 don't add extra line breaks 2023-08-09 07:02:06 -07:00
Bethany
cfad1451e9 Update generated files to not use bigint 2023-08-09 07:00:27 -07:00
Bethany
c0684c5add prettier 2023-08-08 13:19:43 -07:00
Bethany
ad9b955fe9 remove unused package 2023-08-08 12:51:54 -07:00
Bethany
1718e0d97c revert target to es6 2023-08-08 12:49:45 -07:00
Bethany
e85cd96d85 tests and fix bug for retry 2023-08-08 12:49:05 -07:00
Bethany
bc24adbfd6
Merge pull request #1481 from actions/bethanyj28/add-twirp-definitions
[Artifacts] Add artifact Twirp API definitions
2023-08-08 10:36:34 -04:00
Bethany
af1621025d wip 2023-08-07 16:26:07 -07:00
Bethany
6552cb9722 wip 2023-08-07 14:24:58 -07:00
Chad Kimes
f74ff155bd
Add option for concurrent cache downloads with timeout (#1484)
* Add option for concurrent cache downloads with timeout

* Add release notes

* Fix lint
2023-08-07 13:25:56 -04:00
Bethany
a66e49ec8a Merge branch 'bethanyj28/add-twirp-definitions' into bethanyj28/add-twirp-client 2023-08-07 09:10:06 -07:00
Bethany
d4c2fa4c68 add generated to eslintignore 2023-08-07 09:01:14 -07:00
Bethany
8c9ab93da7 Merge remote-tracking branch 'origin' into bethanyj28/add-twirp-definitions 2023-08-07 08:56:08 -07:00
Bethany
3773ef22b1 prettier and add generated files to prettierignore 2023-08-07 08:55:42 -07:00
Bethany
80e4680ac8 Merge branch 'bethanyj28/add-twirp-definitions' into bethanyj28/add-twirp-client 2023-08-07 08:48:20 -07:00
Bethany
c608703ecf revert root tsconfig.json 2023-08-07 08:48:15 -07:00
Bethany
66ac937f2f target es2020 2023-08-07 08:45:28 -07:00
Bethany
efcab31d38 pass in http client to constructor 2023-08-07 08:43:39 -07:00
Bethany
4c6d88f93a Start writing tests 2023-08-04 13:00:58 -07:00
Chad Kimes
19e0016878
actions/http-client 2.1.1 release (#1483) 2023-08-04 15:00:50 -04:00
Chad Kimes
2820b17d9d
Add readBodyBuffer method to HttpClientResponse (#1475)
* Add readBodyBuffer method to HttpClientResponse

* Implement method in other package tests

* Make method optional to satisfy the test process
2023-08-04 14:35:26 -04:00
Bethany
8a5343d54a add twirp client 2023-08-04 09:23:14 -07:00
Bethany
3ebee1e8b4 package-lock.json conflict 2023-08-04 07:12:36 -07:00
Konrad Pabjan
7da3ac6eda
[Artifacts] Name validation + zip specification creation (#1482)
* Artifact name validation + zip specification creation

* Fix linting issues

* Grammar fix

* Update test description
2023-08-04 09:53:42 -04:00
Vallie Joseph
2461056696
Audit Fix (#1480)
* fixing audit failures

* replacing lerna bootstrap with npm command

* audit fix for cache and tool-cache

* updating tunnel

* upgrading core packages

* re-adding tunnel as prod dep

* updating dependencies

* updating exec deps

* updating exec io package

* .

* Revert

* updating packages

* adding core as dep

* updating learna config

* updating lerna commands

* Removing audit failing packages in cache + tool-cache

* updating contribution bootstrap description

* updating libraries

* prettier lint

* hiding stricter rules

* updating prettier command

* Removing unknown flag

* Adding eslint prettier

* ignoring sym links

* updating ignore path

* updating prettier rules

* changing prettier + github ver

* updating ts and ignores

* Revert ts

* Adding unknown ignores

* downgrading lerna

* .

* adding nx

* Adding lint auto lint rules

* updating eslint ignore for glob packages

* Adding subdirs to ignore

* adding flag for ignore pattern in linter

* Expanding ignore regex

* Adding ignore rules

* adding another ignore pattern to tsconfig eslint

* adding ignore pattern to eslintrc

* syncing package-json

* updating traverse

* .

* test adding core and http client to base package

* running npm ci

* adding tsconfig paths

* adding base URL

* Adding explicit path to core and http-client

* editing tsc call

* updating artifact packages

* force build

* updating lock file version

* updating lock file version

* upgrading node version

* Adding babel traverse back

* fixing build issue

* fixing typescript ver

* updating package json

* Adding ignore for artifact test

* adding ignore to flags

* unlink after test completes

* cleanup

* merge + package edit
2023-08-03 16:36:11 -04:00
Bethany
3749c51d21 npm install 2023-08-03 12:59:51 -07:00
Bethany
769c896931 add artifact api twirp definitions 2023-08-03 12:43:45 -07:00
Konrad Pabjan
c4f5ce2665
[Artifacts] Prepare for v2.0.0 of @actions/artifact (#1479)
* Prepare for v2.0.0 of @actions/artifact

* Run prettier

* temporary disable unused vars
2023-08-03 13:34:41 -04:00
MSP-Greg
0747ab3577
cache - getCacheVersion - dup paths array 2023-03-20 18:29:46 -05:00
226 changed files with 28634 additions and 28324 deletions

View File

@ -2,3 +2,4 @@ node_modules/
packages/*/node_modules/ packages/*/node_modules/
packages/*/lib/ packages/*/lib/
packages/glob/__tests__/_temp packages/glob/__tests__/_temp
packages/*/src/generated/*/

View File

@ -1,6 +1,13 @@
{ {
"plugins": ["jest", "@typescript-eslint"], "plugins": [
"extends": ["plugin:github/recommended"], "jest",
"@typescript-eslint",
"prettier"
],
"extends": [
"plugin:github/recommended",
"plugin:prettier/recommended"
],
"parser": "@typescript-eslint/parser", "parser": "@typescript-eslint/parser",
"parserOptions": { "parserOptions": {
"ecmaVersion": 9, "ecmaVersion": 9,
@ -8,14 +15,33 @@
"project": "./tsconfig.eslint.json" "project": "./tsconfig.eslint.json"
}, },
"rules": { "rules": {
"prettier/prettier": [
"error",
{
"endOfLine": "auto"
}
],
"eslint-comments/no-use": "off", "eslint-comments/no-use": "off",
"no-constant-condition": ["error", { "checkLoops": false }],
"github/no-then": "off", "github/no-then": "off",
"import/no-namespace": "off", "import/no-namespace": "off",
"no-shadow": "off", "no-shadow": "off",
"no-unused-vars": "off", "no-unused-vars": "off",
"i18n-text/no-en": "off",
"filenames/match-regex": "off",
"import/no-commonjs": "off",
"import/named": "off",
"no-sequences": "off",
"import/no-unresolved": "off",
"no-undef": "off", "no-undef": "off",
"no-only-tests/no-only-tests": "off",
"@typescript-eslint/no-unused-vars": "error", "@typescript-eslint/no-unused-vars": "error",
"@typescript-eslint/explicit-member-accessibility": ["error", {"accessibility": "no-public"}], "@typescript-eslint/explicit-member-accessibility": [
"error",
{
"accessibility": "no-public"
}
],
"@typescript-eslint/no-require-imports": "error", "@typescript-eslint/no-require-imports": "error",
"@typescript-eslint/array-type": "error", "@typescript-eslint/array-type": "error",
"@typescript-eslint/await-thenable": "error", "@typescript-eslint/await-thenable": "error",
@ -23,8 +49,16 @@
"camelcase": "off", "camelcase": "off",
"@typescript-eslint/camelcase": "off", "@typescript-eslint/camelcase": "off",
"@typescript-eslint/consistent-type-assertions": "off", "@typescript-eslint/consistent-type-assertions": "off",
"@typescript-eslint/explicit-function-return-type": ["error", {"allowExpressions": true}], "@typescript-eslint/explicit-function-return-type": [
"@typescript-eslint/func-call-spacing": ["error", "never"], "error",
{
"allowExpressions": true
}
],
"@typescript-eslint/func-call-spacing": [
"error",
"never"
],
"@typescript-eslint/naming-convention": [ "@typescript-eslint/naming-convention": [
"error", "error",
{ {
@ -56,12 +90,15 @@
"@typescript-eslint/prefer-string-starts-ends-with": "error", "@typescript-eslint/prefer-string-starts-ends-with": "error",
"@typescript-eslint/promise-function-async": "error", "@typescript-eslint/promise-function-async": "error",
"@typescript-eslint/require-array-sort-compare": "error", "@typescript-eslint/require-array-sort-compare": "error",
"@typescript-eslint/restrict-plus-operands": "error",
"semi": "off", "semi": "off",
"@typescript-eslint/semi": ["error", "never"], "@typescript-eslint/semi": [
"error",
"never"
],
"@typescript-eslint/type-annotation-spacing": "error", "@typescript-eslint/type-annotation-spacing": "error",
"@typescript-eslint/unbound-method": "error" "@typescript-eslint/unbound-method": "error"
}, },
"ignorePatterns": "packages/glob/__tests__/_temp/**/",
"env": { "env": {
"node": true, "node": true,
"es6": true, "es6": true,

View File

@ -28,7 +28,7 @@ Note that before a PR will be accepted, you must ensure:
### Useful Scripts ### Useful Scripts
- `npm run bootstrap` This runs `lerna bootstrap` which will install dependencies in this repository's packages and cross-link packages where necessary. - `npm run bootstrap` This runs `lerna exec -- npm install` which will install dependencies in this repository's packages and cross-link packages where necessary.
- `npm run build` This compiles TypeScript code in each package (this is especially important if one package relies on changes in another when you're running tests). This is just an alias for `lerna run tsc`. - `npm run build` This compiles TypeScript code in each package (this is especially important if one package relies on changes in another when you're running tests). This is just an alias for `lerna run tsc`.
- `npm run format` This checks that formatting has been applied with Prettier. - `npm run format` This checks that formatting has been applied with Prettier.
- `npm test` This runs all Jest tests in all packages in this repository. - `npm test` This runs all Jest tests in all packages in this repository.
@ -43,7 +43,7 @@ Note that before a PR will be accepted, you must ensure:
1. In a new branch, create a new Lerna package: 1. In a new branch, create a new Lerna package:
```console ```console
$ npm run create-package new-package $ npm run new-package [name]
``` ```
This will ask you some questions about the new package. Start with `0.0.0` as the first version (look generally at some of the other packages for how the package.json is structured). This will ask you some questions about the new package. Start with `0.0.0` as the first version (look generally at some of the other packages for how the package.json is structured).

View File

@ -10,8 +10,8 @@ on:
- '**.md' - '**.md'
jobs: jobs:
build: upload:
name: Build name: Upload
strategy: strategy:
matrix: matrix:
@ -22,17 +22,12 @@ jobs:
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v3 uses: actions/checkout@v4
- name: Set Node.js 16.x - name: Set Node.js 20.x
uses: actions/setup-node@v3 uses: actions/setup-node@v4
with: with:
node-version: 16.x node-version: 20.x
# In order to upload & download artifacts from a shell script, certain env variables need to be set that are only available in the
# node context. This runs a local action that gets and sets the necessary env variables that are needed
- name: Set env variables
uses: ./packages/artifact/__tests__/ci-test-action/
# Need root node_modules because certain npm packages like jest are configured for the entire repository and it won't be possible # Need root node_modules because certain npm packages like jest are configured for the entire repository and it won't be possible
# without these to just compile the artifacts package # without these to just compile the artifacts package
@ -45,51 +40,155 @@ jobs:
npm run tsc npm run tsc
working-directory: packages/artifact working-directory: packages/artifact
- name: Set artifact file contents
shell: bash
run: |
echo "non-gzip-artifact-content=hello" >> $GITHUB_ENV
echo "gzip-artifact-content=Some large amount of text that has a compression ratio that is greater than 100%. If greater than 100%, gzip is used to upload the file" >> $GITHUB_ENV
echo "empty-artifact-content=_EMPTY_" >> $GITHUB_ENV
- name: Create files that will be uploaded - name: Create files that will be uploaded
run: | run: |
mkdir artifact-path mkdir artifact-path
echo '${{ env.non-gzip-artifact-content }}' > artifact-path/world.txt echo -n 'hello from file 1' > artifact-path/first.txt
echo '${{ env.gzip-artifact-content }}' > artifact-path/gzip.txt echo -n 'hello from file 2' > artifact-path/second.txt
touch artifact-path/empty.txt
# We're using node -e to call the functions directly available in the @actions/artifact package - name: Upload Artifacts
- name: Upload artifacts using uploadArtifact() uses: actions/github-script@v7
run: | with:
node -e "Promise.resolve(require('./packages/artifact/lib/artifact-client').create().uploadArtifact('my-artifact-1',['artifact-path/world.txt'], process.argv[1]))" "${{ github.workspace }}" script: |
node -e "Promise.resolve(require('./packages/artifact/lib/artifact-client').create().uploadArtifact('my-artifact-2',['artifact-path/gzip.txt'], process.argv[1]))" "${{ github.workspace }}" const {default: artifact} = require('./packages/artifact/lib/artifact')
node -e "Promise.resolve(require('./packages/artifact/lib/artifact-client').create().uploadArtifact('my-artifact-3',['artifact-path/empty.txt'], process.argv[1]))" "${{ github.workspace }}"
- name: Download artifacts using downloadArtifact() const artifactName = 'my-artifact-${{ matrix.runs-on }}'
run: | console.log('artifactName: ' + artifactName)
mkdir artifact-1-directory
node -e "Promise.resolve(require('./packages/artifact/lib/artifact-client').create().downloadArtifact('my-artifact-1','artifact-1-directory'))"
mkdir artifact-2-directory
node -e "Promise.resolve(require('./packages/artifact/lib/artifact-client').create().downloadArtifact('my-artifact-2','artifact-2-directory'))"
mkdir artifact-3-directory
node -e "Promise.resolve(require('./packages/artifact/lib/artifact-client').create().downloadArtifact('my-artifact-3','artifact-3-directory'))"
- name: Verify downloadArtifact() const fileContents = ['artifact-path/first.txt','artifact-path/second.txt']
shell: bash
run: |
packages/artifact/__tests__/test-artifact-file.sh "artifact-1-directory/artifact-path/world.txt" "${{ env.non-gzip-artifact-content }}"
packages/artifact/__tests__/test-artifact-file.sh "artifact-2-directory/artifact-path/gzip.txt" "${{ env.gzip-artifact-content }}"
packages/artifact/__tests__/test-artifact-file.sh "artifact-3-directory/artifact-path/empty.txt" "${{ env.empty-artifact-content }}"
- name: Download artifacts using downloadAllArtifacts() const uploadResult = await artifact.uploadArtifact(artifactName, fileContents, './')
run: | console.log(uploadResult)
mkdir multi-artifact-directory
node -e "Promise.resolve(require('./packages/artifact/lib/artifact-client').create().downloadAllArtifacts('multi-artifact-directory'))"
- name: Verify downloadAllArtifacts() const size = uploadResult.size
shell: bash const id = uploadResult.id
console.log(`Successfully uploaded artifact ${id}`)
try {
await artifact.uploadArtifact(artifactName, fileContents, './')
throw new Error('should have failed second upload')
} catch (err) {
console.log('Successfully blocked second artifact upload')
}
verify:
name: Verify and Delete
runs-on: ubuntu-latest
needs: [upload]
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Set Node.js 20.x
uses: actions/setup-node@v4
with:
node-version: 20.x
# Need root node_modules because certain npm packages like jest are configured for the entire repository and it won't be possible
# without these to just compile the artifacts package
- name: Install root npm packages
run: npm ci
- name: Compile artifact package
run: | run: |
packages/artifact/__tests__/test-artifact-file.sh "multi-artifact-directory/my-artifact-1/artifact-path/world.txt" "${{ env.non-gzip-artifact-content }}" npm ci
packages/artifact/__tests__/test-artifact-file.sh "multi-artifact-directory/my-artifact-2/artifact-path/gzip.txt" "${{ env.gzip-artifact-content }}" npm run tsc
packages/artifact/__tests__/test-artifact-file.sh "multi-artifact-directory/my-artifact-3/artifact-path/empty.txt" "${{ env.empty-artifact-content }}" working-directory: packages/artifact
- name: List and Download Artifacts
uses: actions/github-script@v7
with:
script: |
const {default: artifactClient} = require('./packages/artifact/lib/artifact')
const {readFile} = require('fs/promises')
const path = require('path')
const findBy = {
repositoryOwner: process.env.GITHUB_REPOSITORY.split('/')[0],
repositoryName: process.env.GITHUB_REPOSITORY.split('/')[1],
token: '${{ secrets.GITHUB_TOKEN }}',
workflowRunId: process.env.GITHUB_RUN_ID
}
const listResult = await artifactClient.listArtifacts({latest: true, findBy})
console.log(listResult)
const artifacts = listResult.artifacts
const expected = [
'my-artifact-ubuntu-latest',
'my-artifact-windows-latest',
'my-artifact-macos-latest'
]
const foundArtifacts = artifacts.filter(artifact =>
expected.includes(artifact.name)
)
if (foundArtifacts.length !== 3) {
console.log('Unexpected length of found artifacts', foundArtifacts)
throw new Error(
`Expected 3 artifacts but found ${foundArtifacts.length} artifacts.`
)
}
console.log('Successfully listed artifacts that were uploaded')
const files = [
{name: 'artifact-path/first.txt', content: 'hello from file 1'},
{name: 'artifact-path/second.txt', content: 'hello from file 2'}
]
for (const artifact of foundArtifacts) {
const {downloadPath} = await artifactClient.downloadArtifact(artifact.id, {
path: artifact.name,
findBy
})
console.log('Downloaded artifact to:', downloadPath)
for (const file of files) {
const filepath = path.join(
process.env.GITHUB_WORKSPACE,
downloadPath,
file.name
)
console.log('Checking file:', filepath)
const content = await readFile(filepath, 'utf8')
if (content.trim() !== file.content.trim()) {
throw new Error(
`Expected file '${file.name}' to contain '${file.content}' but found '${content}'`
)
}
}
}
- name: Delete Artifacts
uses: actions/github-script@v7
with:
script: |
const {default: artifactClient} = require('./packages/artifact/lib/artifact')
const artifactsToDelete = [
'my-artifact-ubuntu-latest',
'my-artifact-windows-latest',
'my-artifact-macos-latest'
]
for (const artifactName of artifactsToDelete) {
const {id} = await artifactClient.deleteArtifact(artifactName)
}
const {artifacts} = await artifactClient.listArtifacts({latest: true})
const foundArtifacts = artifacts.filter(artifact =>
artifactsToDelete.includes(artifact.name)
)
if (foundArtifacts.length !== 0) {
console.log('Unexpected length of found artifacts:', foundArtifacts)
throw new Error(
`Expected 0 artifacts but found ${foundArtifacts.length} artifacts.`
)
}

View File

@ -18,12 +18,12 @@ jobs:
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v3 uses: actions/checkout@v4
- name: Set Node.js 16.x - name: Set Node.js 20.x
uses: actions/setup-node@v3 uses: actions/setup-node@v4
with: with:
node-version: 16.x node-version: 20.x
- name: npm install - name: npm install
run: npm install run: npm install
@ -32,7 +32,7 @@ jobs:
run: npm run bootstrap run: npm run bootstrap
- name: audit tools (without allow-list) - name: audit tools (without allow-list)
run: npm audit --audit-level=moderate run: npm audit --audit-level=moderate --omit dev
- name: audit packages - name: audit packages
run: npm run audit-all run: npm run audit-all

View File

@ -22,12 +22,12 @@ jobs:
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v3 uses: actions/checkout@v4
- name: Set Node.js 16.x - name: Set Node.js 20.x
uses: actions/setup-node@v3 uses: actions/setup-node@v4
with: with:
node-version: 16.x node-version: 20.x
# In order to save & restore cache from a shell script, certain env variables need to be set that are only available in the # In order to save & restore cache from a shell script, certain env variables need to be set that are only available in the
# node context. This runs a local action that gets and sets the necessary env variables that are needed # node context. This runs a local action that gets and sets the necessary env variables that are needed

View File

@ -23,10 +23,10 @@ jobs:
run: | run: |
rm "C:\Program Files\Git\usr\bin\tar.exe" rm "C:\Program Files\Git\usr\bin\tar.exe"
- name: Set Node.js 12.x - name: Set Node.js 20.x
uses: actions/setup-node@v1 uses: actions/setup-node@v1
with: with:
node-version: 12.x node-version: 20.x
# In order to save & restore cache from a shell script, certain env variables need to be set that are only available in the # In order to save & restore cache from a shell script, certain env variables need to be set that are only available in the
# node context. This runs a local action that gets and sets the necessary env variables that are needed # node context. This runs a local action that gets and sets the necessary env variables that are needed

View File

@ -20,18 +20,18 @@ jobs:
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v3 uses: actions/checkout@v4
# Initializes the CodeQL tools for scanning. # Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL - name: Initialize CodeQL
uses: github/codeql-action/init@v1 uses: github/codeql-action/init@v2
with: with:
languages: javascript languages: javascript
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java). # Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below) # If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild - name: Autobuild
uses: github/codeql-action/autobuild@v1 uses: github/codeql-action/autobuild@v2
- name: Perform CodeQL Analysis - name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1 uses: github/codeql-action/analyze@v2

View File

@ -1,27 +1,42 @@
name: Publish NPM name: Publish NPM
run-name: Publish NPM - ${{ github.event.inputs.package }}
on: on:
workflow_dispatch: workflow_dispatch:
inputs: inputs:
package: package:
type: choice
required: true required: true
description: 'core, artifact, cache, exec, github, glob, http-client, io, tool-cache' description: 'Which package to release'
options:
- artifact
- attest
- cache
- core
- exec
- github
- glob
- http-client
- io
- tool-cache
jobs: jobs:
test: test:
runs-on: macos-latest runs-on: macos-latest-large
steps: steps:
- name: setup repo - name: setup repo
uses: actions/checkout@v3 uses: actions/checkout@v4
- name: verify package exists - name: verify package exists
run: ls packages/${{ github.event.inputs.package }} run: ls packages/${{ github.event.inputs.package }}
- name: Set Node.js 16.x - name: Set Node.js 20.x
uses: actions/setup-node@v3 uses: actions/setup-node@v4
with: with:
node-version: 16.x node-version: 20.x
- name: npm install - name: npm install
run: npm install run: npm install
@ -40,19 +55,22 @@ jobs:
working-directory: packages/${{ github.event.inputs.package }} working-directory: packages/${{ github.event.inputs.package }}
- name: upload artifact - name: upload artifact
uses: actions/upload-artifact@v3 uses: actions/upload-artifact@v4
with: with:
name: ${{ github.event.inputs.package }} name: ${{ github.event.inputs.package }}
path: packages/${{ github.event.inputs.package }}/*.tgz path: packages/${{ github.event.inputs.package }}/*.tgz
publish: publish:
runs-on: macos-latest runs-on: macos-latest-large
needs: test needs: test
environment: npm-publish environment: npm-publish
permissions:
contents: read
id-token: write
steps: steps:
- name: download artifact - name: download artifact
uses: actions/download-artifact@v3 uses: actions/download-artifact@v4
with: with:
name: ${{ github.event.inputs.package }} name: ${{ github.event.inputs.package }}
@ -62,7 +80,7 @@ jobs:
NPM_TOKEN: ${{ secrets.TOKEN }} NPM_TOKEN: ${{ secrets.TOKEN }}
- name: publish - name: publish
run: npm publish *.tgz run: npm publish --provenance *.tgz
- name: notify slack on failure - name: notify slack on failure
if: failure() if: failure()

View File

@ -16,19 +16,23 @@ jobs:
strategy: strategy:
matrix: matrix:
runs-on: [ubuntu-latest, macos-latest, windows-latest] runs-on: [ubuntu-latest, macos-latest-large, windows-latest]
# Node 18 is the current default Node version in hosted runners, so users may still use the toolkit with it when running tests (see https://github.com/actions/toolkit/issues/1841)
# Node 20 is the currently support Node version for actions - https://docs.github.com/actions/sharing-automations/creating-actions/metadata-syntax-for-github-actions#runsusing-for-javascript-actions
node-version: [18.x, 20.x]
fail-fast: false fail-fast: false
runs-on: ${{ matrix.runs-on }} runs-on: ${{ matrix.runs-on }}
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v3 uses: actions/checkout@v4
- name: Set Node.js 16.x - name: Set up Node ${{ matrix.node-version }}
uses: actions/setup-node@v3 uses: actions/setup-node@v4
with: with:
node-version: 16.x node-version: ${{ matrix.node-version }}
- name: npm install - name: npm install
run: npm install run: npm install
@ -40,7 +44,7 @@ jobs:
run: npm run build run: npm run build
- name: npm test - name: npm test
run: npm test -- --runInBand run: npm test -- --runInBand --forceExit
env: env:
GITHUB_TOKEN: ${{ github.token }} GITHUB_TOKEN: ${{ github.token }}

View File

@ -9,7 +9,7 @@ jobs:
if: ${{ github.repository_owner == 'actions' }} if: ${{ github.repository_owner == 'actions' }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v3 uses: actions/checkout@v4
- name: Update Octokit - name: Update Octokit
working-directory: packages/github working-directory: packages/github
run: | run: |

View File

@ -1,3 +1,5 @@
node_modules/ node_modules/
packages/*/node_modules/ packages/*/node_modules/
packages/*/lib/ packages/*/lib/
packages/glob/__tests__/_temp/**/
packages/*/src/generated/*/

View File

@ -7,5 +7,6 @@
"trailingComma": "none", "trailingComma": "none",
"bracketSpacing": false, "bracketSpacing": false,
"arrowParens": "avoid", "arrowParens": "avoid",
"parser": "typescript" "parser": "typescript",
"endOfLine": "auto"
} }

View File

@ -2,3 +2,4 @@
/packages/artifact/ @actions/artifacts-actions /packages/artifact/ @actions/artifacts-actions
/packages/cache/ @actions/actions-cache /packages/cache/ @actions/actions-cache
/packages/attest/ @actions/package-security

View File

@ -24,7 +24,7 @@ The GitHub Actions ToolKit provides a set of packages to make creating actions e
Provides functions for inputs, outputs, results, logging, secrets and variables. Read more [here](packages/core) Provides functions for inputs, outputs, results, logging, secrets and variables. Read more [here](packages/core)
```bash ```bash
$ npm install @actions/core npm install @actions/core
``` ```
<br/> <br/>
@ -33,7 +33,7 @@ $ npm install @actions/core
Provides functions to exec cli tools and process output. Read more [here](packages/exec) Provides functions to exec cli tools and process output. Read more [here](packages/exec)
```bash ```bash
$ npm install @actions/exec npm install @actions/exec
``` ```
<br/> <br/>
@ -42,7 +42,7 @@ $ npm install @actions/exec
Provides functions to search for files matching glob patterns. Read more [here](packages/glob) Provides functions to search for files matching glob patterns. Read more [here](packages/glob)
```bash ```bash
$ npm install @actions/glob npm install @actions/glob
``` ```
<br/> <br/>
@ -51,7 +51,7 @@ $ npm install @actions/glob
A lightweight HTTP client optimized for building actions. Read more [here](packages/http-client) A lightweight HTTP client optimized for building actions. Read more [here](packages/http-client)
```bash ```bash
$ npm install @actions/http-client npm install @actions/http-client
``` ```
<br/> <br/>
@ -60,7 +60,7 @@ $ npm install @actions/http-client
Provides disk i/o functions like cp, mv, rmRF, which etc. Read more [here](packages/io) Provides disk i/o functions like cp, mv, rmRF, which etc. Read more [here](packages/io)
```bash ```bash
$ npm install @actions/io npm install @actions/io
``` ```
<br/> <br/>
@ -71,7 +71,7 @@ Provides functions for downloading and caching tools. e.g. setup-* actions. Rea
See @actions/cache for caching workflow dependencies. See @actions/cache for caching workflow dependencies.
```bash ```bash
$ npm install @actions/tool-cache npm install @actions/tool-cache
``` ```
<br/> <br/>
@ -80,7 +80,7 @@ $ npm install @actions/tool-cache
Provides an Octokit client hydrated with the context that the current action is being run in. Read more [here](packages/github) Provides an Octokit client hydrated with the context that the current action is being run in. Read more [here](packages/github)
```bash ```bash
$ npm install @actions/github npm install @actions/github
``` ```
<br/> <br/>
@ -89,7 +89,7 @@ $ npm install @actions/github
Provides functions to interact with actions artifacts. Read more [here](packages/artifact) Provides functions to interact with actions artifacts. Read more [here](packages/artifact)
```bash ```bash
$ npm install @actions/artifact npm install @actions/artifact
``` ```
<br/> <br/>
@ -98,7 +98,16 @@ $ npm install @actions/artifact
Provides functions to cache dependencies and build outputs to improve workflow execution time. Read more [here](packages/cache) Provides functions to cache dependencies and build outputs to improve workflow execution time. Read more [here](packages/cache)
```bash ```bash
$ npm install @actions/cache npm install @actions/cache
```
<br/>
:lock_with_ink_pen: [@actions/attest](packages/attest)
Provides functions to write attestations for workflow artifacts. Read more [here](packages/attest)
```bash
npm install @actions/attest
``` ```
<br/> <br/>

View File

@ -32,7 +32,7 @@ jobs:
os: [ubuntu-16.04, windows-2019] os: [ubuntu-16.04, windows-2019]
runs-on: ${{matrix.os}} runs-on: ${{matrix.os}}
actions: actions:
- uses: actions/setup-node@v3 - uses: actions/setup-node@v4
with: with:
version: ${{matrix.node}} version: ${{matrix.node}}
- run: | - run: |

View File

@ -18,7 +18,7 @@ e.g. To use https://github.com/actions/setup-node, users will author:
```yaml ```yaml
steps: steps:
using: actions/setup-node@v3 using: actions/setup-node@v4
``` ```
# Define Metadata # Define Metadata

View File

@ -1,6 +1,6 @@
{ {
"packages": [ "packages": [
"packages/*" "packages/**/*"
], ],
"version": "independent" "version": "independent"
} }

24
nx.json Normal file
View File

@ -0,0 +1,24 @@
{
"tasksRunnerOptions": {
"default": {
"runner": "nx/tasks-runners/default",
"options": {
"cacheableOperations": []
}
}
},
"affected": {
"defaultBase": "master"
},
"$schema": "./node_modules/nx/schemas/nx-schema.json",
"namedInputs": {
"default": [
"{projectRoot}/**/*",
"sharedGlobals"
],
"sharedGlobals": [],
"production": [
"default"
]
}
}

27181
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -3,30 +3,34 @@
"private": true, "private": true,
"scripts": { "scripts": {
"audit-all": "lerna run audit-moderate", "audit-all": "lerna run audit-moderate",
"bootstrap": "lerna bootstrap", "bootstrap": "lerna exec -- npm install",
"build": "lerna run tsc", "build": "lerna run tsc",
"clean": "lerna clean",
"repair": "lerna repair",
"check-all": "concurrently \"npm:format-check\" \"npm:lint\" \"npm:test\" \"npm:build -- -- --noEmit\"", "check-all": "concurrently \"npm:format-check\" \"npm:lint\" \"npm:test\" \"npm:build -- -- --noEmit\"",
"format": "prettier --write packages/**/*.ts", "format": "prettier --write packages/**/*.ts",
"format-check": "prettier --check packages/**/*.ts", "format-check": "prettier --check packages/**/*.ts",
"lint": "eslint packages/**/*.ts", "lint": "eslint packages/**/*.ts",
"lint-fix": "eslint packages/**/*.ts --fix", "lint-fix": "eslint packages/**/*.ts --fix",
"new-package": "scripts/create-package", "new-package": "scripts/create-package",
"test": "jest --testTimeout 10000" "test": "jest --testTimeout 70000"
}, },
"devDependencies": { "devDependencies": {
"@types/jest": "^27.0.2", "@types/jest": "^29.5.4",
"@types/node": "^16.18.1", "@types/node": "^20.5.7",
"@types/signale": "^1.4.1", "@types/signale": "^1.4.1",
"@typescript-eslint/parser": "^4.0.0",
"concurrently": "^6.1.0", "concurrently": "^6.1.0",
"eslint": "^7.23.0", "eslint": "^8.0.1",
"eslint-plugin-github": "^4.1.3", "eslint-config-prettier": "^8.9.0",
"eslint-plugin-jest": "^22.21.0", "eslint-plugin-github": "^4.9.2",
"eslint-plugin-jest": "^27.2.3",
"eslint-plugin-prettier": "^5.0.0",
"flow-bin": "^0.115.0", "flow-bin": "^0.115.0",
"jest": "^27.2.5", "jest": "^29.6.4",
"lerna": "^5.4.0", "lerna": "^6.4.1",
"prettier": "^1.19.1", "nx": "16.6.0",
"ts-jest": "^27.0.5", "prettier": "^3.0.0",
"typescript": "^3.9.9" "ts-jest": "^29.1.1",
"typescript": "^5.2.2"
} }
} }

View File

@ -1,30 +1,44 @@
# Contributions # Contributions
This package is used internally by the v2+ versions of [upload-artifact](https://github.com/actions/upload-artifact) and [download-artifact](https://github.com/actions/download-artifact). This package can also be used by other actions to interact with artifacts. Any changes or updates to this package will propagate updates to these actions so it is important that major changes or updates get properly tested. This package is used internally by the v4 versions of [upload-artifact](https://github.com/actions/upload-artifact) and [download-artifact](https://github.com/actions/download-artifact). This package can also be used by other actions to interact with artifacts. Any changes or updates to this package will propagate updates to these actions so it is important that major changes or updates get properly tested.
Any issues or feature requests that are related to the artifact actions should be filled in the appropriate repo. Any issues or feature requests that are related to the artifact actions should be filled in the appropriate repo.
A limited range of unit tests run as part of each PR when making changes to the artifact packages. For small contributions and fixes, they should be sufficient. A limited range of unit tests run as part of each PR when making changes to the artifact packages. For small contributions and fixes, they should be sufficient.
If making large changes, there are a few scenarios that should be tested. If making large changes, there are a few scenarios that should be tested:
- Uploading very large artifacts (large artifacts get compressed using gzip so compression/decompression must be tested) - Uploading very large artifacts
- Uploading artifacts with lots of small files (each file is uploaded with its own HTTP call, timeouts and non-success HTTP responses can be expected so they must be properly handled) - Uploading artifacts with lots of small files
- Uploading artifacts using a self-hosted runner (uploads and downloads behave differently due to extra latency) - Uploading artifacts using a self-hosted runner (uploads and downloads behave differently due to extra latency)
- Downloading a single artifact (large and small, if lots of small files are part of an artifact, timeouts and non-success HTTP responses can be expected) - Downloading a single artifact (large and small, if lots of small files are part of an artifact, timeouts and non-success HTTP responses can be expected)
- Downloading all artifacts at once - Downloading all artifacts at once
Large architectural changes can impact upload/download performance so it is important to separately run extra tests. We request that any large contributions/changes have extra detailed testing so we can verify performance and possible regressions. Large architectural changes can impact upload/download performance so it is important to separately run extra tests. We request that any large contributions/changes have extra detailed testing so we can verify performance and possible regressions.
It is not possible to run end-to-end tests for artifacts as part of a PR in this repo because certain env variables such as `ACTIONS_RUNTIME_URL` are only available from the context of an action as opposed to a shell script. These env variables are needed in order to make the necessary API calls. Tests will run for every push/pull_request [via Actions](https://github.com/actions/toolkit/blob/main/.github/workflows/artifact-tests.yml).
# Testing # Testing
Any easy way to test changes is to fork the artifact actions and to use `npm link` to test your changes. ## Package tests
1. Fork the [upload-artifact](https://github.com/actions/upload-artifact) and [download-artifact](https://github.com/actions/download-artifact) repos To run unit tests for the `@actions/artifact` package:
2. Clone the forks locally
3. With your local changes to the toolkit repo, type `npm link` after ensuring there are no errors when running `tsc` 1. Clone `actions/toolkit` locally
4. In the locally cloned fork, type `npm link @actions/artifact` 2. Install dependencies: `npm bootstrap`
4. Create a new release for your local fork using `tsc` and `npm run release` (this will create a new `dist/index.js` file using `@vercel/ncc`) 3. Change working directory to `packages/artifact`
5. Commit and push your local changes, you will then be able to test your changes with your forked action 4. Run jest tests: `npm run test`
## Within upload-artifact or download-artifact actions
Any easy way to test changes for the official upload/download actions is to fork them, compile changes and run them.
1. For your local `actions/toolkit` changes:
1. Change directory to `packages/artifact`
2. Compile the changes: `npm run tsc`
3. Symlink your package change: `npm link`
2. Fork and clone either [upload-artifact](https://github.com/actions/upload-artifact) and [download-artifact](https://github.com/actions/download-artifact)
1. In the locally cloned fork, link to your local toolkit changes: `npm link @actions/artifact`
2. Then, compile your changes with: `npm run release`. The local `dist/index.js` should be updated with your changes.
3. Commit and push to your fork, you can then test with a `uses:` in your workflow pointed at your fork.
4. The format for the above is `<username>/<repository-name>/@<ref>`, i.e. `me/myrepo/@HEAD`

View File

@ -1,213 +1,192 @@
# `@actions/artifact` # `@actions/artifact`
## Usage Interact programmatically with [Actions Artifacts](https://docs.github.com/en/actions/using-workflows/storing-workflow-data-as-artifacts).
You can use this package to interact with the actions artifacts. This is the core library that powers the [`@actions/upload-artifact`](https://github.com/actions/upload-artifact) and [`@actions/download-artifact`](https://github.com/actions/download-artifact) actions.
- [Upload an Artifact](#Upload-an-Artifact)
- [Download a Single Artifact](#Download-a-Single-Artifact)
- [Download All Artifacts](#Download-all-Artifacts)
- [Additional Documentation](#Additional-Documentation)
- [Contributions](#Contributions)
Relative paths and absolute paths are both allowed. Relative paths are rooted against the current working directory.
## Upload an Artifact - [`@actions/artifact`](#actionsartifact)
- [v2 - What's New](#v2---whats-new)
- [Improvements](#improvements)
- [Breaking changes](#breaking-changes)
- [Quick Start](#quick-start)
- [Examples](#examples)
- [Upload and Download](#upload-and-download)
- [Delete an Artifact](#delete-an-artifact)
- [Downloading from other workflow runs or repos](#downloading-from-other-workflow-runs-or-repos)
- [Speeding up large uploads](#speeding-up-large-uploads)
- [Additional Resources](#additional-resources)
Method Name: `uploadArtifact` ## v2 - What's New
#### Inputs > [!IMPORTANT]
- `name` > @actions/artifact v2+, upload-artifact@v4+, and download-artifact@v4+ are not currently supported on GHES yet. The previous version of this package can be found at [this tag](https://github.com/actions/toolkit/tree/@actions/artifact@1.1.2/packages/artifact) and [on npm](https://www.npmjs.com/package/@actions/artifact/v/1.1.2).
- The name of the artifact that is being uploaded
- Required
- `files`
- A list of file paths that describe what should be uploaded as part of the artifact
- If a path is provided that does not exist, an error will be thrown
- Can be absolute or relative. Internally everything is normalized and resolved
- Required
- `rootDirectory`
- A file path that denotes the root directory of the files being uploaded. This path is used to strip the paths provided in `files` to control how they are uploaded and structured
- If a file specified in `files` is not in the `rootDirectory`, an error will be thrown
- Required
- `options`
- Extra options that allow for the customization of the upload behavior
- Optional
#### Available Options The release of `@actions/artifact@v2` (including `upload-artifact@v4` and `download-artifact@v4`) are major changes to the backend architecture of Artifacts. They have numerous performance and behavioral improvements.
- `continueOnError` ### Improvements
- Indicates if the artifact upload should continue in the event a file fails to upload. If there is a error during upload, a partial artifact will always be created and available for download at the end. The `size` reported will be the amount of storage that the user or org will be charged for the partial artifact.
- If set to `false`, and an error is encountered, all other uploads will stop and any files that were queued will not be attempted to be uploaded. The partial artifact available will only include files up until the failure.
- If set to `true` and an error is encountered, the failed file will be skipped and ignored and all other queued files will be attempted to be uploaded. There will be an artifact available for download at the end with everything excluding the file that failed to upload
- Optional, defaults to `true` if not specified
- `retentionDays`
- Duration after which artifact will expire in days
- Minimum value: 1
- Maximum value: 90 unless changed by repository setting
- If this is set to a greater value than the retention settings allowed, the retention on artifacts will be reduced to match the max value allowed on the server, and the upload process will continue. An input of 0 assumes default retention value.
#### Example using Absolute File Paths 1. All upload and download operations are much quicker, up to 80% faster download times and 96% faster upload times in worst case scenarios.
2. Once uploaded, an Artifact ID is returned and Artifacts are immediately available in the UI and [REST API](https://docs.github.com/en/rest/actions/artifacts). Previously, you would have to wait for the run to be completed before an ID was available or any APIs could be utilized.
3. Artifacts can now be downloaded and deleted from the UI _before_ the entire workflow run finishes.
4. The contents of an Artifact are uploaded together into an _immutable_ archive. They cannot be altered by subsequent jobs. Both of these factors help reduce the possibility of accidentally corrupting Artifact files. (Digest/integrity hash coming soon in the API!)
5. This library (and `actions/download-artifact`) now support downloading Artifacts from _other_ repositories and runs if a `GITHUB_TOKEN` with sufficient `actions:read` permissions are provided.
### Breaking changes
1. Firewall rules required for self-hosted runners.
If you are using self-hosted runners behind a firewall, you must have flows open to [Actions endpoints](https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners#communication-between-self-hosted-runners-and-github). If you cannot use wildcard rules for your firewall, see the GitHub [meta endpoint](https://api.github.com/meta) for specific endpoints.
e.g.
```bash
curl https://api.github.com/meta | jq .domains.actions
```
2. Uploading to the same named Artifact multiple times.
Due to how Artifacts are created in this new version, it is no longer possible to upload to the same named Artifact multiple times. You must either split the uploads into multiple Artifacts with different names, or only upload once.
3. Limit of Artifacts for an individual job.
Each job in a workflow run now has a limit of 10 artifacts.
## Quick Start
Install the package:
```bash
npm i @actions/artifact
```
Import the module:
```js ```js
const artifact = require('@actions/artifact'); // ES6 module
const artifactClient = artifact.create() import {DefaultArtifactClient} from '@actions/artifact'
const artifactName = 'my-artifact';
const files = [ // CommonJS
'/home/user/files/plz-upload/file1.txt', const {DefaultArtifactClient} = require('@actions/artifact')
'/home/user/files/plz-upload/file2.txt', ```
'/home/user/files/plz-upload/dir/file3.txt'
] Then instantiate:
const rootDirectory = '/home/user/files/plz-upload'
const options = { ```js
continueOnError: true const artifact = new DefaultArtifactClient()
```
For a comprehensive list of classes, interfaces, functions and more, see the [generated documentation](./docs/generated/README.md).
## Examples
### Upload and Download
The most basic scenario is uploading one or more files to an Artifact, then downloading that Artifact. Downloads are based on the Artifact ID, which can be obtained in the response of `uploadArtifact`, `getArtifact`, `listArtifacts` or via the [REST API](https://docs.github.com/en/rest/actions/artifacts).
```js
const {id, size} = await artifact.uploadArtifact(
// name of the artifact
'my-artifact',
// files to include (supports absolute and relative paths)
['/absolute/path/file1.txt', './relative/file2.txt'],
{
// optional: how long to retain the artifact
// if unspecified, defaults to repository/org retention settings (the limit of this value)
retentionDays: 10
}
)
console.log(`Created artifact with id: ${id} (bytes: ${size}`)
const {downloadPath} = await artifact.downloadArtifact(id, {
// optional: download destination path. otherwise defaults to $GITHUB_WORKSPACE
path: '/tmp/dst/path',
})
console.log(`Downloaded artifact ${id} to: ${downloadPath}`)
```
### Delete an Artifact
To delete an artifact, all you need is the name.
```js
const {id} = await artifact.deleteArtifact(
// name of the artifact
'my-artifact'
)
console.log(`Deleted Artifact ID '${id}'`)
```
It also supports options to delete from other repos/runs given a github token with `actions:write` permissions on the target repository is supplied.
```js
const findBy = {
// must have actions:write permission on target repository
token: process.env['GITHUB_TOKEN'],
workflowRunId: 123,
repositoryOwner: 'actions',
repositoryName: 'toolkit'
} }
const uploadResult = await artifactClient.uploadArtifact(artifactName, files, rootDirectory, options)
const {id} = await artifact.deleteArtifact(
// name of the artifact
'my-artifact',
// options to find by other repo/owner
{ findBy }
)
console.log(`Deleted Artifact ID '${id}' from ${findBy.repositoryOwner}/ ${findBy.repositoryName}`)
``` ```
#### Example using Relative File Paths ### Downloading from other workflow runs or repos
```js
// Assuming the current working directory is /home/user/files/plz-upload
const artifact = require('@actions/artifact');
const artifactClient = artifact.create()
const artifactName = 'my-artifact';
const files = [
'file1.txt',
'file2.txt',
'dir/file3.txt'
]
const rootDirectory = '.' // Also possible to use __dirname It may be useful to download Artifacts from other workflow runs, or even other repositories. By default, the permissions are scoped so they can only download Artifacts within the current workflow run. To elevate permissions for this scenario, you must specify `options.findBy` to `downloadArtifact`.
const options = {
continueOnError: false ```ts
const findBy = {
// must have actions:read permission on target repository
token: process.env['GITHUB_TOKEN'],
workflowRunId: 123,
repositoryOwner: 'actions',
repositoryName: 'toolkit'
} }
const uploadResponse = await artifactClient.uploadArtifact(artifactName, files, rootDirectory, options) await artifact.downloadArtifact(1337, {
findBy
})
// can also be used in other methods
await artifact.getArtifact('my-artifact', {
findBy
})
await artifact.listArtifacts({
findBy
})
``` ```
#### Upload Result ### Speeding up large uploads
The returned `UploadResponse` will contain the following information If you have large files that need to be uploaded (or file types that don't compress well), you may benefit from changing the compression level of the Artifact archive. NOTE: This is a tradeoff between artifact upload time and stored data size.
- `artifactName` ```ts
- The name of the artifact that was uploaded await artifact.uploadArtifact('my-massive-artifact', ['big_file.bin'], {
- `artifactItems` // The level of compression for Zlib to be applied to the artifact archive.
- A list of all files that describe what is uploaded if there are no errors encountered. Usually this will be equal to the inputted `files` with the exception of empty directories (will not be uploaded) // - 0: No compression
- `size` // - 1: Best speed
- Total size of the artifact that was uploaded in bytes // - 6: Default compression (same as GNU Gzip)
- `failedItems` // - 9: Best compression
- A list of items that were not uploaded successfully (this will include queued items that were not uploaded if `continueOnError` is set to false). This is a subset of `artifactItems` compressionLevel: 0
})
## Download a Single Artifact
Method Name: `downloadArtifact`
#### Inputs
- `name`
- The name of the artifact to download
- Required
- `path`
- Path that denotes where the artifact will be downloaded to
- Optional. Defaults to the GitHub workspace directory(`$GITHUB_WORKSPACE`) if not specified
- `options`
- Extra options that allow for the customization of the download behavior
- Optional
#### Available Options
- `createArtifactFolder`
- Specifies if a folder (the artifact name) is created for the artifact that is downloaded (contents downloaded into this folder),
- Optional. Defaults to false if not specified
#### Example
```js
const artifact = require('@actions/artifact');
const artifactClient = artifact.create()
const artifactName = 'my-artifact';
const path = 'some/directory'
const options = {
createArtifactFolder: false
}
const downloadResponse = await artifactClient.downloadArtifact(artifactName, path, options)
// Post download, the directory structure will look like this
/some
/directory
/file1.txt
/file2.txt
/dir
/file3.txt
// If createArtifactFolder is set to true, the directory structure will look like this
/some
/directory
/my-artifact
/file1.txt
/file2.txt
/dir
/file3.txt
``` ```
#### Download Response ## Additional Resources
The returned `DownloadResponse` will contain the following information - [Releases](./RELEASES.md)
- [Contribution Guide](./CONTRIBUTIONS.md)
- `artifactName` - [Frequently Asked Questions](./docs/faq.md)
- The name of the artifact that was downloaded
- `downloadPath`
- The full Path to where the artifact was downloaded
## Download All Artifacts
Method Name: `downloadAllArtifacts`
#### Inputs
- `path`
- Path that denotes where the artifact will be downloaded to
- Optional. Defaults to the GitHub workspace directory(`$GITHUB_WORKSPACE`) if not specified
```js
const artifact = require('@actions/artifact');
const artifactClient = artifact.create();
const downloadResponse = await artifactClient.downloadAllArtifacts();
// output result
for (response in downloadResponse) {
console.log(response.artifactName);
console.log(response.downloadPath);
}
```
Because there are multiple artifacts, an extra directory (denoted by the name of the artifact) will be created for each artifact in the path. With 2 artifacts(`my-artifact-1` and `my-artifact-2` for example) and the default path, the directory structure will be as follows:
```js
/GITHUB_WORKSPACE
/my-artifact-1
/ .. contents of `my-artifact-1`
/my-artifact-2
/ .. contents of `my-artifact-2`
```
#### Download Result
An array will be returned that describes the results for downloading all artifacts. The number of items in the array indicates the number of artifacts that were downloaded.
Each artifact will have the same `DownloadResponse` as if it was individually downloaded
- `artifactName`
- The name of the artifact that was downloaded
- `downloadPath`
- The full Path to where the artifact was downloaded
## Additional Documentation
Check out [additional-information](docs/additional-information.md) for extra documentation around usage, restrictions and behavior.
Check out [implementation-details](docs/implementation-details.md) for extra information about the implementation of this package.
## Contributions
See [contributor guidelines](https://github.com/actions/toolkit/blob/main/.github/CONTRIBUTING.md) for general guidelines and information about toolkit contributions.
For contributions related to this package, see [artifact contributions](CONTRIBUTIONS.md) for more information.

View File

@ -1,66 +1,120 @@
# @actions/artifact Releases # @actions/artifact Releases
### 0.1.0 ### 2.3.3
- Initial release - Dependency updates [#2049](https://github.com/actions/toolkit/pull/2049)
### 0.2.0 ### 2.3.2
- Fixes to TCP connections not closing - Added masking for Shared Access Signature (SAS) artifact URLs [#1982](https://github.com/actions/toolkit/pull/1982)
- GZip file compression to speed up downloads - Change hash to digest for consistent terminology across runner logs [#1991](https://github.com/actions/toolkit/pull/1991)
- Improved logging and output
- Extra documentation
### 0.3.0 ### 2.3.1
- Fixes to gzip decompression when downloading artifacts - Fix comment typo on expectedHash. [#1986](https://github.com/actions/toolkit/pull/1986)
- Support handling 429 response codes
- Improved download experience when dealing with empty files
- Exponential backoff when retryable status codes are encountered
- Clearer error message if storage quota has been reached
- Improved logging and output during artifact download
### 0.3.1 ### 2.3.0
- Fix to ensure temporary gzip files get correctly deleted during artifact upload - Allow ArtifactClient to perform digest comparisons, if supplied. [#1975](https://github.com/actions/toolkit/pull/1975)
- Remove spaces as a forbidden character during upload
### 0.3.2 ### 2.2.2
- Fix to ensure readstreams get correctly reset in the event of a retry - Default concurrency to 5 for uploading artifacts [#1962](https://github.com/actions/toolkit/pull/1962)
### 0.3.3 ### 2.2.1
- Increase chunk size during upload from 4MB to 8MB - Add `ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY` and `ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS` environment variables [#1928](https://github.com/actions/toolkit/pull/1928)
- Improve user-agent strings during API calls to help internally diagnose issues
### 0.3.5 ### 2.2.0
- Retry in the event of a 413 response - Return artifact digest on upload [#1896](https://github.com/actions/toolkit/pull/1896)
### 0.4.0 ### 2.1.11
- Add option to specify custom retentions on artifacts - Fixed a bug with relative symlinks resolution [#1844](https://github.com/actions/toolkit/pull/1844)
- Use native `crypto` [#1815](https://github.com/actions/toolkit/pull/1815)
### 0.4.1 ### 2.1.10
- Update to latest @actions/core version - Fixed a regression with symlinks not being automatically resolved [#1830](https://github.com/actions/toolkit/pull/1830)
- Fixed a regression with chunk timeout [#1786](https://github.com/actions/toolkit/pull/1786)
### 0.4.2 ### 2.1.9
- Improved retry-ability when a partial artifact download is encountered - Fixed artifact upload chunk timeout logic [#1774](https://github.com/actions/toolkit/pull/1774)
- Use lazy stream to prevent issues with open file limits [#1771](https://github.com/actions/toolkit/pull/1771)
### 0.5.0 ### 2.1.8
- Improved retry-ability for all http calls during artifact upload and download if an error is encountered - Allows `*.localhost` domains for hostname checks for local development.
### 0.5.1 ### 2.1.7
- Bump @actions/http-client to version 1.0.11 to fix proxy related issues during artifact upload and download - Update unzip-stream dependency and reverted to using `unzip.Extract()`
### 0.5.2 ### 2.1.6
- Add HTTP 500 as a retryable status code for artifact upload and download. - Will retry on invalid request responses.
### 2.1.5
- Bumped `archiver` dependency to 7.0.1
### 2.1.4
- Adds info-level logging for zip extraction
### 2.1.3
- Fixes a bug in the extract logic updated in 2.1.2
### 2.1.2
- Updated the stream extract functionality to use `unzip.Parse()` instead of `unzip.Extract()` for greater control of unzipping artifacts
### 2.1.1
- Updated `isGhes` check to include `.ghe.com` and `.ghe.localhost` as accepted hosts
### 2.1.0
- Added `ArtifactClient#deleteArtifact` to delete artifacts by name [#1626](https://github.com/actions/toolkit/pull/1626)
- Update error messaging to be more useful [#1628](https://github.com/actions/toolkit/pull/1628)
### 2.0.1
- Patch to fix transient request timeouts https://github.com/actions/download-artifact/issues/249
### 2.0.0
- Major release. Supports new Artifact backend for improved speed, reliability and behavior.
- Numerous API changes, [some breaking](./README.md#breaking-changes).
- [Blog post with more info](https://github.blog/2024-02-12-get-started-with-v4-of-github-actions-artifacts/)
### 1.1.1
- Fixed a bug in Node16 where if an HTTP download finished too quickly (<1ms, e.g. when it's mocked) we attempt to delete a temp file that has not been created yet [#1278](https://github.com/actions/toolkit/pull/1278/commits/b9de68a590daf37c6747e38d3cb4f1dd2cfb791c)
### 1.1.0
- Add `x-actions-results-crc64` and `x-actions-results-md5` checksum headers on upload [#1063](https://github.com/actions/toolkit/pull/1063)
### 1.0.2
- Update to v2.0.1 of `@actions/http-client` [#1087](https://github.com/actions/toolkit/pull/1087)
### 1.0.1
- Update to v2.0.0 of `@actions/http-client`
### 1.0.0
- Update `lockfileVersion` to `v2` in `package-lock.json` [#1009](https://github.com/actions/toolkit/pull/1009)
### 0.6.1
- Fix for failing 0 byte file uploads on Windows [#962](https://github.com/actions/toolkit/pull/962)
### 0.6.0 ### 0.6.0
@ -71,26 +125,64 @@
- Faster upload speeds for certain types of large files by exempting gzip compression [#956](https://github.com/actions/toolkit/pull/956) - Faster upload speeds for certain types of large files by exempting gzip compression [#956](https://github.com/actions/toolkit/pull/956)
- More detailed logging when dealing with chunked uploads [#957](https://github.com/actions/toolkit/pull/957) - More detailed logging when dealing with chunked uploads [#957](https://github.com/actions/toolkit/pull/957)
### 0.6.1 ### 0.5.2
- Fix for failing 0 byte file uploads on Windows [#962](https://github.com/actions/toolkit/pull/962) - Add HTTP 500 as a retryable status code for artifact upload and download.
### 1.0.0 ### 0.5.1
- Update `lockfileVersion` to `v2` in `package-lock.json` [#1009](https://github.com/actions/toolkit/pull/1009) - Bump @actions/http-client to version 1.0.11 to fix proxy related issues during artifact upload and download
### 1.0.1 ### 0.5.0
- Update to v2.0.0 of `@actions/http-client` - Improved retry-ability for all http calls during artifact upload and download if an error is encountered
### 1.0.2 ### 0.4.2
- Update to v2.0.1 of `@actions/http-client` [#1087](https://github.com/actions/toolkit/pull/1087) - Improved retry-ability when a partial artifact download is encountered
### 1.1.0 ### 0.4.1
- Add `x-actions-results-crc64` and `x-actions-results-md5` checksum headers on upload [#1063](https://github.com/actions/toolkit/pull/1063) - Update to latest @actions/core version
### 1.1.1 ### 0.4.0
- Fixed a bug in Node16 where if an HTTP download finished too quickly (<1ms, e.g. when it's mocked) we attempt to delete a temp file that has not been created yet [#1278](https://github.com/actions/toolkit/pull/1278/commits/b9de68a590daf37c6747e38d3cb4f1dd2cfb791c) - Add option to specify custom retentions on artifacts
-
### 0.3.5
- Retry in the event of a 413 response
### 0.3.3
- Increase chunk size during upload from 4MB to 8MB
- Improve user-agent strings during API calls to help internally diagnose issues
### 0.3.2
- Fix to ensure readstreams get correctly reset in the event of a retry
### 0.3.1
- Fix to ensure temporary gzip files get correctly deleted during artifact upload
- Remove spaces as a forbidden character during upload
### 0.3.0
- Fixes to gzip decompression when downloading artifacts
- Support handling 429 response codes
- Improved download experience when dealing with empty files
- Exponential backoff when retryable status codes are encountered
- Clearer error message if storage quota has been reached
- Improved logging and output during artifact download
### 0.2.0
- Fixes to TCP connections not closing
- GZip file compression to speed up downloads
- Improved logging and output
- Extra documentation
### 0.1.0
- Initial release

View File

@ -0,0 +1,348 @@
import * as http from 'http'
import * as net from 'net'
import {HttpClient} from '@actions/http-client'
import * as config from '../src/internal/shared/config'
import {internalArtifactTwirpClient} from '../src/internal/shared/artifact-twirp-client'
import {noopLogs} from './common'
import {NetworkError, UsageError} from '../src/internal/shared/errors'
jest.mock('@actions/http-client')
const clientOptions = {
maxAttempts: 5,
retryIntervalMs: 1,
retryMultiplier: 1.5
}
describe('artifact-http-client', () => {
beforeAll(() => {
noopLogs()
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('http://localhost:8080')
jest.spyOn(config, 'getRuntimeToken').mockReturnValue('token')
})
beforeEach(() => {
jest.clearAllMocks()
})
it('should successfully create a client', () => {
const client = internalArtifactTwirpClient()
expect(client).toBeDefined()
})
it('should make a request', async () => {
const mockPost = jest.fn(() => {
const msg = new http.IncomingMessage(new net.Socket())
msg.statusCode = 200
return {
message: msg,
readBody: async () => {
return Promise.resolve(
`{"ok": true, "signedUploadUrl": "http://localhost:8080/upload"}`
)
}
}
})
const mockHttpClient = (
HttpClient as unknown as jest.Mock
).mockImplementation(() => {
return {
post: mockPost
}
})
const client = internalArtifactTwirpClient()
const artifact = await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
name: 'artifact',
version: 4
})
expect(mockHttpClient).toHaveBeenCalledTimes(1)
expect(mockPost).toHaveBeenCalledTimes(1)
expect(artifact).toBeDefined()
expect(artifact.ok).toBe(true)
expect(artifact.signedUploadUrl).toBe('http://localhost:8080/upload')
})
it('should retry if the request fails', async () => {
const mockPost = jest
.fn(() => {
const msgSucceeded = new http.IncomingMessage(new net.Socket())
msgSucceeded.statusCode = 200
return {
message: msgSucceeded,
readBody: async () => {
return Promise.resolve(
`{"ok": true, "signedUploadUrl": "http://localhost:8080/upload"}`
)
}
}
})
.mockImplementationOnce(() => {
const msgFailed = new http.IncomingMessage(new net.Socket())
msgFailed.statusCode = 500
msgFailed.statusMessage = 'Internal Server Error'
return {
message: msgFailed,
readBody: async () => {
return Promise.resolve(`{"ok": false}`)
}
}
})
const mockHttpClient = (
HttpClient as unknown as jest.Mock
).mockImplementation(() => {
return {
post: mockPost
}
})
const client = internalArtifactTwirpClient(clientOptions)
const artifact = await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
name: 'artifact',
version: 4
})
expect(mockHttpClient).toHaveBeenCalledTimes(1)
expect(artifact).toBeDefined()
expect(artifact.ok).toBe(true)
expect(artifact.signedUploadUrl).toBe('http://localhost:8080/upload')
expect(mockPost).toHaveBeenCalledTimes(2)
})
it('should retry if invalid body response', async () => {
const mockPost = jest
.fn(() => {
const msgSucceeded = new http.IncomingMessage(new net.Socket())
msgSucceeded.statusCode = 200
return {
message: msgSucceeded,
readBody: async () => {
return Promise.resolve(
`{"ok": true, "signedUploadUrl": "http://localhost:8080/upload"}`
)
}
}
})
.mockImplementationOnce(() => {
const msgFailed = new http.IncomingMessage(new net.Socket())
msgFailed.statusCode = 502
msgFailed.statusMessage = 'Bad Gateway'
return {
message: msgFailed,
readBody: async () => {
return Promise.resolve('💥')
}
}
})
const mockHttpClient = (
HttpClient as unknown as jest.Mock
).mockImplementation(() => {
return {
post: mockPost
}
})
const client = internalArtifactTwirpClient(clientOptions)
const artifact = await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
name: 'artifact',
version: 4
})
expect(mockHttpClient).toHaveBeenCalledTimes(1)
expect(artifact).toBeDefined()
expect(artifact.ok).toBe(true)
expect(artifact.signedUploadUrl).toBe('http://localhost:8080/upload')
expect(mockPost).toHaveBeenCalledTimes(2)
})
it('should fail if the request fails 5 times', async () => {
const mockPost = jest.fn(() => {
const msgFailed = new http.IncomingMessage(new net.Socket())
msgFailed.statusCode = 500
msgFailed.statusMessage = 'Internal Server Error'
return {
message: msgFailed,
readBody: async () => {
return Promise.resolve(`{"ok": false}`)
}
}
})
const mockHttpClient = (
HttpClient as unknown as jest.Mock
).mockImplementation(() => {
return {
post: mockPost
}
})
const client = internalArtifactTwirpClient(clientOptions)
await expect(async () => {
await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
name: 'artifact',
version: 4
})
}).rejects.toThrowError(
'Failed to make request after 5 attempts: Failed request: (500) Internal Server Error'
)
expect(mockHttpClient).toHaveBeenCalledTimes(1)
expect(mockPost).toHaveBeenCalledTimes(5)
})
it('should fail immediately if there is a non-retryable error', async () => {
const mockPost = jest.fn(() => {
const msgFailed = new http.IncomingMessage(new net.Socket())
msgFailed.statusCode = 401
msgFailed.statusMessage = 'Unauthorized'
return {
message: msgFailed,
readBody: async () => {
return Promise.resolve(`{"ok": false}`)
}
}
})
const mockHttpClient = (
HttpClient as unknown as jest.Mock
).mockImplementation(() => {
return {
post: mockPost
}
})
const client = internalArtifactTwirpClient(clientOptions)
await expect(async () => {
await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
name: 'artifact',
version: 4
})
}).rejects.toThrowError(
'Received non-retryable error: Failed request: (401) Unauthorized'
)
expect(mockHttpClient).toHaveBeenCalledTimes(1)
expect(mockPost).toHaveBeenCalledTimes(1)
})
it('should fail with a descriptive error', async () => {
// 409 duplicate error
const mockPost = jest.fn(() => {
const msgFailed = new http.IncomingMessage(new net.Socket())
msgFailed.statusCode = 409
msgFailed.statusMessage = 'Conflict'
return {
message: msgFailed,
readBody: async () => {
return Promise.resolve(
`{"msg": "an artifact with this name already exists on the workflow run"}`
)
}
}
})
const mockHttpClient = (
HttpClient as unknown as jest.Mock
).mockImplementation(() => {
return {
post: mockPost
}
})
const client = internalArtifactTwirpClient(clientOptions)
await expect(async () => {
await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
name: 'artifact',
version: 4
})
await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
name: 'artifact',
version: 4
})
}).rejects.toThrowError(
'Failed to CreateArtifact: Received non-retryable error: Failed request: (409) Conflict: an artifact with this name already exists on the workflow run'
)
expect(mockHttpClient).toHaveBeenCalledTimes(1)
expect(mockPost).toHaveBeenCalledTimes(1)
})
it('should properly describe a network failure', async () => {
class FakeNodeError extends Error {
code: string
constructor(code: string) {
super()
this.code = code
}
}
const mockPost = jest.fn(() => {
throw new FakeNodeError('ENOTFOUND')
})
const mockHttpClient = (
HttpClient as unknown as jest.Mock
).mockImplementation(() => {
return {
post: mockPost
}
})
const client = internalArtifactTwirpClient()
await expect(async () => {
await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
name: 'artifact',
version: 4
})
}).rejects.toThrowError(new NetworkError('ENOTFOUND').message)
expect(mockHttpClient).toHaveBeenCalledTimes(1)
expect(mockPost).toHaveBeenCalledTimes(1)
})
it('should properly describe a usage error', async () => {
const mockPost = jest.fn(() => {
const msgFailed = new http.IncomingMessage(new net.Socket())
msgFailed.statusCode = 403
msgFailed.statusMessage = 'Forbidden'
return {
message: msgFailed,
readBody: async () => {
return Promise.resolve(
`{"msg": "insufficient usage to create artifact"}`
)
}
}
})
const mockHttpClient = (
HttpClient as unknown as jest.Mock
).mockImplementation(() => {
return {
post: mockPost
}
})
const client = internalArtifactTwirpClient()
await expect(async () => {
await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
name: 'artifact',
version: 4
})
}).rejects.toThrowError(new UsageError().message)
expect(mockHttpClient).toHaveBeenCalledTimes(1)
expect(mockPost).toHaveBeenCalledTimes(1)
})
})

View File

@ -1,5 +0,0 @@
name: 'Set env variables'
description: 'Sets certain env variables so that e2e artifact upload and download can be tested in a shell'
runs:
using: 'node12'
main: 'index.js'

View File

@ -1,14 +0,0 @@
// Certain env variables are not set by default in a shell context and are only available in a node context from a running action
// In order to be able to upload and download artifacts e2e in a shell when running CI tests, we need these env variables set
const fs = require('fs');
const os = require('os');
const filePath = process.env[`GITHUB_ENV`]
fs.appendFileSync(filePath, `ACTIONS_RUNTIME_URL=${process.env.ACTIONS_RUNTIME_URL}${os.EOL}`, {
encoding: 'utf8'
})
fs.appendFileSync(filePath, `ACTIONS_RUNTIME_TOKEN=${process.env.ACTIONS_RUNTIME_TOKEN}${os.EOL}`, {
encoding: 'utf8'
})
fs.appendFileSync(filePath, `GITHUB_RUN_ID=${process.env.GITHUB_RUN_ID}${os.EOL}`, {
encoding: 'utf8'
})

View File

@ -0,0 +1,9 @@
import * as core from '@actions/core'
// noopLogs mocks the console.log and core.* functions to prevent output in the console while testing
export const noopLogs = (): void => {
jest.spyOn(console, 'log').mockImplementation(() => {})
jest.spyOn(core, 'debug').mockImplementation(() => {})
jest.spyOn(core, 'info').mockImplementation(() => {})
jest.spyOn(core, 'warning').mockImplementation(() => {})
}

View File

@ -0,0 +1,103 @@
import * as config from '../src/internal/shared/config'
import os from 'os'
// Mock the 'os' module
jest.mock('os', () => ({
cpus: jest.fn()
}))
beforeEach(() => {
jest.resetModules()
})
describe('isGhes', () => {
it('should return false when the request domain is github.com', () => {
process.env.GITHUB_SERVER_URL = 'https://github.com'
expect(config.isGhes()).toBe(false)
})
it('should return false when the request domain ends with ghe.com', () => {
process.env.GITHUB_SERVER_URL = 'https://my.domain.ghe.com'
expect(config.isGhes()).toBe(false)
})
it('should return false when the request domain ends with ghe.localhost', () => {
process.env.GITHUB_SERVER_URL = 'https://my.domain.ghe.localhost'
expect(config.isGhes()).toBe(false)
})
it('should return false when the request domain ends with .localhost', () => {
process.env.GITHUB_SERVER_URL = 'https://github.localhost'
expect(config.isGhes()).toBe(false)
})
it('should return false when the request domain is specific to an enterprise', () => {
process.env.GITHUB_SERVER_URL = 'https://my-enterprise.github.com'
expect(config.isGhes()).toBe(true)
})
})
describe('uploadChunkTimeoutEnv', () => {
it('should return default 300000 when no env set', () => {
expect(config.getUploadChunkTimeout()).toBe(300000)
})
it('should return value set in ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS', () => {
process.env.ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS = '150000'
expect(config.getUploadChunkTimeout()).toBe(150000)
})
it('should throw if value set in ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS is invalid', () => {
process.env.ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS = 'abc'
expect(() => {
config.getUploadChunkTimeout()
}).toThrow()
})
})
describe('uploadConcurrencyEnv', () => {
it('Concurrency default to 5', () => {
;(os.cpus as jest.Mock).mockReturnValue(new Array(4))
expect(config.getConcurrency()).toBe(5)
})
it('Concurrency max out at 300 on systems with many CPUs', () => {
;(os.cpus as jest.Mock).mockReturnValue(new Array(32))
process.env.ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY = '301'
expect(config.getConcurrency()).toBe(300)
})
it('Concurrency can be set to 32 when cpu num is <= 4', () => {
;(os.cpus as jest.Mock).mockReturnValue(new Array(4))
process.env.ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY = '32'
expect(config.getConcurrency()).toBe(32)
})
it('Concurrency can be set 16 * num of cpu when cpu num is > 4', () => {
;(os.cpus as jest.Mock).mockReturnValue(new Array(6))
process.env.ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY = '96'
expect(config.getConcurrency()).toBe(96)
})
it('Concurrency can be overridden by env var ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY', () => {
;(os.cpus as jest.Mock).mockReturnValue(new Array(4))
process.env.ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY = '10'
expect(config.getConcurrency()).toBe(10)
})
it('should throw with invalid value of ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY', () => {
;(os.cpus as jest.Mock).mockReturnValue(new Array(4))
process.env.ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY = 'abc'
expect(() => {
config.getConcurrency()
}).toThrow()
})
it('should throw if ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY is < 1', () => {
;(os.cpus as jest.Mock).mockReturnValue(new Array(4))
process.env.ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY = '0'
expect(() => {
config.getConcurrency()
}).toThrow()
})
})

View File

@ -1,57 +0,0 @@
import CRC64, {CRC64DigestEncoding} from '../src/internal/crc64'
const fixtures = {
data:
'🚀 👉😎👉 Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.\nUt enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat.\nDuis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.\nExcepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.\n',
expected: {
hex: '846CE4ADAD6223ED',
base64: '7SNira3kbIQ=',
buffer: Buffer.from([0xed, 0x23, 0x62, 0xad, 0xad, 0xe4, 0x6c, 0x84])
}
}
function assertEncodings(crc: CRC64): void {
const encodings = Object.keys(fixtures.expected) as CRC64DigestEncoding[]
for (const encoding of encodings) {
expect(crc.digest(encoding)).toEqual(fixtures.expected[encoding])
}
}
describe('@actions/artifact/src/internal/crc64', () => {
it('CRC64 from string', async () => {
const crc = new CRC64()
crc.update(fixtures.data)
assertEncodings(crc)
})
it('CRC64 from buffer', async () => {
const crc = new CRC64()
const buf = Buffer.from(fixtures.data)
crc.update(buf)
assertEncodings(crc)
})
it('CRC64 from split data', async () => {
const crc = new CRC64()
const splits = fixtures.data.split('\n').slice(0, -1)
for (const split of splits) {
crc.update(`${split}\n`)
}
assertEncodings(crc)
})
it('flips 64 bits', async () => {
const tests = [
[BigInt(0), BigInt('0xffffffffffffffff')],
[BigInt('0xffffffffffffffff'), BigInt(0)],
[BigInt('0xdeadbeef'), BigInt('0xffffffff21524110')]
]
for (const [input, expected] of tests) {
expect(CRC64.flip64Bits(input)).toEqual(expected)
}
})
})

View File

@ -0,0 +1,192 @@
import * as github from '@actions/github'
import type {RestEndpointMethods} from '@octokit/plugin-rest-endpoint-methods/dist-types/generated/method-types'
import type {RequestInterface} from '@octokit/types'
import {
deleteArtifactInternal,
deleteArtifactPublic
} from '../src/internal/delete/delete-artifact'
import * as config from '../src/internal/shared/config'
import {ArtifactServiceClientJSON, Timestamp} from '../src/generated'
import * as util from '../src/internal/shared/util'
import {noopLogs} from './common'
type MockedRequest = jest.MockedFunction<RequestInterface<object>>
type MockedDeleteArtifact = jest.MockedFunction<
RestEndpointMethods['actions']['deleteArtifact']
>
jest.mock('@actions/github', () => ({
getOctokit: jest.fn().mockReturnValue({
request: jest.fn(),
rest: {
actions: {
deleteArtifact: jest.fn()
}
}
})
}))
const fixtures = {
repo: 'toolkit',
owner: 'actions',
token: 'ghp_1234567890',
runId: 123,
backendIds: {
workflowRunBackendId: 'c4d7c21f-ba3f-4ddc-a8c8-6f2f626f8422',
workflowJobRunBackendId: '760803a1-f890-4d25-9a6e-a3fc01a0c7cf'
},
artifacts: [
{
id: 1,
name: 'my-artifact',
size: 456,
createdAt: new Date('2023-12-01')
},
{
id: 2,
name: 'my-artifact',
size: 456,
createdAt: new Date('2023-12-02')
}
]
}
describe('delete-artifact', () => {
beforeAll(() => {
noopLogs()
})
describe('public', () => {
it('should delete an artifact', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
artifacts: [
{
name: fixtures.artifacts[0].name,
id: fixtures.artifacts[0].id,
size_in_bytes: fixtures.artifacts[0].size,
created_at: fixtures.artifacts[0].createdAt.toISOString()
}
]
}
})
const mockDeleteArtifact = github.getOctokit(fixtures.token).rest.actions
.deleteArtifact as MockedDeleteArtifact
mockDeleteArtifact.mockResolvedValueOnce({
status: 204,
headers: {},
url: '',
data: null as never
})
const response = await deleteArtifactPublic(
fixtures.artifacts[0].name,
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token
)
expect(response).toEqual({
id: fixtures.artifacts[0].id
})
})
it('should fail if non-200 response', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
artifacts: [
{
name: fixtures.artifacts[0].name,
id: fixtures.artifacts[0].id,
size_in_bytes: fixtures.artifacts[0].size,
created_at: fixtures.artifacts[0].createdAt.toISOString()
}
]
}
})
const mockDeleteArtifact = github.getOctokit(fixtures.token).rest.actions
.deleteArtifact as MockedDeleteArtifact
mockDeleteArtifact.mockRejectedValue(new Error('boom'))
await expect(
deleteArtifactPublic(
fixtures.artifacts[0].name,
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token
)
).rejects.toThrow('boom')
})
})
describe('internal', () => {
beforeEach(() => {
jest.spyOn(config, 'getRuntimeToken').mockReturnValue('test-token')
jest
.spyOn(util, 'getBackendIdsFromToken')
.mockReturnValue(fixtures.backendIds)
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('https://results.local')
})
it('should delete an artifact', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: fixtures.artifacts.map(artifact => ({
...fixtures.backendIds,
databaseId: artifact.id.toString(),
name: artifact.name,
size: artifact.size.toString(),
createdAt: Timestamp.fromDate(artifact.createdAt)
}))
})
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'DeleteArtifact')
.mockResolvedValue({
ok: true,
artifactId: fixtures.artifacts[0].id.toString()
})
const response = await deleteArtifactInternal(fixtures.artifacts[0].name)
expect(response).toEqual({
id: fixtures.artifacts[0].id
})
})
it('should fail if non-200 response', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: fixtures.artifacts.map(artifact => ({
...fixtures.backendIds,
databaseId: artifact.id.toString(),
name: artifact.name,
size: artifact.size.toString(),
createdAt: Timestamp.fromDate(artifact.createdAt)
}))
})
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'DeleteArtifact')
.mockRejectedValue(new Error('boom'))
await expect(
deleteArtifactInternal(fixtures.artifacts[0].id)
).rejects.toThrow('boom')
})
})
})

View File

@ -0,0 +1,614 @@
import fs from 'fs'
import * as http from 'http'
import * as net from 'net'
import * as path from 'path'
import * as github from '@actions/github'
import {HttpClient} from '@actions/http-client'
import type {RestEndpointMethods} from '@octokit/plugin-rest-endpoint-methods/dist-types/generated/method-types'
import archiver from 'archiver'
import {
downloadArtifactInternal,
downloadArtifactPublic,
streamExtractExternal
} from '../src/internal/download/download-artifact'
import {getUserAgentString} from '../src/internal/shared/user-agent'
import {noopLogs} from './common'
import * as config from '../src/internal/shared/config'
import {ArtifactServiceClientJSON} from '../src/generated'
import * as util from '../src/internal/shared/util'
type MockedDownloadArtifact = jest.MockedFunction<
RestEndpointMethods['actions']['downloadArtifact']
>
const testDir = path.join(__dirname, '_temp', 'download-artifact')
const fixtures = {
workspaceDir: path.join(testDir, 'workspace'),
exampleArtifact: {
path: path.join(testDir, 'artifact.zip'),
files: [
{
path: 'hello.txt',
content: 'Hello World!'
},
{
path: 'goodbye.txt',
content: 'Goodbye World!'
}
]
},
artifactID: 1234,
artifactName: 'my-artifact',
artifactSize: 123456,
repositoryOwner: 'actions',
repositoryName: 'toolkit',
token: 'ghp_1234567890',
blobStorageUrl: 'https://blob-storage.local?signed=true',
backendIds: {
workflowRunBackendId: 'c4d7c21f-ba3f-4ddc-a8c8-6f2f626f8422',
workflowJobRunBackendId: '760803a1-f890-4d25-9a6e-a3fc01a0c7cf'
}
}
jest.mock('@actions/github', () => ({
getOctokit: jest.fn().mockReturnValue({
rest: {
actions: {
downloadArtifact: jest.fn()
}
}
})
}))
jest.mock('@actions/http-client')
// Create a zip archive with the contents of the example artifact
const createTestArchive = async (): Promise<void> => {
const archive = archiver('zip', {
zlib: {level: 9}
})
for (const file of fixtures.exampleArtifact.files) {
archive.append(file.content, {name: file.path})
}
archive.finalize()
return new Promise((resolve, reject) => {
archive.pipe(fs.createWriteStream(fixtures.exampleArtifact.path))
archive.on('error', reject)
archive.on('finish', resolve)
})
}
const expectExtractedArchive = async (dir: string): Promise<void> => {
for (const file of fixtures.exampleArtifact.files) {
const filePath = path.join(dir, file.path)
expect(fs.readFileSync(filePath, 'utf8')).toEqual(file.content)
}
}
const setup = async (): Promise<void> => {
noopLogs()
await fs.promises.mkdir(testDir, {recursive: true})
await createTestArchive()
process.env['GITHUB_WORKSPACE'] = fixtures.workspaceDir
}
const cleanup = async (): Promise<void> => {
jest.restoreAllMocks()
await fs.promises.rm(testDir, {recursive: true})
delete process.env['GITHUB_WORKSPACE']
}
const mockGetArtifactSuccess = jest.fn(() => {
const message = new http.IncomingMessage(new net.Socket())
message.statusCode = 200
message.push(fs.readFileSync(fixtures.exampleArtifact.path))
message.push(null)
return {
message
}
})
const mockGetArtifactFailure = jest.fn(() => {
const message = new http.IncomingMessage(new net.Socket())
message.statusCode = 500
message.push('Internal Server Error')
message.push(null)
return {
message
}
})
const mockGetArtifactMalicious = jest.fn(() => {
const message = new http.IncomingMessage(new net.Socket())
message.statusCode = 200
message.push(fs.readFileSync(path.join(__dirname, 'fixtures', 'evil.zip'))) // evil.zip contains files that are formatted x/../../etc/hosts
message.push(null)
return {
message
}
})
describe('download-artifact', () => {
describe('public', () => {
beforeEach(setup)
afterEach(cleanup)
it('should successfully download an artifact to $GITHUB_WORKSPACE', async () => {
const downloadArtifactMock = github.getOctokit(fixtures.token).rest
.actions.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {
location: fixtures.blobStorageUrl
},
status: 302,
url: '',
data: Buffer.from('')
})
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifactSuccess
}
}
)
const response = await downloadArtifactPublic(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token
)
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
}
})
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockGetArtifactSuccess).toHaveBeenCalledWith(
fixtures.blobStorageUrl
)
expectExtractedArchive(fixtures.workspaceDir)
expect(response.downloadPath).toBe(fixtures.workspaceDir)
})
it('should not allow path traversal from malicious artifacts', async () => {
const downloadArtifactMock = github.getOctokit(fixtures.token).rest
.actions.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {
location: fixtures.blobStorageUrl
},
status: 302,
url: '',
data: Buffer.from('')
})
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifactMalicious
}
}
)
const response = await downloadArtifactPublic(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token
)
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
}
})
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockGetArtifactMalicious).toHaveBeenCalledWith(
fixtures.blobStorageUrl
)
// ensure path traversal was not possible
expect(
fs.existsSync(path.join(fixtures.workspaceDir, 'x/etc/hosts'))
).toBe(true)
expect(
fs.existsSync(path.join(fixtures.workspaceDir, 'y/etc/hosts'))
).toBe(true)
expect(response.downloadPath).toBe(fixtures.workspaceDir)
})
it('should successfully download an artifact to user defined path', async () => {
const customPath = path.join(testDir, 'custom')
const downloadArtifactMock = github.getOctokit(fixtures.token).rest
.actions.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {
location: fixtures.blobStorageUrl
},
status: 302,
url: '',
data: Buffer.from('')
})
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifactSuccess
}
}
)
const response = await downloadArtifactPublic(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token,
{
path: customPath
}
)
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
}
})
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockGetArtifactSuccess).toHaveBeenCalledWith(
fixtures.blobStorageUrl
)
expectExtractedArchive(customPath)
expect(response.downloadPath).toBe(customPath)
})
it('should fail if download artifact API does not respond with location', async () => {
const downloadArtifactMock = github.getOctokit(fixtures.token).rest
.actions.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {},
status: 302,
url: '',
data: Buffer.from('')
})
await expect(
downloadArtifactPublic(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token
)
).rejects.toBeInstanceOf(Error)
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
}
})
})
it('should fail if blob storage storage chunk does not respond within 30s', async () => {
// mock http client to delay response data by 30s
const msg = new http.IncomingMessage(new net.Socket())
msg.statusCode = 200
const mockGet = jest.fn(async () => {
return new Promise((resolve, reject) => {
// Reject with an error after 31 seconds
setTimeout(() => {
reject(new Error('Request timeout'))
}, 31000) // Timeout after 31 seconds
})
})
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGet
}
}
)
await expect(
streamExtractExternal(fixtures.blobStorageUrl, fixtures.workspaceDir)
).rejects.toBeInstanceOf(Error)
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
}, 35000) // add longer timeout to allow for timer to run out
it('should fail if blob storage response is non-200 after 5 retries', async () => {
const downloadArtifactMock = github.getOctokit(fixtures.token).rest
.actions.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {
location: fixtures.blobStorageUrl
},
status: 302,
url: '',
data: Buffer.from('')
})
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifactFailure
}
}
)
await expect(
downloadArtifactPublic(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token
)
).rejects.toBeInstanceOf(Error)
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
}
})
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockGetArtifactFailure).toHaveBeenCalledWith(
fixtures.blobStorageUrl
)
expect(mockGetArtifactFailure).toHaveBeenCalledTimes(5)
}, 38000)
it('should retry if blob storage response is non-200 and then succeed with a 200', async () => {
const downloadArtifactMock = github.getOctokit(fixtures.token).rest
.actions.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {
location: fixtures.blobStorageUrl
},
status: 302,
url: '',
data: Buffer.from('')
})
const mockGetArtifact = jest
.fn(mockGetArtifactSuccess)
.mockImplementationOnce(mockGetArtifactFailure)
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifact
}
}
)
const response = await downloadArtifactPublic(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token
)
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
}
})
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockGetArtifactFailure).toHaveBeenCalledWith(
fixtures.blobStorageUrl
)
expect(mockGetArtifactFailure).toHaveBeenCalledTimes(1)
expect(mockGetArtifactSuccess).toHaveBeenCalledWith(
fixtures.blobStorageUrl
)
expect(mockGetArtifactSuccess).toHaveBeenCalledTimes(1)
expect(response.downloadPath).toBe(fixtures.workspaceDir)
}, 28000)
})
describe('internal', () => {
beforeEach(async () => {
await setup()
jest.spyOn(config, 'getRuntimeToken').mockReturnValue('test-token')
jest
.spyOn(util, 'getBackendIdsFromToken')
.mockReturnValue(fixtures.backendIds)
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('https://results.local')
})
afterEach(async () => {
await cleanup()
})
it('should successfully download an artifact to $GITHUB_WORKSPACE', async () => {
const mockListArtifacts = jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: [
{
...fixtures.backendIds,
databaseId: fixtures.artifactID.toString(),
name: fixtures.artifactName,
size: fixtures.artifactSize.toString()
}
]
})
const mockGetSignedArtifactURL = jest
.spyOn(ArtifactServiceClientJSON.prototype, 'GetSignedArtifactURL')
.mockReturnValue(
Promise.resolve({
signedUrl: fixtures.blobStorageUrl
})
)
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifactSuccess
}
}
)
const response = await downloadArtifactInternal(fixtures.artifactID)
expectExtractedArchive(fixtures.workspaceDir)
expect(response.downloadPath).toBe(fixtures.workspaceDir)
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockListArtifacts).toHaveBeenCalledWith({
idFilter: {
value: fixtures.artifactID.toString()
},
...fixtures.backendIds
})
expect(mockGetSignedArtifactURL).toHaveBeenCalledWith({
...fixtures.backendIds,
name: fixtures.artifactName
})
})
it('should successfully download an artifact to user defined path', async () => {
const customPath = path.join(testDir, 'custom')
const mockListArtifacts = jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: [
{
...fixtures.backendIds,
databaseId: fixtures.artifactID.toString(),
name: fixtures.artifactName,
size: fixtures.artifactSize.toString()
}
]
})
const mockGetSignedArtifactURL = jest
.spyOn(ArtifactServiceClientJSON.prototype, 'GetSignedArtifactURL')
.mockReturnValue(
Promise.resolve({
signedUrl: fixtures.blobStorageUrl
})
)
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifactSuccess
}
}
)
const response = await downloadArtifactInternal(fixtures.artifactID, {
path: customPath
})
expectExtractedArchive(customPath)
expect(response.downloadPath).toBe(customPath)
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockListArtifacts).toHaveBeenCalledWith({
idFilter: {
value: fixtures.artifactID.toString()
},
...fixtures.backendIds
})
expect(mockGetSignedArtifactURL).toHaveBeenCalledWith({
...fixtures.backendIds,
name: fixtures.artifactName
})
})
it('should fail if download artifact API does not respond with location', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockRejectedValue(new Error('boom'))
await expect(
downloadArtifactInternal(fixtures.artifactID)
).rejects.toBeInstanceOf(Error)
})
it('should fail if blob storage response is non-200', async () => {
const mockListArtifacts = jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: [
{
...fixtures.backendIds,
databaseId: fixtures.artifactID.toString(),
name: fixtures.artifactName,
size: fixtures.artifactSize.toString()
}
]
})
const mockGetSignedArtifactURL = jest
.spyOn(ArtifactServiceClientJSON.prototype, 'GetSignedArtifactURL')
.mockReturnValue(
Promise.resolve({
signedUrl: fixtures.blobStorageUrl
})
)
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifactFailure
}
}
)
await expect(
downloadArtifactInternal(fixtures.artifactID)
).rejects.toBeInstanceOf(Error)
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockListArtifacts).toHaveBeenCalledWith({
idFilter: {
value: fixtures.artifactID.toString()
},
...fixtures.backendIds
})
expect(mockGetSignedArtifactURL).toHaveBeenCalledWith({
...fixtures.backendIds,
name: fixtures.artifactName
})
})
})
})

View File

@ -1,552 +0,0 @@
import * as path from 'path'
import * as core from '@actions/core'
import {URL} from 'url'
import {getDownloadSpecification} from '../src/internal/download-specification'
import {ContainerEntry} from '../src/internal/contracts'
const artifact1Name = 'my-artifact'
const artifact2Name = 'my-artifact-extra'
// Populating with only the information that is necessary
function getPartialContainerEntry(): ContainerEntry {
return {
containerId: 10,
scopeIdentifier: '00000000-0000-0000-0000-000000000000',
path: 'ADD_INFORMATION',
itemType: 'ADD_INFORMATION',
status: 'created',
dateCreated: '2020-02-06T22:13:35.373Z',
dateLastModified: '2020-02-06T22:13:35.453Z',
createdBy: '82f0bf89-6e55-4e5a-b8b6-f75eb992578c',
lastModifiedBy: '82f0bf89-6e55-4e5a-b8b6-f75eb992578c',
itemLocation: 'ADD_INFORMATION',
contentLocation: 'ADD_INFORMATION',
contentId: '',
fileLength: 100
}
}
function createFileEntry(entryPath: string): ContainerEntry {
const newFileEntry = getPartialContainerEntry()
newFileEntry.path = entryPath
newFileEntry.itemType = 'file'
newFileEntry.itemLocation = createItemLocation(entryPath)
newFileEntry.contentLocation = createContentLocation(entryPath)
return newFileEntry
}
function createDirectoryEntry(directoryPath: string): ContainerEntry {
const newDirectoryEntry = getPartialContainerEntry()
newDirectoryEntry.path = directoryPath
newDirectoryEntry.itemType = 'folder'
newDirectoryEntry.itemLocation = createItemLocation(directoryPath)
newDirectoryEntry.contentLocation = createContentLocation(directoryPath)
return newDirectoryEntry
}
function createItemLocation(relativePath: string): string {
const itemLocation = new URL(
'https://testing/_apis/resources/Containers/10000'
)
itemLocation.searchParams.append('itemPath', relativePath)
itemLocation.searchParams.append('metadata', 'true')
return itemLocation.toString()
}
function createContentLocation(relativePath: string): string {
const itemLocation = new URL(
'https://testing/_apis/resources/Containers/10000'
)
itemLocation.searchParams.append('itemPath', relativePath)
return itemLocation.toString()
}
/*
Represents a set of container entries for two artifacts with the following directory structure
/my-artifact
/file1.txt
/file2.txt
/dir1
/file3.txt
/dir2
/dir3
/dir4
file4.txt
file5.txt (no length property)
file6.txt (empty file)
/my-artifact-extra
/file1.txt
*/
// main artifact
const file1Path = path.join(artifact1Name, 'file1.txt')
const file2Path = path.join(artifact1Name, 'file2.txt')
const dir1Path = path.join(artifact1Name, 'dir1')
const file3Path = path.join(dir1Path, 'file3.txt')
const dir2Path = path.join(dir1Path, 'dir2')
const dir3Path = path.join(dir2Path, 'dir3')
const dir4Path = path.join(dir3Path, 'dir4')
const file4Path = path.join(dir4Path, 'file4.txt')
const file5Path = path.join(dir4Path, 'file5.txt')
const file6Path = path.join(dir4Path, 'file6.txt')
const rootDirectoryEntry = createDirectoryEntry(artifact1Name)
const directoryEntry1 = createDirectoryEntry(dir1Path)
const directoryEntry2 = createDirectoryEntry(dir2Path)
const directoryEntry3 = createDirectoryEntry(dir3Path)
const directoryEntry4 = createDirectoryEntry(dir4Path)
const fileEntry1 = createFileEntry(file1Path)
const fileEntry2 = createFileEntry(file2Path)
const fileEntry3 = createFileEntry(file3Path)
const fileEntry4 = createFileEntry(file4Path)
const missingLengthFileEntry = createFileEntry(file5Path)
missingLengthFileEntry.fileLength = undefined // one file does not have a fileLength
const emptyLengthFileEntry = createFileEntry(file6Path)
emptyLengthFileEntry.fileLength = 0 // empty file path
// extra artifact
const artifact2File1Path = path.join(artifact2Name, 'file1.txt')
const rootDirectoryEntry2 = createDirectoryEntry(artifact2Name)
const extraFileEntry = createFileEntry(artifact2File1Path)
const artifactContainerEntries: ContainerEntry[] = [
rootDirectoryEntry,
fileEntry1,
fileEntry2,
directoryEntry1,
fileEntry3,
directoryEntry2,
directoryEntry3,
directoryEntry4,
fileEntry4,
missingLengthFileEntry,
emptyLengthFileEntry,
rootDirectoryEntry2,
extraFileEntry
]
describe('Search', () => {
beforeAll(async () => {
// mock all output so that there is less noise when running tests
jest.spyOn(console, 'log').mockImplementation(() => {})
jest.spyOn(core, 'debug').mockImplementation(() => {})
jest.spyOn(core, 'info').mockImplementation(() => {})
jest.spyOn(core, 'warning').mockImplementation(() => {})
})
it('Download Specification - Absolute Path with no root directory', () => {
const testDownloadPath = path.join(
__dirname,
'some',
'destination',
'folder'
)
const specification = getDownloadSpecification(
artifact1Name,
artifactContainerEntries,
testDownloadPath,
false
)
expect(specification.rootDownloadLocation).toEqual(testDownloadPath)
expect(specification.filesToDownload.length).toEqual(5)
const item1ExpectedTargetPath = path.join(testDownloadPath, 'file1.txt')
const item2ExpectedTargetPath = path.join(testDownloadPath, 'file2.txt')
const item3ExpectedTargetPath = path.join(
testDownloadPath,
'dir1',
'file3.txt'
)
const item4ExpectedTargetPath = path.join(
testDownloadPath,
'dir1',
'dir2',
'dir3',
'dir4',
'file4.txt'
)
const item5ExpectedTargetPath = path.join(
testDownloadPath,
'dir1',
'dir2',
'dir3',
'dir4',
'file5.txt'
)
const item6ExpectedTargetPath = path.join(
testDownloadPath,
'dir1',
'dir2',
'dir3',
'dir4',
'file6.txt'
)
const targetLocations = specification.filesToDownload.map(
item => item.targetPath
)
expect(targetLocations).toContain(item1ExpectedTargetPath)
expect(targetLocations).toContain(item2ExpectedTargetPath)
expect(targetLocations).toContain(item3ExpectedTargetPath)
expect(targetLocations).toContain(item4ExpectedTargetPath)
expect(targetLocations).toContain(item5ExpectedTargetPath)
for (const downloadItem of specification.filesToDownload) {
if (downloadItem.targetPath === item1ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file1Path)
)
} else if (downloadItem.targetPath === item2ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file2Path)
)
} else if (downloadItem.targetPath === item3ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file3Path)
)
} else if (downloadItem.targetPath === item4ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file4Path)
)
} else if (downloadItem.targetPath === item5ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file5Path)
)
} else {
throw new Error('this should never be reached')
}
}
expect(specification.directoryStructure.length).toEqual(3)
expect(specification.directoryStructure).toContain(testDownloadPath)
expect(specification.directoryStructure).toContain(
path.join(testDownloadPath, 'dir1')
)
expect(specification.directoryStructure).toContain(
path.join(testDownloadPath, 'dir1', 'dir2', 'dir3', 'dir4')
)
expect(specification.emptyFilesToCreate.length).toEqual(1)
expect(specification.emptyFilesToCreate).toContain(item6ExpectedTargetPath)
})
it('Download Specification - Relative Path with no root directory', () => {
const testDownloadPath = path.join('some', 'destination', 'folder')
const specification = getDownloadSpecification(
artifact1Name,
artifactContainerEntries,
testDownloadPath,
false
)
expect(specification.rootDownloadLocation).toEqual(testDownloadPath)
expect(specification.filesToDownload.length).toEqual(5)
const item1ExpectedTargetPath = path.join(testDownloadPath, 'file1.txt')
const item2ExpectedTargetPath = path.join(testDownloadPath, 'file2.txt')
const item3ExpectedTargetPath = path.join(
testDownloadPath,
'dir1',
'file3.txt'
)
const item4ExpectedTargetPath = path.join(
testDownloadPath,
'dir1',
'dir2',
'dir3',
'dir4',
'file4.txt'
)
const item5ExpectedTargetPath = path.join(
testDownloadPath,
'dir1',
'dir2',
'dir3',
'dir4',
'file5.txt'
)
const item6ExpectedTargetPath = path.join(
testDownloadPath,
'dir1',
'dir2',
'dir3',
'dir4',
'file6.txt'
)
const targetLocations = specification.filesToDownload.map(
item => item.targetPath
)
expect(targetLocations).toContain(item1ExpectedTargetPath)
expect(targetLocations).toContain(item2ExpectedTargetPath)
expect(targetLocations).toContain(item3ExpectedTargetPath)
expect(targetLocations).toContain(item4ExpectedTargetPath)
expect(targetLocations).toContain(item5ExpectedTargetPath)
for (const downloadItem of specification.filesToDownload) {
if (downloadItem.targetPath === item1ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file1Path)
)
} else if (downloadItem.targetPath === item2ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file2Path)
)
} else if (downloadItem.targetPath === item3ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file3Path)
)
} else if (downloadItem.targetPath === item4ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file4Path)
)
} else if (downloadItem.targetPath === item5ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file5Path)
)
} else {
throw new Error('this should never be reached')
}
}
expect(specification.directoryStructure.length).toEqual(3)
expect(specification.directoryStructure).toContain(testDownloadPath)
expect(specification.directoryStructure).toContain(
path.join(testDownloadPath, 'dir1')
)
expect(specification.directoryStructure).toContain(
path.join(testDownloadPath, 'dir1', 'dir2', 'dir3', 'dir4')
)
expect(specification.emptyFilesToCreate.length).toEqual(1)
expect(specification.emptyFilesToCreate).toContain(item6ExpectedTargetPath)
})
it('Download Specification - Absolute Path with root directory', () => {
const testDownloadPath = path.join(
__dirname,
'some',
'destination',
'folder'
)
const specification = getDownloadSpecification(
artifact1Name,
artifactContainerEntries,
testDownloadPath,
true
)
expect(specification.rootDownloadLocation).toEqual(
path.join(testDownloadPath, artifact1Name)
)
expect(specification.filesToDownload.length).toEqual(5)
const item1ExpectedTargetPath = path.join(
testDownloadPath,
artifact1Name,
'file1.txt'
)
const item2ExpectedTargetPath = path.join(
testDownloadPath,
artifact1Name,
'file2.txt'
)
const item3ExpectedTargetPath = path.join(
testDownloadPath,
artifact1Name,
'dir1',
'file3.txt'
)
const item4ExpectedTargetPath = path.join(
testDownloadPath,
artifact1Name,
'dir1',
'dir2',
'dir3',
'dir4',
'file4.txt'
)
const item5ExpectedTargetPath = path.join(
testDownloadPath,
artifact1Name,
'dir1',
'dir2',
'dir3',
'dir4',
'file5.txt'
)
const item6ExpectedTargetPath = path.join(
testDownloadPath,
artifact1Name,
'dir1',
'dir2',
'dir3',
'dir4',
'file6.txt'
)
const targetLocations = specification.filesToDownload.map(
item => item.targetPath
)
expect(targetLocations).toContain(item1ExpectedTargetPath)
expect(targetLocations).toContain(item2ExpectedTargetPath)
expect(targetLocations).toContain(item3ExpectedTargetPath)
expect(targetLocations).toContain(item4ExpectedTargetPath)
expect(targetLocations).toContain(item5ExpectedTargetPath)
for (const downloadItem of specification.filesToDownload) {
if (downloadItem.targetPath === item1ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file1Path)
)
} else if (downloadItem.targetPath === item2ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file2Path)
)
} else if (downloadItem.targetPath === item3ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file3Path)
)
} else if (downloadItem.targetPath === item4ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file4Path)
)
} else if (downloadItem.targetPath === item5ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file5Path)
)
} else {
throw new Error('this should never be reached')
}
}
expect(specification.directoryStructure.length).toEqual(3)
expect(specification.directoryStructure).toContain(
path.join(testDownloadPath, artifact1Name)
)
expect(specification.directoryStructure).toContain(
path.join(testDownloadPath, dir1Path)
)
expect(specification.directoryStructure).toContain(
path.join(testDownloadPath, dir4Path)
)
expect(specification.emptyFilesToCreate.length).toEqual(1)
expect(specification.emptyFilesToCreate).toContain(item6ExpectedTargetPath)
})
it('Download Specification - Relative Path with root directory', () => {
const testDownloadPath = path.join('some', 'destination', 'folder')
const specification = getDownloadSpecification(
artifact1Name,
artifactContainerEntries,
testDownloadPath,
true
)
expect(specification.rootDownloadLocation).toEqual(
path.join(testDownloadPath, artifact1Name)
)
expect(specification.filesToDownload.length).toEqual(5)
const item1ExpectedTargetPath = path.join(
testDownloadPath,
artifact1Name,
'file1.txt'
)
const item2ExpectedTargetPath = path.join(
testDownloadPath,
artifact1Name,
'file2.txt'
)
const item3ExpectedTargetPath = path.join(
testDownloadPath,
artifact1Name,
'dir1',
'file3.txt'
)
const item4ExpectedTargetPath = path.join(
testDownloadPath,
artifact1Name,
'dir1',
'dir2',
'dir3',
'dir4',
'file4.txt'
)
const item5ExpectedTargetPath = path.join(
testDownloadPath,
artifact1Name,
'dir1',
'dir2',
'dir3',
'dir4',
'file5.txt'
)
const item6ExpectedTargetPath = path.join(
testDownloadPath,
artifact1Name,
'dir1',
'dir2',
'dir3',
'dir4',
'file6.txt'
)
const targetLocations = specification.filesToDownload.map(
item => item.targetPath
)
expect(targetLocations).toContain(item1ExpectedTargetPath)
expect(targetLocations).toContain(item2ExpectedTargetPath)
expect(targetLocations).toContain(item3ExpectedTargetPath)
expect(targetLocations).toContain(item4ExpectedTargetPath)
expect(targetLocations).toContain(item5ExpectedTargetPath)
for (const downloadItem of specification.filesToDownload) {
if (downloadItem.targetPath === item1ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file1Path)
)
} else if (downloadItem.targetPath === item2ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file2Path)
)
} else if (downloadItem.targetPath === item3ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file3Path)
)
} else if (downloadItem.targetPath === item4ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file4Path)
)
} else if (downloadItem.targetPath === item5ExpectedTargetPath) {
expect(downloadItem.sourceLocation).toEqual(
createContentLocation(file5Path)
)
} else {
throw new Error('this should never be reached')
}
}
expect(specification.directoryStructure.length).toEqual(3)
expect(specification.directoryStructure).toContain(
path.join(testDownloadPath, artifact1Name)
)
expect(specification.directoryStructure).toContain(
path.join(testDownloadPath, dir1Path)
)
expect(specification.directoryStructure).toContain(
path.join(testDownloadPath, dir4Path)
)
expect(specification.emptyFilesToCreate.length).toEqual(1)
expect(specification.emptyFilesToCreate).toContain(item6ExpectedTargetPath)
})
})

View File

@ -1,490 +0,0 @@
import * as core from '@actions/core'
import * as http from 'http'
import * as io from '../../io/src/io'
import * as net from 'net'
import * as path from 'path'
import * as configVariables from '../src/internal/config-variables'
import {promises as fs} from 'fs'
import {DownloadItem} from '../src/internal/download-specification'
import {HttpClient, HttpClientResponse} from '@actions/http-client'
import {DownloadHttpClient} from '../src/internal/download-http-client'
import {
ListArtifactsResponse,
QueryArtifactResponse
} from '../src/internal/contracts'
import * as stream from 'stream'
import {gzip} from 'zlib'
import {promisify} from 'util'
const root = path.join(__dirname, '_temp', 'artifact-download-tests')
const defaultEncoding = 'utf8'
jest.mock('../src/internal/config-variables')
jest.mock('@actions/http-client')
describe('Download Tests', () => {
beforeAll(async () => {
await io.rmRF(root)
await fs.mkdir(path.join(root), {
recursive: true
})
// mock all output so that there is less noise when running tests
jest.spyOn(console, 'log').mockImplementation(() => {})
jest.spyOn(core, 'debug').mockImplementation(() => {})
jest.spyOn(core, 'info').mockImplementation(() => {})
jest.spyOn(core, 'warning').mockImplementation(() => {})
jest.spyOn(core, 'error').mockImplementation(() => {})
})
/**
* Test Listing Artifacts
*/
it('List Artifacts - Success', async () => {
setupSuccessfulListArtifactsResponse()
const downloadHttpClient = new DownloadHttpClient()
const artifacts = await downloadHttpClient.listArtifacts()
expect(artifacts.count).toEqual(2)
const artifactNames = artifacts.value.map(item => item.name)
expect(artifactNames).toContain('artifact1-name')
expect(artifactNames).toContain('artifact2-name')
for (const artifact of artifacts.value) {
if (artifact.name === 'artifact1-name') {
expect(artifact.url).toEqual(
`${configVariables.getRuntimeUrl()}_apis/pipelines/1/runs/1/artifacts?artifactName=artifact1-name`
)
} else if (artifact.name === 'artifact2-name') {
expect(artifact.url).toEqual(
`${configVariables.getRuntimeUrl()}_apis/pipelines/1/runs/1/artifacts?artifactName=artifact2-name`
)
} else {
throw new Error(
'Invalid artifact combination. This should never be reached'
)
}
}
})
it('List Artifacts - Failure', async () => {
setupFailedResponse()
const downloadHttpClient = new DownloadHttpClient()
expect(downloadHttpClient.listArtifacts()).rejects.toThrow(
'List Artifacts failed: Artifact service responded with 400'
)
})
/**
* Test Container Items
*/
it('Container Items - Success', async () => {
setupSuccessfulContainerItemsResponse()
const downloadHttpClient = new DownloadHttpClient()
const response = await downloadHttpClient.getContainerItems(
'artifact-name',
configVariables.getRuntimeUrl()
)
expect(response.count).toEqual(2)
const itemPaths = response.value.map(item => item.path)
expect(itemPaths).toContain('artifact-name')
expect(itemPaths).toContain('artifact-name/file1.txt')
for (const containerEntry of response.value) {
if (containerEntry.path === 'artifact-name') {
expect(containerEntry.itemType).toEqual('folder')
} else if (containerEntry.path === 'artifact-name/file1.txt') {
expect(containerEntry.itemType).toEqual('file')
} else {
throw new Error(
'Invalid container combination. This should never be reached'
)
}
}
})
it('Container Items - Failure', async () => {
setupFailedResponse()
const downloadHttpClient = new DownloadHttpClient()
expect(
downloadHttpClient.getContainerItems(
'artifact-name',
configVariables.getRuntimeUrl()
)
).rejects.toThrow(
`Get Container Items failed: Artifact service responded with 400`
)
})
it('Test downloading an individual artifact with gzip', async () => {
const fileContents = Buffer.from(
'gzip worked on the first try\n',
defaultEncoding
)
const targetPath = path.join(root, 'FileA.txt')
setupDownloadItemResponse(fileContents, true, 200, false, false)
const downloadHttpClient = new DownloadHttpClient()
const items: DownloadItem[] = []
items.push({
sourceLocation: `${configVariables.getRuntimeUrl()}_apis/resources/Containers/13?itemPath=my-artifact%2FFileA.txt`,
targetPath
})
await expect(
downloadHttpClient.downloadSingleArtifact(items)
).resolves.not.toThrow()
await checkDestinationFile(targetPath, fileContents)
})
it('Test downloading an individual artifact without gzip', async () => {
const fileContents = Buffer.from(
'plaintext worked on the first try\n',
defaultEncoding
)
const targetPath = path.join(root, 'FileB.txt')
setupDownloadItemResponse(fileContents, false, 200, false, false)
const downloadHttpClient = new DownloadHttpClient()
const items: DownloadItem[] = []
items.push({
sourceLocation: `${configVariables.getRuntimeUrl()}_apis/resources/Containers/13?itemPath=my-artifact%2FFileB.txt`,
targetPath
})
await expect(
downloadHttpClient.downloadSingleArtifact(items)
).resolves.not.toThrow()
await checkDestinationFile(targetPath, fileContents)
})
it('Test retryable status codes during artifact download', async () => {
// The first http response should return a retryable status call while the subsequent call should return a 200 so
// the download should successfully finish
const retryableStatusCodes = [429, 500, 502, 503, 504]
for (const statusCode of retryableStatusCodes) {
const fileContents = Buffer.from('try, try again\n', defaultEncoding)
const targetPath = path.join(root, `FileC-${statusCode}.txt`)
setupDownloadItemResponse(fileContents, false, statusCode, false, true)
const downloadHttpClient = new DownloadHttpClient()
const items: DownloadItem[] = []
items.push({
sourceLocation: `${configVariables.getRuntimeUrl()}_apis/resources/Containers/13?itemPath=my-artifact%2FFileC.txt`,
targetPath
})
await expect(
downloadHttpClient.downloadSingleArtifact(items)
).resolves.not.toThrow()
await checkDestinationFile(targetPath, fileContents)
}
})
it('Test retry on truncated response with gzip', async () => {
const fileContents = Buffer.from(
'Sometimes gunzip fails on the first try\n',
defaultEncoding
)
const targetPath = path.join(root, 'FileD.txt')
setupDownloadItemResponse(fileContents, true, 200, true, true)
const downloadHttpClient = new DownloadHttpClient()
const items: DownloadItem[] = []
items.push({
sourceLocation: `${configVariables.getRuntimeUrl()}_apis/resources/Containers/13?itemPath=my-artifact%2FFileD.txt`,
targetPath
})
await expect(
downloadHttpClient.downloadSingleArtifact(items)
).resolves.not.toThrow()
await checkDestinationFile(targetPath, fileContents)
})
it('Test retry on truncated response without gzip', async () => {
const fileContents = Buffer.from(
'You have to inspect the content-length header to know if you got everything\n',
defaultEncoding
)
const targetPath = path.join(root, 'FileE.txt')
setupDownloadItemResponse(fileContents, false, 200, true, true)
const downloadHttpClient = new DownloadHttpClient()
const items: DownloadItem[] = []
items.push({
sourceLocation: `${configVariables.getRuntimeUrl()}_apis/resources/Containers/13?itemPath=my-artifact%2FFileD.txt`,
targetPath
})
await expect(
downloadHttpClient.downloadSingleArtifact(items)
).resolves.not.toThrow()
await checkDestinationFile(targetPath, fileContents)
})
/**
* Helper used to setup mocking for the HttpClient
*/
async function emptyMockReadBody(): Promise<string> {
return new Promise(resolve => {
resolve()
})
}
/**
* Setups up HTTP GET response for a successful listArtifacts() call
*/
function setupSuccessfulListArtifactsResponse(): void {
jest.spyOn(HttpClient.prototype, 'get').mockImplementationOnce(async () => {
const mockMessage = new http.IncomingMessage(new net.Socket())
let mockReadBody = emptyMockReadBody
mockMessage.statusCode = 201
const response: ListArtifactsResponse = {
count: 2,
value: [
{
containerId: '13',
size: -1,
signedContent: 'false',
fileContainerResourceUrl: `${configVariables.getRuntimeUrl()}_apis/resources/Containers/13`,
type: 'actions_storage',
name: 'artifact1-name',
url: `${configVariables.getRuntimeUrl()}_apis/pipelines/1/runs/1/artifacts?artifactName=artifact1-name`
},
{
containerId: '13',
size: -1,
signedContent: 'false',
fileContainerResourceUrl: `${configVariables.getRuntimeUrl()}_apis/resources/Containers/13`,
type: 'actions_storage',
name: 'artifact2-name',
url: `${configVariables.getRuntimeUrl()}_apis/pipelines/1/runs/1/artifacts?artifactName=artifact2-name`
}
]
}
const returnData: string = JSON.stringify(response, null, 2)
mockReadBody = async function(): Promise<string> {
return new Promise(resolve => {
resolve(returnData)
})
}
return new Promise<HttpClientResponse>(resolve => {
resolve({
message: mockMessage,
readBody: mockReadBody
})
})
})
}
/**
* Setups up HTTP GET response for downloading items
* @param isGzip is the downloaded item gzip encoded
* @param firstHttpResponseCode the http response code that should be returned
*/
function setupDownloadItemResponse(
fileContents: Buffer,
isGzip: boolean,
firstHttpResponseCode: number,
truncateFirstResponse: boolean,
retryExpected: boolean
): void {
const spyInstance = jest
.spyOn(HttpClient.prototype, 'get')
.mockImplementationOnce(async () => {
if (firstHttpResponseCode === 200) {
const fullResponse = await constructResponse(isGzip, fileContents)
const actualResponse = truncateFirstResponse
? fullResponse.subarray(0, 3)
: fullResponse
return {
message: getDownloadResponseMessage(
firstHttpResponseCode,
isGzip,
fullResponse.length,
actualResponse
),
readBody: emptyMockReadBody
}
} else {
return {
message: getDownloadResponseMessage(
firstHttpResponseCode,
false,
0,
null
),
readBody: emptyMockReadBody
}
}
})
// set up a second mock only if we expect a retry. Otherwise this mock will affect other tests.
if (retryExpected) {
spyInstance.mockImplementationOnce(async () => {
// chained response, if the HTTP GET function gets called again, return a successful response
const fullResponse = await constructResponse(isGzip, fileContents)
return {
message: getDownloadResponseMessage(
200,
isGzip,
fullResponse.length,
fullResponse
),
readBody: emptyMockReadBody
}
})
}
}
async function constructResponse(
isGzip: boolean,
plaintext: Buffer | string
): Promise<Buffer> {
if (isGzip) {
return await promisify(gzip)(plaintext)
} else if (typeof plaintext === 'string') {
return Buffer.from(plaintext, defaultEncoding)
} else {
return plaintext
}
}
function getDownloadResponseMessage(
httpResponseCode: number,
isGzip: boolean,
contentLength: number,
response: Buffer | null
): http.IncomingMessage {
let readCallCount = 0
const mockMessage = <http.IncomingMessage>new stream.Readable({
read(size) {
switch (readCallCount++) {
case 0:
if (!!response && response.byteLength > size) {
throw new Error(
`test response larger than requested size (${size})`
)
}
this.push(response)
break
default:
// end the stream
this.push(null)
break
}
}
})
mockMessage.statusCode = httpResponseCode
mockMessage.headers = {
'content-length': contentLength.toString()
}
if (isGzip) {
mockMessage.headers['content-encoding'] = 'gzip'
}
return mockMessage
}
/**
* Setups up HTTP GET response when querying for container items
*/
function setupSuccessfulContainerItemsResponse(): void {
jest.spyOn(HttpClient.prototype, 'get').mockImplementationOnce(async () => {
const mockMessage = new http.IncomingMessage(new net.Socket())
let mockReadBody = emptyMockReadBody
mockMessage.statusCode = 201
const response: QueryArtifactResponse = {
count: 2,
value: [
{
containerId: 10000,
scopeIdentifier: '00000000-0000-0000-0000-000000000000',
path: 'artifact-name',
itemType: 'folder',
status: 'created',
dateCreated: '2020-02-06T22:13:35.373Z',
dateLastModified: '2020-02-06T22:13:35.453Z',
createdBy: '82f0bf89-6e55-4e5a-b8b6-f75eb992578c',
lastModifiedBy: '82f0bf89-6e55-4e5a-b8b6-f75eb992578c',
itemLocation: `${configVariables.getRuntimeUrl()}/_apis/resources/Containers/10000?itemPath=artifact-name&metadata=True`,
contentLocation: `${configVariables.getRuntimeUrl()}/_apis/resources/Containers/10000?itemPath=artifact-name`,
contentId: ''
},
{
containerId: 10000,
scopeIdentifier: '00000000-0000-0000-0000-000000000000',
path: 'artifact-name/file1.txt',
itemType: 'file',
status: 'created',
dateCreated: '2020-02-06T22:13:35.373Z',
dateLastModified: '2020-02-06T22:13:35.453Z',
createdBy: '82f0bf89-6e55-4e5a-b8b6-f75eb992578c',
lastModifiedBy: '82f0bf89-6e55-4e5a-b8b6-f75eb992578c',
itemLocation: `${configVariables.getRuntimeUrl()}/_apis/resources/Containers/10000?itemPath=artifact-name%2Ffile1.txt&metadata=True`,
contentLocation: `${configVariables.getRuntimeUrl()}/_apis/resources/Containers/10000?itemPath=artifact-name%2Ffile1.txt`,
contentId: ''
}
]
}
const returnData: string = JSON.stringify(response, null, 2)
mockReadBody = async function(): Promise<string> {
return new Promise(resolve => {
resolve(returnData)
})
}
return new Promise<HttpClientResponse>(resolve => {
resolve({
message: mockMessage,
readBody: mockReadBody
})
})
})
}
/**
* Setups up HTTP GET response for a generic failed request
*/
function setupFailedResponse(): void {
jest.spyOn(HttpClient.prototype, 'get').mockImplementationOnce(async () => {
const mockMessage = new http.IncomingMessage(new net.Socket())
mockMessage.statusCode = 400
return new Promise<HttpClientResponse>(resolve => {
resolve({
message: mockMessage,
readBody: emptyMockReadBody
})
})
})
}
async function checkDestinationFile(
targetPath: string,
expectedContents: Buffer
): Promise<void> {
const fileContents = await fs.readFile(targetPath)
expect(fileContents.byteLength).toEqual(expectedContents.byteLength)
expect(fileContents.equals(expectedContents)).toBeTruthy()
}
})

Binary file not shown.

View File

@ -0,0 +1,239 @@
import * as github from '@actions/github'
import type {RequestInterface} from '@octokit/types'
import {
getArtifactInternal,
getArtifactPublic
} from '../src/internal/find/get-artifact'
import * as config from '../src/internal/shared/config'
import {ArtifactServiceClientJSON, Timestamp} from '../src/generated'
import * as util from '../src/internal/shared/util'
import {noopLogs} from './common'
import {
ArtifactNotFoundError,
InvalidResponseError
} from '../src/internal/shared/errors'
type MockedRequest = jest.MockedFunction<RequestInterface<object>>
jest.mock('@actions/github', () => ({
getOctokit: jest.fn().mockReturnValue({
request: jest.fn()
})
}))
const fixtures = {
repo: 'toolkit',
owner: 'actions',
token: 'ghp_1234567890',
runId: 123,
backendIds: {
workflowRunBackendId: 'c4d7c21f-ba3f-4ddc-a8c8-6f2f626f8422',
workflowJobRunBackendId: '760803a1-f890-4d25-9a6e-a3fc01a0c7cf'
},
artifacts: [
{
id: 1,
name: 'my-artifact',
size: 456,
createdAt: new Date('2023-12-01')
},
{
id: 2,
name: 'my-artifact',
size: 456,
createdAt: new Date('2023-12-02')
}
]
}
describe('get-artifact', () => {
beforeAll(() => {
noopLogs()
})
describe('public', () => {
it('should return the artifact if it is found', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
artifacts: [
{
name: fixtures.artifacts[0].name,
id: fixtures.artifacts[0].id,
size_in_bytes: fixtures.artifacts[0].size,
created_at: fixtures.artifacts[0].createdAt.toISOString()
}
]
}
})
const response = await getArtifactPublic(
fixtures.artifacts[0].name,
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token
)
expect(response).toEqual({
artifact: fixtures.artifacts[0]
})
})
it('should return the latest artifact if multiple are found', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
artifacts: fixtures.artifacts.map(artifact => ({
name: artifact.name,
id: artifact.id,
size_in_bytes: artifact.size,
created_at: artifact.createdAt.toISOString()
}))
}
})
const response = await getArtifactPublic(
fixtures.artifacts[0].name,
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token
)
expect(response).toEqual({
artifact: fixtures.artifacts[1]
})
})
it('should fail if no artifacts are found', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
artifacts: []
}
})
const response = getArtifactPublic(
fixtures.artifacts[0].name,
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token
)
expect(response).rejects.toThrowError(ArtifactNotFoundError)
})
it('should fail if non-200 response', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 404,
headers: {},
url: '',
data: {}
})
const response = getArtifactPublic(
fixtures.artifacts[0].name,
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token
)
expect(response).rejects.toThrowError(InvalidResponseError)
})
})
describe('internal', () => {
beforeEach(() => {
jest.spyOn(config, 'getRuntimeToken').mockReturnValue('test-token')
jest
.spyOn(util, 'getBackendIdsFromToken')
.mockReturnValue(fixtures.backendIds)
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('https://results.local')
})
it('should return the artifact if it is found', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: [
{
...fixtures.backendIds,
databaseId: fixtures.artifacts[0].id.toString(),
name: fixtures.artifacts[0].name,
size: fixtures.artifacts[0].size.toString(),
createdAt: Timestamp.fromDate(fixtures.artifacts[0].createdAt)
}
]
})
const response = await getArtifactInternal(fixtures.artifacts[0].name)
expect(response).toEqual({
artifact: fixtures.artifacts[0]
})
})
it('should return the latest artifact if multiple are found', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: fixtures.artifacts.map(artifact => ({
...fixtures.backendIds,
databaseId: artifact.id.toString(),
name: artifact.name,
size: artifact.size.toString(),
createdAt: Timestamp.fromDate(artifact.createdAt)
}))
})
const response = await getArtifactInternal(fixtures.artifacts[0].name)
expect(response).toEqual({
artifact: fixtures.artifacts[1]
})
})
it('should fail if no artifacts are found', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: []
})
const response = getArtifactInternal(fixtures.artifacts[0].name)
expect(response).rejects.toThrowError(ArtifactNotFoundError)
})
it('should fail if non-200 response', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockRejectedValue(new Error('boom'))
const response = getArtifactInternal(fixtures.artifacts[0].name)
expect(response).rejects.toThrow()
})
})
})

View File

@ -0,0 +1,241 @@
import * as github from '@actions/github'
import type {RestEndpointMethodTypes} from '@octokit/plugin-rest-endpoint-methods/dist-types/generated/parameters-and-response-types'
import {
listArtifactsInternal,
listArtifactsPublic
} from '../src/internal/find/list-artifacts'
import * as config from '../src/internal/shared/config'
import {ArtifactServiceClientJSON, Timestamp} from '../src/generated'
import * as util from '../src/internal/shared/util'
import {noopLogs} from './common'
import {Artifact} from '../src/internal/shared/interfaces'
import {RequestInterface} from '@octokit/types'
type MockedRequest = jest.MockedFunction<RequestInterface<object>>
jest.mock('@actions/github', () => ({
getOctokit: jest.fn().mockReturnValue({
request: jest.fn(),
rest: {
actions: {
listWorkflowRunArtifacts: jest.fn()
}
}
})
}))
const artifactsToListResponse = (
artifacts: Artifact[]
): RestEndpointMethodTypes['actions']['listWorkflowRunArtifacts']['response']['data'] => {
return {
total_count: artifacts.length,
artifacts: artifacts.map(artifact => ({
name: artifact.name,
id: artifact.id,
size_in_bytes: artifact.size,
created_at: artifact.createdAt?.toISOString() || '',
run_id: fixtures.runId,
// unused fields for tests
url: '',
archive_download_url: '',
expired: false,
expires_at: '',
node_id: '',
run_url: '',
type: '',
updated_at: ''
}))
}
}
const fixtures = {
repo: 'toolkit',
owner: 'actions',
token: 'ghp_1234567890',
runId: 123,
backendIds: {
workflowRunBackendId: 'c4d7c21f-ba3f-4ddc-a8c8-6f2f626f8422',
workflowJobRunBackendId: '760803a1-f890-4d25-9a6e-a3fc01a0c7cf'
},
artifacts: [
{
id: 1,
name: 'my-artifact',
size: 456,
createdAt: new Date('2023-12-01')
},
{
id: 2,
name: 'my-artifact',
size: 456,
createdAt: new Date('2023-12-02')
}
]
}
describe('list-artifact', () => {
beforeAll(() => {
noopLogs()
})
describe('public', () => {
it('should return a list of artifacts', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: artifactsToListResponse(fixtures.artifacts)
})
const response = await listArtifactsPublic(
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token,
false
)
expect(response).toEqual({
artifacts: fixtures.artifacts
})
})
it('should return the latest artifact when latest is specified', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: artifactsToListResponse(fixtures.artifacts)
})
const response = await listArtifactsPublic(
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token,
true
)
expect(response).toEqual({
artifacts: [fixtures.artifacts[1]]
})
})
it('can return empty artifacts', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
total_count: 0,
artifacts: []
}
})
const response = await listArtifactsPublic(
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token,
true
)
expect(response).toEqual({
artifacts: []
})
})
it('should fail if non-200 response', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockRejectedValueOnce(new Error('boom'))
await expect(
listArtifactsPublic(
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token,
false
)
).rejects.toThrow('boom')
})
})
describe('internal', () => {
beforeEach(() => {
jest.spyOn(config, 'getRuntimeToken').mockReturnValue('test-token')
jest
.spyOn(util, 'getBackendIdsFromToken')
.mockReturnValue(fixtures.backendIds)
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('https://results.local')
})
it('should return a list of artifacts', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: fixtures.artifacts.map(artifact => ({
...fixtures.backendIds,
databaseId: artifact.id.toString(),
name: artifact.name,
size: artifact.size.toString(),
createdAt: Timestamp.fromDate(artifact.createdAt)
}))
})
const response = await listArtifactsInternal(false)
expect(response).toEqual({
artifacts: fixtures.artifacts
})
})
it('should return the latest artifact when latest is specified', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: fixtures.artifacts.map(artifact => ({
...fixtures.backendIds,
databaseId: artifact.id.toString(),
name: artifact.name,
size: artifact.size.toString(),
createdAt: Timestamp.fromDate(artifact.createdAt)
}))
})
const response = await listArtifactsInternal(true)
expect(response).toEqual({
artifacts: [fixtures.artifacts[1]]
})
})
it('can return empty artifacts', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: []
})
const response = await listArtifactsInternal(false)
expect(response).toEqual({
artifacts: []
})
})
it('should fail if non-200 response', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockRejectedValue(new Error('boom'))
await expect(listArtifactsInternal(false)).rejects.toThrow('boom')
})
})
})

View File

@ -1,16 +1,13 @@
import { import {
checkArtifactName, validateArtifactName,
checkArtifactFilePath validateFilePath
} from '../src/internal/path-and-artifact-name-validation' } from '../src/internal/upload/path-and-artifact-name-validation'
import * as core from '@actions/core'
import {noopLogs} from './common'
describe('Path and artifact name validation', () => { describe('Path and artifact name validation', () => {
beforeAll(() => { beforeAll(() => {
// mock all output so that there is less noise when running tests noopLogs()
jest.spyOn(console, 'log').mockImplementation(() => {})
jest.spyOn(core, 'debug').mockImplementation(() => {})
jest.spyOn(core, 'info').mockImplementation(() => {})
jest.spyOn(core, 'warning').mockImplementation(() => {})
}) })
it('Check Artifact Name for any invalid characters', () => { it('Check Artifact Name for any invalid characters', () => {
@ -28,7 +25,7 @@ describe('Path and artifact name validation', () => {
] ]
for (const invalidName of invalidNames) { for (const invalidName of invalidNames) {
expect(() => { expect(() => {
checkArtifactName(invalidName) validateArtifactName(invalidName)
}).toThrow() }).toThrow()
} }
@ -39,7 +36,7 @@ describe('Path and artifact name validation', () => {
] ]
for (const validName of validNames) { for (const validName of validNames) {
expect(() => { expect(() => {
checkArtifactName(validName) validateArtifactName(validName)
}).not.toThrow() }).not.toThrow()
} }
}) })
@ -60,7 +57,7 @@ describe('Path and artifact name validation', () => {
] ]
for (const invalidName of invalidNames) { for (const invalidName of invalidNames) {
expect(() => { expect(() => {
checkArtifactFilePath(invalidName) validateFilePath(invalidName)
}).toThrow() }).toThrow()
} }
@ -71,7 +68,7 @@ describe('Path and artifact name validation', () => {
] ]
for (const validName of validNames) { for (const validName of validNames) {
expect(() => { expect(() => {
checkArtifactFilePath(validName) validateFilePath(validName)
}).not.toThrow() }).not.toThrow()
} }
}) })

View File

@ -0,0 +1,65 @@
import {Timestamp} from '../src/generated'
import * as retention from '../src/internal/upload/retention'
describe('retention', () => {
beforeEach(() => {
delete process.env['GITHUB_RETENTION_DAYS']
})
it('should return the inputted retention days if it is less than the max retention days', () => {
// setup
const mockDate = new Date('2020-01-01')
jest.useFakeTimers().setSystemTime(mockDate)
process.env['GITHUB_RETENTION_DAYS'] = '90'
const exp = retention.getExpiration(30)
expect(exp).toBeDefined()
if (exp) {
const expDate = Timestamp.toDate(exp)
const expected = new Date()
expected.setDate(expected.getDate() + 30)
expect(expDate).toEqual(expected)
}
})
it('should return the max retention days if the inputted retention days is greater than the max retention days', () => {
// setup
const mockDate = new Date('2020-01-01')
jest.useFakeTimers().setSystemTime(mockDate)
process.env['GITHUB_RETENTION_DAYS'] = '90'
const exp = retention.getExpiration(120)
expect(exp).toBeDefined()
if (exp) {
const expDate = Timestamp.toDate(exp) // we check whether exp is defined above
const expected = new Date()
expected.setDate(expected.getDate() + 90)
expect(expDate).toEqual(expected)
}
})
it('should return undefined if the inputted retention days is undefined', () => {
const exp = retention.getExpiration()
expect(exp).toBeUndefined()
})
it('should return the inputted retention days if there is no max retention days', () => {
// setup
const mockDate = new Date('2020-01-01')
jest.useFakeTimers().setSystemTime(mockDate)
const exp = retention.getExpiration(30)
expect(exp).toBeDefined()
if (exp) {
const expDate = Timestamp.toDate(exp) // we check whether exp is defined above
const expected = new Date()
expected.setDate(expected.getDate() + 30)
expect(expDate).toEqual(expected)
}
})
})

View File

@ -1,113 +0,0 @@
import * as http from 'http'
import * as net from 'net'
import * as core from '@actions/core'
import * as configVariables from '../src/internal/config-variables'
import {retry} from '../src/internal/requestUtils'
import {HttpClientResponse} from '@actions/http-client'
jest.mock('../src/internal/config-variables')
interface ITestResult {
responseCode: number
errorMessage: string | null
}
async function testRetry(
responseCodes: number[],
expectedResult: ITestResult
): Promise<void> {
const reverse = responseCodes.reverse() // Reverse responses since we pop from end
if (expectedResult.errorMessage) {
// we expect some exception to be thrown
expect(
retry(
'test',
async () => handleResponse(reverse.pop()),
new Map(), // extra error message for any particular http codes
configVariables.getRetryLimit()
)
).rejects.toThrow(expectedResult.errorMessage)
} else {
// we expect a correct status code to be returned
const actualResult = await retry(
'test',
async () => handleResponse(reverse.pop()),
new Map(), // extra error message for any particular http codes
configVariables.getRetryLimit()
)
expect(actualResult.message.statusCode).toEqual(expectedResult.responseCode)
}
}
async function handleResponse(
testResponseCode: number | undefined
): Promise<HttpClientResponse> {
if (!testResponseCode) {
throw new Error(
'Test incorrectly set up. reverse.pop() was called too many times so not enough test response codes were supplied'
)
}
return setupSingleMockResponse(testResponseCode)
}
beforeAll(async () => {
// mock all output so that there is less noise when running tests
jest.spyOn(console, 'log').mockImplementation(() => {})
jest.spyOn(core, 'debug').mockImplementation(() => {})
jest.spyOn(core, 'info').mockImplementation(() => {})
jest.spyOn(core, 'warning').mockImplementation(() => {})
jest.spyOn(core, 'error').mockImplementation(() => {})
})
/**
* Helpers used to setup mocking for the HttpClient
*/
async function emptyMockReadBody(): Promise<string> {
return new Promise(resolve => {
resolve()
})
}
async function setupSingleMockResponse(
statusCode: number
): Promise<HttpClientResponse> {
const mockMessage = new http.IncomingMessage(new net.Socket())
const mockReadBody = emptyMockReadBody
mockMessage.statusCode = statusCode
return new Promise<HttpClientResponse>(resolve => {
resolve({
message: mockMessage,
readBody: mockReadBody
})
})
}
test('retry works on successful response', async () => {
await testRetry([200], {
responseCode: 200,
errorMessage: null
})
})
test('retry works after retryable status code', async () => {
await testRetry([503, 200], {
responseCode: 200,
errorMessage: null
})
})
test('retry fails after exhausting retries', async () => {
// __mocks__/config-variables caps the max retry count in tests to 2
await testRetry([503, 503, 200], {
responseCode: 200,
errorMessage: 'test failed: Artifact service responded with 503'
})
})
test('retry fails after non-retryable status code', async () => {
await testRetry([400, 200], {
responseCode: 400,
errorMessage: 'test failed: Artifact service responded with 400'
})
})

View File

@ -1,27 +0,0 @@
#!/bin/bash
path="$1"
expectedContent="$2"
if [ "$path" == "" ]; then
echo "File path not provided"
exit 1
fi
if [ "$expectedContent" == "" ]; then
echo "Expected file contents not provided"
exit 1
fi
if [ ! -f "$path" ]; then
echo "Expected file $path does not exist"
exit 1
fi
actualContent=$(cat "$path")
if [ "$expectedContent" == "_EMPTY_" ] && [ ! -s "$path" ]; then
exit 0
elif [ "$actualContent" != "$expectedContent" ]; then
echo "File contents are not correct, expected $expectedContent, received $actualContent"
exit 1
fi

View File

@ -0,0 +1,373 @@
import * as uploadZipSpecification from '../src/internal/upload/upload-zip-specification'
import * as zip from '../src/internal/upload/zip'
import * as util from '../src/internal/shared/util'
import * as config from '../src/internal/shared/config'
import {ArtifactServiceClientJSON} from '../src/generated'
import * as blobUpload from '../src/internal/upload/blob-upload'
import {uploadArtifact} from '../src/internal/upload/upload-artifact'
import {noopLogs} from './common'
import {FilesNotFoundError} from '../src/internal/shared/errors'
import {BlockBlobUploadStreamOptions} from '@azure/storage-blob'
import * as fs from 'fs'
import * as path from 'path'
import unzip from 'unzip-stream'
const uploadStreamMock = jest.fn()
const blockBlobClientMock = jest.fn().mockImplementation(() => ({
uploadStream: uploadStreamMock
}))
jest.mock('@azure/storage-blob', () => ({
BlobClient: jest.fn().mockImplementation(() => {
return {
getBlockBlobClient: blockBlobClientMock
}
})
}))
const fixtures = {
uploadDirectory: path.join(__dirname, '_temp', 'plz-upload'),
files: [
{name: 'file1.txt', content: 'test 1 file content'},
{name: 'file2.txt', content: 'test 2 file content'},
{name: 'file3.txt', content: 'test 3 file content'},
{
name: 'real.txt',
content: 'from a symlink'
},
{
name: 'relative.txt',
content: 'from a symlink',
symlink: 'real.txt',
relative: true
},
{
name: 'absolute.txt',
content: 'from a symlink',
symlink: 'real.txt',
relative: false
}
],
backendIDs: {
workflowRunBackendId: '67dbcc20-e851-4452-a7c3-2cc0d2e0ec67',
workflowJobRunBackendId: '5f49179d-3386-4c38-85f7-00f8138facd0'
},
runtimeToken: 'test-token',
resultsServiceURL: 'http://results.local',
inputs: {
artifactName: 'test-artifact',
files: [
'/home/user/files/plz-upload/file1.txt',
'/home/user/files/plz-upload/file2.txt',
'/home/user/files/plz-upload/dir/file3.txt'
],
rootDirectory: '/home/user/files/plz-upload'
}
}
describe('upload-artifact', () => {
beforeAll(() => {
fs.mkdirSync(fixtures.uploadDirectory, {
recursive: true
})
for (const file of fixtures.files) {
if (file.symlink) {
let symlinkPath = file.symlink
if (!file.relative) {
symlinkPath = path.join(fixtures.uploadDirectory, file.symlink)
}
if (!fs.existsSync(path.join(fixtures.uploadDirectory, file.name))) {
fs.symlinkSync(
symlinkPath,
path.join(fixtures.uploadDirectory, file.name),
'file'
)
}
} else {
fs.writeFileSync(
path.join(fixtures.uploadDirectory, file.name),
file.content
)
}
}
})
beforeEach(() => {
noopLogs()
jest
.spyOn(uploadZipSpecification, 'validateRootDirectory')
.mockReturnValue()
jest
.spyOn(util, 'getBackendIdsFromToken')
.mockReturnValue(fixtures.backendIDs)
jest
.spyOn(uploadZipSpecification, 'getUploadZipSpecification')
.mockReturnValue(
fixtures.files.map(file => ({
sourcePath: path.join(fixtures.uploadDirectory, file.name),
destinationPath: file.name,
stats: new fs.Stats()
}))
)
jest.spyOn(config, 'getRuntimeToken').mockReturnValue(fixtures.runtimeToken)
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue(fixtures.resultsServiceURL)
})
afterEach(() => {
jest.restoreAllMocks()
})
it('should reject if there are no files to upload', async () => {
jest
.spyOn(uploadZipSpecification, 'getUploadZipSpecification')
.mockClear()
.mockReturnValue([])
const uploadResp = uploadArtifact(
fixtures.inputs.artifactName,
fixtures.inputs.files,
fixtures.inputs.rootDirectory
)
await expect(uploadResp).rejects.toThrowError(FilesNotFoundError)
})
it('should reject if no backend IDs are found', async () => {
jest.spyOn(util, 'getBackendIdsFromToken').mockRestore()
const uploadResp = uploadArtifact(
fixtures.inputs.artifactName,
fixtures.inputs.files,
fixtures.inputs.rootDirectory
)
await expect(uploadResp).rejects.toThrow()
})
it('should return false if the creation request fails', async () => {
jest
.spyOn(zip, 'createZipUploadStream')
.mockReturnValue(Promise.resolve(new zip.ZipUploadStream(1)))
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'CreateArtifact')
.mockReturnValue(Promise.resolve({ok: false, signedUploadUrl: ''}))
const uploadResp = uploadArtifact(
fixtures.inputs.artifactName,
fixtures.inputs.files,
fixtures.inputs.rootDirectory
)
await expect(uploadResp).rejects.toThrow()
})
it('should return false if blob storage upload is unsuccessful', async () => {
jest
.spyOn(zip, 'createZipUploadStream')
.mockReturnValue(Promise.resolve(new zip.ZipUploadStream(1)))
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'CreateArtifact')
.mockReturnValue(
Promise.resolve({
ok: true,
signedUploadUrl: 'https://signed-upload-url.com'
})
)
jest
.spyOn(blobUpload, 'uploadZipToBlobStorage')
.mockReturnValue(Promise.reject(new Error('boom')))
const uploadResp = uploadArtifact(
fixtures.inputs.artifactName,
fixtures.inputs.files,
fixtures.inputs.rootDirectory
)
await expect(uploadResp).rejects.toThrow()
})
it('should reject if finalize artifact fails', async () => {
jest
.spyOn(zip, 'createZipUploadStream')
.mockReturnValue(Promise.resolve(new zip.ZipUploadStream(1)))
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'CreateArtifact')
.mockReturnValue(
Promise.resolve({
ok: true,
signedUploadUrl: 'https://signed-upload-url.com'
})
)
jest.spyOn(blobUpload, 'uploadZipToBlobStorage').mockReturnValue(
Promise.resolve({
uploadSize: 1234,
sha256Hash: 'test-sha256-hash'
})
)
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'FinalizeArtifact')
.mockReturnValue(Promise.resolve({ok: false, artifactId: ''}))
const uploadResp = uploadArtifact(
fixtures.inputs.artifactName,
fixtures.inputs.files,
fixtures.inputs.rootDirectory
)
await expect(uploadResp).rejects.toThrow()
})
it('should successfully upload an artifact', async () => {
jest
.spyOn(uploadZipSpecification, 'getUploadZipSpecification')
.mockRestore()
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'CreateArtifact')
.mockReturnValue(
Promise.resolve({
ok: true,
signedUploadUrl: 'https://signed-upload-url.local'
})
)
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'FinalizeArtifact')
.mockReturnValue(
Promise.resolve({
ok: true,
artifactId: '1'
})
)
let loadedBytes = 0
const uploadedZip = path.join(
fixtures.uploadDirectory,
'..',
'uploaded.zip'
)
uploadStreamMock.mockImplementation(
async (
stream: NodeJS.ReadableStream,
bufferSize?: number,
maxConcurrency?: number,
options?: BlockBlobUploadStreamOptions
) => {
const {onProgress} = options || {}
if (fs.existsSync(uploadedZip)) {
fs.unlinkSync(uploadedZip)
}
const uploadedZipStream = fs.createWriteStream(uploadedZip)
onProgress?.({loadedBytes: 0})
return new Promise((resolve, reject) => {
stream.on('data', chunk => {
loadedBytes += chunk.length
uploadedZipStream.write(chunk)
onProgress?.({loadedBytes})
})
stream.on('end', () => {
onProgress?.({loadedBytes})
uploadedZipStream.end()
resolve({})
})
stream.on('error', err => {
reject(err)
})
})
}
)
const {id, size, digest} = await uploadArtifact(
fixtures.inputs.artifactName,
fixtures.files.map(file =>
path.join(fixtures.uploadDirectory, file.name)
),
fixtures.uploadDirectory
)
expect(id).toBe(1)
expect(size).toBe(loadedBytes)
expect(digest).toBeDefined()
expect(digest).toHaveLength(64)
const extractedDirectory = path.join(
fixtures.uploadDirectory,
'..',
'extracted'
)
if (fs.existsSync(extractedDirectory)) {
fs.rmdirSync(extractedDirectory, {recursive: true})
}
const extract = new Promise((resolve, reject) => {
fs.createReadStream(uploadedZip)
.pipe(unzip.Extract({path: extractedDirectory}))
.on('close', () => {
resolve(true)
})
.on('error', err => {
reject(err)
})
})
await expect(extract).resolves.toBe(true)
for (const file of fixtures.files) {
const filePath = path.join(extractedDirectory, file.name)
expect(fs.existsSync(filePath)).toBe(true)
expect(fs.readFileSync(filePath, 'utf8')).toBe(file.content)
}
})
it('should throw an error uploading blob chunks get delayed', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'CreateArtifact')
.mockReturnValue(
Promise.resolve({
ok: true,
signedUploadUrl: 'https://signed-upload-url.local'
})
)
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'FinalizeArtifact')
.mockReturnValue(
Promise.resolve({
ok: true,
artifactId: '1'
})
)
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('https://results.local')
jest.spyOn(config, 'getUploadChunkTimeout').mockReturnValue(2_000)
uploadStreamMock.mockImplementation(
async (
stream: NodeJS.ReadableStream,
bufferSize?: number,
maxConcurrency?: number,
options?: BlockBlobUploadStreamOptions
) => {
const {onProgress, abortSignal} = options || {}
onProgress?.({loadedBytes: 0})
return new Promise(resolve => {
abortSignal?.addEventListener('abort', () => {
resolve({})
})
})
}
)
const uploadResp = uploadArtifact(
fixtures.inputs.artifactName,
fixtures.inputs.files,
fixtures.inputs.rootDirectory
)
await expect(uploadResp).rejects.toThrow('Upload progress stalled.')
})
})

View File

@ -1,154 +0,0 @@
import * as core from '@actions/core'
import * as tmp from 'tmp-promise'
import * as path from 'path'
import * as io from '../../io/src/io'
import {promises as fs} from 'fs'
import {createGZipFileOnDisk} from '../src/internal/upload-gzip'
const root = path.join(__dirname, '_temp', 'upload-gzip')
const tempGzFilePath = path.join(root, 'file.gz')
const tempGzipFilePath = path.join(root, 'file.gzip')
const tempTgzFilePath = path.join(root, 'file.tgz')
const tempTazFilePath = path.join(root, 'file.taz')
const tempZFilePath = path.join(root, 'file.Z')
const tempTaZFilePath = path.join(root, 'file.taZ')
const tempBz2FilePath = path.join(root, 'file.bz2')
const tempTbzFilePath = path.join(root, 'file.tbz')
const tempTbz2FilePath = path.join(root, 'file.tbz2')
const tempTz2FilePath = path.join(root, 'file.tz2')
const tempLzFilePath = path.join(root, 'file.lz')
const tempLzmaFilePath = path.join(root, 'file.lzma')
const tempTlzFilePath = path.join(root, 'file.tlz')
const tempLzoFilePath = path.join(root, 'file.lzo')
const tempXzFilePath = path.join(root, 'file.xz')
const tempTxzFilePath = path.join(root, 'file.txz')
const tempZstFilePath = path.join(root, 'file.zst')
const tempZstdFilePath = path.join(root, 'file.zstd')
const tempTzstFilePath = path.join(root, 'file.tzst')
const tempZipFilePath = path.join(root, 'file.zip')
const temp7zFilePath = path.join(root, 'file.7z')
const tempNormalFilePath = path.join(root, 'file.txt')
jest.mock('../src/internal/config-variables')
beforeAll(async () => {
// mock all output so that there is less noise when running tests
jest.spyOn(console, 'log').mockImplementation(() => {})
jest.spyOn(core, 'debug').mockImplementation(() => {})
jest.spyOn(core, 'info').mockImplementation(() => {})
jest.spyOn(core, 'warning').mockImplementation(() => {})
jest.spyOn(core, 'error').mockImplementation(() => {})
// clear temp directory and create files that will be "uploaded"
await io.rmRF(root)
await fs.mkdir(path.join(root))
await fs.writeFile(tempGzFilePath, 'a file with a .gz file extension')
await fs.writeFile(tempGzipFilePath, 'a file with a .gzip file extension')
await fs.writeFile(tempTgzFilePath, 'a file with a .tgz file extension')
await fs.writeFile(tempTazFilePath, 'a file with a .taz file extension')
await fs.writeFile(tempZFilePath, 'a file with a .Z file extension')
await fs.writeFile(tempTaZFilePath, 'a file with a .taZ file extension')
await fs.writeFile(tempBz2FilePath, 'a file with a .bz2 file extension')
await fs.writeFile(tempTbzFilePath, 'a file with a .tbz file extension')
await fs.writeFile(tempTbz2FilePath, 'a file with a .tbz2 file extension')
await fs.writeFile(tempTz2FilePath, 'a file with a .tz2 file extension')
await fs.writeFile(tempLzFilePath, 'a file with a .lz file extension')
await fs.writeFile(tempLzmaFilePath, 'a file with a .lzma file extension')
await fs.writeFile(tempTlzFilePath, 'a file with a .tlz file extension')
await fs.writeFile(tempLzoFilePath, 'a file with a .lzo file extension')
await fs.writeFile(tempXzFilePath, 'a file with a .xz file extension')
await fs.writeFile(tempTxzFilePath, 'a file with a .txz file extension')
await fs.writeFile(tempZstFilePath, 'a file with a .zst file extension')
await fs.writeFile(tempZstdFilePath, 'a file with a .zstd file extension')
await fs.writeFile(tempTzstFilePath, 'a file with a .tzst file extension')
await fs.writeFile(tempZipFilePath, 'a file with a .zip file extension')
await fs.writeFile(temp7zFilePath, 'a file with a .7z file extension')
await fs.writeFile(tempNormalFilePath, 'a file with a .txt file extension')
})
test('Number.MAX_SAFE_INTEGER is returned when an existing compressed file is used', async () => {
// create temporary file
const tempFile = await tmp.file()
expect(await createGZipFileOnDisk(tempGzFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempGzipFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempTgzFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempTazFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempZFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempTaZFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempBz2FilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempTbzFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempTbz2FilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempTz2FilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempLzFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempLzmaFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempTlzFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempLzoFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempXzFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempTxzFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempZstFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempZstdFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempTzstFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(tempZipFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(await createGZipFileOnDisk(temp7zFilePath, tempFile.path)).toEqual(
Number.MAX_SAFE_INTEGER
)
expect(
await createGZipFileOnDisk(tempNormalFilePath, tempFile.path)
).not.toEqual(Number.MAX_SAFE_INTEGER)
})
test('gzip file on disk gets successfully created', async () => {
// create temporary file
const tempFile = await tmp.file()
const gzipFileSize = await createGZipFileOnDisk(
tempNormalFilePath,
tempFile.path
)
const fileStat = await fs.stat(tempNormalFilePath)
const totalFileSize = fileStat.size
// original file and gzip file should not be equal in size
expect(gzipFileSize).not.toEqual(totalFileSize)
})

View File

@ -1,10 +1,12 @@
import * as io from '../../io/src/io' import * as io from '../../io/src/io'
import * as path from 'path' import * as path from 'path'
import {promises as fs} from 'fs' import {promises as fs} from 'fs'
import * as core from '@actions/core' import {
import {getUploadSpecification} from '../src/internal/upload-specification' getUploadZipSpecification,
validateRootDirectory
} from '../src/internal/upload/upload-zip-specification'
import {noopLogs} from './common'
const artifactName = 'my-artifact'
const root = path.join(__dirname, '_temp', 'upload-specification') const root = path.join(__dirname, '_temp', 'upload-specification')
const goodItem1Path = path.join( const goodItem1Path = path.join(
root, root,
@ -49,11 +51,7 @@ const artifactFilesToUpload = [
describe('Search', () => { describe('Search', () => {
beforeAll(async () => { beforeAll(async () => {
// mock all output so that there is less noise when running tests noopLogs()
jest.spyOn(console, 'log').mockImplementation(() => {})
jest.spyOn(core, 'debug').mockImplementation(() => {})
jest.spyOn(core, 'info').mockImplementation(() => {})
jest.spyOn(core, 'warning').mockImplementation(() => {})
// clear temp directory // clear temp directory
await io.rmRF(root) await io.rmRF(root)
@ -125,31 +123,28 @@ describe('Search', () => {
'upload-specification-invalid' 'upload-specification-invalid'
) )
expect(() => { expect(() => {
getUploadSpecification( validateRootDirectory(invalidRootDirectory)
artifactName, }).toThrow(
invalidRootDirectory, `The provided rootDirectory ${invalidRootDirectory} does not exist`
artifactFilesToUpload )
)
}).toThrow(`Provided rootDirectory ${invalidRootDirectory} does not exist`)
}) })
it('Upload Specification - Fail invalid rootDirectory', async () => { it('Upload Specification - Fail invalid rootDirectory', async () => {
expect(() => { expect(() => {
getUploadSpecification(artifactName, goodItem1Path, artifactFilesToUpload) validateRootDirectory(goodItem1Path)
}).toThrow( }).toThrow(
`Provided rootDirectory ${goodItem1Path} is not a valid directory` `The provided rootDirectory ${goodItem1Path} is not a valid directory`
) )
}) })
it('Upload Specification - File does not exist', async () => { it('Upload Specification - File does not exist', async () => {
const fakeFilePath = path.join( const fakeFilePath = path.join(
artifactName,
'folder-a', 'folder-a',
'folder-b', 'folder-b',
'non-existent-file.txt' 'non-existent-file.txt'
) )
expect(() => { expect(() => {
getUploadSpecification(artifactName, root, [fakeFilePath]) getUploadZipSpecification([fakeFilePath], root)
}).toThrow(`File ${fakeFilePath} does not exist`) }).toThrow(`File ${fakeFilePath} does not exist`)
}) })
@ -162,21 +157,20 @@ describe('Search', () => {
goodItem5Path goodItem5Path
] ]
expect(() => { expect(() => {
getUploadSpecification(artifactName, folderADirectory, artifactFiles) getUploadZipSpecification(artifactFiles, folderADirectory)
}).toThrow( }).toThrow(
`The rootDirectory: ${folderADirectory} is not a parent directory of the file: ${goodItem5Path}` `The rootDirectory: ${folderADirectory} is not a parent directory of the file: ${goodItem5Path}`
) )
}) })
it('Upload Specification - Success', async () => { it('Upload Specification - Success', async () => {
const specifications = getUploadSpecification( const specifications = getUploadZipSpecification(
artifactName, artifactFilesToUpload,
root, root
artifactFilesToUpload
) )
expect(specifications.length).toEqual(7) expect(specifications.length).toEqual(7)
const absolutePaths = specifications.map(item => item.absoluteFilePath) const absolutePaths = specifications.map(item => item.sourcePath)
expect(absolutePaths).toContain(goodItem1Path) expect(absolutePaths).toContain(goodItem1Path)
expect(absolutePaths).toContain(goodItem2Path) expect(absolutePaths).toContain(goodItem2Path)
expect(absolutePaths).toContain(goodItem3Path) expect(absolutePaths).toContain(goodItem3Path)
@ -186,45 +180,38 @@ describe('Search', () => {
expect(absolutePaths).toContain(amazingFileInFolderHPath) expect(absolutePaths).toContain(amazingFileInFolderHPath)
for (const specification of specifications) { for (const specification of specifications) {
if (specification.absoluteFilePath === goodItem1Path) { if (specification.sourcePath === goodItem1Path) {
expect(specification.uploadFilePath).toEqual( expect(specification.destinationPath).toEqual(
path.join('/folder-a', 'folder-b', 'folder-c', 'good-item1.txt')
)
} else if (specification.sourcePath === goodItem2Path) {
expect(specification.destinationPath).toEqual(
path.join('/folder-d', 'good-item2.txt')
)
} else if (specification.sourcePath === goodItem3Path) {
expect(specification.destinationPath).toEqual(
path.join('/folder-d', 'good-item3.txt')
)
} else if (specification.sourcePath === goodItem4Path) {
expect(specification.destinationPath).toEqual(
path.join('/folder-d', 'good-item4.txt')
)
} else if (specification.sourcePath === goodItem5Path) {
expect(specification.destinationPath).toEqual(
path.join('/good-item5.txt')
)
} else if (specification.sourcePath === extraFileInFolderCPath) {
expect(specification.destinationPath).toEqual(
path.join( path.join(
artifactName, '/folder-a',
'folder-a',
'folder-b',
'folder-c',
'good-item1.txt'
)
)
} else if (specification.absoluteFilePath === goodItem2Path) {
expect(specification.uploadFilePath).toEqual(
path.join(artifactName, 'folder-d', 'good-item2.txt')
)
} else if (specification.absoluteFilePath === goodItem3Path) {
expect(specification.uploadFilePath).toEqual(
path.join(artifactName, 'folder-d', 'good-item3.txt')
)
} else if (specification.absoluteFilePath === goodItem4Path) {
expect(specification.uploadFilePath).toEqual(
path.join(artifactName, 'folder-d', 'good-item4.txt')
)
} else if (specification.absoluteFilePath === goodItem5Path) {
expect(specification.uploadFilePath).toEqual(
path.join(artifactName, 'good-item5.txt')
)
} else if (specification.absoluteFilePath === extraFileInFolderCPath) {
expect(specification.uploadFilePath).toEqual(
path.join(
artifactName,
'folder-a',
'folder-b', 'folder-b',
'folder-c', 'folder-c',
'extra-file-in-folder-c.txt' 'extra-file-in-folder-c.txt'
) )
) )
} else if (specification.absoluteFilePath === amazingFileInFolderHPath) { } else if (specification.sourcePath === amazingFileInFolderHPath) {
expect(specification.uploadFilePath).toEqual( expect(specification.destinationPath).toEqual(
path.join(artifactName, 'folder-h', 'amazing-item.txt') path.join('/folder-h', 'amazing-item.txt')
) )
} else { } else {
throw new Error( throw new Error(
@ -236,14 +223,13 @@ describe('Search', () => {
it('Upload Specification - Success with extra slash', async () => { it('Upload Specification - Success with extra slash', async () => {
const rootWithSlash = `${root}/` const rootWithSlash = `${root}/`
const specifications = getUploadSpecification( const specifications = getUploadZipSpecification(
artifactName, artifactFilesToUpload,
rootWithSlash, rootWithSlash
artifactFilesToUpload
) )
expect(specifications.length).toEqual(7) expect(specifications.length).toEqual(7)
const absolutePaths = specifications.map(item => item.absoluteFilePath) const absolutePaths = specifications.map(item => item.sourcePath)
expect(absolutePaths).toContain(goodItem1Path) expect(absolutePaths).toContain(goodItem1Path)
expect(absolutePaths).toContain(goodItem2Path) expect(absolutePaths).toContain(goodItem2Path)
expect(absolutePaths).toContain(goodItem3Path) expect(absolutePaths).toContain(goodItem3Path)
@ -253,45 +239,38 @@ describe('Search', () => {
expect(absolutePaths).toContain(amazingFileInFolderHPath) expect(absolutePaths).toContain(amazingFileInFolderHPath)
for (const specification of specifications) { for (const specification of specifications) {
if (specification.absoluteFilePath === goodItem1Path) { if (specification.sourcePath === goodItem1Path) {
expect(specification.uploadFilePath).toEqual( expect(specification.destinationPath).toEqual(
path.join('/folder-a', 'folder-b', 'folder-c', 'good-item1.txt')
)
} else if (specification.sourcePath === goodItem2Path) {
expect(specification.destinationPath).toEqual(
path.join('/folder-d', 'good-item2.txt')
)
} else if (specification.sourcePath === goodItem3Path) {
expect(specification.destinationPath).toEqual(
path.join('/folder-d', 'good-item3.txt')
)
} else if (specification.sourcePath === goodItem4Path) {
expect(specification.destinationPath).toEqual(
path.join('/folder-d', 'good-item4.txt')
)
} else if (specification.sourcePath === goodItem5Path) {
expect(specification.destinationPath).toEqual(
path.join('/good-item5.txt')
)
} else if (specification.sourcePath === extraFileInFolderCPath) {
expect(specification.destinationPath).toEqual(
path.join( path.join(
artifactName, '/folder-a',
'folder-a',
'folder-b',
'folder-c',
'good-item1.txt'
)
)
} else if (specification.absoluteFilePath === goodItem2Path) {
expect(specification.uploadFilePath).toEqual(
path.join(artifactName, 'folder-d', 'good-item2.txt')
)
} else if (specification.absoluteFilePath === goodItem3Path) {
expect(specification.uploadFilePath).toEqual(
path.join(artifactName, 'folder-d', 'good-item3.txt')
)
} else if (specification.absoluteFilePath === goodItem4Path) {
expect(specification.uploadFilePath).toEqual(
path.join(artifactName, 'folder-d', 'good-item4.txt')
)
} else if (specification.absoluteFilePath === goodItem5Path) {
expect(specification.uploadFilePath).toEqual(
path.join(artifactName, 'good-item5.txt')
)
} else if (specification.absoluteFilePath === extraFileInFolderCPath) {
expect(specification.uploadFilePath).toEqual(
path.join(
artifactName,
'folder-a',
'folder-b', 'folder-b',
'folder-c', 'folder-c',
'extra-file-in-folder-c.txt' 'extra-file-in-folder-c.txt'
) )
) )
} else if (specification.absoluteFilePath === amazingFileInFolderHPath) { } else if (specification.sourcePath === amazingFileInFolderHPath) {
expect(specification.uploadFilePath).toEqual( expect(specification.destinationPath).toEqual(
path.join(artifactName, 'folder-h', 'amazing-item.txt') path.join('/folder-h', 'amazing-item.txt')
) )
} else { } else {
throw new Error( throw new Error(
@ -301,47 +280,23 @@ describe('Search', () => {
} }
}) })
it('Upload Specification - Directories should not be included', async () => { it('Upload Specification - Empty Directories are included', async () => {
const folderEPath = path.join(root, 'folder-a', 'folder-b', 'folder-e') const folderEPath = path.join(root, 'folder-a', 'folder-b', 'folder-e')
const filesWithDirectory = [ const filesWithDirectory = [goodItem1Path, folderEPath]
goodItem1Path, const specifications = getUploadZipSpecification(filesWithDirectory, root)
goodItem4Path, expect(specifications.length).toEqual(2)
folderEPath, const absolutePaths = specifications.map(item => item.sourcePath)
badItem3Path
]
const specifications = getUploadSpecification(
artifactName,
root,
filesWithDirectory
)
expect(specifications.length).toEqual(3)
const absolutePaths = specifications.map(item => item.absoluteFilePath)
expect(absolutePaths).toContain(goodItem1Path) expect(absolutePaths).toContain(goodItem1Path)
expect(absolutePaths).toContain(goodItem4Path) expect(absolutePaths).toContain(null)
expect(absolutePaths).toContain(badItem3Path)
for (const specification of specifications) { for (const specification of specifications) {
if (specification.absoluteFilePath === goodItem1Path) { if (specification.sourcePath === goodItem1Path) {
expect(specification.uploadFilePath).toEqual( expect(specification.destinationPath).toEqual(
path.join( path.join('/folder-a', 'folder-b', 'folder-c', 'good-item1.txt')
artifactName,
'folder-a',
'folder-b',
'folder-c',
'good-item1.txt'
)
) )
} else if (specification.absoluteFilePath === goodItem2Path) { } else if (specification.sourcePath === null) {
expect(specification.uploadFilePath).toEqual( expect(specification.destinationPath).toEqual(
path.join(artifactName, 'folder-d', 'good-item2.txt') path.join('/folder-a', 'folder-b', 'folder-e')
)
} else if (specification.absoluteFilePath === goodItem4Path) {
expect(specification.uploadFilePath).toEqual(
path.join(artifactName, 'folder-d', 'good-item4.txt')
)
} else if (specification.absoluteFilePath === badItem3Path) {
expect(specification.uploadFilePath).toEqual(
path.join(artifactName, 'folder-f', 'bad-item3.txt')
) )
} else { } else {
throw new Error( throw new Error(
@ -350,4 +305,22 @@ describe('Search', () => {
} }
} }
}) })
it('Upload Specification - Includes symlinks', async () => {
const targetPath = path.join(root, 'link-dir', 'symlink-me.txt')
await fs.mkdir(path.dirname(targetPath), {recursive: true})
await fs.writeFile(targetPath, 'symlink file content')
const uploadPath = path.join(root, 'upload-dir', 'symlink.txt')
await fs.mkdir(path.dirname(uploadPath), {recursive: true})
await fs.symlink(targetPath, uploadPath, 'file')
const specifications = getUploadZipSpecification([uploadPath], root)
expect(specifications.length).toEqual(1)
expect(specifications[0].sourcePath).toEqual(uploadPath)
expect(specifications[0].destinationPath).toEqual(
path.join('/upload-dir', 'symlink.txt')
)
expect(specifications[0].stats.isSymbolicLink()).toBe(true)
})
}) })

View File

@ -1,551 +0,0 @@
import * as http from 'http'
import * as io from '../../io/src/io'
import * as net from 'net'
import * as path from 'path'
import {mocked} from 'ts-jest/utils'
import {exec, execSync} from 'child_process'
import {createGunzip} from 'zlib'
import {promisify} from 'util'
import {UploadHttpClient} from '../src/internal/upload-http-client'
import * as core from '@actions/core'
import {promises as fs} from 'fs'
import {getRuntimeUrl} from '../src/internal/config-variables'
import {HttpClient, HttpClientResponse} from '@actions/http-client'
import {
ArtifactResponse,
PatchArtifactSizeSuccessResponse
} from '../src/internal/contracts'
import {UploadSpecification} from '../src/internal/upload-specification'
import {getArtifactUrl} from '../src/internal/utils'
import {UploadOptions} from '../src/internal/upload-options'
const root = path.join(__dirname, '_temp', 'artifact-upload')
const file1Path = path.join(root, 'file1.txt')
const file2Path = path.join(root, 'file2.txt')
const file3Path = path.join(root, 'folder1', 'file3.txt')
const file4Path = path.join(root, 'folder1', 'file4.txt')
const file5Path = path.join(root, 'folder1', 'folder2', 'folder3', 'file5.txt')
let file1Size = 0
let file2Size = 0
let file3Size = 0
let file4Size = 0
let file5Size = 0
jest.mock('../src/internal/config-variables')
jest.mock('@actions/http-client')
describe('Upload Tests', () => {
beforeAll(async () => {
// mock all output so that there is less noise when running tests
jest.spyOn(console, 'log').mockImplementation(() => {})
jest.spyOn(core, 'debug').mockImplementation(() => {})
jest.spyOn(core, 'info').mockImplementation(() => {})
jest.spyOn(core, 'warning').mockImplementation(() => {})
jest.spyOn(core, 'error').mockImplementation(() => {})
// setup mocking for calls that got through the HttpClient
setupHttpClientMock()
// clear temp directory and create files that will be "uploaded"
await io.rmRF(root)
await fs.mkdir(path.join(root, 'folder1', 'folder2', 'folder3'), {
recursive: true
})
await fs.writeFile(file1Path, 'this is file 1')
await fs.writeFile(file2Path, 'this is file 2')
await fs.writeFile(file3Path, 'this is file 3')
await fs.writeFile(file4Path, 'this is file 4')
await fs.writeFile(file5Path, 'this is file 5')
/*
Directory structure for files that get created:
root/
file1.txt
file2.txt
folder1/
file3.txt
file4.txt
folder2/
folder3/
file5.txt
*/
file1Size = (await fs.stat(file1Path)).size
file2Size = (await fs.stat(file2Path)).size
file3Size = (await fs.stat(file3Path)).size
file4Size = (await fs.stat(file4Path)).size
file5Size = (await fs.stat(file5Path)).size
})
/**
* Artifact Creation Tests
*/
it('Create Artifact - Success', async () => {
const artifactName = 'valid-artifact-name'
const uploadHttpClient = new UploadHttpClient()
const response = await uploadHttpClient.createArtifactInFileContainer(
artifactName
)
expect(response.containerId).toEqual('13')
expect(response.size).toEqual(-1)
expect(response.signedContent).toEqual('false')
expect(response.fileContainerResourceUrl).toEqual(
`${getRuntimeUrl()}_apis/resources/Containers/13`
)
expect(response.type).toEqual('actions_storage')
expect(response.name).toEqual(artifactName)
expect(response.url).toEqual(
`${getRuntimeUrl()}_apis/pipelines/1/runs/1/artifacts?artifactName=${artifactName}`
)
})
it('Create Artifact - Failure', async () => {
const artifactName = 'invalid-artifact-name'
const uploadHttpClient = new UploadHttpClient()
expect(
uploadHttpClient.createArtifactInFileContainer(artifactName)
).rejects.toEqual(
new Error(
`Create Artifact Container failed: The artifact name invalid-artifact-name is not valid. Request URL ${getArtifactUrl()}`
)
)
})
it('Create Artifact - Retention Less Than Min Value Error', async () => {
const artifactName = 'valid-artifact-name'
const options: UploadOptions = {
retentionDays: -1
}
const uploadHttpClient = new UploadHttpClient()
expect(
uploadHttpClient.createArtifactInFileContainer(artifactName, options)
).rejects.toEqual(new Error('Invalid retention, minimum value is 1.'))
})
it('Create Artifact - Storage Quota Error', async () => {
const artifactName = 'storage-quota-hit'
const uploadHttpClient = new UploadHttpClient()
expect(
uploadHttpClient.createArtifactInFileContainer(artifactName)
).rejects.toEqual(
new Error(
'Create Artifact Container failed: Artifact storage quota has been hit. Unable to upload any new artifacts'
)
)
})
/**
* Artifact Upload Tests
*/
it('Upload Artifact - Success', async () => {
/**
* Normally search.findFilesToUpload() would be used for providing information about what to upload. These tests however
* focuses solely on the upload APIs so searchResult[] will be hard-coded
*/
const artifactName = 'successful-artifact'
const uploadSpecification: UploadSpecification[] = [
{
absoluteFilePath: file1Path,
uploadFilePath: `${artifactName}/file1.txt`
},
{
absoluteFilePath: file2Path,
uploadFilePath: `${artifactName}/file2.txt`
},
{
absoluteFilePath: file3Path,
uploadFilePath: `${artifactName}/folder1/file3.txt`
},
{
absoluteFilePath: file4Path,
uploadFilePath: `${artifactName}/folder1/file4.txt`
},
{
absoluteFilePath: file5Path,
uploadFilePath: `${artifactName}/folder1/folder2/folder3/file5.txt`
}
]
const expectedTotalSize =
file1Size + file2Size + file3Size + file4Size + file5Size
const uploadUrl = `${getRuntimeUrl()}_apis/resources/Containers/13`
const uploadHttpClient = new UploadHttpClient()
const uploadResult = await uploadHttpClient.uploadArtifactToFileContainer(
uploadUrl,
uploadSpecification
)
expect(uploadResult.failedItems.length).toEqual(0)
expect(uploadResult.uploadSize).toEqual(expectedTotalSize)
})
function hasMkfifo(): boolean {
try {
// make sure we drain the stdout
return (
process.platform !== 'win32' &&
execSync('which mkfifo').toString().length > 0
)
} catch (e) {
return false
}
}
const withMkfifoIt = hasMkfifo() ? it : it.skip
withMkfifoIt(
'Upload Artifact with content from named pipe - Success',
async () => {
// create a named pipe 'pipe' with content 'hello pipe'
const content = Buffer.from('hello pipe')
const pipeFilePath = path.join(root, 'pipe')
await promisify(exec)('mkfifo pipe', {cwd: root})
// don't want to await here as that would block until read
fs.writeFile(pipeFilePath, content)
const artifactName = 'successful-artifact'
const uploadSpecification: UploadSpecification[] = [
{
absoluteFilePath: pipeFilePath,
uploadFilePath: `${artifactName}/pipe`
}
]
const uploadUrl = `${getRuntimeUrl()}_apis/resources/Containers/13`
const uploadHttpClient = new UploadHttpClient()
const uploadResult = await uploadHttpClient.uploadArtifactToFileContainer(
uploadUrl,
uploadSpecification
)
// accesses the ReadableStream that was passed into sendStream
// eslint-disable-next-line @typescript-eslint/unbound-method
const stream = mocked(HttpClient.prototype.sendStream).mock.calls[0][2]
expect(stream).not.toBeNull()
// decompresses the passed stream
const data: Buffer[] = []
for await (const chunk of stream.pipe(createGunzip())) {
data.push(Buffer.isBuffer(chunk) ? chunk : Buffer.from(chunk as string))
}
const uploaded = Buffer.concat(data)
expect(uploadResult.failedItems.length).toEqual(0)
expect(uploaded).toEqual(content)
}
)
it('Upload Artifact - Failed Single File Upload', async () => {
const uploadSpecification: UploadSpecification[] = [
{
absoluteFilePath: file1Path,
uploadFilePath: `this-file-upload-will-fail`
}
]
const uploadUrl = `${getRuntimeUrl()}_apis/resources/Containers/13`
const uploadHttpClient = new UploadHttpClient()
const uploadResult = await uploadHttpClient.uploadArtifactToFileContainer(
uploadUrl,
uploadSpecification
)
expect(uploadResult.failedItems.length).toEqual(1)
expect(uploadResult.uploadSize).toEqual(0)
})
it('Upload Artifact - Partial Upload Continue On Error', async () => {
const artifactName = 'partial-artifact'
const uploadSpecification: UploadSpecification[] = [
{
absoluteFilePath: file1Path,
uploadFilePath: `${artifactName}/file1.txt`
},
{
absoluteFilePath: file2Path,
uploadFilePath: `${artifactName}/file2.txt`
},
{
absoluteFilePath: file3Path,
uploadFilePath: `${artifactName}/folder1/file3.txt`
},
{
absoluteFilePath: file4Path,
uploadFilePath: `this-file-upload-will-fail`
},
{
absoluteFilePath: file5Path,
uploadFilePath: `${artifactName}/folder1/folder2/folder3/file5.txt`
}
]
const expectedPartialSize = file1Size + file2Size + file4Size + file5Size
const uploadUrl = `${getRuntimeUrl()}_apis/resources/Containers/13`
const uploadHttpClient = new UploadHttpClient()
const uploadResult = await uploadHttpClient.uploadArtifactToFileContainer(
uploadUrl,
uploadSpecification,
{continueOnError: true}
)
expect(uploadResult.failedItems.length).toEqual(1)
expect(uploadResult.uploadSize).toEqual(expectedPartialSize)
})
it('Upload Artifact - Partial Upload Fail Fast', async () => {
const artifactName = 'partial-artifact'
const uploadSpecification: UploadSpecification[] = [
{
absoluteFilePath: file1Path,
uploadFilePath: `${artifactName}/file1.txt`
},
{
absoluteFilePath: file2Path,
uploadFilePath: `${artifactName}/file2.txt`
},
{
absoluteFilePath: file3Path,
uploadFilePath: `${artifactName}/folder1/file3.txt`
},
{
absoluteFilePath: file4Path,
uploadFilePath: `this-file-upload-will-fail`
},
{
absoluteFilePath: file5Path,
uploadFilePath: `${artifactName}/folder1/folder2/folder3/file5.txt`
}
]
const expectedPartialSize = file1Size + file2Size + file3Size
const uploadUrl = `${getRuntimeUrl()}_apis/resources/Containers/13`
const uploadHttpClient = new UploadHttpClient()
const uploadResult = await uploadHttpClient.uploadArtifactToFileContainer(
uploadUrl,
uploadSpecification,
{continueOnError: false}
)
expect(uploadResult.failedItems.length).toEqual(2)
expect(uploadResult.uploadSize).toEqual(expectedPartialSize)
})
it('Upload Artifact - Failed upload with no options', async () => {
const artifactName = 'partial-artifact'
const uploadSpecification: UploadSpecification[] = [
{
absoluteFilePath: file1Path,
uploadFilePath: `${artifactName}/file1.txt`
},
{
absoluteFilePath: file2Path,
uploadFilePath: `${artifactName}/file2.txt`
},
{
absoluteFilePath: file3Path,
uploadFilePath: `${artifactName}/folder1/file3.txt`
},
{
absoluteFilePath: file4Path,
uploadFilePath: `this-file-upload-will-fail`
},
{
absoluteFilePath: file5Path,
uploadFilePath: `${artifactName}/folder1/folder2/folder3/file5.txt`
}
]
const expectedPartialSize = file1Size + file2Size + file3Size + file5Size
const uploadUrl = `${getRuntimeUrl()}_apis/resources/Containers/13`
const uploadHttpClient = new UploadHttpClient()
const uploadResult = await uploadHttpClient.uploadArtifactToFileContainer(
uploadUrl,
uploadSpecification
)
expect(uploadResult.failedItems.length).toEqual(1)
expect(uploadResult.uploadSize).toEqual(expectedPartialSize)
})
it('Upload Artifact - Failed upload with empty options', async () => {
const artifactName = 'partial-artifact'
const uploadSpecification: UploadSpecification[] = [
{
absoluteFilePath: file1Path,
uploadFilePath: `${artifactName}/file1.txt`
},
{
absoluteFilePath: file2Path,
uploadFilePath: `${artifactName}/file2.txt`
},
{
absoluteFilePath: file3Path,
uploadFilePath: `${artifactName}/folder1/file3.txt`
},
{
absoluteFilePath: file4Path,
uploadFilePath: `this-file-upload-will-fail`
},
{
absoluteFilePath: file5Path,
uploadFilePath: `${artifactName}/folder1/folder2/folder3/file5.txt`
}
]
const expectedPartialSize = file1Size + file2Size + file3Size + file5Size
const uploadUrl = `${getRuntimeUrl()}_apis/resources/Containers/13`
const uploadHttpClient = new UploadHttpClient()
const uploadResult = await uploadHttpClient.uploadArtifactToFileContainer(
uploadUrl,
uploadSpecification,
{}
)
expect(uploadResult.failedItems.length).toEqual(1)
expect(uploadResult.uploadSize).toEqual(expectedPartialSize)
})
/**
* Artifact Association Tests
*/
it('Associate Artifact - Success', async () => {
const uploadHttpClient = new UploadHttpClient()
expect(async () => {
uploadHttpClient.patchArtifactSize(130, 'my-artifact')
}).not.toThrow()
})
it('Associate Artifact - Not Found', async () => {
const uploadHttpClient = new UploadHttpClient()
expect(
uploadHttpClient.patchArtifactSize(100, 'non-existent-artifact')
).rejects.toThrow(
'An Artifact with the name non-existent-artifact was not found'
)
})
it('Associate Artifact - Error', async () => {
const uploadHttpClient = new UploadHttpClient()
expect(
uploadHttpClient.patchArtifactSize(-2, 'my-artifact')
).rejects.toThrow(
'Finalize artifact upload failed: Artifact service responded with 400'
)
})
/**
* Helpers used to setup mocking for the HttpClient
*/
async function emptyMockReadBody(): Promise<string> {
return new Promise(resolve => {
resolve()
})
}
function setupHttpClientMock(): void {
/**
* Mocks Post calls that are used during Artifact Creation tests
*
* Simulates success and non-success status codes depending on the artifact name along with an appropriate
* payload that represents an expected response
*/
jest
.spyOn(HttpClient.prototype, 'post')
.mockImplementation(async (requestdata, data) => {
// parse the input data and use the provided artifact name as part of the response
const inputData = JSON.parse(data)
const mockMessage = new http.IncomingMessage(new net.Socket())
let mockReadBody = emptyMockReadBody
if (inputData.Name === 'invalid-artifact-name') {
mockMessage.statusCode = 400
} else if (inputData.Name === 'storage-quota-hit') {
mockMessage.statusCode = 403
} else {
mockMessage.statusCode = 201
const response: ArtifactResponse = {
containerId: '13',
size: -1,
signedContent: 'false',
fileContainerResourceUrl: `${getRuntimeUrl()}_apis/resources/Containers/13`,
type: 'actions_storage',
name: inputData.Name,
url: `${getRuntimeUrl()}_apis/pipelines/1/runs/1/artifacts?artifactName=${
inputData.Name
}`
}
const returnData: string = JSON.stringify(response, null, 2)
mockReadBody = async function(): Promise<string> {
return new Promise(resolve => {
resolve(returnData)
})
}
}
return new Promise<HttpClientResponse>(resolve => {
resolve({
message: mockMessage,
readBody: mockReadBody
})
})
})
/**
* Mocks SendStream calls that are made during Artifact Upload tests
*
* A 500 response is used to simulate a failed upload stream. The uploadUrl can be set to
* include 'fail' to specify that the upload should fail
*/
jest
.spyOn(HttpClient.prototype, 'sendStream')
.mockImplementation(async (verb, requestUrl) => {
const mockMessage = new http.IncomingMessage(new net.Socket())
mockMessage.statusCode = 200
if (requestUrl.includes('fail')) {
mockMessage.statusCode = 500
}
return new Promise<HttpClientResponse>(resolve => {
resolve({
message: mockMessage,
readBody: emptyMockReadBody
})
})
})
/**
* Mocks Patch calls that are made during Artifact Association tests
*
* Simulates success and non-success status codes depending on the input size along with an appropriate
* payload that represents an expected response
*/
jest
.spyOn(HttpClient.prototype, 'patch')
.mockImplementation(async (requestdata, data) => {
const inputData = JSON.parse(data)
const mockMessage = new http.IncomingMessage(new net.Socket())
// Get the name from the end of requestdata. Will be something like https://www.example.com/_apis/pipelines/workflows/15/artifacts?api-version=6.0-preview&artifactName=my-artifact
const artifactName = requestdata.split('=')[2]
let mockReadBody = emptyMockReadBody
if (inputData.Size < 1) {
mockMessage.statusCode = 400
} else if (artifactName === 'non-existent-artifact') {
mockMessage.statusCode = 404
} else {
mockMessage.statusCode = 200
const response: PatchArtifactSizeSuccessResponse = {
containerId: 13,
size: inputData.Size,
signedContent: 'false',
type: 'actions_storage',
name: artifactName,
url: `${getRuntimeUrl()}_apis/pipelines/1/runs/1/artifacts?artifactName=${artifactName}`,
uploadUrl: `${getRuntimeUrl()}_apis/resources/Containers/13`
}
const returnData: string = JSON.stringify(response, null, 2)
mockReadBody = async function(): Promise<string> {
return new Promise(resolve => {
resolve(returnData)
})
}
}
return new Promise<HttpClientResponse>(resolve => {
resolve({
message: mockMessage,
readBody: mockReadBody
})
})
})
}
})

View File

@ -1,239 +1,219 @@
import * as fs from 'fs' import * as config from '../src/internal/shared/config'
import * as io from '../../io/src/io' import * as util from '../src/internal/shared/util'
import * as path from 'path' import {maskSigUrl, maskSecretUrls} from '../src/internal/shared/util'
import * as utils from '../src/internal/utils' import {setSecret, debug} from '@actions/core'
import * as core from '@actions/core'
import {HttpCodes} from '@actions/http-client'
import {
getRuntimeUrl,
getWorkFlowRunId,
getInitialRetryIntervalInMilliseconds,
getRetryMultiplier
} from '../src/internal/config-variables'
import {Readable} from 'stream'
jest.mock('../src/internal/config-variables') export const testRuntimeToken =
'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwic2NwIjoiQWN0aW9ucy5FeGFtcGxlIEFjdGlvbnMuQW5vdGhlckV4YW1wbGU6dGVzdCBBY3Rpb25zLlJlc3VsdHM6Y2U3ZjU0YzctNjFjNy00YWFlLTg4N2YtMzBkYTQ3NWY1ZjFhOmNhMzk1MDg1LTA0MGEtNTI2Yi0yY2U4LWJkYzg1ZjY5Mjc3NCIsImlhdCI6MTUxNjIzOTAyMn0.XYnI_wHPBlUi1mqYveJnnkJhp4dlFjqxzRmISPsqfw8'
describe('Utils', () => { describe('get-backend-ids-from-token', () => {
beforeAll(() => { it('should return backend ids when the token is valid', () => {
// mock all output so that there is less noise when running tests jest.spyOn(config, 'getRuntimeToken').mockReturnValue(testRuntimeToken)
jest.spyOn(console, 'log').mockImplementation(() => {})
jest.spyOn(core, 'debug').mockImplementation(() => {}) const backendIds = util.getBackendIdsFromToken()
jest.spyOn(core, 'info').mockImplementation(() => {}) expect(backendIds.workflowRunBackendId).toBe(
jest.spyOn(core, 'warning').mockImplementation(() => {}) 'ce7f54c7-61c7-4aae-887f-30da475f5f1a'
)
expect(backendIds.workflowJobRunBackendId).toBe(
'ca395085-040a-526b-2ce8-bdc85f692774'
)
}) })
it('Check exponential retry range', () => { it("should throw an error when the token doesn't have the right scope", () => {
// No retries should return the initial retry interval jest
const retryWaitTime0 = utils.getExponentialRetryTimeInMilliseconds(0) .spyOn(config, 'getRuntimeToken')
expect(retryWaitTime0).toEqual(getInitialRetryIntervalInMilliseconds()) .mockReturnValue(
'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwic2NwIjoiQWN0aW9ucy5FeGFtcGxlIEFjdGlvbnMuQW5vdGhlckV4YW1wbGU6dGVzdCIsImlhdCI6MTUxNjIzOTAyMn0.K0IEoULZteGevF38G94xiaA8zcZ5UlKWfGfqE6q3dhw'
const testMinMaxRange = (retryCount: number): void => {
const retryWaitTime = utils.getExponentialRetryTimeInMilliseconds(
retryCount
) )
const minRange =
getInitialRetryIntervalInMilliseconds() *
getRetryMultiplier() *
retryCount
const maxRange = minRange * getRetryMultiplier()
expect(retryWaitTime).toBeGreaterThanOrEqual(minRange) expect(util.getBackendIdsFromToken).toThrowError(
expect(retryWaitTime).toBeLessThan(maxRange) 'Failed to get backend IDs: The provided JWT token is invalid'
}
for (let i = 1; i < 10; i++) {
testMinMaxRange(i)
}
})
it('Test negative artifact retention throws', () => {
expect(() => {
utils.getProperRetention(-1, undefined)
}).toThrow()
})
it('Test no setting specified takes artifact retention input', () => {
expect(utils.getProperRetention(180, undefined)).toEqual(180)
})
it('Test artifact retention must conform to max allowed', () => {
expect(utils.getProperRetention(180, '45')).toEqual(45)
})
it('Test constructing artifact URL', () => {
const runtimeUrl = getRuntimeUrl()
const runId = getWorkFlowRunId()
const artifactUrl = utils.getArtifactUrl()
expect(artifactUrl).toEqual(
`${runtimeUrl}_apis/pipelines/workflows/${runId}/artifacts?api-version=${utils.getApiVersion()}`
) )
}) })
it('Test constructing upload headers with all optional parameters', () => { it('should throw an error when the token has a malformed scope', () => {
const contentType = 'application/octet-stream' jest
const size = 24 .spyOn(config, 'getRuntimeToken')
const uncompressedLength = 100 .mockReturnValue(
const range = 'bytes 0-199/200' 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwic2NwIjoiQWN0aW9ucy5FeGFtcGxlIEFjdGlvbnMuQW5vdGhlckV4YW1wbGU6dGVzdCBBY3Rpb25zLlJlc3VsdHM6Y2U3ZjU0YzctNjFjNy00YWFlLTg4N2YtMzBkYTQ3NWY1ZjFhIiwiaWF0IjoxNTE2MjM5MDIyfQ.7D0_LRfRFRZFImHQ7GxH2S6ZyFjjZ5U0ujjGCfle1XE'
const digest = { )
crc64: 'bSzITYnW/P8=',
md5: 'Xiv1fT9AxLbfadrxk2y3ZvgyN0tPwCWafL/wbi9w8mk='
}
const headers = utils.getUploadHeaders(
contentType,
true,
true,
uncompressedLength,
size,
range,
digest
)
expect(Object.keys(headers).length).toEqual(10)
expect(headers['Accept']).toEqual(
`application/json;api-version=${utils.getApiVersion()}`
)
expect(headers['Content-Type']).toEqual(contentType)
expect(headers['Connection']).toEqual('Keep-Alive')
expect(headers['Keep-Alive']).toEqual('10')
expect(headers['Content-Encoding']).toEqual('gzip')
expect(headers['x-tfs-filelength']).toEqual(uncompressedLength)
expect(headers['Content-Length']).toEqual(size)
expect(headers['Content-Range']).toEqual(range)
expect(headers['x-actions-results-crc64']).toEqual(digest.crc64)
expect(headers['x-actions-results-md5']).toEqual(digest.md5)
})
it('Test constructing upload headers with only required parameter', () => { expect(util.getBackendIdsFromToken).toThrowError(
const headers = utils.getUploadHeaders('application/octet-stream') 'Failed to get backend IDs: The provided JWT token is invalid'
expect(Object.keys(headers).length).toEqual(2)
expect(headers['Accept']).toEqual(
`application/json;api-version=${utils.getApiVersion()}`
)
expect(headers['Content-Type']).toEqual('application/octet-stream')
})
it('Test constructing download headers with all optional parameters', () => {
const contentType = 'application/json'
const headers = utils.getDownloadHeaders(contentType, true, true)
expect(Object.keys(headers).length).toEqual(5)
expect(headers['Content-Type']).toEqual(contentType)
expect(headers['Connection']).toEqual('Keep-Alive')
expect(headers['Keep-Alive']).toEqual('10')
expect(headers['Accept-Encoding']).toEqual('gzip')
expect(headers['Accept']).toEqual(
`application/octet-stream;api-version=${utils.getApiVersion()}`
) )
}) })
it('Test constructing download headers with only required parameter', () => { it('should throw an error when the token is in an invalid format', () => {
const headers = utils.getDownloadHeaders('application/octet-stream') jest.spyOn(config, 'getRuntimeToken').mockReturnValue('token')
expect(Object.keys(headers).length).toEqual(2)
expect(headers['Content-Type']).toEqual('application/octet-stream') expect(util.getBackendIdsFromToken).toThrowError('Invalid token specified')
// check for default accept type
expect(headers['Accept']).toEqual(
`application/json;api-version=${utils.getApiVersion()}`
)
}) })
it('Test Success Status Code', () => { it("should throw an error when the token doesn't have the right field", () => {
expect(utils.isSuccessStatusCode(HttpCodes.OK)).toEqual(true) jest
expect(utils.isSuccessStatusCode(201)).toEqual(true) .spyOn(config, 'getRuntimeToken')
expect(utils.isSuccessStatusCode(299)).toEqual(true) .mockReturnValue(
expect(utils.isSuccessStatusCode(HttpCodes.NotFound)).toEqual(false) 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c'
expect(utils.isSuccessStatusCode(HttpCodes.BadGateway)).toEqual(false) )
expect(utils.isSuccessStatusCode(HttpCodes.Forbidden)).toEqual(false)
})
it('Test Retry Status Code', () => { expect(util.getBackendIdsFromToken).toThrowError(
expect(utils.isRetryableStatusCode(HttpCodes.BadGateway)).toEqual(true) 'Failed to get backend IDs: The provided JWT token is invalid'
expect(utils.isRetryableStatusCode(HttpCodes.ServiceUnavailable)).toEqual(
true
) )
expect(utils.isRetryableStatusCode(HttpCodes.GatewayTimeout)).toEqual(true) })
expect(utils.isRetryableStatusCode(HttpCodes.TooManyRequests)).toEqual(true) })
expect(utils.isRetryableStatusCode(HttpCodes.OK)).toEqual(false)
expect(utils.isRetryableStatusCode(HttpCodes.NotFound)).toEqual(false) jest.mock('@actions/core')
expect(utils.isRetryableStatusCode(HttpCodes.Forbidden)).toEqual(false)
expect(utils.isRetryableStatusCode(413)).toEqual(true) // Payload Too Large describe('maskSigUrl', () => {
}) beforeEach(() => {
jest.clearAllMocks()
it('Test Throttled Status Code', () => { })
expect(utils.isThrottledStatusCode(HttpCodes.TooManyRequests)).toEqual(true)
expect(utils.isThrottledStatusCode(HttpCodes.InternalServerError)).toEqual( it('does nothing if no sig parameter is present', () => {
false const url = 'https://example.com'
) maskSigUrl(url)
expect(utils.isThrottledStatusCode(HttpCodes.BadGateway)).toEqual(false) expect(setSecret).not.toHaveBeenCalled()
expect(utils.isThrottledStatusCode(HttpCodes.ServiceUnavailable)).toEqual( })
false
) it('masks the sig parameter in the middle of the URL and sets it as a secret', () => {
}) const url = 'https://example.com/?param1=value1&sig=12345&param2=value2'
maskSigUrl(url)
it('Test Forbidden Status Code', () => { expect(setSecret).toHaveBeenCalledWith('12345')
expect(utils.isForbiddenStatusCode(HttpCodes.Forbidden)).toEqual(true) expect(setSecret).toHaveBeenCalledWith(encodeURIComponent('12345'))
expect(utils.isForbiddenStatusCode(HttpCodes.InternalServerError)).toEqual( })
false
) it('does nothing if the URL is empty', () => {
expect(utils.isForbiddenStatusCode(HttpCodes.TooManyRequests)).toEqual( const url = ''
false maskSigUrl(url)
) expect(setSecret).not.toHaveBeenCalled()
expect(utils.isForbiddenStatusCode(HttpCodes.OK)).toEqual(false) })
})
it('handles URLs with fragments', () => {
it('Test Creating Artifact Directories', async () => { const url = 'https://example.com?sig=12345#fragment'
const root = path.join(__dirname, '_temp', 'artifact-download') maskSigUrl(url)
// remove directory before starting expect(setSecret).toHaveBeenCalledWith('12345')
await io.rmRF(root) expect(setSecret).toHaveBeenCalledWith(encodeURIComponent('12345'))
})
const directory1 = path.join(root, 'folder2', 'folder3') })
const directory2 = path.join(directory1, 'folder1')
describe('maskSigUrl handles special characters in signatures', () => {
// Initially should not exist beforeEach(() => {
await expect(fs.promises.access(directory1)).rejects.not.toBeUndefined() jest.clearAllMocks()
await expect(fs.promises.access(directory2)).rejects.not.toBeUndefined() })
const directoryStructure = [directory1, directory2]
await utils.createDirectoriesForArtifact(directoryStructure) it('handles signatures with slashes', () => {
// directories should now be created const url = 'https://example.com/?sig=abc/123'
await expect(fs.promises.access(directory1)).resolves.toEqual(undefined) maskSigUrl(url)
await expect(fs.promises.access(directory2)).resolves.toEqual(undefined) expect(setSecret).toHaveBeenCalledWith('abc/123')
}) expect(setSecret).toHaveBeenCalledWith('abc%2F123')
})
it('Test Creating Empty Files', async () => {
const root = path.join(__dirname, '_temp', 'empty-files') it('handles signatures with plus signs', () => {
await io.rmRF(root) const url = 'https://example.com/?sig=abc+123'
maskSigUrl(url)
const emptyFile1 = path.join(root, 'emptyFile1') expect(setSecret).toHaveBeenCalledWith('abc 123')
const directoryToCreate = path.join(root, 'folder1') expect(setSecret).toHaveBeenCalledWith('abc%20123')
const emptyFile2 = path.join(directoryToCreate, 'emptyFile2') })
// empty files should only be created after the directory structure is fully setup it('handles signatures with equals signs', () => {
// ensure they are first created by using the createDirectoriesForArtifact method const url = 'https://example.com/?sig=abc=123'
const directoryStructure = [root, directoryToCreate] maskSigUrl(url)
await utils.createDirectoriesForArtifact(directoryStructure) expect(setSecret).toHaveBeenCalledWith('abc=123')
await expect(fs.promises.access(root)).resolves.toEqual(undefined) expect(setSecret).toHaveBeenCalledWith('abc%3D123')
await expect(fs.promises.access(directoryToCreate)).resolves.toEqual( })
undefined
) it('handles already percent-encoded signatures', () => {
const url = 'https://example.com/?sig=abc%2F123%3D'
await expect(fs.promises.access(emptyFile1)).rejects.not.toBeUndefined() maskSigUrl(url)
await expect(fs.promises.access(emptyFile2)).rejects.not.toBeUndefined() expect(setSecret).toHaveBeenCalledWith('abc/123=')
expect(setSecret).toHaveBeenCalledWith('abc%2F123%3D')
const emptyFilesToCreate = [emptyFile1, emptyFile2] })
await utils.createEmptyFilesForArtifact(emptyFilesToCreate)
it('handles complex Azure SAS signatures', () => {
await expect(fs.promises.access(emptyFile1)).resolves.toEqual(undefined) const url =
const size1 = (await fs.promises.stat(emptyFile1)).size 'https://example.com/container/file.txt?sig=nXyQIUj%2F%2F06Cxt80pBRYiiJlYqtPYg5sz%2FvEh5iHAhw%3D&se=2023-12-31'
expect(size1).toEqual(0) maskSigUrl(url)
await expect(fs.promises.access(emptyFile2)).resolves.toEqual(undefined) expect(setSecret).toHaveBeenCalledWith(
const size2 = (await fs.promises.stat(emptyFile2)).size 'nXyQIUj//06Cxt80pBRYiiJlYqtPYg5sz/vEh5iHAhw='
expect(size2).toEqual(0) )
}) expect(setSecret).toHaveBeenCalledWith(
'nXyQIUj%2F%2F06Cxt80pBRYiiJlYqtPYg5sz%2FvEh5iHAhw%3D'
it('Creates a digest from a readable stream', async () => { )
const data = 'lorem ipsum' })
const stream = Readable.from(data)
const digest = await utils.digestForStream(stream) it('handles signatures with multiple special characters', () => {
const url = 'https://example.com/?sig=a/b+c=d&e=f'
expect(digest.crc64).toBe('bSzITYnW/P8=') maskSigUrl(url)
expect(digest.md5).toBe('gKdR/eV3AoZAxBkADjPrpg==') expect(setSecret).toHaveBeenCalledWith('a/b c=d')
expect(setSecret).toHaveBeenCalledWith('a%2Fb%20c%3Dd')
})
})
describe('maskSecretUrls', () => {
beforeEach(() => {
jest.clearAllMocks()
})
it('masks sig parameters in signed_upload_url and signed_url', () => {
const body = {
signed_upload_url: 'https://upload.com?sig=upload123',
signed_url: 'https://download.com?sig=download123'
}
maskSecretUrls(body)
expect(setSecret).toHaveBeenCalledWith('upload123')
expect(setSecret).toHaveBeenCalledWith(encodeURIComponent('upload123'))
expect(setSecret).toHaveBeenCalledWith('download123')
expect(setSecret).toHaveBeenCalledWith(encodeURIComponent('download123'))
})
it('handles case where only upload_url is present', () => {
const body = {
signed_upload_url: 'https://upload.com?sig=upload123'
}
maskSecretUrls(body)
expect(setSecret).toHaveBeenCalledWith('upload123')
expect(setSecret).toHaveBeenCalledWith(encodeURIComponent('upload123'))
})
it('handles case where only download_url is present', () => {
const body = {
signed_url: 'https://download.com?sig=download123'
}
maskSecretUrls(body)
expect(setSecret).toHaveBeenCalledWith('download123')
expect(setSecret).toHaveBeenCalledWith(encodeURIComponent('download123'))
})
it('handles case where URLs do not contain sig parameters', () => {
const body = {
signed_upload_url: 'https://upload.com?token=abc',
signed_url: 'https://download.com?token=xyz'
}
maskSecretUrls(body)
expect(setSecret).not.toHaveBeenCalled()
})
it('handles empty string URLs', () => {
const body = {
signed_upload_url: '',
signed_url: ''
}
maskSecretUrls(body)
expect(setSecret).not.toHaveBeenCalled()
})
it('does nothing if body is not an object or is null', () => {
maskSecretUrls(null)
expect(debug).toHaveBeenCalledWith('body is not an object or is null')
expect(setSecret).not.toHaveBeenCalled()
})
it('does nothing if signed_upload_url and signed_url are not strings', () => {
const body = {
signed_upload_url: 123,
signed_url: 456
}
maskSecretUrls(body)
expect(setSecret).not.toHaveBeenCalled()
}) })
}) })

View File

@ -1,53 +0,0 @@
# Additional Information
Extra information
- [Non-Supported Characters](#Non-Supported-Characters)
- [Permission loss](#Permission-Loss)
- [Considerations](#Considerations)
- [Compression](#Is-my-artifact-compressed)
## Non-Supported Characters
When uploading an artifact, the inputted `name` parameter along with the files specified in `files` cannot contain any of the following characters. They will be rejected by the server if attempted to be sent over and the upload will fail. These characters are not allowed due to limitations and restrictions with certain file systems such as NTFS. To maintain platform-agnostic behavior, all characters that are not supported by an individual filesystem/platform will not be supported on all filesystems/platforms.
- "
- :
- <
- \>
- |
- \*
- ?
In addition to the aforementioned characters, the inputted `name` also cannot include the following
- \
- /
## Permission Loss
File permissions are not maintained between uploaded and downloaded artifacts. If file permissions are something that need to be maintained (such as an executable), consider archiving all of the files using something like `tar` and then uploading the single archive. After downloading the artifact, you can `un-tar` the individual file and permissions will be preserved.
```js
const artifact = require('@actions/artifact');
const artifactClient = artifact.create()
const artifactName = 'my-artifact';
const files = [
'/home/user/files/plz-upload/my-archive.tgz',
]
const rootDirectory = '/home/user/files/plz-upload'
const uploadResult = await artifactClient.uploadArtifact(artifactName, files, rootDirectory)
```
## Considerations
During upload, each file is uploaded concurrently in 4MB chunks using a separate HTTPS connection per file. Chunked uploads are used so that in the event of a failure (which is entirely possible because the internet is not perfect), the upload can be retried. If there is an error, a retry will be attempted after a certain period of time.
Uploading will be generally be faster if there are fewer files that are larger in size vs if there are lots of smaller files. Depending on the types and quantities of files being uploaded, it might be beneficial to separately compress and archive everything into a single archive (using something like `tar` or `zip`) before starting and artifact upload to speed things up.
## Is my artifact compressed?
GZip is used internally to compress individual files before starting an upload. Compression helps reduce the total amount of data that must be uploaded and stored while helping to speed up uploads (this performance benefit is significant especially on self hosted runners). If GZip does not reduce the size of the file that is being uploaded, the original file is uploaded as-is.
Compression using GZip also helps speed up artifact download as part of a workflow. Header information is used to determine if an individual file was uploaded using GZip and if necessary, decompression is used.
When downloading an artifact from the GitHub UI (this differs from downloading an artifact during a workflow), a single Zip file is dynamically created that contains all of the files uploaded as part of an artifact. Any files that were uploaded using GZip will be decompressed on the server before being added to the Zip file with the remaining files.

View File

@ -0,0 +1,62 @@
# Frequently Asked Questions
- [Frequently Asked Questions](#frequently-asked-questions)
- [Supported Characters](#supported-characters)
- [Compression? ZIP? How is my artifact stored?](#compression-zip-how-is-my-artifact-stored)
- [Which versions of the artifacts packages are compatible?](#which-versions-of-the-artifacts-packages-are-compatible)
- [How long will my artifact be available?](#how-long-will-my-artifact-be-available)
## Supported Characters
When uploading an artifact, the inputted `name` parameter along with the files specified in `files` cannot contain any of the following characters. If they are present in `name` or `files`, the Artifact will be rejected by the server and the upload will fail. These characters are not allowed due to limitations and restrictions with certain file systems such as NTFS. To maintain platform-agnostic behavior, characters that are not supported by an individual filesystem/platform will not be supported on all filesystems/platforms.
- "
- :
- <
- \>
- |
- \*
- ?
In addition to the aforementioned characters, the inputted `name` also cannot include the following
- \
- /
## Compression? ZIP? How is my artifact stored?
When creating an Artifact, the files are dynamically compressed and streamed into a ZIP archive. Since they are stored in a ZIP, they can be compressed by Zlib in varying levels.
The value can range from 0 to 9:
- 0: No compression
- 1: Best speed
- 6: Default compression (same as GNU Gzip)
- 9: Best compression
Higher levels will result in better compression, but will take longer to complete.
For large files that are not easily compressed, a value of 0 is recommended for significantly faster uploads.
## Which versions of the artifacts packages are compatible?
[actions/upload-artifact](https://github.com/actions/upload-artifact) and [actions/download-artifact](https://github.com/actions/download-artifact), leverage [GitHub Actions toolkit](https://github.com/actions/toolkit) and are typically used together to upload and download artifacts in your workflows.
| upload-artifact | download-artifact | toolkit |
|---|---|---|
| v4 | v4 | v2 |
| < v3 | < v3 | < v1 |
Use matching versions of `actions/upload-artifact` and `actions/download-artifact` to ensure compatibility.
In your GitHub Actions workflow YAML file, you specify the version of the actions you want to use. For example:
```yaml
uses: actions/upload-artifact@v4
# ...
uses: actions/download-artifact@v4
# ...
```
**Release Notes:**
Check the release notes for each repository to see if there are any specific notes about compatibility or changes in behavior.
## How long will my artifact be available?
The default retention period is **90 days**. For more information, visit: https://github.com/actions/upload-artifact?tab=readme-ov-file#retention-period

View File

@ -0,0 +1,43 @@
@actions/artifact
# @actions/artifact
## Table of contents
### Classes
- [ArtifactNotFoundError](classes/ArtifactNotFoundError.md)
- [DefaultArtifactClient](classes/DefaultArtifactClient.md)
- [FilesNotFoundError](classes/FilesNotFoundError.md)
- [GHESNotSupportedError](classes/GHESNotSupportedError.md)
- [InvalidResponseError](classes/InvalidResponseError.md)
- [NetworkError](classes/NetworkError.md)
- [UsageError](classes/UsageError.md)
### Interfaces
- [Artifact](interfaces/Artifact.md)
- [ArtifactClient](interfaces/ArtifactClient.md)
- [DeleteArtifactResponse](interfaces/DeleteArtifactResponse.md)
- [DownloadArtifactOptions](interfaces/DownloadArtifactOptions.md)
- [DownloadArtifactResponse](interfaces/DownloadArtifactResponse.md)
- [FindOptions](interfaces/FindOptions.md)
- [GetArtifactResponse](interfaces/GetArtifactResponse.md)
- [ListArtifactsOptions](interfaces/ListArtifactsOptions.md)
- [ListArtifactsResponse](interfaces/ListArtifactsResponse.md)
- [UploadArtifactOptions](interfaces/UploadArtifactOptions.md)
- [UploadArtifactResponse](interfaces/UploadArtifactResponse.md)
### Variables
- [default](README.md#default)
## Variables
### default
`Const` **default**: [`ArtifactClient`](interfaces/ArtifactClient.md)
#### Defined in
[src/artifact.ts:7](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/artifact.ts#L7)

View File

@ -0,0 +1,169 @@
[@actions/artifact](../README.md) / ArtifactNotFoundError
# Class: ArtifactNotFoundError
## Hierarchy
- `Error`
**`ArtifactNotFoundError`**
## Table of contents
### Constructors
- [constructor](ArtifactNotFoundError.md#constructor)
### Properties
- [message](ArtifactNotFoundError.md#message)
- [name](ArtifactNotFoundError.md#name)
- [stack](ArtifactNotFoundError.md#stack)
- [prepareStackTrace](ArtifactNotFoundError.md#preparestacktrace)
- [stackTraceLimit](ArtifactNotFoundError.md#stacktracelimit)
### Methods
- [captureStackTrace](ArtifactNotFoundError.md#capturestacktrace)
## Constructors
### constructor
**new ArtifactNotFoundError**(`message?`): [`ArtifactNotFoundError`](ArtifactNotFoundError.md)
#### Parameters
| Name | Type | Default value |
| :------ | :------ | :------ |
| `message` | `string` | `'Artifact not found'` |
#### Returns
[`ArtifactNotFoundError`](ArtifactNotFoundError.md)
#### Overrides
Error.constructor
#### Defined in
[src/internal/shared/errors.ts:24](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L24)
## Properties
### message
**message**: `string`
#### Inherited from
Error.message
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1068
___
### name
**name**: `string`
#### Inherited from
Error.name
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1067
___
### stack
`Optional` **stack**: `string`
#### Inherited from
Error.stack
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1069
___
### prepareStackTrace
`Static` `Optional` **prepareStackTrace**: (`err`: `Error`, `stackTraces`: `CallSite`[]) => `any`
#### Type declaration
▸ (`err`, `stackTraces`): `any`
Optional override for formatting stack traces
##### Parameters
| Name | Type |
| :------ | :------ |
| `err` | `Error` |
| `stackTraces` | `CallSite`[] |
##### Returns
`any`
**`See`**
https://v8.dev/docs/stack-trace-api#customizing-stack-traces
#### Inherited from
Error.prepareStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:11
___
### stackTraceLimit
`Static` **stackTraceLimit**: `number`
#### Inherited from
Error.stackTraceLimit
#### Defined in
node_modules/@types/node/globals.d.ts:13
## Methods
### captureStackTrace
**captureStackTrace**(`targetObject`, `constructorOpt?`): `void`
Create .stack property on a target object
#### Parameters
| Name | Type |
| :------ | :------ |
| `targetObject` | `object` |
| `constructorOpt?` | `Function` |
#### Returns
`void`
#### Inherited from
Error.captureStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:4

View File

@ -0,0 +1,193 @@
[@actions/artifact](../README.md) / DefaultArtifactClient
# Class: DefaultArtifactClient
The default artifact client that is used by the artifact action(s).
## Implements
- [`ArtifactClient`](../interfaces/ArtifactClient.md)
## Table of contents
### Constructors
- [constructor](DefaultArtifactClient.md#constructor)
### Methods
- [deleteArtifact](DefaultArtifactClient.md#deleteartifact)
- [downloadArtifact](DefaultArtifactClient.md#downloadartifact)
- [getArtifact](DefaultArtifactClient.md#getartifact)
- [listArtifacts](DefaultArtifactClient.md#listartifacts)
- [uploadArtifact](DefaultArtifactClient.md#uploadartifact)
## Constructors
### constructor
**new DefaultArtifactClient**(): [`DefaultArtifactClient`](DefaultArtifactClient.md)
#### Returns
[`DefaultArtifactClient`](DefaultArtifactClient.md)
## Methods
### deleteArtifact
**deleteArtifact**(`artifactName`, `options?`): `Promise`\<[`DeleteArtifactResponse`](../interfaces/DeleteArtifactResponse.md)\>
Delete an Artifact
If `options.findBy` is specified, this will use the public Delete Artifact API https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#delete-an-artifact
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `artifactName` | `string` | The name of the artifact to delete |
| `options?` | [`FindOptions`](../interfaces/FindOptions.md) | Extra options that allow for the customization of the delete behavior |
#### Returns
`Promise`\<[`DeleteArtifactResponse`](../interfaces/DeleteArtifactResponse.md)\>
single DeleteArtifactResponse object
#### Implementation of
[ArtifactClient](../interfaces/ArtifactClient.md).[deleteArtifact](../interfaces/ArtifactClient.md#deleteartifact)
#### Defined in
[src/internal/client.ts:248](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L248)
___
### downloadArtifact
**downloadArtifact**(`artifactId`, `options?`): `Promise`\<[`DownloadArtifactResponse`](../interfaces/DownloadArtifactResponse.md)\>
Downloads an artifact and unzips the content.
If `options.findBy` is specified, this will use the public Download Artifact API https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#download-an-artifact
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `artifactId` | `number` | The id of the artifact to download |
| `options?` | [`DownloadArtifactOptions`](../interfaces/DownloadArtifactOptions.md) & [`FindOptions`](../interfaces/FindOptions.md) | Extra options that allow for the customization of the download behavior |
#### Returns
`Promise`\<[`DownloadArtifactResponse`](../interfaces/DownloadArtifactResponse.md)\>
single DownloadArtifactResponse object
#### Implementation of
[ArtifactClient](../interfaces/ArtifactClient.md).[downloadArtifact](../interfaces/ArtifactClient.md#downloadartifact)
#### Defined in
[src/internal/client.ts:138](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L138)
___
### getArtifact
**getArtifact**(`artifactName`, `options?`): `Promise`\<[`GetArtifactResponse`](../interfaces/GetArtifactResponse.md)\>
Finds an artifact by name.
If there are multiple artifacts with the same name in the same workflow run, this will return the latest.
If the artifact is not found, it will throw.
If `options.findBy` is specified, this will use the public List Artifacts API with a name filter which can get artifacts from other runs.
https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#list-workflow-run-artifacts
`@actions/artifact` v2+ does not allow for creating multiple artifacts with the same name in the same workflow run.
It is possible to have multiple artifacts with the same name in the same workflow run by using old versions of upload-artifact (v1,v2 and v3), @actions/artifact < v2 or it is a rerun.
If there are multiple artifacts with the same name in the same workflow run this function will return the first artifact that matches the name.
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `artifactName` | `string` | The name of the artifact to find |
| `options?` | [`FindOptions`](../interfaces/FindOptions.md) | Extra options that allow for the customization of the get behavior |
#### Returns
`Promise`\<[`GetArtifactResponse`](../interfaces/GetArtifactResponse.md)\>
#### Implementation of
[ArtifactClient](../interfaces/ArtifactClient.md).[getArtifact](../interfaces/ArtifactClient.md#getartifact)
#### Defined in
[src/internal/client.ts:212](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L212)
___
### listArtifacts
**listArtifacts**(`options?`): `Promise`\<[`ListArtifactsResponse`](../interfaces/ListArtifactsResponse.md)\>
Lists all artifacts that are part of the current workflow run.
This function will return at most 1000 artifacts per workflow run.
If `options.findBy` is specified, this will call the public List-Artifacts API which can list from other runs.
https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#list-workflow-run-artifacts
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `options?` | [`ListArtifactsOptions`](../interfaces/ListArtifactsOptions.md) & [`FindOptions`](../interfaces/FindOptions.md) | Extra options that allow for the customization of the list behavior |
#### Returns
`Promise`\<[`ListArtifactsResponse`](../interfaces/ListArtifactsResponse.md)\>
ListArtifactResponse object
#### Implementation of
[ArtifactClient](../interfaces/ArtifactClient.md).[listArtifacts](../interfaces/ArtifactClient.md#listartifacts)
#### Defined in
[src/internal/client.ts:176](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L176)
___
### uploadArtifact
**uploadArtifact**(`name`, `files`, `rootDirectory`, `options?`): `Promise`\<[`UploadArtifactResponse`](../interfaces/UploadArtifactResponse.md)\>
Uploads an artifact.
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `name` | `string` | The name of the artifact, required |
| `files` | `string`[] | A list of absolute or relative paths that denote what files should be uploaded |
| `rootDirectory` | `string` | An absolute or relative file path that denotes the root parent directory of the files being uploaded |
| `options?` | [`UploadArtifactOptions`](../interfaces/UploadArtifactOptions.md) | Extra options for customizing the upload behavior |
#### Returns
`Promise`\<[`UploadArtifactResponse`](../interfaces/UploadArtifactResponse.md)\>
single UploadArtifactResponse object
#### Implementation of
[ArtifactClient](../interfaces/ArtifactClient.md).[uploadArtifact](../interfaces/ArtifactClient.md#uploadartifact)
#### Defined in
[src/internal/client.ts:113](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L113)

View File

@ -0,0 +1,180 @@
[@actions/artifact](../README.md) / FilesNotFoundError
# Class: FilesNotFoundError
## Hierarchy
- `Error`
**`FilesNotFoundError`**
## Table of contents
### Constructors
- [constructor](FilesNotFoundError.md#constructor)
### Properties
- [files](FilesNotFoundError.md#files)
- [message](FilesNotFoundError.md#message)
- [name](FilesNotFoundError.md#name)
- [stack](FilesNotFoundError.md#stack)
- [prepareStackTrace](FilesNotFoundError.md#preparestacktrace)
- [stackTraceLimit](FilesNotFoundError.md#stacktracelimit)
### Methods
- [captureStackTrace](FilesNotFoundError.md#capturestacktrace)
## Constructors
### constructor
**new FilesNotFoundError**(`files?`): [`FilesNotFoundError`](FilesNotFoundError.md)
#### Parameters
| Name | Type | Default value |
| :------ | :------ | :------ |
| `files` | `string`[] | `[]` |
#### Returns
[`FilesNotFoundError`](FilesNotFoundError.md)
#### Overrides
Error.constructor
#### Defined in
[src/internal/shared/errors.ts:4](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L4)
## Properties
### files
**files**: `string`[]
#### Defined in
[src/internal/shared/errors.ts:2](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L2)
___
### message
**message**: `string`
#### Inherited from
Error.message
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1068
___
### name
**name**: `string`
#### Inherited from
Error.name
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1067
___
### stack
`Optional` **stack**: `string`
#### Inherited from
Error.stack
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1069
___
### prepareStackTrace
`Static` `Optional` **prepareStackTrace**: (`err`: `Error`, `stackTraces`: `CallSite`[]) => `any`
#### Type declaration
▸ (`err`, `stackTraces`): `any`
Optional override for formatting stack traces
##### Parameters
| Name | Type |
| :------ | :------ |
| `err` | `Error` |
| `stackTraces` | `CallSite`[] |
##### Returns
`any`
**`See`**
https://v8.dev/docs/stack-trace-api#customizing-stack-traces
#### Inherited from
Error.prepareStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:11
___
### stackTraceLimit
`Static` **stackTraceLimit**: `number`
#### Inherited from
Error.stackTraceLimit
#### Defined in
node_modules/@types/node/globals.d.ts:13
## Methods
### captureStackTrace
**captureStackTrace**(`targetObject`, `constructorOpt?`): `void`
Create .stack property on a target object
#### Parameters
| Name | Type |
| :------ | :------ |
| `targetObject` | `object` |
| `constructorOpt?` | `Function` |
#### Returns
`void`
#### Inherited from
Error.captureStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:4

View File

@ -0,0 +1,169 @@
[@actions/artifact](../README.md) / GHESNotSupportedError
# Class: GHESNotSupportedError
## Hierarchy
- `Error`
**`GHESNotSupportedError`**
## Table of contents
### Constructors
- [constructor](GHESNotSupportedError.md#constructor)
### Properties
- [message](GHESNotSupportedError.md#message)
- [name](GHESNotSupportedError.md#name)
- [stack](GHESNotSupportedError.md#stack)
- [prepareStackTrace](GHESNotSupportedError.md#preparestacktrace)
- [stackTraceLimit](GHESNotSupportedError.md#stacktracelimit)
### Methods
- [captureStackTrace](GHESNotSupportedError.md#capturestacktrace)
## Constructors
### constructor
**new GHESNotSupportedError**(`message?`): [`GHESNotSupportedError`](GHESNotSupportedError.md)
#### Parameters
| Name | Type | Default value |
| :------ | :------ | :------ |
| `message` | `string` | `'@actions/artifact v2.0.0+, upload-artifact@v4+ and download-artifact@v4+ are not currently supported on GHES.'` |
#### Returns
[`GHESNotSupportedError`](GHESNotSupportedError.md)
#### Overrides
Error.constructor
#### Defined in
[src/internal/shared/errors.ts:31](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L31)
## Properties
### message
**message**: `string`
#### Inherited from
Error.message
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1068
___
### name
**name**: `string`
#### Inherited from
Error.name
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1067
___
### stack
`Optional` **stack**: `string`
#### Inherited from
Error.stack
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1069
___
### prepareStackTrace
`Static` `Optional` **prepareStackTrace**: (`err`: `Error`, `stackTraces`: `CallSite`[]) => `any`
#### Type declaration
▸ (`err`, `stackTraces`): `any`
Optional override for formatting stack traces
##### Parameters
| Name | Type |
| :------ | :------ |
| `err` | `Error` |
| `stackTraces` | `CallSite`[] |
##### Returns
`any`
**`See`**
https://v8.dev/docs/stack-trace-api#customizing-stack-traces
#### Inherited from
Error.prepareStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:11
___
### stackTraceLimit
`Static` **stackTraceLimit**: `number`
#### Inherited from
Error.stackTraceLimit
#### Defined in
node_modules/@types/node/globals.d.ts:13
## Methods
### captureStackTrace
**captureStackTrace**(`targetObject`, `constructorOpt?`): `void`
Create .stack property on a target object
#### Parameters
| Name | Type |
| :------ | :------ |
| `targetObject` | `object` |
| `constructorOpt?` | `Function` |
#### Returns
`void`
#### Inherited from
Error.captureStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:4

View File

@ -0,0 +1,169 @@
[@actions/artifact](../README.md) / InvalidResponseError
# Class: InvalidResponseError
## Hierarchy
- `Error`
**`InvalidResponseError`**
## Table of contents
### Constructors
- [constructor](InvalidResponseError.md#constructor)
### Properties
- [message](InvalidResponseError.md#message)
- [name](InvalidResponseError.md#name)
- [stack](InvalidResponseError.md#stack)
- [prepareStackTrace](InvalidResponseError.md#preparestacktrace)
- [stackTraceLimit](InvalidResponseError.md#stacktracelimit)
### Methods
- [captureStackTrace](InvalidResponseError.md#capturestacktrace)
## Constructors
### constructor
**new InvalidResponseError**(`message`): [`InvalidResponseError`](InvalidResponseError.md)
#### Parameters
| Name | Type |
| :------ | :------ |
| `message` | `string` |
#### Returns
[`InvalidResponseError`](InvalidResponseError.md)
#### Overrides
Error.constructor
#### Defined in
[src/internal/shared/errors.ts:17](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L17)
## Properties
### message
**message**: `string`
#### Inherited from
Error.message
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1068
___
### name
**name**: `string`
#### Inherited from
Error.name
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1067
___
### stack
`Optional` **stack**: `string`
#### Inherited from
Error.stack
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1069
___
### prepareStackTrace
`Static` `Optional` **prepareStackTrace**: (`err`: `Error`, `stackTraces`: `CallSite`[]) => `any`
#### Type declaration
▸ (`err`, `stackTraces`): `any`
Optional override for formatting stack traces
##### Parameters
| Name | Type |
| :------ | :------ |
| `err` | `Error` |
| `stackTraces` | `CallSite`[] |
##### Returns
`any`
**`See`**
https://v8.dev/docs/stack-trace-api#customizing-stack-traces
#### Inherited from
Error.prepareStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:11
___
### stackTraceLimit
`Static` **stackTraceLimit**: `number`
#### Inherited from
Error.stackTraceLimit
#### Defined in
node_modules/@types/node/globals.d.ts:13
## Methods
### captureStackTrace
**captureStackTrace**(`targetObject`, `constructorOpt?`): `void`
Create .stack property on a target object
#### Parameters
| Name | Type |
| :------ | :------ |
| `targetObject` | `object` |
| `constructorOpt?` | `Function` |
#### Returns
`void`
#### Inherited from
Error.captureStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:4

View File

@ -0,0 +1,201 @@
[@actions/artifact](../README.md) / NetworkError
# Class: NetworkError
## Hierarchy
- `Error`
**`NetworkError`**
## Table of contents
### Constructors
- [constructor](NetworkError.md#constructor)
### Properties
- [code](NetworkError.md#code)
- [message](NetworkError.md#message)
- [name](NetworkError.md#name)
- [stack](NetworkError.md#stack)
- [prepareStackTrace](NetworkError.md#preparestacktrace)
- [stackTraceLimit](NetworkError.md#stacktracelimit)
### Methods
- [captureStackTrace](NetworkError.md#capturestacktrace)
- [isNetworkErrorCode](NetworkError.md#isnetworkerrorcode)
## Constructors
### constructor
**new NetworkError**(`code`): [`NetworkError`](NetworkError.md)
#### Parameters
| Name | Type |
| :------ | :------ |
| `code` | `string` |
#### Returns
[`NetworkError`](NetworkError.md)
#### Overrides
Error.constructor
#### Defined in
[src/internal/shared/errors.ts:42](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L42)
## Properties
### code
**code**: `string`
#### Defined in
[src/internal/shared/errors.ts:40](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L40)
___
### message
**message**: `string`
#### Inherited from
Error.message
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1068
___
### name
**name**: `string`
#### Inherited from
Error.name
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1067
___
### stack
`Optional` **stack**: `string`
#### Inherited from
Error.stack
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1069
___
### prepareStackTrace
`Static` `Optional` **prepareStackTrace**: (`err`: `Error`, `stackTraces`: `CallSite`[]) => `any`
#### Type declaration
▸ (`err`, `stackTraces`): `any`
Optional override for formatting stack traces
##### Parameters
| Name | Type |
| :------ | :------ |
| `err` | `Error` |
| `stackTraces` | `CallSite`[] |
##### Returns
`any`
**`See`**
https://v8.dev/docs/stack-trace-api#customizing-stack-traces
#### Inherited from
Error.prepareStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:11
___
### stackTraceLimit
`Static` **stackTraceLimit**: `number`
#### Inherited from
Error.stackTraceLimit
#### Defined in
node_modules/@types/node/globals.d.ts:13
## Methods
### captureStackTrace
**captureStackTrace**(`targetObject`, `constructorOpt?`): `void`
Create .stack property on a target object
#### Parameters
| Name | Type |
| :------ | :------ |
| `targetObject` | `object` |
| `constructorOpt?` | `Function` |
#### Returns
`void`
#### Inherited from
Error.captureStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:4
___
### isNetworkErrorCode
**isNetworkErrorCode**(`code?`): `boolean`
#### Parameters
| Name | Type |
| :------ | :------ |
| `code?` | `string` |
#### Returns
`boolean`
#### Defined in
[src/internal/shared/errors.ts:49](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L49)

View File

@ -0,0 +1,184 @@
[@actions/artifact](../README.md) / UsageError
# Class: UsageError
## Hierarchy
- `Error`
**`UsageError`**
## Table of contents
### Constructors
- [constructor](UsageError.md#constructor)
### Properties
- [message](UsageError.md#message)
- [name](UsageError.md#name)
- [stack](UsageError.md#stack)
- [prepareStackTrace](UsageError.md#preparestacktrace)
- [stackTraceLimit](UsageError.md#stacktracelimit)
### Methods
- [captureStackTrace](UsageError.md#capturestacktrace)
- [isUsageErrorMessage](UsageError.md#isusageerrormessage)
## Constructors
### constructor
**new UsageError**(): [`UsageError`](UsageError.md)
#### Returns
[`UsageError`](UsageError.md)
#### Overrides
Error.constructor
#### Defined in
[src/internal/shared/errors.ts:62](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L62)
## Properties
### message
**message**: `string`
#### Inherited from
Error.message
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1068
___
### name
**name**: `string`
#### Inherited from
Error.name
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1067
___
### stack
`Optional` **stack**: `string`
#### Inherited from
Error.stack
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1069
___
### prepareStackTrace
`Static` `Optional` **prepareStackTrace**: (`err`: `Error`, `stackTraces`: `CallSite`[]) => `any`
#### Type declaration
▸ (`err`, `stackTraces`): `any`
Optional override for formatting stack traces
##### Parameters
| Name | Type |
| :------ | :------ |
| `err` | `Error` |
| `stackTraces` | `CallSite`[] |
##### Returns
`any`
**`See`**
https://v8.dev/docs/stack-trace-api#customizing-stack-traces
#### Inherited from
Error.prepareStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:11
___
### stackTraceLimit
`Static` **stackTraceLimit**: `number`
#### Inherited from
Error.stackTraceLimit
#### Defined in
node_modules/@types/node/globals.d.ts:13
## Methods
### captureStackTrace
**captureStackTrace**(`targetObject`, `constructorOpt?`): `void`
Create .stack property on a target object
#### Parameters
| Name | Type |
| :------ | :------ |
| `targetObject` | `object` |
| `constructorOpt?` | `Function` |
#### Returns
`void`
#### Inherited from
Error.captureStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:4
___
### isUsageErrorMessage
**isUsageErrorMessage**(`msg?`): `boolean`
#### Parameters
| Name | Type |
| :------ | :------ |
| `msg?` | `string` |
#### Returns
`boolean`
#### Defined in
[src/internal/shared/errors.ts:68](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L68)

View File

@ -0,0 +1,62 @@
[@actions/artifact](../README.md) / Artifact
# Interface: Artifact
An Actions Artifact
## Table of contents
### Properties
- [createdAt](Artifact.md#createdat)
- [id](Artifact.md#id)
- [name](Artifact.md#name)
- [size](Artifact.md#size)
## Properties
### createdAt
`Optional` **createdAt**: `Date`
The time when the artifact was created
#### Defined in
[src/internal/shared/interfaces.ts:128](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L128)
___
### id
**id**: `number`
The ID of the artifact
#### Defined in
[src/internal/shared/interfaces.ts:118](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L118)
___
### name
**name**: `string`
The name of the artifact
#### Defined in
[src/internal/shared/interfaces.ts:113](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L113)
___
### size
**size**: `number`
The size of the artifact in bytes
#### Defined in
[src/internal/shared/interfaces.ts:123](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L123)

View File

@ -0,0 +1,159 @@
[@actions/artifact](../README.md) / ArtifactClient
# Interface: ArtifactClient
Generic interface for the artifact client.
## Implemented by
- [`DefaultArtifactClient`](../classes/DefaultArtifactClient.md)
## Table of contents
### Methods
- [deleteArtifact](ArtifactClient.md#deleteartifact)
- [downloadArtifact](ArtifactClient.md#downloadartifact)
- [getArtifact](ArtifactClient.md#getartifact)
- [listArtifacts](ArtifactClient.md#listartifacts)
- [uploadArtifact](ArtifactClient.md#uploadartifact)
## Methods
### deleteArtifact
**deleteArtifact**(`artifactName`, `options?`): `Promise`\<[`DeleteArtifactResponse`](DeleteArtifactResponse.md)\>
Delete an Artifact
If `options.findBy` is specified, this will use the public Delete Artifact API https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#delete-an-artifact
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `artifactName` | `string` | The name of the artifact to delete |
| `options?` | [`FindOptions`](FindOptions.md) | Extra options that allow for the customization of the delete behavior |
#### Returns
`Promise`\<[`DeleteArtifactResponse`](DeleteArtifactResponse.md)\>
single DeleteArtifactResponse object
#### Defined in
[src/internal/client.ts:103](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L103)
___
### downloadArtifact
**downloadArtifact**(`artifactId`, `options?`): `Promise`\<[`DownloadArtifactResponse`](DownloadArtifactResponse.md)\>
Downloads an artifact and unzips the content.
If `options.findBy` is specified, this will use the public Download Artifact API https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#download-an-artifact
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `artifactId` | `number` | The id of the artifact to download |
| `options?` | [`DownloadArtifactOptions`](DownloadArtifactOptions.md) & [`FindOptions`](FindOptions.md) | Extra options that allow for the customization of the download behavior |
#### Returns
`Promise`\<[`DownloadArtifactResponse`](DownloadArtifactResponse.md)\>
single DownloadArtifactResponse object
#### Defined in
[src/internal/client.ts:89](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L89)
___
### getArtifact
**getArtifact**(`artifactName`, `options?`): `Promise`\<[`GetArtifactResponse`](GetArtifactResponse.md)\>
Finds an artifact by name.
If there are multiple artifacts with the same name in the same workflow run, this will return the latest.
If the artifact is not found, it will throw.
If `options.findBy` is specified, this will use the public List Artifacts API with a name filter which can get artifacts from other runs.
https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#list-workflow-run-artifacts
`@actions/artifact` v2+ does not allow for creating multiple artifacts with the same name in the same workflow run.
It is possible to have multiple artifacts with the same name in the same workflow run by using old versions of upload-artifact (v1,v2 and v3), @actions/artifact < v2 or it is a rerun.
If there are multiple artifacts with the same name in the same workflow run this function will return the first artifact that matches the name.
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `artifactName` | `string` | The name of the artifact to find |
| `options?` | [`FindOptions`](FindOptions.md) | Extra options that allow for the customization of the get behavior |
#### Returns
`Promise`\<[`GetArtifactResponse`](GetArtifactResponse.md)\>
#### Defined in
[src/internal/client.ts:75](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L75)
___
### listArtifacts
**listArtifacts**(`options?`): `Promise`\<[`ListArtifactsResponse`](ListArtifactsResponse.md)\>
Lists all artifacts that are part of the current workflow run.
This function will return at most 1000 artifacts per workflow run.
If `options.findBy` is specified, this will call the public List-Artifacts API which can list from other runs.
https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#list-workflow-run-artifacts
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `options?` | [`ListArtifactsOptions`](ListArtifactsOptions.md) & [`FindOptions`](FindOptions.md) | Extra options that allow for the customization of the list behavior |
#### Returns
`Promise`\<[`ListArtifactsResponse`](ListArtifactsResponse.md)\>
ListArtifactResponse object
#### Defined in
[src/internal/client.ts:57](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L57)
___
### uploadArtifact
**uploadArtifact**(`name`, `files`, `rootDirectory`, `options?`): `Promise`\<[`UploadArtifactResponse`](UploadArtifactResponse.md)\>
Uploads an artifact.
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `name` | `string` | The name of the artifact, required |
| `files` | `string`[] | A list of absolute or relative paths that denote what files should be uploaded |
| `rootDirectory` | `string` | An absolute or relative file path that denotes the root parent directory of the files being uploaded |
| `options?` | [`UploadArtifactOptions`](UploadArtifactOptions.md) | Extra options for customizing the upload behavior |
#### Returns
`Promise`\<[`UploadArtifactResponse`](UploadArtifactResponse.md)\>
single UploadArtifactResponse object
#### Defined in
[src/internal/client.ts:40](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L40)

View File

@ -0,0 +1,23 @@
[@actions/artifact](../README.md) / DeleteArtifactResponse
# Interface: DeleteArtifactResponse
Response from the server when deleting an artifact
## Table of contents
### Properties
- [id](DeleteArtifactResponse.md#id)
## Properties
### id
**id**: `number`
The id of the artifact that was deleted
#### Defined in
[src/internal/shared/interfaces.ts:163](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L163)

View File

@ -0,0 +1,23 @@
[@actions/artifact](../README.md) / DownloadArtifactOptions
# Interface: DownloadArtifactOptions
Options for downloading an artifact
## Table of contents
### Properties
- [path](DownloadArtifactOptions.md#path)
## Properties
### path
`Optional` **path**: `string`
Denotes where the artifact will be downloaded to. If not specified then the artifact is download to GITHUB_WORKSPACE
#### Defined in
[src/internal/shared/interfaces.ts:103](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L103)

View File

@ -0,0 +1,23 @@
[@actions/artifact](../README.md) / DownloadArtifactResponse
# Interface: DownloadArtifactResponse
Response from the server when downloading an artifact
## Table of contents
### Properties
- [downloadPath](DownloadArtifactResponse.md#downloadpath)
## Properties
### downloadPath
`Optional` **downloadPath**: `string`
The path where the artifact was downloaded to
#### Defined in
[src/internal/shared/interfaces.ts:93](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L93)

View File

@ -0,0 +1,30 @@
[@actions/artifact](../README.md) / FindOptions
# Interface: FindOptions
## Table of contents
### Properties
- [findBy](FindOptions.md#findby)
## Properties
### findBy
`Optional` **findBy**: `Object`
The criteria for finding Artifact(s) out of the scope of the current run.
#### Type declaration
| Name | Type | Description |
| :------ | :------ | :------ |
| `repositoryName` | `string` | Repository owner (eg. 'toolkit') |
| `repositoryOwner` | `string` | Repository owner (eg. 'actions') |
| `token` | `string` | Token with actions:read permissions |
| `workflowRunId` | `number` | WorkflowRun of the artifact(s) to lookup |
#### Defined in
[src/internal/shared/interfaces.ts:136](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L136)

View File

@ -0,0 +1,23 @@
[@actions/artifact](../README.md) / GetArtifactResponse
# Interface: GetArtifactResponse
Response from the server when getting an artifact
## Table of contents
### Properties
- [artifact](GetArtifactResponse.md#artifact)
## Properties
### artifact
**artifact**: [`Artifact`](Artifact.md)
Metadata about the artifact that was found
#### Defined in
[src/internal/shared/interfaces.ts:62](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L62)

View File

@ -0,0 +1,24 @@
[@actions/artifact](../README.md) / ListArtifactsOptions
# Interface: ListArtifactsOptions
Options for listing artifacts
## Table of contents
### Properties
- [latest](ListArtifactsOptions.md#latest)
## Properties
### latest
`Optional` **latest**: `boolean`
Filter the workflow run's artifacts to the latest by name
In the case of reruns, this can be useful to avoid duplicates
#### Defined in
[src/internal/shared/interfaces.ts:73](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L73)

View File

@ -0,0 +1,23 @@
[@actions/artifact](../README.md) / ListArtifactsResponse
# Interface: ListArtifactsResponse
Response from the server when listing artifacts
## Table of contents
### Properties
- [artifacts](ListArtifactsResponse.md#artifacts)
## Properties
### artifacts
**artifacts**: [`Artifact`](Artifact.md)[]
A list of artifacts that were found
#### Defined in
[src/internal/shared/interfaces.ts:83](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L83)

View File

@ -0,0 +1,55 @@
[@actions/artifact](../README.md) / UploadArtifactOptions
# Interface: UploadArtifactOptions
Options for uploading an artifact
## Table of contents
### Properties
- [compressionLevel](UploadArtifactOptions.md#compressionlevel)
- [retentionDays](UploadArtifactOptions.md#retentiondays)
## Properties
### compressionLevel
`Optional` **compressionLevel**: `number`
The level of compression for Zlib to be applied to the artifact archive.
The value can range from 0 to 9:
- 0: No compression
- 1: Best speed
- 6: Default compression (same as GNU Gzip)
- 9: Best compression
Higher levels will result in better compression, but will take longer to complete.
For large files that are not easily compressed, a value of 0 is recommended for significantly faster uploads.
#### Defined in
[src/internal/shared/interfaces.ts:52](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L52)
___
### retentionDays
`Optional` **retentionDays**: `number`
Duration after which artifact will expire in days.
By default artifact expires after 90 days:
https://docs.github.com/en/actions/configuring-and-managing-workflows/persisting-workflow-data-using-artifacts#downloading-and-deleting-artifacts-after-a-workflow-run-is-complete
Use this option to override the default expiry.
Min value: 1
Max value: 90 unless changed by repository setting
If this is set to a greater value than the retention settings allowed, the retention on artifacts
will be reduced to match the max value allowed on server, and the upload process will continue. An
input of 0 assumes default retention setting.
#### Defined in
[src/internal/shared/interfaces.ts:41](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L41)

View File

@ -0,0 +1,50 @@
[@actions/artifact](../README.md) / UploadArtifactResponse
# Interface: UploadArtifactResponse
Response from the server when an artifact is uploaded
## Table of contents
### Properties
- [digest](UploadArtifactResponse.md#digest)
- [id](UploadArtifactResponse.md#id)
- [size](UploadArtifactResponse.md#size)
## Properties
### digest
`Optional` **digest**: `string`
The SHA256 digest of the artifact that was created. Not provided if no artifact was uploaded
#### Defined in
[src/internal/shared/interfaces.ts:19](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L19)
___
### id
`Optional` **id**: `number`
The id of the artifact that was created. Not provided if no artifact was uploaded
This ID can be used as input to other APIs to download, delete or get more information about an artifact: https://docs.github.com/en/rest/actions/artifacts
#### Defined in
[src/internal/shared/interfaces.ts:14](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L14)
___
### size
`Optional` **size**: `number`
Total size of the artifact in bytes. Not provided if no artifact was uploaded
#### Defined in
[src/internal/shared/interfaces.ts:8](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L8)

View File

@ -1,57 +0,0 @@
# Implementation Details
Warning: Implementation details may change at any time without notice. This is meant to serve as a reference to help users understand the package.
## Upload/Compression flow
![image](https://user-images.githubusercontent.com/16109154/79765587-19522b00-8327-11ea-9679-410bb10e1b13.png)
During artifact upload, gzip is used to compress individual files that then get uploaded. This is used to minimize the amount of data that gets uploaded which reduces the total amount of HTTP calls (upload happens in 4MB chunks). This results in considerably faster uploads with huge performance implications especially on self-hosted runners.
If a file is less than 64KB in size, a passthrough stream (readable and writable) is used to convert an in-memory buffer into a readable stream without any extra streams or pipping.
## Retry Logic when downloading an individual file
![image](https://user-images.githubusercontent.com/16109154/78555461-5be71400-780d-11ea-9abd-b05b77a95a3f.png)
## Proxy support
This package uses the `@actions/http-client` NPM package internally which supports proxied requests out of the box.
## HttpManager
### `keep-alive` header
When an HTTP call is made to upload or download an individual file, the server will close the HTTP connection after the upload/download is complete and respond with a header indicating `Connection: close`.
[HTTP closed connection header information](https://tools.ietf.org/html/rfc2616#section-14.10)
TCP connections are sometimes not immediately closed by the node client (Windows might hold on to the port for an extra period of time before actually releasing it for example) and a large amount of closed connections can cause port exhaustion before ports get released and are available again.
VMs hosted by GitHub Actions have 1024 available ports so uploading 1000+ files very quickly can cause port exhaustion if connections get closed immediately. This can start to cause strange undefined behavior and timeouts.
In order for connections to not close immediately, the `keep-alive` header is used to indicate to the server that the connection should stay open. If a `keep-alive` header is used, the connection needs to be disposed of by calling `dispose()` in the `HttpClient`.
[`keep-alive` header information](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Keep-Alive)
[@actions/http-client client disposal](https://github.com/actions/http-client/blob/04e5ad73cd3fd1f5610a32116b0759eddf6570d2/index.ts#L292)
### Multiple HTTP clients
During an artifact upload or download, files are concurrently uploaded or downloaded using `async/await`. When an error or retry is encountered, the `HttpClient` that made a call is disposed of and a new one is created. If a single `HttpClient` was used for all HTTP calls and it had to be disposed, it could inadvertently effect any other calls that could be concurrently happening.
Any other concurrent uploads or downloads should be left untouched. Because of this, each concurrent upload or download gets its own `HttpClient`. The `http-manager` is used to manage all available clients and each concurrent upload or download maintains a `httpClientIndex` that keep track of which client should be used (and potentially disposed and recycled if necessary)
### Potential resource leaks
When an HTTP response is received, it consists of two parts
- `message`
- `body`
The `message` contains information such as the response code and header information and it is available immediately. The body however is not available immediately and it can be read by calling `await response.readBody()`.
TCP connections consist of an input and output buffer to manage what is sent and received across a connection. If the body is not read (even if its contents are not needed) the buffers can stay in use even after `dispose()` gets called on the `HttpClient`. The buffers get released automatically after a certain period of time, but in order for them to be explicitly cleared, `readBody()` is always called.
### Non Concurrent calls
Both `upload-http-client` and `download-http-client` do not instantiate or create any HTTP clients (the `HttpManager` has that responsibility). If an HTTP call has to be made that does not require the `keep-alive` header (such as when calling `listArtifacts` or `patchArtifactSize`), the first `HttpClient` in the `HttpManager` is used. The number of available clients is equal to the upload or download concurrency and there will always be at least one available.

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,6 @@
{ {
"name": "@actions/artifact", "name": "@actions/artifact",
"version": "1.1.1", "version": "2.3.3",
"preview": true, "preview": true,
"description": "Actions artifact lib", "description": "Actions artifact lib",
"keywords": [ "keywords": [
@ -10,8 +10,8 @@
], ],
"homepage": "https://github.com/actions/toolkit/tree/main/packages/artifact", "homepage": "https://github.com/actions/toolkit/tree/main/packages/artifact",
"license": "MIT", "license": "MIT",
"main": "lib/artifact-client.js", "main": "lib/artifact.js",
"types": "lib/artifact-client.d.ts", "types": "lib/artifact.d.ts",
"directories": { "directories": {
"lib": "lib", "lib": "lib",
"test": "__tests__" "test": "__tests__"
@ -30,20 +30,35 @@
}, },
"scripts": { "scripts": {
"audit-moderate": "npm install && npm audit --json --audit-level=moderate > audit.json", "audit-moderate": "npm install && npm audit --json --audit-level=moderate > audit.json",
"test": "echo \"Error: run tests from root\" && exit 1", "test": "cd ../../ && npm run test ./packages/artifact",
"tsc": "tsc" "bootstrap": "cd ../../ && npm run bootstrap",
"tsc-run": "tsc",
"tsc": "npm run bootstrap && npm run tsc-run",
"gen:docs": "typedoc --plugin typedoc-plugin-markdown --out docs/generated src/artifact.ts --githubPages false --readme none"
}, },
"bugs": { "bugs": {
"url": "https://github.com/actions/toolkit/issues" "url": "https://github.com/actions/toolkit/issues"
}, },
"dependencies": { "dependencies": {
"@actions/core": "^1.9.1", "@actions/core": "^1.10.0",
"@actions/http-client": "^2.0.1", "@actions/github": "^6.0.1",
"tmp": "^0.2.1", "@actions/http-client": "^2.1.0",
"tmp-promise": "^3.0.2" "@azure/storage-blob": "^12.15.0",
"@octokit/core": "^5.2.1",
"@octokit/plugin-request-log": "^1.0.4",
"@octokit/plugin-retry": "^3.0.9",
"@octokit/request": "^8.4.1",
"@octokit/request-error": "^5.1.1",
"@protobuf-ts/plugin": "^2.2.3-alpha.1",
"archiver": "^7.0.1",
"jwt-decode": "^3.1.2",
"unzip-stream": "^0.3.1"
}, },
"devDependencies": { "devDependencies": {
"@types/tmp": "^0.2.1", "@types/archiver": "^5.3.2",
"typescript": "^3.8.3" "@types/unzip-stream": "^0.3.4",
"typedoc": "^0.25.4",
"typedoc-plugin-markdown": "^3.17.1",
"typescript": "^5.2.2"
} }
} }

View File

@ -1,20 +0,0 @@
import {UploadOptions} from './internal/upload-options'
import {UploadResponse} from './internal/upload-response'
import {DownloadOptions} from './internal/download-options'
import {DownloadResponse} from './internal/download-response'
import {ArtifactClient, DefaultArtifactClient} from './internal/artifact-client'
export {
ArtifactClient,
UploadResponse,
UploadOptions,
DownloadResponse,
DownloadOptions
}
/**
* Constructs an ArtifactClient
*/
export function create(): ArtifactClient {
return DefaultArtifactClient.create()
}

View File

@ -0,0 +1,8 @@
import {ArtifactClient, DefaultArtifactClient} from './internal/client'
export * from './internal/shared/interfaces'
export * from './internal/shared/errors'
export * from './internal/client'
const client: ArtifactClient = new DefaultArtifactClient()
export default client

View File

@ -0,0 +1,277 @@
// @generated by protobuf-ts 2.9.1 with parameter long_type_string,client_none,generate_dependencies
// @generated from protobuf file "google/protobuf/timestamp.proto" (package "google.protobuf", syntax proto3)
// tslint:disable
//
// Protocol Buffers - Google's data interchange format
// Copyright 2008 Google Inc. All rights reserved.
// https://developers.google.com/protocol-buffers/
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are
// met:
//
// * Redistributions of source code must retain the above copyright
// notice, this list of conditions and the following disclaimer.
// * Redistributions in binary form must reproduce the above
// copyright notice, this list of conditions and the following disclaimer
// in the documentation and/or other materials provided with the
// distribution.
// * Neither the name of Google Inc. nor the names of its
// contributors may be used to endorse or promote products derived from
// this software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
//
import type { BinaryWriteOptions } from "@protobuf-ts/runtime";
import type { IBinaryWriter } from "@protobuf-ts/runtime";
import { WireType } from "@protobuf-ts/runtime";
import type { BinaryReadOptions } from "@protobuf-ts/runtime";
import type { IBinaryReader } from "@protobuf-ts/runtime";
import { UnknownFieldHandler } from "@protobuf-ts/runtime";
import type { PartialMessage } from "@protobuf-ts/runtime";
import { reflectionMergePartial } from "@protobuf-ts/runtime";
import { MESSAGE_TYPE } from "@protobuf-ts/runtime";
import { typeofJsonValue } from "@protobuf-ts/runtime";
import type { JsonValue } from "@protobuf-ts/runtime";
import type { JsonReadOptions } from "@protobuf-ts/runtime";
import type { JsonWriteOptions } from "@protobuf-ts/runtime";
import { PbLong } from "@protobuf-ts/runtime";
import { MessageType } from "@protobuf-ts/runtime";
/**
* A Timestamp represents a point in time independent of any time zone
* or calendar, represented as seconds and fractions of seconds at
* nanosecond resolution in UTC Epoch time. It is encoded using the
* Proleptic Gregorian Calendar which extends the Gregorian calendar
* backwards to year one. It is encoded assuming all minutes are 60
* seconds long, i.e. leap seconds are "smeared" so that no leap second
* table is needed for interpretation. Range is from
* 0001-01-01T00:00:00Z to 9999-12-31T23:59:59.999999999Z.
* By restricting to that range, we ensure that we can convert to
* and from RFC 3339 date strings.
* See [https://www.ietf.org/rfc/rfc3339.txt](https://www.ietf.org/rfc/rfc3339.txt).
*
* # Examples
*
* Example 1: Compute Timestamp from POSIX `time()`.
*
* Timestamp timestamp;
* timestamp.set_seconds(time(NULL));
* timestamp.set_nanos(0);
*
* Example 2: Compute Timestamp from POSIX `gettimeofday()`.
*
* struct timeval tv;
* gettimeofday(&tv, NULL);
*
* Timestamp timestamp;
* timestamp.set_seconds(tv.tv_sec);
* timestamp.set_nanos(tv.tv_usec * 1000);
*
* Example 3: Compute Timestamp from Win32 `GetSystemTimeAsFileTime()`.
*
* FILETIME ft;
* GetSystemTimeAsFileTime(&ft);
* UINT64 ticks = (((UINT64)ft.dwHighDateTime) << 32) | ft.dwLowDateTime;
*
* // A Windows tick is 100 nanoseconds. Windows epoch 1601-01-01T00:00:00Z
* // is 11644473600 seconds before Unix epoch 1970-01-01T00:00:00Z.
* Timestamp timestamp;
* timestamp.set_seconds((INT64) ((ticks / 10000000) - 11644473600LL));
* timestamp.set_nanos((INT32) ((ticks % 10000000) * 100));
*
* Example 4: Compute Timestamp from Java `System.currentTimeMillis()`.
*
* long millis = System.currentTimeMillis();
*
* Timestamp timestamp = Timestamp.newBuilder().setSeconds(millis / 1000)
* .setNanos((int) ((millis % 1000) * 1000000)).build();
*
*
* Example 5: Compute Timestamp from current time in Python.
*
* timestamp = Timestamp()
* timestamp.GetCurrentTime()
*
* # JSON Mapping
*
* In JSON format, the Timestamp type is encoded as a string in the
* [RFC 3339](https://www.ietf.org/rfc/rfc3339.txt) format. That is, the
* format is "{year}-{month}-{day}T{hour}:{min}:{sec}[.{frac_sec}]Z"
* where {year} is always expressed using four digits while {month}, {day},
* {hour}, {min}, and {sec} are zero-padded to two digits each. The fractional
* seconds, which can go up to 9 digits (i.e. up to 1 nanosecond resolution),
* are optional. The "Z" suffix indicates the timezone ("UTC"); the timezone
* is required. A proto3 JSON serializer should always use UTC (as indicated by
* "Z") when printing the Timestamp type and a proto3 JSON parser should be
* able to accept both UTC and other timezones (as indicated by an offset).
*
* For example, "2017-01-15T01:30:15.01Z" encodes 15.01 seconds past
* 01:30 UTC on January 15, 2017.
*
* In JavaScript, one can convert a Date object to this format using the
* standard [toISOString()](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date/toISOString]
* method. In Python, a standard `datetime.datetime` object can be converted
* to this format using [`strftime`](https://docs.python.org/2/library/time.html#time.strftime)
* with the time format spec '%Y-%m-%dT%H:%M:%S.%fZ'. Likewise, in Java, one
* can use the Joda Time's [`ISODateTimeFormat.dateTime()`](
* http://www.joda.org/joda-time/apidocs/org/joda/time/format/ISODateTimeFormat.html#dateTime--
* ) to obtain a formatter capable of generating timestamps in this format.
*
*
*
* @generated from protobuf message google.protobuf.Timestamp
*/
export interface Timestamp {
/**
* Represents seconds of UTC time since Unix epoch
* 1970-01-01T00:00:00Z. Must be from 0001-01-01T00:00:00Z to
* 9999-12-31T23:59:59Z inclusive.
*
* @generated from protobuf field: int64 seconds = 1;
*/
seconds: string;
/**
* Non-negative fractions of a second at nanosecond resolution. Negative
* second values with fractions must still have non-negative nanos values
* that count forward in time. Must be from 0 to 999,999,999
* inclusive.
*
* @generated from protobuf field: int32 nanos = 2;
*/
nanos: number;
}
// @generated message type with reflection information, may provide speed optimized methods
class Timestamp$Type extends MessageType<Timestamp> {
constructor() {
super("google.protobuf.Timestamp", [
{ no: 1, name: "seconds", kind: "scalar", T: 3 /*ScalarType.INT64*/ },
{ no: 2, name: "nanos", kind: "scalar", T: 5 /*ScalarType.INT32*/ }
]);
}
/**
* Creates a new `Timestamp` for the current time.
*/
now(): Timestamp {
const msg = this.create();
const ms = Date.now();
msg.seconds = PbLong.from(Math.floor(ms / 1000)).toString();
msg.nanos = (ms % 1000) * 1000000;
return msg;
}
/**
* Converts a `Timestamp` to a JavaScript Date.
*/
toDate(message: Timestamp): Date {
return new Date(PbLong.from(message.seconds).toNumber() * 1000 + Math.ceil(message.nanos / 1000000));
}
/**
* Converts a JavaScript Date to a `Timestamp`.
*/
fromDate(date: Date): Timestamp {
const msg = this.create();
const ms = date.getTime();
msg.seconds = PbLong.from(Math.floor(ms / 1000)).toString();
msg.nanos = (ms % 1000) * 1000000;
return msg;
}
/**
* In JSON format, the `Timestamp` type is encoded as a string
* in the RFC 3339 format.
*/
internalJsonWrite(message: Timestamp, options: JsonWriteOptions): JsonValue {
let ms = PbLong.from(message.seconds).toNumber() * 1000;
if (ms < Date.parse("0001-01-01T00:00:00Z") || ms > Date.parse("9999-12-31T23:59:59Z"))
throw new Error("Unable to encode Timestamp to JSON. Must be from 0001-01-01T00:00:00Z to 9999-12-31T23:59:59Z inclusive.");
if (message.nanos < 0)
throw new Error("Unable to encode invalid Timestamp to JSON. Nanos must not be negative.");
let z = "Z";
if (message.nanos > 0) {
let nanosStr = (message.nanos + 1000000000).toString().substring(1);
if (nanosStr.substring(3) === "000000")
z = "." + nanosStr.substring(0, 3) + "Z";
else if (nanosStr.substring(6) === "000")
z = "." + nanosStr.substring(0, 6) + "Z";
else
z = "." + nanosStr + "Z";
}
return new Date(ms).toISOString().replace(".000Z", z);
}
/**
* In JSON format, the `Timestamp` type is encoded as a string
* in the RFC 3339 format.
*/
internalJsonRead(json: JsonValue, options: JsonReadOptions, target?: Timestamp): Timestamp {
if (typeof json !== "string")
throw new Error("Unable to parse Timestamp from JSON " + typeofJsonValue(json) + ".");
let matches = json.match(/^([0-9]{4})-([0-9]{2})-([0-9]{2})T([0-9]{2}):([0-9]{2}):([0-9]{2})(?:Z|\.([0-9]{3,9})Z|([+-][0-9][0-9]:[0-9][0-9]))$/);
if (!matches)
throw new Error("Unable to parse Timestamp from JSON. Invalid format.");
let ms = Date.parse(matches[1] + "-" + matches[2] + "-" + matches[3] + "T" + matches[4] + ":" + matches[5] + ":" + matches[6] + (matches[8] ? matches[8] : "Z"));
if (Number.isNaN(ms))
throw new Error("Unable to parse Timestamp from JSON. Invalid value.");
if (ms < Date.parse("0001-01-01T00:00:00Z") || ms > Date.parse("9999-12-31T23:59:59Z"))
throw new globalThis.Error("Unable to parse Timestamp from JSON. Must be from 0001-01-01T00:00:00Z to 9999-12-31T23:59:59Z inclusive.");
if (!target)
target = this.create();
target.seconds = PbLong.from(ms / 1000).toString();
target.nanos = 0;
if (matches[7])
target.nanos = (parseInt("1" + matches[7] + "0".repeat(9 - matches[7].length)) - 1000000000);
return target;
}
create(value?: PartialMessage<Timestamp>): Timestamp {
const message = { seconds: "0", nanos: 0 };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<Timestamp>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: Timestamp): Timestamp {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* int64 seconds */ 1:
message.seconds = reader.int64().toString();
break;
case /* int32 nanos */ 2:
message.nanos = reader.int32();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: Timestamp, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* int64 seconds = 1; */
if (message.seconds !== "0")
writer.tag(1, WireType.Varint).int64(message.seconds);
/* int32 nanos = 2; */
if (message.nanos !== 0)
writer.tag(2, WireType.Varint).int32(message.nanos);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message google.protobuf.Timestamp
*/
export const Timestamp = new Timestamp$Type();

View File

@ -0,0 +1,748 @@
// @generated by protobuf-ts 2.9.1 with parameter long_type_string,client_none,generate_dependencies
// @generated from protobuf file "google/protobuf/wrappers.proto" (package "google.protobuf", syntax proto3)
// tslint:disable
//
// Protocol Buffers - Google's data interchange format
// Copyright 2008 Google Inc. All rights reserved.
// https://developers.google.com/protocol-buffers/
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are
// met:
//
// * Redistributions of source code must retain the above copyright
// notice, this list of conditions and the following disclaimer.
// * Redistributions in binary form must reproduce the above
// copyright notice, this list of conditions and the following disclaimer
// in the documentation and/or other materials provided with the
// distribution.
// * Neither the name of Google Inc. nor the names of its
// contributors may be used to endorse or promote products derived from
// this software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
//
//
// Wrappers for primitive (non-message) types. These types are useful
// for embedding primitives in the `google.protobuf.Any` type and for places
// where we need to distinguish between the absence of a primitive
// typed field and its default value.
//
import { ScalarType } from "@protobuf-ts/runtime";
import { LongType } from "@protobuf-ts/runtime";
import type { BinaryWriteOptions } from "@protobuf-ts/runtime";
import type { IBinaryWriter } from "@protobuf-ts/runtime";
import { WireType } from "@protobuf-ts/runtime";
import type { BinaryReadOptions } from "@protobuf-ts/runtime";
import type { IBinaryReader } from "@protobuf-ts/runtime";
import { UnknownFieldHandler } from "@protobuf-ts/runtime";
import type { PartialMessage } from "@protobuf-ts/runtime";
import { reflectionMergePartial } from "@protobuf-ts/runtime";
import { MESSAGE_TYPE } from "@protobuf-ts/runtime";
import type { JsonValue } from "@protobuf-ts/runtime";
import type { JsonReadOptions } from "@protobuf-ts/runtime";
import type { JsonWriteOptions } from "@protobuf-ts/runtime";
import { MessageType } from "@protobuf-ts/runtime";
/**
* Wrapper message for `double`.
*
* The JSON representation for `DoubleValue` is JSON number.
*
* @generated from protobuf message google.protobuf.DoubleValue
*/
export interface DoubleValue {
/**
* The double value.
*
* @generated from protobuf field: double value = 1;
*/
value: number;
}
/**
* Wrapper message for `float`.
*
* The JSON representation for `FloatValue` is JSON number.
*
* @generated from protobuf message google.protobuf.FloatValue
*/
export interface FloatValue {
/**
* The float value.
*
* @generated from protobuf field: float value = 1;
*/
value: number;
}
/**
* Wrapper message for `int64`.
*
* The JSON representation for `Int64Value` is JSON string.
*
* @generated from protobuf message google.protobuf.Int64Value
*/
export interface Int64Value {
/**
* The int64 value.
*
* @generated from protobuf field: int64 value = 1;
*/
value: string;
}
/**
* Wrapper message for `uint64`.
*
* The JSON representation for `UInt64Value` is JSON string.
*
* @generated from protobuf message google.protobuf.UInt64Value
*/
export interface UInt64Value {
/**
* The uint64 value.
*
* @generated from protobuf field: uint64 value = 1;
*/
value: string;
}
/**
* Wrapper message for `int32`.
*
* The JSON representation for `Int32Value` is JSON number.
*
* @generated from protobuf message google.protobuf.Int32Value
*/
export interface Int32Value {
/**
* The int32 value.
*
* @generated from protobuf field: int32 value = 1;
*/
value: number;
}
/**
* Wrapper message for `uint32`.
*
* The JSON representation for `UInt32Value` is JSON number.
*
* @generated from protobuf message google.protobuf.UInt32Value
*/
export interface UInt32Value {
/**
* The uint32 value.
*
* @generated from protobuf field: uint32 value = 1;
*/
value: number;
}
/**
* Wrapper message for `bool`.
*
* The JSON representation for `BoolValue` is JSON `true` and `false`.
*
* @generated from protobuf message google.protobuf.BoolValue
*/
export interface BoolValue {
/**
* The bool value.
*
* @generated from protobuf field: bool value = 1;
*/
value: boolean;
}
/**
* Wrapper message for `string`.
*
* The JSON representation for `StringValue` is JSON string.
*
* @generated from protobuf message google.protobuf.StringValue
*/
export interface StringValue {
/**
* The string value.
*
* @generated from protobuf field: string value = 1;
*/
value: string;
}
/**
* Wrapper message for `bytes`.
*
* The JSON representation for `BytesValue` is JSON string.
*
* @generated from protobuf message google.protobuf.BytesValue
*/
export interface BytesValue {
/**
* The bytes value.
*
* @generated from protobuf field: bytes value = 1;
*/
value: Uint8Array;
}
// @generated message type with reflection information, may provide speed optimized methods
class DoubleValue$Type extends MessageType<DoubleValue> {
constructor() {
super("google.protobuf.DoubleValue", [
{ no: 1, name: "value", kind: "scalar", T: 1 /*ScalarType.DOUBLE*/ }
]);
}
/**
* Encode `DoubleValue` to JSON number.
*/
internalJsonWrite(message: DoubleValue, options: JsonWriteOptions): JsonValue {
return this.refJsonWriter.scalar(2, message.value, "value", false, true);
}
/**
* Decode `DoubleValue` from JSON number.
*/
internalJsonRead(json: JsonValue, options: JsonReadOptions, target?: DoubleValue): DoubleValue {
if (!target)
target = this.create();
target.value = this.refJsonReader.scalar(json, 1, undefined, "value") as number;
return target;
}
create(value?: PartialMessage<DoubleValue>): DoubleValue {
const message = { value: 0 };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<DoubleValue>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: DoubleValue): DoubleValue {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* double value */ 1:
message.value = reader.double();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: DoubleValue, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* double value = 1; */
if (message.value !== 0)
writer.tag(1, WireType.Bit64).double(message.value);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message google.protobuf.DoubleValue
*/
export const DoubleValue = new DoubleValue$Type();
// @generated message type with reflection information, may provide speed optimized methods
class FloatValue$Type extends MessageType<FloatValue> {
constructor() {
super("google.protobuf.FloatValue", [
{ no: 1, name: "value", kind: "scalar", T: 2 /*ScalarType.FLOAT*/ }
]);
}
/**
* Encode `FloatValue` to JSON number.
*/
internalJsonWrite(message: FloatValue, options: JsonWriteOptions): JsonValue {
return this.refJsonWriter.scalar(1, message.value, "value", false, true);
}
/**
* Decode `FloatValue` from JSON number.
*/
internalJsonRead(json: JsonValue, options: JsonReadOptions, target?: FloatValue): FloatValue {
if (!target)
target = this.create();
target.value = this.refJsonReader.scalar(json, 1, undefined, "value") as number;
return target;
}
create(value?: PartialMessage<FloatValue>): FloatValue {
const message = { value: 0 };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<FloatValue>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: FloatValue): FloatValue {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* float value */ 1:
message.value = reader.float();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: FloatValue, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* float value = 1; */
if (message.value !== 0)
writer.tag(1, WireType.Bit32).float(message.value);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message google.protobuf.FloatValue
*/
export const FloatValue = new FloatValue$Type();
// @generated message type with reflection information, may provide speed optimized methods
class Int64Value$Type extends MessageType<Int64Value> {
constructor() {
super("google.protobuf.Int64Value", [
{ no: 1, name: "value", kind: "scalar", T: 3 /*ScalarType.INT64*/ }
]);
}
/**
* Encode `Int64Value` to JSON string.
*/
internalJsonWrite(message: Int64Value, options: JsonWriteOptions): JsonValue {
return this.refJsonWriter.scalar(ScalarType.INT64, message.value, "value", false, true);
}
/**
* Decode `Int64Value` from JSON string.
*/
internalJsonRead(json: JsonValue, options: JsonReadOptions, target?: Int64Value): Int64Value {
if (!target)
target = this.create();
target.value = this.refJsonReader.scalar(json, ScalarType.INT64, LongType.STRING, "value") as any;
return target;
}
create(value?: PartialMessage<Int64Value>): Int64Value {
const message = { value: "0" };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<Int64Value>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: Int64Value): Int64Value {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* int64 value */ 1:
message.value = reader.int64().toString();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: Int64Value, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* int64 value = 1; */
if (message.value !== "0")
writer.tag(1, WireType.Varint).int64(message.value);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message google.protobuf.Int64Value
*/
export const Int64Value = new Int64Value$Type();
// @generated message type with reflection information, may provide speed optimized methods
class UInt64Value$Type extends MessageType<UInt64Value> {
constructor() {
super("google.protobuf.UInt64Value", [
{ no: 1, name: "value", kind: "scalar", T: 4 /*ScalarType.UINT64*/ }
]);
}
/**
* Encode `UInt64Value` to JSON string.
*/
internalJsonWrite(message: UInt64Value, options: JsonWriteOptions): JsonValue {
return this.refJsonWriter.scalar(ScalarType.UINT64, message.value, "value", false, true);
}
/**
* Decode `UInt64Value` from JSON string.
*/
internalJsonRead(json: JsonValue, options: JsonReadOptions, target?: UInt64Value): UInt64Value {
if (!target)
target = this.create();
target.value = this.refJsonReader.scalar(json, ScalarType.UINT64, LongType.STRING, "value") as any;
return target;
}
create(value?: PartialMessage<UInt64Value>): UInt64Value {
const message = { value: "0" };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<UInt64Value>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: UInt64Value): UInt64Value {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* uint64 value */ 1:
message.value = reader.uint64().toString();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: UInt64Value, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* uint64 value = 1; */
if (message.value !== "0")
writer.tag(1, WireType.Varint).uint64(message.value);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message google.protobuf.UInt64Value
*/
export const UInt64Value = new UInt64Value$Type();
// @generated message type with reflection information, may provide speed optimized methods
class Int32Value$Type extends MessageType<Int32Value> {
constructor() {
super("google.protobuf.Int32Value", [
{ no: 1, name: "value", kind: "scalar", T: 5 /*ScalarType.INT32*/ }
]);
}
/**
* Encode `Int32Value` to JSON string.
*/
internalJsonWrite(message: Int32Value, options: JsonWriteOptions): JsonValue {
return this.refJsonWriter.scalar(5, message.value, "value", false, true);
}
/**
* Decode `Int32Value` from JSON string.
*/
internalJsonRead(json: JsonValue, options: JsonReadOptions, target?: Int32Value): Int32Value {
if (!target)
target = this.create();
target.value = this.refJsonReader.scalar(json, 5, undefined, "value") as number;
return target;
}
create(value?: PartialMessage<Int32Value>): Int32Value {
const message = { value: 0 };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<Int32Value>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: Int32Value): Int32Value {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* int32 value */ 1:
message.value = reader.int32();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: Int32Value, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* int32 value = 1; */
if (message.value !== 0)
writer.tag(1, WireType.Varint).int32(message.value);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message google.protobuf.Int32Value
*/
export const Int32Value = new Int32Value$Type();
// @generated message type with reflection information, may provide speed optimized methods
class UInt32Value$Type extends MessageType<UInt32Value> {
constructor() {
super("google.protobuf.UInt32Value", [
{ no: 1, name: "value", kind: "scalar", T: 13 /*ScalarType.UINT32*/ }
]);
}
/**
* Encode `UInt32Value` to JSON string.
*/
internalJsonWrite(message: UInt32Value, options: JsonWriteOptions): JsonValue {
return this.refJsonWriter.scalar(13, message.value, "value", false, true);
}
/**
* Decode `UInt32Value` from JSON string.
*/
internalJsonRead(json: JsonValue, options: JsonReadOptions, target?: UInt32Value): UInt32Value {
if (!target)
target = this.create();
target.value = this.refJsonReader.scalar(json, 13, undefined, "value") as number;
return target;
}
create(value?: PartialMessage<UInt32Value>): UInt32Value {
const message = { value: 0 };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<UInt32Value>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: UInt32Value): UInt32Value {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* uint32 value */ 1:
message.value = reader.uint32();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: UInt32Value, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* uint32 value = 1; */
if (message.value !== 0)
writer.tag(1, WireType.Varint).uint32(message.value);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message google.protobuf.UInt32Value
*/
export const UInt32Value = new UInt32Value$Type();
// @generated message type with reflection information, may provide speed optimized methods
class BoolValue$Type extends MessageType<BoolValue> {
constructor() {
super("google.protobuf.BoolValue", [
{ no: 1, name: "value", kind: "scalar", T: 8 /*ScalarType.BOOL*/ }
]);
}
/**
* Encode `BoolValue` to JSON bool.
*/
internalJsonWrite(message: BoolValue, options: JsonWriteOptions): JsonValue {
return message.value;
}
/**
* Decode `BoolValue` from JSON bool.
*/
internalJsonRead(json: JsonValue, options: JsonReadOptions, target?: BoolValue): BoolValue {
if (!target)
target = this.create();
target.value = this.refJsonReader.scalar(json, 8, undefined, "value") as boolean;
return target;
}
create(value?: PartialMessage<BoolValue>): BoolValue {
const message = { value: false };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<BoolValue>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: BoolValue): BoolValue {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* bool value */ 1:
message.value = reader.bool();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: BoolValue, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* bool value = 1; */
if (message.value !== false)
writer.tag(1, WireType.Varint).bool(message.value);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message google.protobuf.BoolValue
*/
export const BoolValue = new BoolValue$Type();
// @generated message type with reflection information, may provide speed optimized methods
class StringValue$Type extends MessageType<StringValue> {
constructor() {
super("google.protobuf.StringValue", [
{ no: 1, name: "value", kind: "scalar", T: 9 /*ScalarType.STRING*/ }
]);
}
/**
* Encode `StringValue` to JSON string.
*/
internalJsonWrite(message: StringValue, options: JsonWriteOptions): JsonValue {
return message.value;
}
/**
* Decode `StringValue` from JSON string.
*/
internalJsonRead(json: JsonValue, options: JsonReadOptions, target?: StringValue): StringValue {
if (!target)
target = this.create();
target.value = this.refJsonReader.scalar(json, 9, undefined, "value") as string;
return target;
}
create(value?: PartialMessage<StringValue>): StringValue {
const message = { value: "" };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<StringValue>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: StringValue): StringValue {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string value */ 1:
message.value = reader.string();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: StringValue, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* string value = 1; */
if (message.value !== "")
writer.tag(1, WireType.LengthDelimited).string(message.value);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message google.protobuf.StringValue
*/
export const StringValue = new StringValue$Type();
// @generated message type with reflection information, may provide speed optimized methods
class BytesValue$Type extends MessageType<BytesValue> {
constructor() {
super("google.protobuf.BytesValue", [
{ no: 1, name: "value", kind: "scalar", T: 12 /*ScalarType.BYTES*/ }
]);
}
/**
* Encode `BytesValue` to JSON string.
*/
internalJsonWrite(message: BytesValue, options: JsonWriteOptions): JsonValue {
return this.refJsonWriter.scalar(12, message.value, "value", false, true);
}
/**
* Decode `BytesValue` from JSON string.
*/
internalJsonRead(json: JsonValue, options: JsonReadOptions, target?: BytesValue): BytesValue {
if (!target)
target = this.create();
target.value = this.refJsonReader.scalar(json, 12, undefined, "value") as Uint8Array;
return target;
}
create(value?: PartialMessage<BytesValue>): BytesValue {
const message = { value: new Uint8Array(0) };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<BytesValue>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: BytesValue): BytesValue {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* bytes value */ 1:
message.value = reader.bytes();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: BytesValue, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* bytes value = 1; */
if (message.value.length)
writer.tag(1, WireType.LengthDelimited).bytes(message.value);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message google.protobuf.BytesValue
*/
export const BytesValue = new BytesValue$Type();

View File

@ -0,0 +1,4 @@
export * from './google/protobuf/timestamp'
export * from './google/protobuf/wrappers'
export * from './results/api/v1/artifact'
export * from './results/api/v1/artifact.twirp-client'

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,232 @@
import {
CreateArtifactRequest,
CreateArtifactResponse,
FinalizeArtifactRequest,
FinalizeArtifactResponse,
ListArtifactsRequest,
ListArtifactsResponse,
GetSignedArtifactURLRequest,
GetSignedArtifactURLResponse,
DeleteArtifactRequest,
DeleteArtifactResponse,
} from "./artifact";
//==================================//
// Client Code //
//==================================//
interface Rpc {
request(
service: string,
method: string,
contentType: "application/json" | "application/protobuf",
data: object | Uint8Array
): Promise<object | Uint8Array>;
}
export interface ArtifactServiceClient {
CreateArtifact(
request: CreateArtifactRequest
): Promise<CreateArtifactResponse>;
FinalizeArtifact(
request: FinalizeArtifactRequest
): Promise<FinalizeArtifactResponse>;
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(
request: GetSignedArtifactURLRequest
): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(
request: DeleteArtifactRequest
): Promise<DeleteArtifactResponse>;
}
export class ArtifactServiceClientJSON implements ArtifactServiceClient {
private readonly rpc: Rpc;
constructor(rpc: Rpc) {
this.rpc = rpc;
this.CreateArtifact.bind(this);
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(
request: CreateArtifactRequest
): Promise<CreateArtifactResponse> {
const data = CreateArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"CreateArtifact",
"application/json",
data as object
);
return promise.then((data) =>
CreateArtifactResponse.fromJson(data as any, {
ignoreUnknownFields: true,
})
);
}
FinalizeArtifact(
request: FinalizeArtifactRequest
): Promise<FinalizeArtifactResponse> {
const data = FinalizeArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"FinalizeArtifact",
"application/json",
data as object
);
return promise.then((data) =>
FinalizeArtifactResponse.fromJson(data as any, {
ignoreUnknownFields: true,
})
);
}
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse> {
const data = ListArtifactsRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"ListArtifacts",
"application/json",
data as object
);
return promise.then((data) =>
ListArtifactsResponse.fromJson(data as any, { ignoreUnknownFields: true })
);
}
GetSignedArtifactURL(
request: GetSignedArtifactURLRequest
): Promise<GetSignedArtifactURLResponse> {
const data = GetSignedArtifactURLRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"GetSignedArtifactURL",
"application/json",
data as object
);
return promise.then((data) =>
GetSignedArtifactURLResponse.fromJson(data as any, {
ignoreUnknownFields: true,
})
);
}
DeleteArtifact(
request: DeleteArtifactRequest
): Promise<DeleteArtifactResponse> {
const data = DeleteArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"DeleteArtifact",
"application/json",
data as object
);
return promise.then((data) =>
DeleteArtifactResponse.fromJson(data as any, {
ignoreUnknownFields: true,
})
);
}
}
export class ArtifactServiceClientProtobuf implements ArtifactServiceClient {
private readonly rpc: Rpc;
constructor(rpc: Rpc) {
this.rpc = rpc;
this.CreateArtifact.bind(this);
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(
request: CreateArtifactRequest
): Promise<CreateArtifactResponse> {
const data = CreateArtifactRequest.toBinary(request);
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"CreateArtifact",
"application/protobuf",
data
);
return promise.then((data) =>
CreateArtifactResponse.fromBinary(data as Uint8Array)
);
}
FinalizeArtifact(
request: FinalizeArtifactRequest
): Promise<FinalizeArtifactResponse> {
const data = FinalizeArtifactRequest.toBinary(request);
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"FinalizeArtifact",
"application/protobuf",
data
);
return promise.then((data) =>
FinalizeArtifactResponse.fromBinary(data as Uint8Array)
);
}
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse> {
const data = ListArtifactsRequest.toBinary(request);
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"ListArtifacts",
"application/protobuf",
data
);
return promise.then((data) =>
ListArtifactsResponse.fromBinary(data as Uint8Array)
);
}
GetSignedArtifactURL(
request: GetSignedArtifactURLRequest
): Promise<GetSignedArtifactURLResponse> {
const data = GetSignedArtifactURLRequest.toBinary(request);
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"GetSignedArtifactURL",
"application/protobuf",
data
);
return promise.then((data) =>
GetSignedArtifactURLResponse.fromBinary(data as Uint8Array)
);
}
DeleteArtifact(
request: DeleteArtifactRequest
): Promise<DeleteArtifactResponse> {
const data = DeleteArtifactRequest.toBinary(request);
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"DeleteArtifact",
"application/protobuf",
data
);
return promise.then((data) =>
DeleteArtifactResponse.fromBinary(data as Uint8Array)
);
}
}

View File

@ -1,51 +0,0 @@
/**
* Mocks default limits for easier testing
*/
export function getUploadFileConcurrency(): number {
return 1
}
export function getUploadChunkConcurrency(): number {
return 1
}
export function getUploadChunkSize(): number {
return 4 * 1024 * 1024 // 4 MB Chunks
}
export function getRetryLimit(): number {
return 2
}
export function getRetryMultiplier(): number {
return 1.5
}
export function getInitialRetryIntervalInMilliseconds(): number {
return 10
}
export function getDownloadFileConcurrency(): number {
return 1
}
/**
* Mocks the 'ACTIONS_RUNTIME_TOKEN', 'ACTIONS_RUNTIME_URL' and 'GITHUB_RUN_ID' env variables
* that are only available from a node context on the runner. This allows for tests to run
* locally without the env variables actually being set
*/
export function getRuntimeToken(): string {
return 'totally-valid-token'
}
export function getRuntimeUrl(): string {
return 'https://www.example.com/'
}
export function getWorkFlowRunId(): string {
return '15'
}
export function getRetentionDays(): string | undefined {
return '45'
}

View File

@ -1,282 +0,0 @@
import * as core from '@actions/core'
import {
UploadSpecification,
getUploadSpecification
} from './upload-specification'
import {UploadHttpClient} from './upload-http-client'
import {UploadResponse} from './upload-response'
import {UploadOptions} from './upload-options'
import {DownloadOptions} from './download-options'
import {DownloadResponse} from './download-response'
import {
createDirectoriesForArtifact,
createEmptyFilesForArtifact
} from './utils'
import {checkArtifactName} from './path-and-artifact-name-validation'
import {DownloadHttpClient} from './download-http-client'
import {getDownloadSpecification} from './download-specification'
import {getWorkSpaceDirectory} from './config-variables'
import {normalize, resolve} from 'path'
export interface ArtifactClient {
/**
* Uploads an artifact
*
* @param name the name of the artifact, required
* @param files a list of absolute or relative paths that denote what files should be uploaded
* @param rootDirectory an absolute or relative file path that denotes the root parent directory of the files being uploaded
* @param options extra options for customizing the upload behavior
* @returns single UploadInfo object
*/
uploadArtifact(
name: string,
files: string[],
rootDirectory: string,
options?: UploadOptions
): Promise<UploadResponse>
/**
* Downloads a single artifact associated with a run
*
* @param name the name of the artifact being downloaded
* @param path optional path that denotes where the artifact will be downloaded to
* @param options extra options that allow for the customization of the download behavior
*/
downloadArtifact(
name: string,
path?: string,
options?: DownloadOptions
): Promise<DownloadResponse>
/**
* Downloads all artifacts associated with a run. Because there are multiple artifacts being downloaded, a folder will be created for each one in the specified or default directory
* @param path optional path that denotes where the artifacts will be downloaded to
*/
downloadAllArtifacts(path?: string): Promise<DownloadResponse[]>
}
export class DefaultArtifactClient implements ArtifactClient {
/**
* Constructs a DefaultArtifactClient
*/
static create(): DefaultArtifactClient {
return new DefaultArtifactClient()
}
/**
* Uploads an artifact
*/
async uploadArtifact(
name: string,
files: string[],
rootDirectory: string,
options?: UploadOptions | undefined
): Promise<UploadResponse> {
core.info(
`Starting artifact upload
For more detailed logs during the artifact upload process, enable step-debugging: https://docs.github.com/actions/monitoring-and-troubleshooting-workflows/enabling-debug-logging#enabling-step-debug-logging`
)
checkArtifactName(name)
// Get specification for the files being uploaded
const uploadSpecification: UploadSpecification[] = getUploadSpecification(
name,
rootDirectory,
files
)
const uploadResponse: UploadResponse = {
artifactName: name,
artifactItems: [],
size: 0,
failedItems: []
}
const uploadHttpClient = new UploadHttpClient()
if (uploadSpecification.length === 0) {
core.warning(`No files found that can be uploaded`)
} else {
// Create an entry for the artifact in the file container
const response = await uploadHttpClient.createArtifactInFileContainer(
name,
options
)
if (!response.fileContainerResourceUrl) {
core.debug(response.toString())
throw new Error(
'No URL provided by the Artifact Service to upload an artifact to'
)
}
core.debug(`Upload Resource URL: ${response.fileContainerResourceUrl}`)
core.info(
`Container for artifact "${name}" successfully created. Starting upload of file(s)`
)
// Upload each of the files that were found concurrently
const uploadResult = await uploadHttpClient.uploadArtifactToFileContainer(
response.fileContainerResourceUrl,
uploadSpecification,
options
)
// Update the size of the artifact to indicate we are done uploading
// The uncompressed size is used for display when downloading a zip of the artifact from the UI
core.info(
`File upload process has finished. Finalizing the artifact upload`
)
await uploadHttpClient.patchArtifactSize(uploadResult.totalSize, name)
if (uploadResult.failedItems.length > 0) {
core.info(
`Upload finished. There were ${uploadResult.failedItems.length} items that failed to upload`
)
} else {
core.info(
`Artifact has been finalized. All files have been successfully uploaded!`
)
}
core.info(
`
The raw size of all the files that were specified for upload is ${uploadResult.totalSize} bytes
The size of all the files that were uploaded is ${uploadResult.uploadSize} bytes. This takes into account any gzip compression used to reduce the upload size, time and storage
Note: The size of downloaded zips can differ significantly from the reported size. For more information see: https://github.com/actions/upload-artifact#zipped-artifact-downloads \r\n`
)
uploadResponse.artifactItems = uploadSpecification.map(
item => item.absoluteFilePath
)
uploadResponse.size = uploadResult.uploadSize
uploadResponse.failedItems = uploadResult.failedItems
}
return uploadResponse
}
async downloadArtifact(
name: string,
path?: string | undefined,
options?: DownloadOptions | undefined
): Promise<DownloadResponse> {
const downloadHttpClient = new DownloadHttpClient()
const artifacts = await downloadHttpClient.listArtifacts()
if (artifacts.count === 0) {
throw new Error(
`Unable to find any artifacts for the associated workflow`
)
}
const artifactToDownload = artifacts.value.find(artifact => {
return artifact.name === name
})
if (!artifactToDownload) {
throw new Error(`Unable to find an artifact with the name: ${name}`)
}
const items = await downloadHttpClient.getContainerItems(
artifactToDownload.name,
artifactToDownload.fileContainerResourceUrl
)
if (!path) {
path = getWorkSpaceDirectory()
}
path = normalize(path)
path = resolve(path)
// During upload, empty directories are rejected by the remote server so there should be no artifacts that consist of only empty directories
const downloadSpecification = getDownloadSpecification(
name,
items.value,
path,
options?.createArtifactFolder || false
)
if (downloadSpecification.filesToDownload.length === 0) {
core.info(
`No downloadable files were found for the artifact: ${artifactToDownload.name}`
)
} else {
// Create all necessary directories recursively before starting any download
await createDirectoriesForArtifact(
downloadSpecification.directoryStructure
)
core.info('Directory structure has been set up for the artifact')
await createEmptyFilesForArtifact(
downloadSpecification.emptyFilesToCreate
)
await downloadHttpClient.downloadSingleArtifact(
downloadSpecification.filesToDownload
)
}
return {
artifactName: name,
downloadPath: downloadSpecification.rootDownloadLocation
}
}
async downloadAllArtifacts(
path?: string | undefined
): Promise<DownloadResponse[]> {
const downloadHttpClient = new DownloadHttpClient()
const response: DownloadResponse[] = []
const artifacts = await downloadHttpClient.listArtifacts()
if (artifacts.count === 0) {
core.info('Unable to find any artifacts for the associated workflow')
return response
}
if (!path) {
path = getWorkSpaceDirectory()
}
path = normalize(path)
path = resolve(path)
let downloadedArtifacts = 0
while (downloadedArtifacts < artifacts.count) {
const currentArtifactToDownload = artifacts.value[downloadedArtifacts]
downloadedArtifacts += 1
core.info(
`starting download of artifact ${currentArtifactToDownload.name} : ${downloadedArtifacts}/${artifacts.count}`
)
// Get container entries for the specific artifact
const items = await downloadHttpClient.getContainerItems(
currentArtifactToDownload.name,
currentArtifactToDownload.fileContainerResourceUrl
)
const downloadSpecification = getDownloadSpecification(
currentArtifactToDownload.name,
items.value,
path,
true
)
if (downloadSpecification.filesToDownload.length === 0) {
core.info(
`No downloadable files were found for any artifact ${currentArtifactToDownload.name}`
)
} else {
await createDirectoriesForArtifact(
downloadSpecification.directoryStructure
)
await createEmptyFilesForArtifact(
downloadSpecification.emptyFilesToCreate
)
await downloadHttpClient.downloadSingleArtifact(
downloadSpecification.filesToDownload
)
}
response.push({
artifactName: currentArtifactToDownload.name,
downloadPath: downloadSpecification.rootDownloadLocation
})
}
return response
}
}

View File

@ -0,0 +1,284 @@
import {warning} from '@actions/core'
import {isGhes} from './shared/config'
import {
UploadArtifactOptions,
UploadArtifactResponse,
DownloadArtifactOptions,
GetArtifactResponse,
ListArtifactsOptions,
ListArtifactsResponse,
DownloadArtifactResponse,
FindOptions,
DeleteArtifactResponse
} from './shared/interfaces'
import {uploadArtifact} from './upload/upload-artifact'
import {
downloadArtifactPublic,
downloadArtifactInternal
} from './download/download-artifact'
import {
deleteArtifactPublic,
deleteArtifactInternal
} from './delete/delete-artifact'
import {getArtifactPublic, getArtifactInternal} from './find/get-artifact'
import {listArtifactsPublic, listArtifactsInternal} from './find/list-artifacts'
import {GHESNotSupportedError} from './shared/errors'
/**
* Generic interface for the artifact client.
*/
export interface ArtifactClient {
/**
* Uploads an artifact.
*
* @param name The name of the artifact, required
* @param files A list of absolute or relative paths that denote what files should be uploaded
* @param rootDirectory An absolute or relative file path that denotes the root parent directory of the files being uploaded
* @param options Extra options for customizing the upload behavior
* @returns single UploadArtifactResponse object
*/
uploadArtifact(
name: string,
files: string[],
rootDirectory: string,
options?: UploadArtifactOptions
): Promise<UploadArtifactResponse>
/**
* Lists all artifacts that are part of the current workflow run.
* This function will return at most 1000 artifacts per workflow run.
*
* If `options.findBy` is specified, this will call the public List-Artifacts API which can list from other runs.
* https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#list-workflow-run-artifacts
*
* @param options Extra options that allow for the customization of the list behavior
* @returns ListArtifactResponse object
*/
listArtifacts(
options?: ListArtifactsOptions & FindOptions
): Promise<ListArtifactsResponse>
/**
* Finds an artifact by name.
* If there are multiple artifacts with the same name in the same workflow run, this will return the latest.
* If the artifact is not found, it will throw.
*
* If `options.findBy` is specified, this will use the public List Artifacts API with a name filter which can get artifacts from other runs.
* https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#list-workflow-run-artifacts
* `@actions/artifact` v2+ does not allow for creating multiple artifacts with the same name in the same workflow run.
* It is possible to have multiple artifacts with the same name in the same workflow run by using old versions of upload-artifact (v1,v2 and v3), @actions/artifact < v2 or it is a rerun.
* If there are multiple artifacts with the same name in the same workflow run this function will return the first artifact that matches the name.
*
* @param artifactName The name of the artifact to find
* @param options Extra options that allow for the customization of the get behavior
*/
getArtifact(
artifactName: string,
options?: FindOptions
): Promise<GetArtifactResponse>
/**
* Downloads an artifact and unzips the content.
*
* If `options.findBy` is specified, this will use the public Download Artifact API https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#download-an-artifact
*
* @param artifactId The id of the artifact to download
* @param options Extra options that allow for the customization of the download behavior
* @returns single DownloadArtifactResponse object
*/
downloadArtifact(
artifactId: number,
options?: DownloadArtifactOptions & FindOptions
): Promise<DownloadArtifactResponse>
/**
* Delete an Artifact
*
* If `options.findBy` is specified, this will use the public Delete Artifact API https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#delete-an-artifact
*
* @param artifactName The name of the artifact to delete
* @param options Extra options that allow for the customization of the delete behavior
* @returns single DeleteArtifactResponse object
*/
deleteArtifact(
artifactName: string,
options?: FindOptions
): Promise<DeleteArtifactResponse>
}
/**
* The default artifact client that is used by the artifact action(s).
*/
export class DefaultArtifactClient implements ArtifactClient {
async uploadArtifact(
name: string,
files: string[],
rootDirectory: string,
options?: UploadArtifactOptions
): Promise<UploadArtifactResponse> {
try {
if (isGhes()) {
throw new GHESNotSupportedError()
}
return uploadArtifact(name, files, rootDirectory, options)
} catch (error) {
warning(
`Artifact upload failed with error: ${error}.
Errors can be temporary, so please try again and optionally run the action with debug mode enabled for more information.
If the error persists, please check whether Actions is operating normally at [https://githubstatus.com](https://www.githubstatus.com).`
)
throw error
}
}
async downloadArtifact(
artifactId: number,
options?: DownloadArtifactOptions & FindOptions
): Promise<DownloadArtifactResponse> {
try {
if (isGhes()) {
throw new GHESNotSupportedError()
}
if (options?.findBy) {
const {
findBy: {repositoryOwner, repositoryName, token},
...downloadOptions
} = options
return downloadArtifactPublic(
artifactId,
repositoryOwner,
repositoryName,
token,
downloadOptions
)
}
return downloadArtifactInternal(artifactId, options)
} catch (error) {
warning(
`Download Artifact failed with error: ${error}.
Errors can be temporary, so please try again and optionally run the action with debug mode enabled for more information.
If the error persists, please check whether Actions and API requests are operating normally at [https://githubstatus.com](https://www.githubstatus.com).`
)
throw error
}
}
async listArtifacts(
options?: ListArtifactsOptions & FindOptions
): Promise<ListArtifactsResponse> {
try {
if (isGhes()) {
throw new GHESNotSupportedError()
}
if (options?.findBy) {
const {
findBy: {workflowRunId, repositoryOwner, repositoryName, token}
} = options
return listArtifactsPublic(
workflowRunId,
repositoryOwner,
repositoryName,
token,
options?.latest
)
}
return listArtifactsInternal(options?.latest)
} catch (error: unknown) {
warning(
`Listing Artifacts failed with error: ${error}.
Errors can be temporary, so please try again and optionally run the action with debug mode enabled for more information.
If the error persists, please check whether Actions and API requests are operating normally at [https://githubstatus.com](https://www.githubstatus.com).`
)
throw error
}
}
async getArtifact(
artifactName: string,
options?: FindOptions
): Promise<GetArtifactResponse> {
try {
if (isGhes()) {
throw new GHESNotSupportedError()
}
if (options?.findBy) {
const {
findBy: {workflowRunId, repositoryOwner, repositoryName, token}
} = options
return getArtifactPublic(
artifactName,
workflowRunId,
repositoryOwner,
repositoryName,
token
)
}
return getArtifactInternal(artifactName)
} catch (error: unknown) {
warning(
`Get Artifact failed with error: ${error}.
Errors can be temporary, so please try again and optionally run the action with debug mode enabled for more information.
If the error persists, please check whether Actions and API requests are operating normally at [https://githubstatus.com](https://www.githubstatus.com).`
)
throw error
}
}
async deleteArtifact(
artifactName: string,
options?: FindOptions
): Promise<DeleteArtifactResponse> {
try {
if (isGhes()) {
throw new GHESNotSupportedError()
}
if (options?.findBy) {
const {
findBy: {repositoryOwner, repositoryName, workflowRunId, token}
} = options
return deleteArtifactPublic(
artifactName,
workflowRunId,
repositoryOwner,
repositoryName,
token
)
}
return deleteArtifactInternal(artifactName)
} catch (error) {
warning(
`Delete Artifact failed with error: ${error}.
Errors can be temporary, so please try again and optionally run the action with debug mode enabled for more information.
If the error persists, please check whether Actions and API requests are operating normally at [https://githubstatus.com](https://www.githubstatus.com).`
)
throw error
}
}
}

View File

@ -1,67 +0,0 @@
// The number of concurrent uploads that happens at the same time
export function getUploadFileConcurrency(): number {
return 2
}
// When uploading large files that can't be uploaded with a single http call, this controls
// the chunk size that is used during upload
export function getUploadChunkSize(): number {
return 8 * 1024 * 1024 // 8 MB Chunks
}
// The maximum number of retries that can be attempted before an upload or download fails
export function getRetryLimit(): number {
return 5
}
// With exponential backoff, the larger the retry count, the larger the wait time before another attempt
// The retry multiplier controls by how much the backOff time increases depending on the number of retries
export function getRetryMultiplier(): number {
return 1.5
}
// The initial wait time if an upload or download fails and a retry is being attempted for the first time
export function getInitialRetryIntervalInMilliseconds(): number {
return 3000
}
// The number of concurrent downloads that happens at the same time
export function getDownloadFileConcurrency(): number {
return 2
}
export function getRuntimeToken(): string {
const token = process.env['ACTIONS_RUNTIME_TOKEN']
if (!token) {
throw new Error('Unable to get ACTIONS_RUNTIME_TOKEN env variable')
}
return token
}
export function getRuntimeUrl(): string {
const runtimeUrl = process.env['ACTIONS_RUNTIME_URL']
if (!runtimeUrl) {
throw new Error('Unable to get ACTIONS_RUNTIME_URL env variable')
}
return runtimeUrl
}
export function getWorkFlowRunId(): string {
const workFlowRunId = process.env['GITHUB_RUN_ID']
if (!workFlowRunId) {
throw new Error('Unable to get GITHUB_RUN_ID env variable')
}
return workFlowRunId
}
export function getWorkSpaceDirectory(): string {
const workspaceDirectory = process.env['GITHUB_WORKSPACE']
if (!workspaceDirectory) {
throw new Error('Unable to get GITHUB_WORKSPACE env variable')
}
return workspaceDirectory
}
export function getRetentionDays(): string | undefined {
return process.env['GITHUB_RETENTION_DAYS']
}

View File

@ -1,76 +0,0 @@
export interface ArtifactResponse {
containerId: string
size: number
signedContent: string
fileContainerResourceUrl: string
type: string
name: string
url: string
}
export interface CreateArtifactParameters {
Type: string
Name: string
RetentionDays?: number
}
export interface PatchArtifactSize {
Size: number
}
export interface PatchArtifactSizeSuccessResponse {
containerId: number
size: number
signedContent: string
type: string
name: string
url: string
uploadUrl: string
}
export interface UploadResults {
/**
* The size in bytes of data that was transferred during the upload process to the actions backend service. This takes into account possible
* gzip compression to reduce the amount of data that needs to be transferred
*/
uploadSize: number
/**
* The raw size of the files that were specified for upload
*/
totalSize: number
/**
* An array of files that failed to upload
*/
failedItems: string[]
}
export interface ListArtifactsResponse {
count: number
value: ArtifactResponse[]
}
export interface QueryArtifactResponse {
count: number
value: ContainerEntry[]
}
export interface ContainerEntry {
containerId: number
scopeIdentifier: string
path: string
itemType: string
status: string
fileLength?: number
fileEncoding?: number
fileType?: number
dateCreated: string
dateLastModified: string
createdBy: string
lastModifiedBy: string
itemLocation: string
contentLocation: string
fileId?: number
contentId: string
}

View File

@ -1,317 +0,0 @@
/**
* CRC64: cyclic redundancy check, 64-bits
*
* In order to validate that artifacts are not being corrupted over the wire, this redundancy check allows us to
* validate that there was no corruption during transmission. The implementation here is based on Go's hash/crc64 pkg,
* but without the slicing-by-8 optimization: https://cs.opensource.google/go/go/+/master:src/hash/crc64/crc64.go
*
* This implementation uses a pregenerated table based on 0x9A6C9329AC4BC9B5 as the polynomial, the same polynomial that
* is used for Azure Storage: https://github.com/Azure/azure-storage-net/blob/cbe605f9faa01bfc3003d75fc5a16b2eaccfe102/Lib/Common/Core/Util/Crc64.cs#L27
*/
// when transpile target is >= ES2020 (after dropping node 12) these can be changed to bigint literals - ts(2737)
const PREGEN_POLY_TABLE = [
BigInt('0x0000000000000000'),
BigInt('0x7F6EF0C830358979'),
BigInt('0xFEDDE190606B12F2'),
BigInt('0x81B31158505E9B8B'),
BigInt('0xC962E5739841B68F'),
BigInt('0xB60C15BBA8743FF6'),
BigInt('0x37BF04E3F82AA47D'),
BigInt('0x48D1F42BC81F2D04'),
BigInt('0xA61CECB46814FE75'),
BigInt('0xD9721C7C5821770C'),
BigInt('0x58C10D24087FEC87'),
BigInt('0x27AFFDEC384A65FE'),
BigInt('0x6F7E09C7F05548FA'),
BigInt('0x1010F90FC060C183'),
BigInt('0x91A3E857903E5A08'),
BigInt('0xEECD189FA00BD371'),
BigInt('0x78E0FF3B88BE6F81'),
BigInt('0x078E0FF3B88BE6F8'),
BigInt('0x863D1EABE8D57D73'),
BigInt('0xF953EE63D8E0F40A'),
BigInt('0xB1821A4810FFD90E'),
BigInt('0xCEECEA8020CA5077'),
BigInt('0x4F5FFBD87094CBFC'),
BigInt('0x30310B1040A14285'),
BigInt('0xDEFC138FE0AA91F4'),
BigInt('0xA192E347D09F188D'),
BigInt('0x2021F21F80C18306'),
BigInt('0x5F4F02D7B0F40A7F'),
BigInt('0x179EF6FC78EB277B'),
BigInt('0x68F0063448DEAE02'),
BigInt('0xE943176C18803589'),
BigInt('0x962DE7A428B5BCF0'),
BigInt('0xF1C1FE77117CDF02'),
BigInt('0x8EAF0EBF2149567B'),
BigInt('0x0F1C1FE77117CDF0'),
BigInt('0x7072EF2F41224489'),
BigInt('0x38A31B04893D698D'),
BigInt('0x47CDEBCCB908E0F4'),
BigInt('0xC67EFA94E9567B7F'),
BigInt('0xB9100A5CD963F206'),
BigInt('0x57DD12C379682177'),
BigInt('0x28B3E20B495DA80E'),
BigInt('0xA900F35319033385'),
BigInt('0xD66E039B2936BAFC'),
BigInt('0x9EBFF7B0E12997F8'),
BigInt('0xE1D10778D11C1E81'),
BigInt('0x606216208142850A'),
BigInt('0x1F0CE6E8B1770C73'),
BigInt('0x8921014C99C2B083'),
BigInt('0xF64FF184A9F739FA'),
BigInt('0x77FCE0DCF9A9A271'),
BigInt('0x08921014C99C2B08'),
BigInt('0x4043E43F0183060C'),
BigInt('0x3F2D14F731B68F75'),
BigInt('0xBE9E05AF61E814FE'),
BigInt('0xC1F0F56751DD9D87'),
BigInt('0x2F3DEDF8F1D64EF6'),
BigInt('0x50531D30C1E3C78F'),
BigInt('0xD1E00C6891BD5C04'),
BigInt('0xAE8EFCA0A188D57D'),
BigInt('0xE65F088B6997F879'),
BigInt('0x9931F84359A27100'),
BigInt('0x1882E91B09FCEA8B'),
BigInt('0x67EC19D339C963F2'),
BigInt('0xD75ADABD7A6E2D6F'),
BigInt('0xA8342A754A5BA416'),
BigInt('0x29873B2D1A053F9D'),
BigInt('0x56E9CBE52A30B6E4'),
BigInt('0x1E383FCEE22F9BE0'),
BigInt('0x6156CF06D21A1299'),
BigInt('0xE0E5DE5E82448912'),
BigInt('0x9F8B2E96B271006B'),
BigInt('0x71463609127AD31A'),
BigInt('0x0E28C6C1224F5A63'),
BigInt('0x8F9BD7997211C1E8'),
BigInt('0xF0F5275142244891'),
BigInt('0xB824D37A8A3B6595'),
BigInt('0xC74A23B2BA0EECEC'),
BigInt('0x46F932EAEA507767'),
BigInt('0x3997C222DA65FE1E'),
BigInt('0xAFBA2586F2D042EE'),
BigInt('0xD0D4D54EC2E5CB97'),
BigInt('0x5167C41692BB501C'),
BigInt('0x2E0934DEA28ED965'),
BigInt('0x66D8C0F56A91F461'),
BigInt('0x19B6303D5AA47D18'),
BigInt('0x980521650AFAE693'),
BigInt('0xE76BD1AD3ACF6FEA'),
BigInt('0x09A6C9329AC4BC9B'),
BigInt('0x76C839FAAAF135E2'),
BigInt('0xF77B28A2FAAFAE69'),
BigInt('0x8815D86ACA9A2710'),
BigInt('0xC0C42C4102850A14'),
BigInt('0xBFAADC8932B0836D'),
BigInt('0x3E19CDD162EE18E6'),
BigInt('0x41773D1952DB919F'),
BigInt('0x269B24CA6B12F26D'),
BigInt('0x59F5D4025B277B14'),
BigInt('0xD846C55A0B79E09F'),
BigInt('0xA72835923B4C69E6'),
BigInt('0xEFF9C1B9F35344E2'),
BigInt('0x90973171C366CD9B'),
BigInt('0x1124202993385610'),
BigInt('0x6E4AD0E1A30DDF69'),
BigInt('0x8087C87E03060C18'),
BigInt('0xFFE938B633338561'),
BigInt('0x7E5A29EE636D1EEA'),
BigInt('0x0134D92653589793'),
BigInt('0x49E52D0D9B47BA97'),
BigInt('0x368BDDC5AB7233EE'),
BigInt('0xB738CC9DFB2CA865'),
BigInt('0xC8563C55CB19211C'),
BigInt('0x5E7BDBF1E3AC9DEC'),
BigInt('0x21152B39D3991495'),
BigInt('0xA0A63A6183C78F1E'),
BigInt('0xDFC8CAA9B3F20667'),
BigInt('0x97193E827BED2B63'),
BigInt('0xE877CE4A4BD8A21A'),
BigInt('0x69C4DF121B863991'),
BigInt('0x16AA2FDA2BB3B0E8'),
BigInt('0xF86737458BB86399'),
BigInt('0x8709C78DBB8DEAE0'),
BigInt('0x06BAD6D5EBD3716B'),
BigInt('0x79D4261DDBE6F812'),
BigInt('0x3105D23613F9D516'),
BigInt('0x4E6B22FE23CC5C6F'),
BigInt('0xCFD833A67392C7E4'),
BigInt('0xB0B6C36E43A74E9D'),
BigInt('0x9A6C9329AC4BC9B5'),
BigInt('0xE50263E19C7E40CC'),
BigInt('0x64B172B9CC20DB47'),
BigInt('0x1BDF8271FC15523E'),
BigInt('0x530E765A340A7F3A'),
BigInt('0x2C608692043FF643'),
BigInt('0xADD397CA54616DC8'),
BigInt('0xD2BD67026454E4B1'),
BigInt('0x3C707F9DC45F37C0'),
BigInt('0x431E8F55F46ABEB9'),
BigInt('0xC2AD9E0DA4342532'),
BigInt('0xBDC36EC59401AC4B'),
BigInt('0xF5129AEE5C1E814F'),
BigInt('0x8A7C6A266C2B0836'),
BigInt('0x0BCF7B7E3C7593BD'),
BigInt('0x74A18BB60C401AC4'),
BigInt('0xE28C6C1224F5A634'),
BigInt('0x9DE29CDA14C02F4D'),
BigInt('0x1C518D82449EB4C6'),
BigInt('0x633F7D4A74AB3DBF'),
BigInt('0x2BEE8961BCB410BB'),
BigInt('0x548079A98C8199C2'),
BigInt('0xD53368F1DCDF0249'),
BigInt('0xAA5D9839ECEA8B30'),
BigInt('0x449080A64CE15841'),
BigInt('0x3BFE706E7CD4D138'),
BigInt('0xBA4D61362C8A4AB3'),
BigInt('0xC52391FE1CBFC3CA'),
BigInt('0x8DF265D5D4A0EECE'),
BigInt('0xF29C951DE49567B7'),
BigInt('0x732F8445B4CBFC3C'),
BigInt('0x0C41748D84FE7545'),
BigInt('0x6BAD6D5EBD3716B7'),
BigInt('0x14C39D968D029FCE'),
BigInt('0x95708CCEDD5C0445'),
BigInt('0xEA1E7C06ED698D3C'),
BigInt('0xA2CF882D2576A038'),
BigInt('0xDDA178E515432941'),
BigInt('0x5C1269BD451DB2CA'),
BigInt('0x237C997575283BB3'),
BigInt('0xCDB181EAD523E8C2'),
BigInt('0xB2DF7122E51661BB'),
BigInt('0x336C607AB548FA30'),
BigInt('0x4C0290B2857D7349'),
BigInt('0x04D364994D625E4D'),
BigInt('0x7BBD94517D57D734'),
BigInt('0xFA0E85092D094CBF'),
BigInt('0x856075C11D3CC5C6'),
BigInt('0x134D926535897936'),
BigInt('0x6C2362AD05BCF04F'),
BigInt('0xED9073F555E26BC4'),
BigInt('0x92FE833D65D7E2BD'),
BigInt('0xDA2F7716ADC8CFB9'),
BigInt('0xA54187DE9DFD46C0'),
BigInt('0x24F29686CDA3DD4B'),
BigInt('0x5B9C664EFD965432'),
BigInt('0xB5517ED15D9D8743'),
BigInt('0xCA3F8E196DA80E3A'),
BigInt('0x4B8C9F413DF695B1'),
BigInt('0x34E26F890DC31CC8'),
BigInt('0x7C339BA2C5DC31CC'),
BigInt('0x035D6B6AF5E9B8B5'),
BigInt('0x82EE7A32A5B7233E'),
BigInt('0xFD808AFA9582AA47'),
BigInt('0x4D364994D625E4DA'),
BigInt('0x3258B95CE6106DA3'),
BigInt('0xB3EBA804B64EF628'),
BigInt('0xCC8558CC867B7F51'),
BigInt('0x8454ACE74E645255'),
BigInt('0xFB3A5C2F7E51DB2C'),
BigInt('0x7A894D772E0F40A7'),
BigInt('0x05E7BDBF1E3AC9DE'),
BigInt('0xEB2AA520BE311AAF'),
BigInt('0x944455E88E0493D6'),
BigInt('0x15F744B0DE5A085D'),
BigInt('0x6A99B478EE6F8124'),
BigInt('0x224840532670AC20'),
BigInt('0x5D26B09B16452559'),
BigInt('0xDC95A1C3461BBED2'),
BigInt('0xA3FB510B762E37AB'),
BigInt('0x35D6B6AF5E9B8B5B'),
BigInt('0x4AB846676EAE0222'),
BigInt('0xCB0B573F3EF099A9'),
BigInt('0xB465A7F70EC510D0'),
BigInt('0xFCB453DCC6DA3DD4'),
BigInt('0x83DAA314F6EFB4AD'),
BigInt('0x0269B24CA6B12F26'),
BigInt('0x7D0742849684A65F'),
BigInt('0x93CA5A1B368F752E'),
BigInt('0xECA4AAD306BAFC57'),
BigInt('0x6D17BB8B56E467DC'),
BigInt('0x12794B4366D1EEA5'),
BigInt('0x5AA8BF68AECEC3A1'),
BigInt('0x25C64FA09EFB4AD8'),
BigInt('0xA4755EF8CEA5D153'),
BigInt('0xDB1BAE30FE90582A'),
BigInt('0xBCF7B7E3C7593BD8'),
BigInt('0xC399472BF76CB2A1'),
BigInt('0x422A5673A732292A'),
BigInt('0x3D44A6BB9707A053'),
BigInt('0x759552905F188D57'),
BigInt('0x0AFBA2586F2D042E'),
BigInt('0x8B48B3003F739FA5'),
BigInt('0xF42643C80F4616DC'),
BigInt('0x1AEB5B57AF4DC5AD'),
BigInt('0x6585AB9F9F784CD4'),
BigInt('0xE436BAC7CF26D75F'),
BigInt('0x9B584A0FFF135E26'),
BigInt('0xD389BE24370C7322'),
BigInt('0xACE74EEC0739FA5B'),
BigInt('0x2D545FB4576761D0'),
BigInt('0x523AAF7C6752E8A9'),
BigInt('0xC41748D84FE75459'),
BigInt('0xBB79B8107FD2DD20'),
BigInt('0x3ACAA9482F8C46AB'),
BigInt('0x45A459801FB9CFD2'),
BigInt('0x0D75ADABD7A6E2D6'),
BigInt('0x721B5D63E7936BAF'),
BigInt('0xF3A84C3BB7CDF024'),
BigInt('0x8CC6BCF387F8795D'),
BigInt('0x620BA46C27F3AA2C'),
BigInt('0x1D6554A417C62355'),
BigInt('0x9CD645FC4798B8DE'),
BigInt('0xE3B8B53477AD31A7'),
BigInt('0xAB69411FBFB21CA3'),
BigInt('0xD407B1D78F8795DA'),
BigInt('0x55B4A08FDFD90E51'),
BigInt('0x2ADA5047EFEC8728')
]
export type CRC64DigestEncoding = 'hex' | 'base64' | 'buffer'
class CRC64 {
private _crc: bigint
constructor() {
this._crc = BigInt(0)
}
update(data: Buffer | string): void {
const buffer = typeof data === 'string' ? Buffer.from(data) : data
let crc = CRC64.flip64Bits(this._crc)
for (const dataByte of buffer) {
const crcByte = Number(crc & BigInt(0xff))
crc = PREGEN_POLY_TABLE[crcByte ^ dataByte] ^ (crc >> BigInt(8))
}
this._crc = CRC64.flip64Bits(crc)
}
digest(encoding?: CRC64DigestEncoding): string | Buffer {
switch (encoding) {
case 'hex':
return this._crc.toString(16).toUpperCase()
case 'base64':
return this.toBuffer().toString('base64')
default:
return this.toBuffer()
}
}
private toBuffer(): Buffer {
return Buffer.from(
[0, 8, 16, 24, 32, 40, 48, 56].map(s =>
Number((this._crc >> BigInt(s)) & BigInt(0xff))
)
)
}
static flip64Bits(n: bigint): bigint {
return (BigInt(1) << BigInt(64)) - BigInt(1) - n
}
}
export default CRC64

View File

@ -0,0 +1,109 @@
import {info, debug} from '@actions/core'
import {getOctokit} from '@actions/github'
import {DeleteArtifactResponse} from '../shared/interfaces'
import {getUserAgentString} from '../shared/user-agent'
import {getRetryOptions} from '../find/retry-options'
import {defaults as defaultGitHubOptions} from '@actions/github/lib/utils'
import {requestLog} from '@octokit/plugin-request-log'
import {retry} from '@octokit/plugin-retry'
import {OctokitOptions} from '@octokit/core/dist-types/types'
import {internalArtifactTwirpClient} from '../shared/artifact-twirp-client'
import {getBackendIdsFromToken} from '../shared/util'
import {
DeleteArtifactRequest,
ListArtifactsRequest,
StringValue
} from '../../generated'
import {getArtifactPublic} from '../find/get-artifact'
import {ArtifactNotFoundError, InvalidResponseError} from '../shared/errors'
export async function deleteArtifactPublic(
artifactName: string,
workflowRunId: number,
repositoryOwner: string,
repositoryName: string,
token: string
): Promise<DeleteArtifactResponse> {
const [retryOpts, requestOpts] = getRetryOptions(defaultGitHubOptions)
const opts: OctokitOptions = {
log: undefined,
userAgent: getUserAgentString(),
previews: undefined,
retry: retryOpts,
request: requestOpts
}
const github = getOctokit(token, opts, retry, requestLog)
const getArtifactResp = await getArtifactPublic(
artifactName,
workflowRunId,
repositoryOwner,
repositoryName,
token
)
const deleteArtifactResp = await github.rest.actions.deleteArtifact({
owner: repositoryOwner,
repo: repositoryName,
artifact_id: getArtifactResp.artifact.id
})
if (deleteArtifactResp.status !== 204) {
throw new InvalidResponseError(
`Invalid response from GitHub API: ${deleteArtifactResp.status} (${deleteArtifactResp?.headers?.['x-github-request-id']})`
)
}
return {
id: getArtifactResp.artifact.id
}
}
export async function deleteArtifactInternal(
artifactName
): Promise<DeleteArtifactResponse> {
const artifactClient = internalArtifactTwirpClient()
const {workflowRunBackendId, workflowJobRunBackendId} =
getBackendIdsFromToken()
const listReq: ListArtifactsRequest = {
workflowRunBackendId,
workflowJobRunBackendId,
nameFilter: StringValue.create({value: artifactName})
}
const listRes = await artifactClient.ListArtifacts(listReq)
if (listRes.artifacts.length === 0) {
throw new ArtifactNotFoundError(
`Artifact not found for name: ${artifactName}`
)
}
let artifact = listRes.artifacts[0]
if (listRes.artifacts.length > 1) {
artifact = listRes.artifacts.sort(
(a, b) => Number(b.databaseId) - Number(a.databaseId)
)[0]
debug(
`More than one artifact found for a single name, returning newest (id: ${artifact.databaseId})`
)
}
const req: DeleteArtifactRequest = {
workflowRunBackendId: artifact.workflowRunBackendId,
workflowJobRunBackendId: artifact.workflowJobRunBackendId,
name: artifact.name
}
const res = await artifactClient.DeleteArtifact(req)
info(`Artifact '${artifactName}' (ID: ${res.artifactId}) deleted`)
return {
id: Number(res.artifactId)
}
}

View File

@ -1,362 +0,0 @@
import * as fs from 'fs'
import * as core from '@actions/core'
import * as zlib from 'zlib'
import {
getArtifactUrl,
getDownloadHeaders,
isSuccessStatusCode,
isRetryableStatusCode,
isThrottledStatusCode,
getExponentialRetryTimeInMilliseconds,
tryGetRetryAfterValueTimeInMilliseconds,
displayHttpDiagnostics,
getFileSize,
rmFile,
sleep
} from './utils'
import {URL} from 'url'
import {StatusReporter} from './status-reporter'
import {performance} from 'perf_hooks'
import {ListArtifactsResponse, QueryArtifactResponse} from './contracts'
import {HttpClientResponse} from '@actions/http-client'
import {HttpManager} from './http-manager'
import {DownloadItem} from './download-specification'
import {getDownloadFileConcurrency, getRetryLimit} from './config-variables'
import {IncomingHttpHeaders} from 'http'
import {retryHttpClientRequest} from './requestUtils'
export class DownloadHttpClient {
// http manager is used for concurrent connections when downloading multiple files at once
private downloadHttpManager: HttpManager
private statusReporter: StatusReporter
constructor() {
this.downloadHttpManager = new HttpManager(
getDownloadFileConcurrency(),
'@actions/artifact-download'
)
// downloads are usually significantly faster than uploads so display status information every second
this.statusReporter = new StatusReporter(1000)
}
/**
* Gets a list of all artifacts that are in a specific container
*/
async listArtifacts(): Promise<ListArtifactsResponse> {
const artifactUrl = getArtifactUrl()
// use the first client from the httpManager, `keep-alive` is not used so the connection will close immediately
const client = this.downloadHttpManager.getClient(0)
const headers = getDownloadHeaders('application/json')
const response = await retryHttpClientRequest('List Artifacts', async () =>
client.get(artifactUrl, headers)
)
const body: string = await response.readBody()
return JSON.parse(body)
}
/**
* Fetches a set of container items that describe the contents of an artifact
* @param artifactName the name of the artifact
* @param containerUrl the artifact container URL for the run
*/
async getContainerItems(
artifactName: string,
containerUrl: string
): Promise<QueryArtifactResponse> {
// the itemPath search parameter controls which containers will be returned
const resourceUrl = new URL(containerUrl)
resourceUrl.searchParams.append('itemPath', artifactName)
// use the first client from the httpManager, `keep-alive` is not used so the connection will close immediately
const client = this.downloadHttpManager.getClient(0)
const headers = getDownloadHeaders('application/json')
const response = await retryHttpClientRequest(
'Get Container Items',
async () => client.get(resourceUrl.toString(), headers)
)
const body: string = await response.readBody()
return JSON.parse(body)
}
/**
* Concurrently downloads all the files that are part of an artifact
* @param downloadItems information about what items to download and where to save them
*/
async downloadSingleArtifact(downloadItems: DownloadItem[]): Promise<void> {
const DOWNLOAD_CONCURRENCY = getDownloadFileConcurrency()
// limit the number of files downloaded at a single time
core.debug(`Download file concurrency is set to ${DOWNLOAD_CONCURRENCY}`)
const parallelDownloads = [...new Array(DOWNLOAD_CONCURRENCY).keys()]
let currentFile = 0
let downloadedFiles = 0
core.info(
`Total number of files that will be downloaded: ${downloadItems.length}`
)
this.statusReporter.setTotalNumberOfFilesToProcess(downloadItems.length)
this.statusReporter.start()
await Promise.all(
parallelDownloads.map(async index => {
while (currentFile < downloadItems.length) {
const currentFileToDownload = downloadItems[currentFile]
currentFile += 1
const startTime = performance.now()
await this.downloadIndividualFile(
index,
currentFileToDownload.sourceLocation,
currentFileToDownload.targetPath
)
if (core.isDebug()) {
core.debug(
`File: ${++downloadedFiles}/${downloadItems.length}. ${
currentFileToDownload.targetPath
} took ${(performance.now() - startTime).toFixed(
3
)} milliseconds to finish downloading`
)
}
this.statusReporter.incrementProcessedCount()
}
})
)
.catch(error => {
throw new Error(`Unable to download the artifact: ${error}`)
})
.finally(() => {
this.statusReporter.stop()
// safety dispose all connections
this.downloadHttpManager.disposeAndReplaceAllClients()
})
}
/**
* Downloads an individual file
* @param httpClientIndex the index of the http client that is used to make all of the calls
* @param artifactLocation origin location where a file will be downloaded from
* @param downloadPath destination location for the file being downloaded
*/
private async downloadIndividualFile(
httpClientIndex: number,
artifactLocation: string,
downloadPath: string
): Promise<void> {
let retryCount = 0
const retryLimit = getRetryLimit()
let destinationStream = fs.createWriteStream(downloadPath)
const headers = getDownloadHeaders('application/json', true, true)
// a single GET request is used to download a file
const makeDownloadRequest = async (): Promise<HttpClientResponse> => {
const client = this.downloadHttpManager.getClient(httpClientIndex)
return await client.get(artifactLocation, headers)
}
// check the response headers to determine if the file was compressed using gzip
const isGzip = (incomingHeaders: IncomingHttpHeaders): boolean => {
return (
'content-encoding' in incomingHeaders &&
incomingHeaders['content-encoding'] === 'gzip'
)
}
// Increments the current retry count and then checks if the retry limit has been reached
// If there have been too many retries, fail so the download stops. If there is a retryAfterValue value provided,
// it will be used
const backOff = async (retryAfterValue?: number): Promise<void> => {
retryCount++
if (retryCount > retryLimit) {
return Promise.reject(
new Error(
`Retry limit has been reached. Unable to download ${artifactLocation}`
)
)
} else {
this.downloadHttpManager.disposeAndReplaceClient(httpClientIndex)
if (retryAfterValue) {
// Back off by waiting the specified time denoted by the retry-after header
core.info(
`Backoff due to too many requests, retry #${retryCount}. Waiting for ${retryAfterValue} milliseconds before continuing the download`
)
await sleep(retryAfterValue)
} else {
// Back off using an exponential value that depends on the retry count
const backoffTime = getExponentialRetryTimeInMilliseconds(retryCount)
core.info(
`Exponential backoff for retry #${retryCount}. Waiting for ${backoffTime} milliseconds before continuing the download`
)
await sleep(backoffTime)
}
core.info(
`Finished backoff for retry #${retryCount}, continuing with download`
)
}
}
const isAllBytesReceived = (
expected?: string,
received?: number
): boolean => {
// be lenient, if any input is missing, assume success, i.e. not truncated
if (
!expected ||
!received ||
process.env['ACTIONS_ARTIFACT_SKIP_DOWNLOAD_VALIDATION']
) {
core.info('Skipping download validation.')
return true
}
return parseInt(expected) === received
}
const resetDestinationStream = async (
fileDownloadPath: string
): Promise<void> => {
destinationStream.close()
// await until file is created at downloadpath; node15 and up fs.createWriteStream had not created a file yet
await new Promise<void>(resolve => {
destinationStream.on('close', resolve)
if (destinationStream.writableFinished) {
resolve()
}
})
await rmFile(fileDownloadPath)
destinationStream = fs.createWriteStream(fileDownloadPath)
}
// keep trying to download a file until a retry limit has been reached
while (retryCount <= retryLimit) {
let response: HttpClientResponse
try {
response = await makeDownloadRequest()
} catch (error) {
// if an error is caught, it is usually indicative of a timeout so retry the download
core.info('An error occurred while attempting to download a file')
// eslint-disable-next-line no-console
console.log(error)
// increment the retryCount and use exponential backoff to wait before making the next request
await backOff()
continue
}
let forceRetry = false
if (isSuccessStatusCode(response.message.statusCode)) {
// The body contains the contents of the file however calling response.readBody() causes all the content to be converted to a string
// which can cause some gzip encoded data to be lost
// Instead of using response.readBody(), response.message is a readableStream that can be directly used to get the raw body contents
try {
const isGzipped = isGzip(response.message.headers)
await this.pipeResponseToFile(response, destinationStream, isGzipped)
if (
isGzipped ||
isAllBytesReceived(
response.message.headers['content-length'],
await getFileSize(downloadPath)
)
) {
return
} else {
forceRetry = true
}
} catch (error) {
// retry on error, most likely streams were corrupted
forceRetry = true
}
}
if (forceRetry || isRetryableStatusCode(response.message.statusCode)) {
core.info(
`A ${response.message.statusCode} response code has been received while attempting to download an artifact`
)
resetDestinationStream(downloadPath)
// if a throttled status code is received, try to get the retryAfter header value, else differ to standard exponential backoff
isThrottledStatusCode(response.message.statusCode)
? await backOff(
tryGetRetryAfterValueTimeInMilliseconds(response.message.headers)
)
: await backOff()
} else {
// Some unexpected response code, fail immediately and stop the download
displayHttpDiagnostics(response)
return Promise.reject(
new Error(
`Unexpected http ${response.message.statusCode} during download for ${artifactLocation}`
)
)
}
}
}
/**
* Pipes the response from downloading an individual file to the appropriate destination stream while decoding gzip content if necessary
* @param response the http response received when downloading a file
* @param destinationStream the stream where the file should be written to
* @param isGzip a boolean denoting if the content is compressed using gzip and if we need to decode it
*/
async pipeResponseToFile(
response: HttpClientResponse,
destinationStream: fs.WriteStream,
isGzip: boolean
): Promise<void> {
await new Promise((resolve, reject) => {
if (isGzip) {
const gunzip = zlib.createGunzip()
response.message
.on('error', error => {
core.info(
`An error occurred while attempting to read the response stream`
)
gunzip.close()
destinationStream.close()
reject(error)
})
.pipe(gunzip)
.on('error', error => {
core.info(
`An error occurred while attempting to decompress the response stream`
)
destinationStream.close()
reject(error)
})
.pipe(destinationStream)
.on('close', () => {
resolve()
})
.on('error', error => {
core.info(
`An error occurred while writing a downloaded file to ${destinationStream.path}`
)
reject(error)
})
} else {
response.message
.on('error', error => {
core.info(
`An error occurred while attempting to read the response stream`
)
destinationStream.close()
reject(error)
})
.pipe(destinationStream)
.on('close', () => {
resolve()
})
.on('error', error => {
core.info(
`An error occurred while writing a downloaded file to ${destinationStream.path}`
)
reject(error)
})
}
})
return
}
}

View File

@ -1,7 +0,0 @@
export interface DownloadOptions {
/**
* Specifies if a folder is created for the artifact that is downloaded (contents downloaded into this folder),
* defaults to false if not specified
* */
createArtifactFolder?: boolean
}

View File

@ -1,11 +0,0 @@
export interface DownloadResponse {
/**
* The name of the artifact that was downloaded
*/
artifactName: string
/**
* The full Path to where the artifact was downloaded
*/
downloadPath: string
}

View File

@ -1,87 +0,0 @@
import * as path from 'path'
import {ContainerEntry} from './contracts'
export interface DownloadSpecification {
// root download location for the artifact
rootDownloadLocation: string
// directories that need to be created for all the items in the artifact
directoryStructure: string[]
// empty files that are part of the artifact that don't require any downloading
emptyFilesToCreate: string[]
// individual files that need to be downloaded as part of the artifact
filesToDownload: DownloadItem[]
}
export interface DownloadItem {
// Url that denotes where to download the item from
sourceLocation: string
// Information about where the file should be downloaded to
targetPath: string
}
/**
* Creates a specification for a set of files that will be downloaded
* @param artifactName the name of the artifact
* @param artifactEntries a set of container entries that describe that files that make up an artifact
* @param downloadPath the path where the artifact will be downloaded to
* @param includeRootDirectory specifies if there should be an extra directory (denoted by the artifact name) where the artifact files should be downloaded to
*/
export function getDownloadSpecification(
artifactName: string,
artifactEntries: ContainerEntry[],
downloadPath: string,
includeRootDirectory: boolean
): DownloadSpecification {
// use a set for the directory paths so that there are no duplicates
const directories = new Set<string>()
const specifications: DownloadSpecification = {
rootDownloadLocation: includeRootDirectory
? path.join(downloadPath, artifactName)
: downloadPath,
directoryStructure: [],
emptyFilesToCreate: [],
filesToDownload: []
}
for (const entry of artifactEntries) {
// Ignore artifacts in the container that don't begin with the same name
if (
entry.path.startsWith(`${artifactName}/`) ||
entry.path.startsWith(`${artifactName}\\`)
) {
// normalize all separators to the local OS
const normalizedPathEntry = path.normalize(entry.path)
// entry.path always starts with the artifact name, if includeRootDirectory is false, remove the name from the beginning of the path
const filePath = path.join(
downloadPath,
includeRootDirectory
? normalizedPathEntry
: normalizedPathEntry.replace(artifactName, '')
)
// Case insensitive folder structure maintained in the backend, not every folder is created so the 'folder'
// itemType cannot be relied upon. The file must be used to determine the directory structure
if (entry.itemType === 'file') {
// Get the directories that we need to create from the filePath for each individual file
directories.add(path.dirname(filePath))
if (entry.fileLength === 0) {
// An empty file was uploaded, create the empty files locally so that no extra http calls are made
specifications.emptyFilesToCreate.push(filePath)
} else {
specifications.filesToDownload.push({
sourceLocation: entry.contentLocation,
targetPath: filePath
})
}
}
}
}
specifications.directoryStructure = Array.from(directories)
return specifications
}

View File

@ -0,0 +1,254 @@
import fs from 'fs/promises'
import * as crypto from 'crypto'
import * as stream from 'stream'
import * as github from '@actions/github'
import * as core from '@actions/core'
import * as httpClient from '@actions/http-client'
import unzip from 'unzip-stream'
import {
DownloadArtifactOptions,
DownloadArtifactResponse,
StreamExtractResponse
} from '../shared/interfaces'
import {getUserAgentString} from '../shared/user-agent'
import {getGitHubWorkspaceDir} from '../shared/config'
import {internalArtifactTwirpClient} from '../shared/artifact-twirp-client'
import {
GetSignedArtifactURLRequest,
Int64Value,
ListArtifactsRequest
} from '../../generated'
import {getBackendIdsFromToken} from '../shared/util'
import {ArtifactNotFoundError} from '../shared/errors'
const scrubQueryParameters = (url: string): string => {
const parsed = new URL(url)
parsed.search = ''
return parsed.toString()
}
async function exists(path: string): Promise<boolean> {
try {
await fs.access(path)
return true
} catch (error) {
if (error.code === 'ENOENT') {
return false
} else {
throw error
}
}
}
async function streamExtract(
url: string,
directory: string
): Promise<StreamExtractResponse> {
let retryCount = 0
while (retryCount < 5) {
try {
return await streamExtractExternal(url, directory)
} catch (error) {
retryCount++
core.debug(
`Failed to download artifact after ${retryCount} retries due to ${error.message}. Retrying in 5 seconds...`
)
// wait 5 seconds before retrying
await new Promise(resolve => setTimeout(resolve, 5000))
}
}
throw new Error(`Artifact download failed after ${retryCount} retries.`)
}
export async function streamExtractExternal(
url: string,
directory: string
): Promise<StreamExtractResponse> {
const client = new httpClient.HttpClient(getUserAgentString())
const response = await client.get(url)
if (response.message.statusCode !== 200) {
throw new Error(
`Unexpected HTTP response from blob storage: ${response.message.statusCode} ${response.message.statusMessage}`
)
}
const timeout = 30 * 1000 // 30 seconds
let sha256Digest: string | undefined = undefined
return new Promise((resolve, reject) => {
const timerFn = (): void => {
response.message.destroy(
new Error(`Blob storage chunk did not respond in ${timeout}ms`)
)
}
const timer = setTimeout(timerFn, timeout)
const hashStream = crypto.createHash('sha256').setEncoding('hex')
const passThrough = new stream.PassThrough()
response.message.pipe(passThrough)
passThrough.pipe(hashStream)
const extractStream = passThrough
extractStream
.on('data', () => {
timer.refresh()
})
.on('error', (error: Error) => {
core.debug(
`response.message: Artifact download failed: ${error.message}`
)
clearTimeout(timer)
reject(error)
})
.pipe(unzip.Extract({path: directory}))
.on('close', () => {
clearTimeout(timer)
if (hashStream) {
hashStream.end()
sha256Digest = hashStream.read() as string
core.info(`SHA256 digest of downloaded artifact is ${sha256Digest}`)
}
resolve({sha256Digest: `sha256:${sha256Digest}`})
})
.on('error', (error: Error) => {
reject(error)
})
})
}
export async function downloadArtifactPublic(
artifactId: number,
repositoryOwner: string,
repositoryName: string,
token: string,
options?: DownloadArtifactOptions
): Promise<DownloadArtifactResponse> {
const downloadPath = await resolveOrCreateDirectory(options?.path)
const api = github.getOctokit(token)
let digestMismatch = false
core.info(
`Downloading artifact '${artifactId}' from '${repositoryOwner}/${repositoryName}'`
)
const {headers, status} = await api.rest.actions.downloadArtifact({
owner: repositoryOwner,
repo: repositoryName,
artifact_id: artifactId,
archive_format: 'zip',
request: {
redirect: 'manual'
}
})
if (status !== 302) {
throw new Error(`Unable to download artifact. Unexpected status: ${status}`)
}
const {location} = headers
if (!location) {
throw new Error(`Unable to redirect to artifact download url`)
}
core.info(
`Redirecting to blob download url: ${scrubQueryParameters(location)}`
)
try {
core.info(`Starting download of artifact to: ${downloadPath}`)
const extractResponse = await streamExtract(location, downloadPath)
core.info(`Artifact download completed successfully.`)
if (options?.expectedHash) {
if (options?.expectedHash !== extractResponse.sha256Digest) {
digestMismatch = true
core.debug(`Computed digest: ${extractResponse.sha256Digest}`)
core.debug(`Expected digest: ${options.expectedHash}`)
}
}
} catch (error) {
throw new Error(`Unable to download and extract artifact: ${error.message}`)
}
return {downloadPath, digestMismatch}
}
export async function downloadArtifactInternal(
artifactId: number,
options?: DownloadArtifactOptions
): Promise<DownloadArtifactResponse> {
const downloadPath = await resolveOrCreateDirectory(options?.path)
const artifactClient = internalArtifactTwirpClient()
let digestMismatch = false
const {workflowRunBackendId, workflowJobRunBackendId} =
getBackendIdsFromToken()
const listReq: ListArtifactsRequest = {
workflowRunBackendId,
workflowJobRunBackendId,
idFilter: Int64Value.create({value: artifactId.toString()})
}
const {artifacts} = await artifactClient.ListArtifacts(listReq)
if (artifacts.length === 0) {
throw new ArtifactNotFoundError(
`No artifacts found for ID: ${artifactId}\nAre you trying to download from a different run? Try specifying a github-token with \`actions:read\` scope.`
)
}
if (artifacts.length > 1) {
core.warning('Multiple artifacts found, defaulting to first.')
}
const signedReq: GetSignedArtifactURLRequest = {
workflowRunBackendId: artifacts[0].workflowRunBackendId,
workflowJobRunBackendId: artifacts[0].workflowJobRunBackendId,
name: artifacts[0].name
}
const {signedUrl} = await artifactClient.GetSignedArtifactURL(signedReq)
core.info(
`Redirecting to blob download url: ${scrubQueryParameters(signedUrl)}`
)
try {
core.info(`Starting download of artifact to: ${downloadPath}`)
const extractResponse = await streamExtract(signedUrl, downloadPath)
core.info(`Artifact download completed successfully.`)
if (options?.expectedHash) {
if (options?.expectedHash !== extractResponse.sha256Digest) {
digestMismatch = true
core.debug(`Computed digest: ${extractResponse.sha256Digest}`)
core.debug(`Expected digest: ${options.expectedHash}`)
}
}
} catch (error) {
throw new Error(`Unable to download and extract artifact: ${error.message}`)
}
return {downloadPath, digestMismatch}
}
async function resolveOrCreateDirectory(
downloadPath = getGitHubWorkspaceDir()
): Promise<string> {
if (!(await exists(downloadPath))) {
core.debug(
`Artifact destination folder does not exist, creating: ${downloadPath}`
)
await fs.mkdir(downloadPath, {recursive: true})
} else {
core.debug(`Artifact destination folder already exists: ${downloadPath}`)
}
return downloadPath
}

View File

@ -0,0 +1,125 @@
import {getOctokit} from '@actions/github'
import {retry} from '@octokit/plugin-retry'
import * as core from '@actions/core'
import {OctokitOptions} from '@octokit/core/dist-types/types'
import {defaults as defaultGitHubOptions} from '@actions/github/lib/utils'
import {getRetryOptions} from './retry-options'
import {requestLog} from '@octokit/plugin-request-log'
import {GetArtifactResponse} from '../shared/interfaces'
import {getBackendIdsFromToken} from '../shared/util'
import {getUserAgentString} from '../shared/user-agent'
import {internalArtifactTwirpClient} from '../shared/artifact-twirp-client'
import {ListArtifactsRequest, StringValue, Timestamp} from '../../generated'
import {ArtifactNotFoundError, InvalidResponseError} from '../shared/errors'
export async function getArtifactPublic(
artifactName: string,
workflowRunId: number,
repositoryOwner: string,
repositoryName: string,
token: string
): Promise<GetArtifactResponse> {
const [retryOpts, requestOpts] = getRetryOptions(defaultGitHubOptions)
const opts: OctokitOptions = {
log: undefined,
userAgent: getUserAgentString(),
previews: undefined,
retry: retryOpts,
request: requestOpts
}
const github = getOctokit(token, opts, retry, requestLog)
const getArtifactResp = await github.request(
'GET /repos/{owner}/{repo}/actions/runs/{run_id}/artifacts{?name}',
{
owner: repositoryOwner,
repo: repositoryName,
run_id: workflowRunId,
name: artifactName
}
)
if (getArtifactResp.status !== 200) {
throw new InvalidResponseError(
`Invalid response from GitHub API: ${getArtifactResp.status} (${getArtifactResp?.headers?.['x-github-request-id']})`
)
}
if (getArtifactResp.data.artifacts.length === 0) {
throw new ArtifactNotFoundError(
`Artifact not found for name: ${artifactName}
Please ensure that your artifact is not expired and the artifact was uploaded using a compatible version of toolkit/upload-artifact.
For more information, visit the GitHub Artifacts FAQ: https://github.com/actions/toolkit/blob/main/packages/artifact/docs/faq.md`
)
}
let artifact = getArtifactResp.data.artifacts[0]
if (getArtifactResp.data.artifacts.length > 1) {
artifact = getArtifactResp.data.artifacts.sort((a, b) => b.id - a.id)[0]
core.debug(
`More than one artifact found for a single name, returning newest (id: ${artifact.id})`
)
}
return {
artifact: {
name: artifact.name,
id: artifact.id,
size: artifact.size_in_bytes,
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined,
digest: artifact.digest
}
}
}
export async function getArtifactInternal(
artifactName: string
): Promise<GetArtifactResponse> {
const artifactClient = internalArtifactTwirpClient()
const {workflowRunBackendId, workflowJobRunBackendId} =
getBackendIdsFromToken()
const req: ListArtifactsRequest = {
workflowRunBackendId,
workflowJobRunBackendId,
nameFilter: StringValue.create({value: artifactName})
}
const res = await artifactClient.ListArtifacts(req)
if (res.artifacts.length === 0) {
throw new ArtifactNotFoundError(
`Artifact not found for name: ${artifactName}
Please ensure that your artifact is not expired and the artifact was uploaded using a compatible version of toolkit/upload-artifact.
For more information, visit the GitHub Artifacts FAQ: https://github.com/actions/toolkit/blob/main/packages/artifact/docs/faq.md`
)
}
let artifact = res.artifacts[0]
if (res.artifacts.length > 1) {
artifact = res.artifacts.sort(
(a, b) => Number(b.databaseId) - Number(a.databaseId)
)[0]
core.debug(
`More than one artifact found for a single name, returning newest (id: ${artifact.databaseId})`
)
}
return {
artifact: {
name: artifact.name,
id: Number(artifact.databaseId),
size: Number(artifact.size),
createdAt: artifact.createdAt
? Timestamp.toDate(artifact.createdAt)
: undefined,
digest: artifact.digest?.value
}
}
}

View File

@ -0,0 +1,187 @@
import {info, warning, debug} from '@actions/core'
import {getOctokit} from '@actions/github'
import {ListArtifactsResponse, Artifact} from '../shared/interfaces'
import {getUserAgentString} from '../shared/user-agent'
import {getRetryOptions} from './retry-options'
import {defaults as defaultGitHubOptions} from '@actions/github/lib/utils'
import {requestLog} from '@octokit/plugin-request-log'
import {retry} from '@octokit/plugin-retry'
import {OctokitOptions} from '@octokit/core/dist-types/types'
import {internalArtifactTwirpClient} from '../shared/artifact-twirp-client'
import {getBackendIdsFromToken} from '../shared/util'
import {ListArtifactsRequest, Timestamp} from '../../generated'
// Limiting to 1000 for perf reasons
const maximumArtifactCount = 1000
const paginationCount = 100
const maxNumberOfPages = maximumArtifactCount / paginationCount
export async function listArtifactsPublic(
workflowRunId: number,
repositoryOwner: string,
repositoryName: string,
token: string,
latest = false
): Promise<ListArtifactsResponse> {
info(
`Fetching artifact list for workflow run ${workflowRunId} in repository ${repositoryOwner}/${repositoryName}`
)
let artifacts: Artifact[] = []
const [retryOpts, requestOpts] = getRetryOptions(defaultGitHubOptions)
const opts: OctokitOptions = {
log: undefined,
userAgent: getUserAgentString(),
previews: undefined,
retry: retryOpts,
request: requestOpts
}
const github = getOctokit(token, opts, retry, requestLog)
let currentPageNumber = 1
const {data: listArtifactResponse} = await github.request(
'GET /repos/{owner}/{repo}/actions/runs/{run_id}/artifacts',
{
owner: repositoryOwner,
repo: repositoryName,
run_id: workflowRunId,
per_page: paginationCount,
page: currentPageNumber
}
)
let numberOfPages = Math.ceil(
listArtifactResponse.total_count / paginationCount
)
const totalArtifactCount = listArtifactResponse.total_count
if (totalArtifactCount > maximumArtifactCount) {
warning(
`Workflow run ${workflowRunId} has more than 1000 artifacts. Results will be incomplete as only the first ${maximumArtifactCount} artifacts will be returned`
)
numberOfPages = maxNumberOfPages
}
// Iterate over the first page
for (const artifact of listArtifactResponse.artifacts) {
artifacts.push({
name: artifact.name,
id: artifact.id,
size: artifact.size_in_bytes,
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined,
digest: (artifact as ArtifactResponse).digest
})
}
// Move to the next page
currentPageNumber++
// Iterate over any remaining pages
for (
currentPageNumber;
currentPageNumber < numberOfPages;
currentPageNumber++
) {
debug(`Fetching page ${currentPageNumber} of artifact list`)
const {data: listArtifactResponse} = await github.request(
'GET /repos/{owner}/{repo}/actions/runs/{run_id}/artifacts',
{
owner: repositoryOwner,
repo: repositoryName,
run_id: workflowRunId,
per_page: paginationCount,
page: currentPageNumber
}
)
for (const artifact of listArtifactResponse.artifacts) {
artifacts.push({
name: artifact.name,
id: artifact.id,
size: artifact.size_in_bytes,
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined,
digest: (artifact as ArtifactResponse).digest
})
}
}
if (latest) {
artifacts = filterLatest(artifacts)
}
info(`Found ${artifacts.length} artifact(s)`)
return {
artifacts
}
}
export async function listArtifactsInternal(
latest = false
): Promise<ListArtifactsResponse> {
const artifactClient = internalArtifactTwirpClient()
const {workflowRunBackendId, workflowJobRunBackendId} =
getBackendIdsFromToken()
const req: ListArtifactsRequest = {
workflowRunBackendId,
workflowJobRunBackendId
}
const res = await artifactClient.ListArtifacts(req)
let artifacts: Artifact[] = res.artifacts.map(artifact => ({
name: artifact.name,
id: Number(artifact.databaseId),
size: Number(artifact.size),
createdAt: artifact.createdAt
? Timestamp.toDate(artifact.createdAt)
: undefined,
digest: artifact.digest?.value
}))
if (latest) {
artifacts = filterLatest(artifacts)
}
info(`Found ${artifacts.length} artifact(s)`)
return {
artifacts
}
}
/**
* This exists so that we don't have to use 'any' when receiving the artifact list from the GitHub API.
* The digest field is not present in OpenAPI/types at time of writing, which necessitates this change.
*/
interface ArtifactResponse {
name: string
id: number
size_in_bytes: number
created_at?: string
digest?: string
}
/**
* Filters a list of artifacts to only include the latest artifact for each name
* @param artifacts The artifacts to filter
* @returns The filtered list of artifacts
*/
function filterLatest(artifacts: Artifact[]): Artifact[] {
artifacts.sort((a, b) => b.id - a.id)
const latestArtifacts: Artifact[] = []
const seenArtifactNames = new Set<string>()
for (const artifact of artifacts) {
if (!seenArtifactNames.has(artifact.name)) {
latestArtifacts.push(artifact)
seenArtifactNames.add(artifact.name)
}
}
return latestArtifacts
}

View File

@ -0,0 +1,48 @@
import * as core from '@actions/core'
import {OctokitOptions} from '@octokit/core/dist-types/types'
import {RequestRequestOptions} from '@octokit/types'
export type RetryOptions = {
doNotRetry?: number[]
enabled?: boolean
}
// Defaults for fetching artifacts
const defaultMaxRetryNumber = 5
const defaultExemptStatusCodes = [400, 401, 403, 404, 422] // https://github.com/octokit/plugin-retry.js/blob/9a2443746c350b3beedec35cf26e197ea318a261/src/index.ts#L14
export function getRetryOptions(
defaultOptions: OctokitOptions,
retries: number = defaultMaxRetryNumber,
exemptStatusCodes: number[] = defaultExemptStatusCodes
): [RetryOptions, RequestRequestOptions | undefined] {
if (retries <= 0) {
return [{enabled: false}, defaultOptions.request]
}
const retryOptions: RetryOptions = {
enabled: true
}
if (exemptStatusCodes.length > 0) {
retryOptions.doNotRetry = exemptStatusCodes
}
// The GitHub type has some defaults for `options.request`
// see: https://github.com/actions/toolkit/blob/4fbc5c941a57249b19562015edbd72add14be93d/packages/github/src/utils.ts#L15
// We pass these in here so they are not overridden.
const requestOptions: RequestRequestOptions = {
...defaultOptions.request,
retries
}
core.debug(
`GitHub client configured with: (retries: ${
requestOptions.retries
}, retry-exempt-status-code: ${
retryOptions.doNotRetry ?? 'octokit default: [400, 401, 403, 404, 422]'
})`
)
return [retryOptions, requestOptions]
}

View File

@ -1,35 +0,0 @@
import {HttpClient} from '@actions/http-client'
import {createHttpClient} from './utils'
/**
* Used for managing http clients during either upload or download
*/
export class HttpManager {
private clients: HttpClient[]
private userAgent: string
constructor(clientCount: number, userAgent: string) {
if (clientCount < 1) {
throw new Error('There must be at least one client')
}
this.userAgent = userAgent
this.clients = new Array(clientCount).fill(createHttpClient(userAgent))
}
getClient(index: number): HttpClient {
return this.clients[index]
}
// client disposal is necessary if a keep-alive connection is used to properly close the connection
// for more information see: https://github.com/actions/http-client/blob/04e5ad73cd3fd1f5610a32116b0759eddf6570d2/index.ts#L292
disposeAndReplaceClient(index: number): void {
this.clients[index].dispose()
this.clients[index] = createHttpClient(this.userAgent)
}
disposeAndReplaceAllClients(): void {
for (const [index] of this.clients.entries()) {
this.disposeAndReplaceClient(index)
}
}
}

View File

@ -1,79 +0,0 @@
import {HttpClientResponse} from '@actions/http-client'
import {
isRetryableStatusCode,
isSuccessStatusCode,
sleep,
getExponentialRetryTimeInMilliseconds,
displayHttpDiagnostics
} from './utils'
import * as core from '@actions/core'
import {getRetryLimit} from './config-variables'
export async function retry(
name: string,
operation: () => Promise<HttpClientResponse>,
customErrorMessages: Map<number, string>,
maxAttempts: number
): Promise<HttpClientResponse> {
let response: HttpClientResponse | undefined = undefined
let statusCode: number | undefined = undefined
let isRetryable = false
let errorMessage = ''
let customErrorInformation: string | undefined = undefined
let attempt = 1
while (attempt <= maxAttempts) {
try {
response = await operation()
statusCode = response.message.statusCode
if (isSuccessStatusCode(statusCode)) {
return response
}
// Extra error information that we want to display if a particular response code is hit
if (statusCode) {
customErrorInformation = customErrorMessages.get(statusCode)
}
isRetryable = isRetryableStatusCode(statusCode)
errorMessage = `Artifact service responded with ${statusCode}`
} catch (error) {
isRetryable = true
errorMessage = error.message
}
if (!isRetryable) {
core.info(`${name} - Error is not retryable`)
if (response) {
displayHttpDiagnostics(response)
}
break
}
core.info(
`${name} - Attempt ${attempt} of ${maxAttempts} failed with error: ${errorMessage}`
)
await sleep(getExponentialRetryTimeInMilliseconds(attempt))
attempt++
}
if (response) {
displayHttpDiagnostics(response)
}
if (customErrorInformation) {
throw Error(`${name} failed: ${customErrorInformation}`)
}
throw Error(`${name} failed: ${errorMessage}`)
}
export async function retryHttpClientRequest(
name: string,
method: () => Promise<HttpClientResponse>,
customErrorMessages: Map<number, string> = new Map(),
maxAttempts = getRetryLimit()
): Promise<HttpClientResponse> {
return await retry(name, method, customErrorMessages, maxAttempts)
}

View File

@ -0,0 +1,198 @@
import {HttpClient, HttpClientResponse, HttpCodes} from '@actions/http-client'
import {BearerCredentialHandler} from '@actions/http-client/lib/auth'
import {info, debug} from '@actions/core'
import {ArtifactServiceClientJSON} from '../../generated'
import {getResultsServiceUrl, getRuntimeToken} from './config'
import {getUserAgentString} from './user-agent'
import {NetworkError, UsageError} from './errors'
import {maskSecretUrls} from './util'
// The twirp http client must implement this interface
interface Rpc {
request(
service: string,
method: string,
contentType: 'application/json' | 'application/protobuf',
data: object | Uint8Array
): Promise<object | Uint8Array>
}
class ArtifactHttpClient implements Rpc {
private httpClient: HttpClient
private baseUrl: string
private maxAttempts = 5
private baseRetryIntervalMilliseconds = 3000
private retryMultiplier = 1.5
constructor(
userAgent: string,
maxAttempts?: number,
baseRetryIntervalMilliseconds?: number,
retryMultiplier?: number
) {
const token = getRuntimeToken()
this.baseUrl = getResultsServiceUrl()
if (maxAttempts) {
this.maxAttempts = maxAttempts
}
if (baseRetryIntervalMilliseconds) {
this.baseRetryIntervalMilliseconds = baseRetryIntervalMilliseconds
}
if (retryMultiplier) {
this.retryMultiplier = retryMultiplier
}
this.httpClient = new HttpClient(userAgent, [
new BearerCredentialHandler(token)
])
}
// This function satisfies the Rpc interface. It is compatible with the JSON
// JSON generated client.
async request(
service: string,
method: string,
contentType: 'application/json' | 'application/protobuf',
data: object | Uint8Array
): Promise<object | Uint8Array> {
const url = new URL(`/twirp/${service}/${method}`, this.baseUrl).href
debug(`[Request] ${method} ${url}`)
const headers = {
'Content-Type': contentType
}
try {
const {body} = await this.retryableRequest(async () =>
this.httpClient.post(url, JSON.stringify(data), headers)
)
return body
} catch (error) {
throw new Error(`Failed to ${method}: ${error.message}`)
}
}
async retryableRequest(
operation: () => Promise<HttpClientResponse>
): Promise<{response: HttpClientResponse; body: object}> {
let attempt = 0
let errorMessage = ''
let rawBody = ''
while (attempt < this.maxAttempts) {
let isRetryable = false
try {
const response = await operation()
const statusCode = response.message.statusCode
rawBody = await response.readBody()
debug(`[Response] - ${response.message.statusCode}`)
debug(`Headers: ${JSON.stringify(response.message.headers, null, 2)}`)
const body = JSON.parse(rawBody)
maskSecretUrls(body)
debug(`Body: ${JSON.stringify(body, null, 2)}`)
if (this.isSuccessStatusCode(statusCode)) {
return {response, body}
}
isRetryable = this.isRetryableHttpStatusCode(statusCode)
errorMessage = `Failed request: (${statusCode}) ${response.message.statusMessage}`
if (body.msg) {
if (UsageError.isUsageErrorMessage(body.msg)) {
throw new UsageError()
}
errorMessage = `${errorMessage}: ${body.msg}`
}
} catch (error) {
if (error instanceof SyntaxError) {
debug(`Raw Body: ${rawBody}`)
}
if (error instanceof UsageError) {
throw error
}
if (NetworkError.isNetworkErrorCode(error?.code)) {
throw new NetworkError(error?.code)
}
isRetryable = true
errorMessage = error.message
}
if (!isRetryable) {
throw new Error(`Received non-retryable error: ${errorMessage}`)
}
if (attempt + 1 === this.maxAttempts) {
throw new Error(
`Failed to make request after ${this.maxAttempts} attempts: ${errorMessage}`
)
}
const retryTimeMilliseconds =
this.getExponentialRetryTimeMilliseconds(attempt)
info(
`Attempt ${attempt + 1} of ${
this.maxAttempts
} failed with error: ${errorMessage}. Retrying request in ${retryTimeMilliseconds} ms...`
)
await this.sleep(retryTimeMilliseconds)
attempt++
}
throw new Error(`Request failed`)
}
isSuccessStatusCode(statusCode?: number): boolean {
if (!statusCode) return false
return statusCode >= 200 && statusCode < 300
}
isRetryableHttpStatusCode(statusCode?: number): boolean {
if (!statusCode) return false
const retryableStatusCodes = [
HttpCodes.BadGateway,
HttpCodes.GatewayTimeout,
HttpCodes.InternalServerError,
HttpCodes.ServiceUnavailable,
HttpCodes.TooManyRequests
]
return retryableStatusCodes.includes(statusCode)
}
async sleep(milliseconds: number): Promise<void> {
return new Promise(resolve => setTimeout(resolve, milliseconds))
}
getExponentialRetryTimeMilliseconds(attempt: number): number {
if (attempt < 0) {
throw new Error('attempt should be a positive integer')
}
if (attempt === 0) {
return this.baseRetryIntervalMilliseconds
}
const minTime =
this.baseRetryIntervalMilliseconds * this.retryMultiplier ** attempt
const maxTime = minTime * this.retryMultiplier
// returns a random number between minTime and maxTime (exclusive)
return Math.trunc(Math.random() * (maxTime - minTime) + minTime)
}
}
export function internalArtifactTwirpClient(options?: {
maxAttempts?: number
retryIntervalMs?: number
retryMultiplier?: number
}): ArtifactServiceClientJSON {
const client = new ArtifactHttpClient(
getUserAgentString(),
options?.maxAttempts,
options?.retryIntervalMs,
options?.retryMultiplier
)
return new ArtifactServiceClientJSON(client)
}

View File

@ -0,0 +1,99 @@
import os from 'os'
import {info} from '@actions/core'
// Used for controlling the highWaterMark value of the zip that is being streamed
// The same value is used as the chunk size that is use during upload to blob storage
export function getUploadChunkSize(): number {
return 8 * 1024 * 1024 // 8 MB Chunks
}
export function getRuntimeToken(): string {
const token = process.env['ACTIONS_RUNTIME_TOKEN']
if (!token) {
throw new Error('Unable to get the ACTIONS_RUNTIME_TOKEN env variable')
}
return token
}
export function getResultsServiceUrl(): string {
const resultsUrl = process.env['ACTIONS_RESULTS_URL']
if (!resultsUrl) {
throw new Error('Unable to get the ACTIONS_RESULTS_URL env variable')
}
return new URL(resultsUrl).origin
}
export function isGhes(): boolean {
const ghUrl = new URL(
process.env['GITHUB_SERVER_URL'] || 'https://github.com'
)
const hostname = ghUrl.hostname.trimEnd().toUpperCase()
const isGitHubHost = hostname === 'GITHUB.COM'
const isGheHost = hostname.endsWith('.GHE.COM')
const isLocalHost = hostname.endsWith('.LOCALHOST')
return !isGitHubHost && !isGheHost && !isLocalHost
}
export function getGitHubWorkspaceDir(): string {
const ghWorkspaceDir = process.env['GITHUB_WORKSPACE']
if (!ghWorkspaceDir) {
throw new Error('Unable to get the GITHUB_WORKSPACE env variable')
}
return ghWorkspaceDir
}
// The maximum value of concurrency is 300.
// This value can be changed with ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY variable.
export function getConcurrency(): number {
const numCPUs = os.cpus().length
let concurrencyCap = 32
if (numCPUs > 4) {
const concurrency = 16 * numCPUs
concurrencyCap = concurrency > 300 ? 300 : concurrency
}
const concurrencyOverride = process.env['ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY']
if (concurrencyOverride) {
const concurrency = parseInt(concurrencyOverride)
if (isNaN(concurrency) || concurrency < 1) {
throw new Error(
'Invalid value set for ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY env variable'
)
}
if (concurrency < concurrencyCap) {
info(
`Set concurrency based on the value set in ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY.`
)
return concurrency
}
info(
`ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY is higher than the cap of ${concurrencyCap} based on the number of cpus. Set it to the maximum value allowed.`
)
return concurrencyCap
}
// default concurrency to 5
return 5
}
export function getUploadChunkTimeout(): number {
const timeoutVar = process.env['ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS']
if (!timeoutVar) {
return 300000 // 5 minutes
}
const timeout = parseInt(timeoutVar)
if (isNaN(timeout)) {
throw new Error(
'Invalid value set for ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS env variable'
)
}
return timeout
}

View File

@ -0,0 +1,72 @@
export class FilesNotFoundError extends Error {
files: string[]
constructor(files: string[] = []) {
let message = 'No files were found to upload'
if (files.length > 0) {
message += `: ${files.join(', ')}`
}
super(message)
this.files = files
this.name = 'FilesNotFoundError'
}
}
export class InvalidResponseError extends Error {
constructor(message: string) {
super(message)
this.name = 'InvalidResponseError'
}
}
export class ArtifactNotFoundError extends Error {
constructor(message = 'Artifact not found') {
super(message)
this.name = 'ArtifactNotFoundError'
}
}
export class GHESNotSupportedError extends Error {
constructor(
message = '@actions/artifact v2.0.0+, upload-artifact@v4+ and download-artifact@v4+ are not currently supported on GHES.'
) {
super(message)
this.name = 'GHESNotSupportedError'
}
}
export class NetworkError extends Error {
code: string
constructor(code: string) {
const message = `Unable to make request: ${code}\nIf you are using self-hosted runners, please make sure your runner has access to all GitHub endpoints: https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners#communication-between-self-hosted-runners-and-github`
super(message)
this.code = code
this.name = 'NetworkError'
}
static isNetworkErrorCode = (code?: string): boolean => {
if (!code) return false
return [
'ECONNRESET',
'ENOTFOUND',
'ETIMEDOUT',
'ECONNREFUSED',
'EHOSTUNREACH'
].includes(code)
}
}
export class UsageError extends Error {
constructor() {
const message = `Artifact storage quota has been hit. Unable to upload any new artifacts. Usage is recalculated every 6-12 hours.\nMore info on storage limits: https://docs.github.com/en/billing/managing-billing-for-github-actions/about-billing-for-github-actions#calculating-minute-and-storage-spending`
super(message)
this.name = 'UsageError'
}
static isUsageErrorMessage = (msg?: string): boolean => {
if (!msg) return false
return msg.includes('insufficient usage')
}
}

View File

@ -0,0 +1,188 @@
/**
* Response from the server when an artifact is uploaded
*/
export interface UploadArtifactResponse {
/**
* Total size of the artifact in bytes. Not provided if no artifact was uploaded
*/
size?: number
/**
* The id of the artifact that was created. Not provided if no artifact was uploaded
* This ID can be used as input to other APIs to download, delete or get more information about an artifact: https://docs.github.com/en/rest/actions/artifacts
*/
id?: number
/**
* The SHA256 digest of the artifact that was created. Not provided if no artifact was uploaded
*/
digest?: string
}
/**
* Options for uploading an artifact
*/
export interface UploadArtifactOptions {
/**
* Duration after which artifact will expire in days.
*
* By default artifact expires after 90 days:
* https://docs.github.com/en/actions/configuring-and-managing-workflows/persisting-workflow-data-using-artifacts#downloading-and-deleting-artifacts-after-a-workflow-run-is-complete
*
* Use this option to override the default expiry.
*
* Min value: 1
* Max value: 90 unless changed by repository setting
*
* If this is set to a greater value than the retention settings allowed, the retention on artifacts
* will be reduced to match the max value allowed on server, and the upload process will continue. An
* input of 0 assumes default retention setting.
*/
retentionDays?: number
/**
* The level of compression for Zlib to be applied to the artifact archive.
* The value can range from 0 to 9:
* - 0: No compression
* - 1: Best speed
* - 6: Default compression (same as GNU Gzip)
* - 9: Best compression
* Higher levels will result in better compression, but will take longer to complete.
* For large files that are not easily compressed, a value of 0 is recommended for significantly faster uploads.
*/
compressionLevel?: number
}
/**
* Response from the server when getting an artifact
*/
export interface GetArtifactResponse {
/**
* Metadata about the artifact that was found
*/
artifact: Artifact
}
/**
* Options for listing artifacts
*/
export interface ListArtifactsOptions {
/**
* Filter the workflow run's artifacts to the latest by name
* In the case of reruns, this can be useful to avoid duplicates
*/
latest?: boolean
}
/**
* Response from the server when listing artifacts
*/
export interface ListArtifactsResponse {
/**
* A list of artifacts that were found
*/
artifacts: Artifact[]
}
/**
* Response from the server when downloading an artifact
*/
export interface DownloadArtifactResponse {
/**
* The path where the artifact was downloaded to
*/
downloadPath?: string
/**
* Returns true if the digest of the downloaded artifact does not match the expected hash
*/
digestMismatch?: boolean
}
/**
* Options for downloading an artifact
*/
export interface DownloadArtifactOptions {
/**
* Denotes where the artifact will be downloaded to. If not specified then the artifact is download to GITHUB_WORKSPACE
*/
path?: string
/**
* The hash that was computed for the artifact during upload. If provided, the outcome of the download
* will provide a digestMismatch property indicating whether the hash of the downloaded artifact
* matches the expected hash.
*/
expectedHash?: string
}
export interface StreamExtractResponse {
/**
* The SHA256 hash of the downloaded file
*/
sha256Digest?: string
}
/**
* An Actions Artifact
*/
export interface Artifact {
/**
* The name of the artifact
*/
name: string
/**
* The ID of the artifact
*/
id: number
/**
* The size of the artifact in bytes
*/
size: number
/**
* The time when the artifact was created
*/
createdAt?: Date
/**
* The digest of the artifact, computed at time of upload.
*/
digest?: string
}
// FindOptions are for fetching Artifact(s) out of the scope of the current run.
export interface FindOptions {
/**
* The criteria for finding Artifact(s) out of the scope of the current run.
*/
findBy?: {
/**
* Token with actions:read permissions
*/
token: string
/**
* WorkflowRun of the artifact(s) to lookup
*/
workflowRunId: number
/**
* Repository owner (eg. 'actions')
*/
repositoryOwner: string
/**
* Repository owner (eg. 'toolkit')
*/
repositoryName: string
}
}
/**
* Response from the server when deleting an artifact
*/
export interface DeleteArtifactResponse {
/**
* The id of the artifact that was deleted
*/
id: number
}

View File

@ -0,0 +1,9 @@
// eslint-disable-next-line @typescript-eslint/no-var-requires, @typescript-eslint/no-require-imports
const packageJson = require('../../../package.json')
/**
* Ensure that this User Agent String is used in all HTTP calls so that we can monitor telemetry between different versions of this package
*/
export function getUserAgentString(): string {
return `@actions/artifact-${packageJson.version}`
}

View File

@ -0,0 +1,145 @@
import * as core from '@actions/core'
import {getRuntimeToken} from './config'
import jwt_decode from 'jwt-decode'
import {debug, setSecret} from '@actions/core'
export interface BackendIds {
workflowRunBackendId: string
workflowJobRunBackendId: string
}
interface ActionsToken {
scp: string
}
const InvalidJwtError = new Error(
'Failed to get backend IDs: The provided JWT token is invalid and/or missing claims'
)
// uses the JWT token claims to get the
// workflow run and workflow job run backend ids
export function getBackendIdsFromToken(): BackendIds {
const token = getRuntimeToken()
const decoded = jwt_decode<ActionsToken>(token)
if (!decoded.scp) {
throw InvalidJwtError
}
/*
* example decoded:
* {
* scp: "Actions.ExampleScope Actions.Results:ce7f54c7-61c7-4aae-887f-30da475f5f1a:ca395085-040a-526b-2ce8-bdc85f692774"
* }
*/
const scpParts = decoded.scp.split(' ')
if (scpParts.length === 0) {
throw InvalidJwtError
}
/*
* example scpParts:
* ["Actions.ExampleScope", "Actions.Results:ce7f54c7-61c7-4aae-887f-30da475f5f1a:ca395085-040a-526b-2ce8-bdc85f692774"]
*/
for (const scopes of scpParts) {
const scopeParts = scopes.split(':')
if (scopeParts?.[0] !== 'Actions.Results') {
// not the Actions.Results scope
continue
}
/*
* example scopeParts:
* ["Actions.Results", "ce7f54c7-61c7-4aae-887f-30da475f5f1a", "ca395085-040a-526b-2ce8-bdc85f692774"]
*/
if (scopeParts.length !== 3) {
// missing expected number of claims
throw InvalidJwtError
}
const ids = {
workflowRunBackendId: scopeParts[1],
workflowJobRunBackendId: scopeParts[2]
}
core.debug(`Workflow Run Backend ID: ${ids.workflowRunBackendId}`)
core.debug(`Workflow Job Run Backend ID: ${ids.workflowJobRunBackendId}`)
return ids
}
throw InvalidJwtError
}
/**
* Masks the `sig` parameter in a URL and sets it as a secret.
*
* @param url - The URL containing the signature parameter to mask
* @remarks
* This function attempts to parse the provided URL and identify the 'sig' query parameter.
* If found, it registers both the raw and URL-encoded signature values as secrets using
* the Actions `setSecret` API, which prevents them from being displayed in logs.
*
* The function handles errors gracefully if URL parsing fails, logging them as debug messages.
*
* @example
* ```typescript
* // Mask a signature in an Azure SAS token URL
* maskSigUrl('https://example.blob.core.windows.net/container/file.txt?sig=abc123&se=2023-01-01');
* ```
*/
export function maskSigUrl(url: string): void {
if (!url) return
try {
const parsedUrl = new URL(url)
const signature = parsedUrl.searchParams.get('sig')
if (signature) {
setSecret(signature)
setSecret(encodeURIComponent(signature))
}
} catch (error) {
debug(
`Failed to parse URL: ${url} ${
error instanceof Error ? error.message : String(error)
}`
)
}
}
/**
* Masks sensitive information in URLs containing signature parameters.
* Currently supports masking 'sig' parameters in the 'signed_upload_url'
* and 'signed_download_url' properties of the provided object.
*
* @param body - The object should contain a signature
* @remarks
* This function extracts URLs from the object properties and calls maskSigUrl
* on each one to redact sensitive signature information. The function doesn't
* modify the original object; it only marks the signatures as secrets for
* logging purposes.
*
* @example
* ```typescript
* const responseBody = {
* signed_upload_url: 'https://example.com?sig=abc123',
* signed_download_url: 'https://example.com?sig=def456'
* };
* maskSecretUrls(responseBody);
* ```
*/
export function maskSecretUrls(body: Record<string, unknown> | null): void {
if (typeof body !== 'object' || body === null) {
debug('body is not an object or is null')
return
}
if (
'signed_upload_url' in body &&
typeof body.signed_upload_url === 'string'
) {
maskSigUrl(body.signed_upload_url)
}
if ('signed_url' in body && typeof body.signed_url === 'string') {
maskSigUrl(body.signed_url)
}
}

Some files were not shown because too many files have changed in this diff Show More