Compare commits

..

28 Commits

Author SHA1 Message Date
Bethany 704facf57e
Merge pull request #1236 from actions/bethanyj28/bump-version
Bump action version to 3.3.2
2023-09-07 16:31:19 -04:00
bethanyj28 17e2888746 Add to RELEASES.md 2023-09-06 14:41:49 -04:00
bethanyj28 667d8fdfa2 bump action version to 3.3.2 2023-09-06 14:15:33 -04:00
Chad Kimes f7ebb81a3f
Consume latest toolkit and fix dangling promise bug (#1217)
* Consume latest toolkit and fix dangling promise bug

* Pass earlyExit parameter to run method so tests don't hang

* Pass earlyExit parameter to run method so tests don't hang

* Refactor restore files to have better patterns for testing

* style
2023-08-09 15:36:51 +01:00
Johanan Idicula 67b839edb6
Merge pull request #1187 from jorendorff/jorendorff/rm-add-to-project
Remove actions to add new PRs and issues to a project board
2023-06-12 15:14:21 -04:00
Jason Orendorff 57f0e3f198 Remove actions to add new PRs and issues to a project board
The project doesn't seem to exist, so this always fails.
2023-06-12 13:10:52 -05:00
Vipul 04f198bf0b
Merge pull request #1132 from vorburger/bazel-example
Bazel example (Take #2️⃣)
2023-03-21 12:15:41 +05:30
Vipul bd9b49b6c3
Merge branch 'main' into bazel-example 2023-03-21 12:10:48 +05:30
Lovepreet Singh ea0503788c
Merge pull request #1122 from actions/pdotl-patch-1
Update Cross-OS Caching tips
2023-03-17 13:09:54 +05:30
Lovepreet Singh 6a1a45d49b
Merge branch 'main' into pdotl-patch-1 2023-03-17 12:28:02 +05:30
Vipul 9c7b3e90bd
Merge pull request #1131 from actions/bishal-pdMSFT-patch-4
Change two new actions mention as quoted text
2023-03-13 19:07:22 +05:30
Michael Vorburger ⛑️ 8f2671f18e
Merge branch 'main' into bazel-example 2023-03-13 06:27:49 -07:00
Michael Vorburger ⛑️ 6f1f1e10f3
Clarify that macos-latest image has bazelisk 2023-03-13 14:26:31 +01:00
Bishal Prasad 5cb4bb86c0
Merge branch 'main' into bishal-pdMSFT-patch-4 2023-03-13 18:54:39 +05:30
Sankalp Kotewar 84995e0d91
Updated description of the lookup-only input for main action (#1130)
* Updated description of the lookup-only input for main action

* Update README.md

Co-authored-by: Bishal Prasad <bishal-pdmsft@github.com>

* Update README.md

---------

Co-authored-by: Bishal Prasad <bishal-pdmsft@github.com>
2023-03-13 16:43:13 +05:30
Bishal Prasad bf96a3f9d8
Merge branch 'main' into bishal-pdMSFT-patch-4 2023-03-13 15:41:54 +05:30
Michael Vorburger ⛑️ 4b8460cbff
Create separate Linux/macOS examples for Bazel 2023-03-13 10:52:18 +01:00
Sankalp Kotewar 57014a2baa
Readme fixes (#1134)
* Update README.md

* Update README.md
2023-03-13 12:02:23 +05:30
Sankalp Kotewar cb865c1889
Fixed readme with new segment timeout values (#1133) 2023-03-13 11:02:55 +05:30
Bishal Prasad 4e7c82221f
Merge branch 'main' into bishal-pdMSFT-patch-4 2023-03-13 11:01:00 +05:30
Sankalp Kotewar 88522ab9f3
Reduced download segment size to 128 MB and timeout to 10 minutes (#1129)
* Changed segment size to 128mb & timeout to 10 min

* Updated license

* Updated licenses
2023-03-13 10:32:46 +05:30
Michael Vorburger ef11f54eee Fix example for Bazel 2023-03-11 19:54:11 +01:00
David Bernard 4b381be638 Add example for Bazel 2023-03-11 19:21:37 +01:00
Bishal Prasad 7893481812
Change two new actions mention as quoted text 2023-03-11 21:32:05 +05:30
Marc Mueller 940f3d7cf1
Add `lookup-only` option (#1041)
* Add new actions/cache version (with dryRun support)

* Add dry-run option

* Changes after rebase

* Update readme

* Rename option to lookup-only

* Update test name

* Update package.json + changelog

* Update README

* Update custom package version

* Update custom package version

* Update @actions/cache to 3.2.0

* Code review

* Update log statement

* Move test case

---------

Co-authored-by: Sankalp Kotewar <98868223+kotewar@users.noreply.github.com>
2023-03-09 18:00:28 +05:30
Kotokaze e0d62270e2
docs: Add missing permission in cache delete example (#1123) 2023-02-27 23:40:04 +05:30
Lovepreet Singh 77eb7eb198
Update Cross-OS Caching tips 2023-02-23 11:59:31 +05:30
Lovepreet Singh 69d9d449ac
Merge pull request #1118 from actions/pdotl/zstd-hotfix
Fix zstd not being used after zstd version upgrade to 1.5.4 on hosted runners
2023-02-21 15:32:12 +05:30
38 changed files with 13991 additions and 17077 deletions

View File

@ -14,9 +14,3 @@ jobs:
- name: add_assignees - name: add_assignees
run: | run: |
curl -X POST -H "Accept: application/vnd.github+json" -H "Authorization: Bearer ${{ secrets.GITHUB_TOKEN}}" https://api.github.com/repos/${{github.repository}}/issues/${{ github.event.issue.number}}/assignees -d '{"assignees":["${{steps.oncall.outputs.CURRENT}}"]}' curl -X POST -H "Accept: application/vnd.github+json" -H "Authorization: Bearer ${{ secrets.GITHUB_TOKEN}}" https://api.github.com/repos/${{github.repository}}/issues/${{ github.event.issue.number}}/assignees -d '{"assignees":["${{steps.oncall.outputs.CURRENT}}"]}'
- uses: actions/add-to-project@v0.4.0
name: Add to Project Board
with:
project-url: https://github.com/orgs/actions/projects/12
github-token: ${{ secrets.CACHE_BOARD_TOKEN }}

View File

@ -18,9 +18,3 @@ jobs:
- name: Add Assignee - name: Add Assignee
run: | run: |
curl -X POST -H "Accept: application/vnd.github+json" -H "Authorization: Bearer ${{ secrets.GITHUB_TOKEN}}" https://api.github.com/repos/${{github.repository}}/issues/${{ github.event.pull_request.number}}/assignees -d '{"assignees":["${{steps.oncall.outputs.CURRENT}}"]}' curl -X POST -H "Accept: application/vnd.github+json" -H "Authorization: Bearer ${{ secrets.GITHUB_TOKEN}}" https://api.github.com/repos/${{github.repository}}/issues/${{ github.event.pull_request.number}}/assignees -d '{"assignees":["${{steps.oncall.outputs.CURRENT}}"]}'
- uses: actions/add-to-project@v0.4.0
name: Add to Project Board
with:
project-url: https://github.com/orgs/actions/projects/12
github-token: ${{ secrets.CACHE_BOARD_TOKEN }}

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

BIN
.licenses/npm/@azure/core-util.dep.yml generated Normal file

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

BIN
.licenses/npm/tslib-2.5.0.dep.yml generated Normal file

Binary file not shown.

Binary file not shown.

View File

@ -2,11 +2,9 @@
This action allows caching dependencies and build outputs to improve workflow execution time. This action allows caching dependencies and build outputs to improve workflow execution time.
Two other actions are available in addition to the primary `cache` action: >Two other actions are available in addition to the primary `cache` action:
>* [Restore action](./restore/README.md)
* [Restore action](./restore/README.md) >* [Save action](./save/README.md)
* [Save action](./save/README.md)
[![Tests](https://github.com/actions/cache/actions/workflows/workflow.yml/badge.svg)](https://github.com/actions/cache/actions/workflows/workflow.yml) [![Tests](https://github.com/actions/cache/actions/workflows/workflow.yml/badge.svg)](https://github.com/actions/cache/actions/workflows/workflow.yml)
@ -27,11 +25,13 @@ See ["Caching dependencies to speed up workflows"](https://docs.github.com/en/ac
* Fixed cache not working with github workspace directory or current directory. * Fixed cache not working with github workspace directory or current directory.
* Fixed the download stuck problem by introducing a timeout of 1 hour for cache downloads. * Fixed the download stuck problem by introducing a timeout of 1 hour for cache downloads.
* Fix zstd not working for windows on gnu tar in issues. * Fix zstd not working for windows on gnu tar in issues.
* Allowing users to provide a custom timeout as input for aborting download of a cache segment using an environment variable `SEGMENT_DOWNLOAD_TIMEOUT_MINS`. Default is 60 minutes. * Allowing users to provide a custom timeout as input for aborting download of a cache segment using an environment variable `SEGMENT_DOWNLOAD_TIMEOUT_MINS`. Default is 10 minutes.
* New actions are available for granular control over caches - [restore](restore/action.yml) and [save](save/action.yml). * New actions are available for granular control over caches - [restore](restore/action.yml) and [save](save/action.yml).
* Support cross-os caching as an opt-in feature. See [Cross OS caching](./tips-and-workarounds.md#cross-os-cache) for more info. * Support cross-os caching as an opt-in feature. See [Cross OS caching](./tips-and-workarounds.md#cross-os-cache) for more info.
* Added option to fail job on cache miss. See [Exit workflow on cache miss](./restore/README.md#exit-workflow-on-cache-miss) for more info. * Added option to fail job on cache miss. See [Exit workflow on cache miss](./restore/README.md#exit-workflow-on-cache-miss) for more info.
* Fix zstd not being used after zstd version upgrade to 1.5.4 on hosted runners * Fix zstd not being used after zstd version upgrade to 1.5.4 on hosted runners
* Added option to lookup cache without downloading it.
* Reduced segment size to 128MB and segment timeout to 10 minutes to fail fast in case the cache download is stuck.
See the [v2 README.md](https://github.com/actions/cache/blob/v2/README.md) for older updates. See the [v2 README.md](https://github.com/actions/cache/blob/v2/README.md) for older updates.
@ -52,10 +52,11 @@ If you are using a `self-hosted` Windows runner, `GNU tar` and `zstd` are requir
* `restore-keys` - An ordered list of prefix-matched keys to use for restoring stale cache if no cache hit occurred for key. * `restore-keys` - An ordered list of prefix-matched keys to use for restoring stale cache if no cache hit occurred for key.
* `enableCrossOsArchive` - An optional boolean when enabled, allows Windows runners to save or restore caches that can be restored or saved respectively on other platforms. Default: `false` * `enableCrossOsArchive` - An optional boolean when enabled, allows Windows runners to save or restore caches that can be restored or saved respectively on other platforms. Default: `false`
* `fail-on-cache-miss` - Fail the workflow if cache entry is not found. Default: `false` * `fail-on-cache-miss` - Fail the workflow if cache entry is not found. Default: `false`
* `lookup-only` - If true, only checks if cache entry exists and skips download. Does not change save cache behavior. Default: `false`
#### Environment Variables #### Environment Variables
* `SEGMENT_DOWNLOAD_TIMEOUT_MINS` - Segment download timeout (in minutes, default `60`) to abort download of the segment if not completed in the defined number of minutes. [Read more](https://github.com/actions/cache/blob/main/tips-and-workarounds.md#cache-segment-restore-timeout) * `SEGMENT_DOWNLOAD_TIMEOUT_MINS` - Segment download timeout (in minutes, default `10`) to abort download of the segment if not completed in the defined number of minutes. [Read more](https://github.com/actions/cache/blob/main/tips-and-workarounds.md#cache-segment-restore-timeout)
### Outputs ### Outputs

View File

@ -1,78 +1,113 @@
# Releases # Releases
### 3.0.0 ### 3.0.0
- Updated minimum runner version support from node 12 -> node 16 - Updated minimum runner version support from node 12 -> node 16
### 3.0.1 ### 3.0.1
- Added support for caching from GHES 3.5. - Added support for caching from GHES 3.5.
- Fixed download issue for files > 2GB during restore. - Fixed download issue for files > 2GB during restore.
### 3.0.2 ### 3.0.2
- Added support for dynamic cache size cap on GHES. - Added support for dynamic cache size cap on GHES.
### 3.0.3 ### 3.0.3
- Fixed avoiding empty cache save when no files are available for caching. ([issue](https://github.com/actions/cache/issues/624)) - Fixed avoiding empty cache save when no files are available for caching. ([issue](https://github.com/actions/cache/issues/624))
### 3.0.4 ### 3.0.4
- Fixed tar creation error while trying to create tar with path as `~/` home folder on `ubuntu-latest`. ([issue](https://github.com/actions/cache/issues/689)) - Fixed tar creation error while trying to create tar with path as `~/` home folder on `ubuntu-latest`. ([issue](https://github.com/actions/cache/issues/689))
### 3.0.5 ### 3.0.5
- Removed error handling by consuming actions/cache 3.0 toolkit, Now cache server error handling will be done by toolkit. ([PR](https://github.com/actions/cache/pull/834)) - Removed error handling by consuming actions/cache 3.0 toolkit, Now cache server error handling will be done by toolkit. ([PR](https://github.com/actions/cache/pull/834))
### 3.0.6 ### 3.0.6
- Fixed [#809](https://github.com/actions/cache/issues/809) - zstd -d: no such file or directory error - Fixed [#809](https://github.com/actions/cache/issues/809) - zstd -d: no such file or directory error
- Fixed [#833](https://github.com/actions/cache/issues/833) - cache doesn't work with github workspace directory - Fixed [#833](https://github.com/actions/cache/issues/833) - cache doesn't work with github workspace directory
### 3.0.7 ### 3.0.7
- Fixed [#810](https://github.com/actions/cache/issues/810) - download stuck issue. A new timeout is introduced in the download process to abort the download if it gets stuck and doesn't finish within an hour. - Fixed [#810](https://github.com/actions/cache/issues/810) - download stuck issue. A new timeout is introduced in the download process to abort the download if it gets stuck and doesn't finish within an hour.
### 3.0.8 ### 3.0.8
- Fix zstd not working for windows on gnu tar in issues [#888](https://github.com/actions/cache/issues/888) and [#891](https://github.com/actions/cache/issues/891). - Fix zstd not working for windows on gnu tar in issues [#888](https://github.com/actions/cache/issues/888) and [#891](https://github.com/actions/cache/issues/891).
- Allowing users to provide a custom timeout as input for aborting download of a cache segment using an environment variable `SEGMENT_DOWNLOAD_TIMEOUT_MINS`. Default is 60 minutes. - Allowing users to provide a custom timeout as input for aborting download of a cache segment using an environment variable `SEGMENT_DOWNLOAD_TIMEOUT_MINS`. Default is 60 minutes.
### 3.0.9 ### 3.0.9
- Enhanced the warning message for cache unavailablity in case of GHES. - Enhanced the warning message for cache unavailablity in case of GHES.
### 3.0.10 ### 3.0.10
- Fix a bug with sorting inputs. - Fix a bug with sorting inputs.
- Update definition for restore-keys in README.md - Update definition for restore-keys in README.md
### 3.0.11 ### 3.0.11
- Update toolkit version to 3.0.5 to include `@actions/core@^1.10.0` - Update toolkit version to 3.0.5 to include `@actions/core@^1.10.0`
- Update `@actions/cache` to use updated `saveState` and `setOutput` functions from `@actions/core@^1.10.0` - Update `@actions/cache` to use updated `saveState` and `setOutput` functions from `@actions/core@^1.10.0`
### 3.1.0-beta.1 ### 3.1.0-beta.1
- Update `@actions/cache` on windows to use gnu tar and zstd by default and fallback to bsdtar and zstd if gnu tar is not available. ([issue](https://github.com/actions/cache/issues/984)) - Update `@actions/cache` on windows to use gnu tar and zstd by default and fallback to bsdtar and zstd if gnu tar is not available. ([issue](https://github.com/actions/cache/issues/984))
### 3.1.0-beta.2 ### 3.1.0-beta.2
- Added support for fallback to gzip to restore old caches on windows. - Added support for fallback to gzip to restore old caches on windows.
### 3.1.0-beta.3 ### 3.1.0-beta.3
- Bug fixes for bsdtar fallback if gnutar not available and gzip fallback if cache saved using old cache action on windows. - Bug fixes for bsdtar fallback if gnutar not available and gzip fallback if cache saved using old cache action on windows.
### 3.2.0-beta.1 ### 3.2.0-beta.1
- Added two new actions - [restore](restore/action.yml) and [save](save/action.yml) for granular control on cache. - Added two new actions - [restore](restore/action.yml) and [save](save/action.yml) for granular control on cache.
### 3.2.0 ### 3.2.0
- Released the two new actions - [restore](restore/action.yml) and [save](save/action.yml) for granular control on cache - Released the two new actions - [restore](restore/action.yml) and [save](save/action.yml) for granular control on cache
### 3.2.1 ### 3.2.1
- Update `@actions/cache` on windows to use gnu tar and zstd by default and fallback to bsdtar and zstd if gnu tar is not available. ([issue](https://github.com/actions/cache/issues/984)) - Update `@actions/cache` on windows to use gnu tar and zstd by default and fallback to bsdtar and zstd if gnu tar is not available. ([issue](https://github.com/actions/cache/issues/984))
- Added support for fallback to gzip to restore old caches on windows. - Added support for fallback to gzip to restore old caches on windows.
- Added logs for cache version in case of a cache miss. - Added logs for cache version in case of a cache miss.
### 3.2.2 ### 3.2.2
- Reverted the changes made in 3.2.1 to use gnu tar and zstd by default on windows. - Reverted the changes made in 3.2.1 to use gnu tar and zstd by default on windows.
### 3.2.3 ### 3.2.3
- Support cross os caching on Windows as an opt-in feature. - Support cross os caching on Windows as an opt-in feature.
- Fix issue with symlink restoration on Windows for cross-os caches. - Fix issue with symlink restoration on Windows for cross-os caches.
### 3.2.4 ### 3.2.4
- Added option to fail job on cache miss. - Added option to fail job on cache miss.
### 3.2.5 ### 3.2.5
- Added fix to prevent from setting MYSYS environment variable globally. - Added fix to prevent from setting MYSYS environment variable globally.
### 3.2.6 ### 3.2.6
- Fix zstd not being used after zstd version upgrade to 1.5.4 on hosted runners. - Fix zstd not being used after zstd version upgrade to 1.5.4 on hosted runners.
### 3.3.0
- Added option to lookup cache without downloading it.
### 3.3.1
- Reduced segment size to 128MB and segment timeout to 10 minutes to fail fast in case the cache download is stuck.
### 3.3.2
- Fixes bug with Azure SDK causing blob downloads to get stuck.

View File

@ -2,7 +2,7 @@ import * as cache from "@actions/cache";
import * as core from "@actions/core"; import * as core from "@actions/core";
import { Events, RefKey } from "../src/constants"; import { Events, RefKey } from "../src/constants";
import run from "../src/restore"; import { restoreRun } from "../src/restoreImpl";
import * as actionUtils from "../src/utils/actionUtils"; import * as actionUtils from "../src/utils/actionUtils";
import * as testUtils from "../src/utils/testUtils"; import * as testUtils from "../src/utils/testUtils";
@ -71,10 +71,18 @@ test("restore with no cache found", async () => {
return Promise.resolve(undefined); return Promise.resolve(undefined);
}); });
await run(); await restoreRun();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith(
[path],
key,
[],
{
lookupOnly: false
},
false
);
expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key);
expect(stateMock).toHaveBeenCalledTimes(1); expect(stateMock).toHaveBeenCalledTimes(1);
@ -106,14 +114,16 @@ test("restore with restore keys and no cache found", async () => {
return Promise.resolve(undefined); return Promise.resolve(undefined);
}); });
await run(); await restoreRun();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith(
[path], [path],
key, key,
[restoreKey], [restoreKey],
{}, {
lookupOnly: false
},
false false
); );
@ -146,10 +156,18 @@ test("restore with cache found for key", async () => {
return Promise.resolve(key); return Promise.resolve(key);
}); });
await run(); await restoreRun();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith(
[path],
key,
[],
{
lookupOnly: false
},
false
);
expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key);
expect(stateMock).toHaveBeenCalledWith("CACHE_RESULT", key); expect(stateMock).toHaveBeenCalledWith("CACHE_RESULT", key);
@ -183,14 +201,16 @@ test("restore with cache found for restore key", async () => {
return Promise.resolve(restoreKey); return Promise.resolve(restoreKey);
}); });
await run(); await restoreRun();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith(
[path], [path],
key, key,
[restoreKey], [restoreKey],
{}, {
lookupOnly: false
},
false false
); );
@ -226,14 +246,16 @@ test("Fail restore when fail on cache miss is enabled and primary + restore keys
return Promise.resolve(undefined); return Promise.resolve(undefined);
}); });
await run(); await restoreRun();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith(
[path], [path],
key, key,
[restoreKey], [restoreKey],
{}, {
lookupOnly: false
},
false false
); );
@ -267,14 +289,16 @@ test("restore when fail on cache miss is enabled and primary key doesn't match r
return Promise.resolve(restoreKey); return Promise.resolve(restoreKey);
}); });
await run(); await restoreRun();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith(
[path], [path],
key, key,
[restoreKey], [restoreKey],
{}, {
lookupOnly: false
},
false false
); );
@ -311,14 +335,16 @@ test("restore with fail on cache miss disabled and no cache found", async () =>
return Promise.resolve(undefined); return Promise.resolve(undefined);
}); });
await run(); await restoreRun();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith(
[path], [path],
key, key,
[restoreKey], [restoreKey],
{}, {
lookupOnly: false
},
false false
); );

View File

@ -2,7 +2,7 @@ import * as cache from "@actions/cache";
import * as core from "@actions/core"; import * as core from "@actions/core";
import { Events, Inputs, RefKey } from "../src/constants"; import { Events, Inputs, RefKey } from "../src/constants";
import run from "../src/restoreImpl"; import { restoreImpl } from "../src/restoreImpl";
import { StateProvider } from "../src/stateProvider"; import { StateProvider } from "../src/stateProvider";
import * as actionUtils from "../src/utils/actionUtils"; import * as actionUtils from "../src/utils/actionUtils";
import * as testUtils from "../src/utils/testUtils"; import * as testUtils from "../src/utils/testUtils";
@ -60,7 +60,7 @@ test("restore with invalid event outputs warning", async () => {
const invalidEvent = "commit_comment"; const invalidEvent = "commit_comment";
process.env[Events.Key] = invalidEvent; process.env[Events.Key] = invalidEvent;
delete process.env[RefKey]; delete process.env[RefKey];
await run(new StateProvider()); await restoreImpl(new StateProvider());
expect(logWarningMock).toHaveBeenCalledWith( expect(logWarningMock).toHaveBeenCalledWith(
`Event Validation Error: The event type ${invalidEvent} is not supported because it's not tied to a branch or tag ref.` `Event Validation Error: The event type ${invalidEvent} is not supported because it's not tied to a branch or tag ref.`
); );
@ -76,7 +76,7 @@ test("restore without AC available should no-op", async () => {
const restoreCacheMock = jest.spyOn(cache, "restoreCache"); const restoreCacheMock = jest.spyOn(cache, "restoreCache");
const setCacheHitOutputMock = jest.spyOn(core, "setOutput"); const setCacheHitOutputMock = jest.spyOn(core, "setOutput");
await run(new StateProvider()); await restoreImpl(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(0); expect(restoreCacheMock).toHaveBeenCalledTimes(0);
expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1); expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1);
@ -92,7 +92,7 @@ test("restore on GHES without AC available should no-op", async () => {
const restoreCacheMock = jest.spyOn(cache, "restoreCache"); const restoreCacheMock = jest.spyOn(cache, "restoreCache");
const setCacheHitOutputMock = jest.spyOn(core, "setOutput"); const setCacheHitOutputMock = jest.spyOn(core, "setOutput");
await run(new StateProvider()); await restoreImpl(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(0); expect(restoreCacheMock).toHaveBeenCalledTimes(0);
expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1); expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1);
@ -119,10 +119,18 @@ test("restore on GHES with AC available ", async () => {
return Promise.resolve(key); return Promise.resolve(key);
}); });
await run(new StateProvider()); await restoreImpl(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith(
[path],
key,
[],
{
lookupOnly: false
},
false
);
expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key);
expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1); expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1);
@ -135,7 +143,7 @@ test("restore on GHES with AC available ", async () => {
test("restore with no path should fail", async () => { test("restore with no path should fail", async () => {
const failedMock = jest.spyOn(core, "setFailed"); const failedMock = jest.spyOn(core, "setFailed");
const restoreCacheMock = jest.spyOn(cache, "restoreCache"); const restoreCacheMock = jest.spyOn(cache, "restoreCache");
await run(new StateProvider()); await restoreImpl(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(0); expect(restoreCacheMock).toHaveBeenCalledTimes(0);
// this input isn't necessary for restore b/c tarball contains entries relative to workspace // this input isn't necessary for restore b/c tarball contains entries relative to workspace
expect(failedMock).not.toHaveBeenCalledWith( expect(failedMock).not.toHaveBeenCalledWith(
@ -147,7 +155,7 @@ test("restore with no key", async () => {
testUtils.setInput(Inputs.Path, "node_modules"); testUtils.setInput(Inputs.Path, "node_modules");
const failedMock = jest.spyOn(core, "setFailed"); const failedMock = jest.spyOn(core, "setFailed");
const restoreCacheMock = jest.spyOn(cache, "restoreCache"); const restoreCacheMock = jest.spyOn(cache, "restoreCache");
await run(new StateProvider()); await restoreImpl(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(0); expect(restoreCacheMock).toHaveBeenCalledTimes(0);
expect(failedMock).toHaveBeenCalledWith( expect(failedMock).toHaveBeenCalledWith(
"Input required and not supplied: key" "Input required and not supplied: key"
@ -166,13 +174,15 @@ test("restore with too many keys should fail", async () => {
}); });
const failedMock = jest.spyOn(core, "setFailed"); const failedMock = jest.spyOn(core, "setFailed");
const restoreCacheMock = jest.spyOn(cache, "restoreCache"); const restoreCacheMock = jest.spyOn(cache, "restoreCache");
await run(new StateProvider()); await restoreImpl(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith(
[path], [path],
key, key,
restoreKeys, restoreKeys,
{}, {
lookupOnly: false
},
false false
); );
expect(failedMock).toHaveBeenCalledWith( expect(failedMock).toHaveBeenCalledWith(
@ -190,9 +200,17 @@ test("restore with large key should fail", async () => {
}); });
const failedMock = jest.spyOn(core, "setFailed"); const failedMock = jest.spyOn(core, "setFailed");
const restoreCacheMock = jest.spyOn(cache, "restoreCache"); const restoreCacheMock = jest.spyOn(cache, "restoreCache");
await run(new StateProvider()); await restoreImpl(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith(
[path],
key,
[],
{
lookupOnly: false
},
false
);
expect(failedMock).toHaveBeenCalledWith( expect(failedMock).toHaveBeenCalledWith(
`Key Validation Error: ${key} cannot be larger than 512 characters.` `Key Validation Error: ${key} cannot be larger than 512 characters.`
); );
@ -208,9 +226,17 @@ test("restore with invalid key should fail", async () => {
}); });
const failedMock = jest.spyOn(core, "setFailed"); const failedMock = jest.spyOn(core, "setFailed");
const restoreCacheMock = jest.spyOn(cache, "restoreCache"); const restoreCacheMock = jest.spyOn(cache, "restoreCache");
await run(new StateProvider()); await restoreImpl(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith(
[path],
key,
[],
{
lookupOnly: false
},
false
);
expect(failedMock).toHaveBeenCalledWith( expect(failedMock).toHaveBeenCalledWith(
`Key Validation Error: ${key} cannot contain commas.` `Key Validation Error: ${key} cannot contain commas.`
); );
@ -234,10 +260,18 @@ test("restore with no cache found", async () => {
return Promise.resolve(undefined); return Promise.resolve(undefined);
}); });
await run(new StateProvider()); await restoreImpl(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith(
[path],
key,
[],
{
lookupOnly: false
},
false
);
expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key);
expect(failedMock).toHaveBeenCalledTimes(0); expect(failedMock).toHaveBeenCalledTimes(0);
@ -267,14 +301,16 @@ test("restore with restore keys and no cache found", async () => {
return Promise.resolve(undefined); return Promise.resolve(undefined);
}); });
await run(new StateProvider()); await restoreImpl(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith(
[path], [path],
key, key,
[restoreKey], [restoreKey],
{}, {
lookupOnly: false
},
false false
); );
@ -305,10 +341,18 @@ test("restore with cache found for key", async () => {
return Promise.resolve(key); return Promise.resolve(key);
}); });
await run(new StateProvider()); await restoreImpl(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith(
[path],
key,
[],
{
lookupOnly: false
},
false
);
expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key);
expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1); expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1);
@ -339,14 +383,16 @@ test("restore with cache found for restore key", async () => {
return Promise.resolve(restoreKey); return Promise.resolve(restoreKey);
}); });
await run(new StateProvider()); await restoreImpl(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith(
[path], [path],
key, key,
[restoreKey], [restoreKey],
{}, {
lookupOnly: false
},
false false
); );
@ -358,3 +404,48 @@ test("restore with cache found for restore key", async () => {
); );
expect(failedMock).toHaveBeenCalledTimes(0); expect(failedMock).toHaveBeenCalledTimes(0);
}); });
test("restore with lookup-only set", async () => {
const path = "node_modules";
const key = "node-test";
testUtils.setInputs({
path: path,
key,
lookupOnly: true
});
const infoMock = jest.spyOn(core, "info");
const failedMock = jest.spyOn(core, "setFailed");
const stateMock = jest.spyOn(core, "saveState");
const setCacheHitOutputMock = jest.spyOn(core, "setOutput");
const restoreCacheMock = jest
.spyOn(cache, "restoreCache")
.mockImplementationOnce(() => {
return Promise.resolve(key);
});
await restoreImpl(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith(
[path],
key,
[],
{
lookupOnly: true
},
false
);
expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key);
expect(stateMock).toHaveBeenCalledWith("CACHE_RESULT", key);
expect(stateMock).toHaveBeenCalledTimes(2);
expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1);
expect(setCacheHitOutputMock).toHaveBeenCalledWith("cache-hit", "true");
expect(infoMock).toHaveBeenCalledWith(
`Cache found and can be restored from key: ${key}`
);
expect(failedMock).toHaveBeenCalledTimes(0);
});

View File

@ -2,7 +2,7 @@ import * as cache from "@actions/cache";
import * as core from "@actions/core"; import * as core from "@actions/core";
import { Events, RefKey } from "../src/constants"; import { Events, RefKey } from "../src/constants";
import run from "../src/restoreOnly"; import { restoreOnlyRun } from "../src/restoreImpl";
import * as actionUtils from "../src/utils/actionUtils"; import * as actionUtils from "../src/utils/actionUtils";
import * as testUtils from "../src/utils/testUtils"; import * as testUtils from "../src/utils/testUtils";
@ -72,10 +72,18 @@ test("restore with no cache found", async () => {
return Promise.resolve(undefined); return Promise.resolve(undefined);
}); });
await run(); await restoreOnlyRun();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith(
[path],
key,
[],
{
lookupOnly: false
},
false
);
expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key); expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key);
expect(outputMock).toHaveBeenCalledTimes(1); expect(outputMock).toHaveBeenCalledTimes(1);
@ -106,14 +114,16 @@ test("restore with restore keys and no cache found", async () => {
return Promise.resolve(undefined); return Promise.resolve(undefined);
}); });
await run(); await restoreOnlyRun();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith(
[path], [path],
key, key,
[restoreKey], [restoreKey],
{}, {
lookupOnly: false
},
false false
); );
@ -143,10 +153,18 @@ test("restore with cache found for key", async () => {
return Promise.resolve(key); return Promise.resolve(key);
}); });
await run(); await restoreOnlyRun();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith(
[path],
key,
[],
{
lookupOnly: false
},
false
);
expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key); expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key);
expect(outputMock).toHaveBeenCalledWith("cache-hit", "true"); expect(outputMock).toHaveBeenCalledWith("cache-hit", "true");
@ -178,14 +196,16 @@ test("restore with cache found for restore key", async () => {
return Promise.resolve(restoreKey); return Promise.resolve(restoreKey);
}); });
await run(); await restoreOnlyRun();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith(
[path], [path],
key, key,
[restoreKey], [restoreKey],
{}, {
lookupOnly: false
},
false false
); );

View File

@ -22,6 +22,10 @@ inputs:
description: 'Fail the workflow if cache entry is not found' description: 'Fail the workflow if cache entry is not found'
default: 'false' default: 'false'
required: false required: false
lookup-only:
description: 'Check if a cache entry exists for the given input(s) (key, restore-keys) without downloading the cache'
default: 'false'
required: false
outputs: outputs:
cache-hit: cache-hit:
description: 'A boolean value to indicate an exact match was found for the primary key' description: 'A boolean value to indicate an exact match was found for the primary key'

File diff suppressed because one or more lines are too long

7566
dist/restore/index.js vendored

File diff suppressed because one or more lines are too long

7498
dist/save-only/index.js vendored

File diff suppressed because one or more lines are too long

7462
dist/save/index.js vendored

File diff suppressed because one or more lines are too long

View File

@ -39,6 +39,7 @@
- [Swift, Objective-C - CocoaPods](#swift-objective-c---cocoapods) - [Swift, Objective-C - CocoaPods](#swift-objective-c---cocoapods)
- [Swift - Swift Package Manager](#swift---swift-package-manager) - [Swift - Swift Package Manager](#swift---swift-package-manager)
- [Swift - Mint](#swift---mint) - [Swift - Mint](#swift---mint)
- [* - Bazel](#---bazel)
## C# - NuGet ## C# - NuGet
@ -657,3 +658,35 @@ steps:
restore-keys: | restore-keys: |
${{ runner.os }}-mint- ${{ runner.os }}-mint-
``` ```
## * - Bazel
[`bazelisk`](https://github.com/bazelbuild/bazelisk) does not have be to separately downloaded and installed because it's already included in GitHub's `ubuntu-latest` and `macos-latest` base images.
### Linux
```yaml
- name: Cache Bazel
uses: actions/cache@v3
with:
path: |
~/.cache/bazel
key: ${{ runner.os }}-bazel-${{ hashFiles('.bazelversion', '.bazelrc', 'WORKSPACE', 'WORKSPACE.bazel', 'MODULE.bazel') }}
restore-keys: |
${{ runner.os }}-bazel-
- run: bazelisk test //...
```
### macOS
```yaml
- name: Cache Bazel
uses: actions/cache@v3
with:
path: |
/private/var/tmp/_bazel_runner/
key: ${{ runner.os }}-bazel-${{ hashFiles('.bazelversion', '.bazelrc', 'WORKSPACE', 'WORKSPACE.bazel', 'MODULE.bazel') }}
restore-keys: |
${{ runner.os }}-bazel-
- run: bazelisk test //...
```

304
package-lock.json generated
View File

@ -1,15 +1,15 @@
{ {
"name": "cache", "name": "cache",
"version": "3.2.6", "version": "3.3.2",
"lockfileVersion": 2, "lockfileVersion": 2,
"requires": true, "requires": true,
"packages": { "packages": {
"": { "": {
"name": "cache", "name": "cache",
"version": "3.2.6", "version": "3.3.2",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@actions/cache": "^3.1.4", "@actions/cache": "^3.2.2",
"@actions/core": "^1.10.0", "@actions/core": "^1.10.0",
"@actions/exec": "^1.1.1", "@actions/exec": "^1.1.1",
"@actions/io": "^1.1.2" "@actions/io": "^1.1.2"
@ -36,18 +36,18 @@
} }
}, },
"node_modules/@actions/cache": { "node_modules/@actions/cache": {
"version": "3.1.4", "version": "3.2.2",
"resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.4.tgz", "resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.2.2.tgz",
"integrity": "sha512-Uh9wsz7SxunfyqF3UY/wfHI81z97CYQrZs4NU+whzYd0N8emTaloB+XtrAq46X2RbQEOBjF6R090jKQpX4coGg==", "integrity": "sha512-6D0Jq5JrLZRQ3VApeQwQkkV20ZZXjXsHNYXd9VjNUdi9E0h93wESpxfMJ2JWLCUCgHNLcfY0v3GjNM+2FdRMlg==",
"dependencies": { "dependencies": {
"@actions/core": "^1.10.0", "@actions/core": "^1.10.0",
"@actions/exec": "^1.0.1", "@actions/exec": "^1.0.1",
"@actions/glob": "^0.1.0", "@actions/glob": "^0.1.0",
"@actions/http-client": "^2.0.1", "@actions/http-client": "^2.1.1",
"@actions/io": "^1.0.1", "@actions/io": "^1.0.1",
"@azure/abort-controller": "^1.1.0", "@azure/abort-controller": "^1.1.0",
"@azure/ms-rest-js": "^2.6.0", "@azure/ms-rest-js": "^2.6.0",
"@azure/storage-blob": "^12.8.0", "@azure/storage-blob": "^12.13.0",
"semver": "^6.1.0", "semver": "^6.1.0",
"uuid": "^3.3.3" "uuid": "^3.3.3"
} }
@ -87,9 +87,9 @@
} }
}, },
"node_modules/@actions/http-client": { "node_modules/@actions/http-client": {
"version": "2.0.1", "version": "2.1.1",
"resolved": "https://registry.npmjs.org/@actions/http-client/-/http-client-2.0.1.tgz", "resolved": "https://registry.npmjs.org/@actions/http-client/-/http-client-2.1.1.tgz",
"integrity": "sha512-PIXiMVtz6VvyaRsGY268qvj57hXQEpsYogYOu2nrQhlf+XCGmZstmuZBbAybUl1nQGnvS1k1eEsQ69ZoD7xlSw==", "integrity": "sha512-qhrkRMB40bbbLo7gF+0vu+X+UawOvQQqNAA/5Unx774RS8poaOhThDOG6BGmxvAnxhQnDp2BG/ZUm65xZILTpw==",
"dependencies": { "dependencies": {
"tunnel": "^0.0.6" "tunnel": "^0.0.6"
} }
@ -127,14 +127,6 @@
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz", "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz",
"integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw==" "integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw=="
}, },
"node_modules/@azure/core-asynciterator-polyfill": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/@azure/core-asynciterator-polyfill/-/core-asynciterator-polyfill-1.0.2.tgz",
"integrity": "sha512-3rkP4LnnlWawl0LZptJOdXNrT/fHp2eQMadoasa6afspXdpGrtPZuAQc2PD0cpgyuoXtUWyC3tv7xfntjGS5Dw==",
"engines": {
"node": ">=12.0.0"
}
},
"node_modules/@azure/core-auth": { "node_modules/@azure/core-auth": {
"version": "1.3.2", "version": "1.3.2",
"resolved": "https://registry.npmjs.org/@azure/core-auth/-/core-auth-1.3.2.tgz", "resolved": "https://registry.npmjs.org/@azure/core-auth/-/core-auth-1.3.2.tgz",
@ -153,28 +145,27 @@
"integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw==" "integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw=="
}, },
"node_modules/@azure/core-http": { "node_modules/@azure/core-http": {
"version": "2.2.4", "version": "3.0.0",
"resolved": "https://registry.npmjs.org/@azure/core-http/-/core-http-2.2.4.tgz", "resolved": "https://registry.npmjs.org/@azure/core-http/-/core-http-3.0.0.tgz",
"integrity": "sha512-QmmJmexXKtPyc3/rsZR/YTLDvMatzbzAypJmLzvlfxgz/SkgnqV/D4f6F2LsK6tBj1qhyp8BoXiOebiej0zz3A==", "integrity": "sha512-BxI2SlGFPPz6J1XyZNIVUf0QZLBKFX+ViFjKOkzqD18J1zOINIQ8JSBKKr+i+v8+MB6LacL6Nn/sP/TE13+s2Q==",
"dependencies": { "dependencies": {
"@azure/abort-controller": "^1.0.0", "@azure/abort-controller": "^1.0.0",
"@azure/core-asynciterator-polyfill": "^1.0.0",
"@azure/core-auth": "^1.3.0", "@azure/core-auth": "^1.3.0",
"@azure/core-tracing": "1.0.0-preview.13", "@azure/core-tracing": "1.0.0-preview.13",
"@azure/core-util": "^1.1.1",
"@azure/logger": "^1.0.0", "@azure/logger": "^1.0.0",
"@types/node-fetch": "^2.5.0", "@types/node-fetch": "^2.5.0",
"@types/tunnel": "^0.0.3", "@types/tunnel": "^0.0.3",
"form-data": "^4.0.0", "form-data": "^4.0.0",
"node-fetch": "^2.6.7", "node-fetch": "^2.6.7",
"process": "^0.11.10", "process": "^0.11.10",
"tough-cookie": "^4.0.0",
"tslib": "^2.2.0", "tslib": "^2.2.0",
"tunnel": "^0.0.6", "tunnel": "^0.0.6",
"uuid": "^8.3.0", "uuid": "^8.3.0",
"xml2js": "^0.4.19" "xml2js": "^0.4.19"
}, },
"engines": { "engines": {
"node": ">=12.0.0" "node": ">=14.0.0"
} }
}, },
"node_modules/@azure/core-http/node_modules/form-data": { "node_modules/@azure/core-http/node_modules/form-data": {
@ -190,23 +181,10 @@
"node": ">= 6" "node": ">= 6"
} }
}, },
"node_modules/@azure/core-http/node_modules/tough-cookie": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/tough-cookie/-/tough-cookie-4.0.0.tgz",
"integrity": "sha512-tHdtEpQCMrc1YLrMaqXXcj6AxhYi/xgit6mZu1+EDWUn+qhUf8wMQoFIy9NXuq23zAwtcB0t/MjACGR18pcRbg==",
"dependencies": {
"psl": "^1.1.33",
"punycode": "^2.1.1",
"universalify": "^0.1.2"
},
"engines": {
"node": ">=6"
}
},
"node_modules/@azure/core-http/node_modules/tslib": { "node_modules/@azure/core-http/node_modules/tslib": {
"version": "2.3.1", "version": "2.5.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz", "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.5.0.tgz",
"integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw==" "integrity": "sha512-336iVw3rtn2BUK7ORdIAHTyxHGRIHVReokCR3XjbckJMK7ms8FysBfhLR8IXnAgy7T0PTPNBWKiH514FOW/WSg=="
}, },
"node_modules/@azure/core-http/node_modules/uuid": { "node_modules/@azure/core-http/node_modules/uuid": {
"version": "8.3.2", "version": "8.3.2",
@ -217,40 +195,38 @@
} }
}, },
"node_modules/@azure/core-lro": { "node_modules/@azure/core-lro": {
"version": "2.2.4", "version": "2.5.1",
"resolved": "https://registry.npmjs.org/@azure/core-lro/-/core-lro-2.2.4.tgz", "resolved": "https://registry.npmjs.org/@azure/core-lro/-/core-lro-2.5.1.tgz",
"integrity": "sha512-e1I2v2CZM0mQo8+RSix0x091Av493e4bnT22ds2fcQGslTHzM2oTbswkB65nP4iEpCxBrFxOSDPKExmTmjCVtQ==", "integrity": "sha512-JHQy/bA3NOz2WuzOi5zEk6n/TJdAropupxUT521JIJvW7EXV2YN2SFYZrf/2RHeD28QAClGdynYadZsbmP+nyQ==",
"dependencies": { "dependencies": {
"@azure/abort-controller": "^1.0.0", "@azure/abort-controller": "^1.0.0",
"@azure/core-tracing": "1.0.0-preview.13",
"@azure/logger": "^1.0.0", "@azure/logger": "^1.0.0",
"tslib": "^2.2.0" "tslib": "^2.2.0"
}, },
"engines": { "engines": {
"node": ">=12.0.0" "node": ">=14.0.0"
} }
}, },
"node_modules/@azure/core-lro/node_modules/tslib": { "node_modules/@azure/core-lro/node_modules/tslib": {
"version": "2.3.1", "version": "2.5.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz", "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.5.0.tgz",
"integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw==" "integrity": "sha512-336iVw3rtn2BUK7ORdIAHTyxHGRIHVReokCR3XjbckJMK7ms8FysBfhLR8IXnAgy7T0PTPNBWKiH514FOW/WSg=="
}, },
"node_modules/@azure/core-paging": { "node_modules/@azure/core-paging": {
"version": "1.2.1", "version": "1.5.0",
"resolved": "https://registry.npmjs.org/@azure/core-paging/-/core-paging-1.2.1.tgz", "resolved": "https://registry.npmjs.org/@azure/core-paging/-/core-paging-1.5.0.tgz",
"integrity": "sha512-UtH5iMlYsvg+nQYIl4UHlvvSrsBjOlRF4fs0j7mxd3rWdAStrKYrh2durOpHs5C9yZbVhsVDaisoyaf/lL1EVA==", "integrity": "sha512-zqWdVIt+2Z+3wqxEOGzR5hXFZ8MGKK52x4vFLw8n58pR6ZfKRx3EXYTxTaYxYHc/PexPUTyimcTWFJbji9Z6Iw==",
"dependencies": { "dependencies": {
"@azure/core-asynciterator-polyfill": "^1.0.0",
"tslib": "^2.2.0" "tslib": "^2.2.0"
}, },
"engines": { "engines": {
"node": ">=12.0.0" "node": ">=14.0.0"
} }
}, },
"node_modules/@azure/core-paging/node_modules/tslib": { "node_modules/@azure/core-paging/node_modules/tslib": {
"version": "2.3.1", "version": "2.5.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz", "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.5.0.tgz",
"integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw==" "integrity": "sha512-336iVw3rtn2BUK7ORdIAHTyxHGRIHVReokCR3XjbckJMK7ms8FysBfhLR8IXnAgy7T0PTPNBWKiH514FOW/WSg=="
}, },
"node_modules/@azure/core-tracing": { "node_modules/@azure/core-tracing": {
"version": "1.0.0-preview.13", "version": "1.0.0-preview.13",
@ -265,25 +241,42 @@
} }
}, },
"node_modules/@azure/core-tracing/node_modules/tslib": { "node_modules/@azure/core-tracing/node_modules/tslib": {
"version": "2.3.1", "version": "2.5.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz", "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.5.0.tgz",
"integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw==" "integrity": "sha512-336iVw3rtn2BUK7ORdIAHTyxHGRIHVReokCR3XjbckJMK7ms8FysBfhLR8IXnAgy7T0PTPNBWKiH514FOW/WSg=="
},
"node_modules/@azure/core-util": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/@azure/core-util/-/core-util-1.2.0.tgz",
"integrity": "sha512-ffGIw+Qs8bNKNLxz5UPkz4/VBM/EZY07mPve1ZYFqYUdPwFqRj0RPk0U7LZMOfT7GCck9YjuT1Rfp1PApNl1ng==",
"dependencies": {
"@azure/abort-controller": "^1.0.0",
"tslib": "^2.2.0"
},
"engines": {
"node": ">=14.0.0"
}
},
"node_modules/@azure/core-util/node_modules/tslib": {
"version": "2.5.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.5.0.tgz",
"integrity": "sha512-336iVw3rtn2BUK7ORdIAHTyxHGRIHVReokCR3XjbckJMK7ms8FysBfhLR8IXnAgy7T0PTPNBWKiH514FOW/WSg=="
}, },
"node_modules/@azure/logger": { "node_modules/@azure/logger": {
"version": "1.0.3", "version": "1.0.4",
"resolved": "https://registry.npmjs.org/@azure/logger/-/logger-1.0.3.tgz", "resolved": "https://registry.npmjs.org/@azure/logger/-/logger-1.0.4.tgz",
"integrity": "sha512-aK4s3Xxjrx3daZr3VylxejK3vG5ExXck5WOHDJ8in/k9AqlfIyFMMT1uG7u8mNjX+QRILTIn0/Xgschfh/dQ9g==", "integrity": "sha512-ustrPY8MryhloQj7OWGe+HrYx+aoiOxzbXTtgblbV3xwCqpzUK36phH3XNHQKj3EPonyFUuDTfR3qFhTEAuZEg==",
"dependencies": { "dependencies": {
"tslib": "^2.2.0" "tslib": "^2.2.0"
}, },
"engines": { "engines": {
"node": ">=12.0.0" "node": ">=14.0.0"
} }
}, },
"node_modules/@azure/logger/node_modules/tslib": { "node_modules/@azure/logger/node_modules/tslib": {
"version": "2.3.1", "version": "2.5.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz", "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.5.0.tgz",
"integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw==" "integrity": "sha512-336iVw3rtn2BUK7ORdIAHTyxHGRIHVReokCR3XjbckJMK7ms8FysBfhLR8IXnAgy7T0PTPNBWKiH514FOW/WSg=="
}, },
"node_modules/@azure/ms-rest-js": { "node_modules/@azure/ms-rest-js": {
"version": "2.6.1", "version": "2.6.1",
@ -310,12 +303,12 @@
} }
}, },
"node_modules/@azure/storage-blob": { "node_modules/@azure/storage-blob": {
"version": "12.9.0", "version": "12.13.0",
"resolved": "https://registry.npmjs.org/@azure/storage-blob/-/storage-blob-12.9.0.tgz", "resolved": "https://registry.npmjs.org/@azure/storage-blob/-/storage-blob-12.13.0.tgz",
"integrity": "sha512-ank38FdCLfJ+EoeMzCz3hkYJuZAd63ARvDKkxZYRDb+beBYf+/+gx8jNTqkq/hfyUl4dJQ/a7tECU0Y0F98CHg==", "integrity": "sha512-t3Q2lvBMJucgTjQcP5+hvEJMAsJSk0qmAnjDLie2td017IiduZbbC9BOcFfmwzR6y6cJdZOuewLCNFmEx9IrXA==",
"dependencies": { "dependencies": {
"@azure/abort-controller": "^1.0.0", "@azure/abort-controller": "^1.0.0",
"@azure/core-http": "^2.0.0", "@azure/core-http": "^3.0.0",
"@azure/core-lro": "^2.2.0", "@azure/core-lro": "^2.2.0",
"@azure/core-paging": "^1.1.1", "@azure/core-paging": "^1.1.1",
"@azure/core-tracing": "1.0.0-preview.13", "@azure/core-tracing": "1.0.0-preview.13",
@ -324,13 +317,13 @@
"tslib": "^2.2.0" "tslib": "^2.2.0"
}, },
"engines": { "engines": {
"node": ">=12.0.0" "node": ">=14.0.0"
} }
}, },
"node_modules/@azure/storage-blob/node_modules/tslib": { "node_modules/@azure/storage-blob/node_modules/tslib": {
"version": "2.3.1", "version": "2.5.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz", "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.5.0.tgz",
"integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw==" "integrity": "sha512-336iVw3rtn2BUK7ORdIAHTyxHGRIHVReokCR3XjbckJMK7ms8FysBfhLR8IXnAgy7T0PTPNBWKiH514FOW/WSg=="
}, },
"node_modules/@babel/code-frame": { "node_modules/@babel/code-frame": {
"version": "7.16.7", "version": "7.16.7",
@ -2649,9 +2642,9 @@
} }
}, },
"node_modules/@opentelemetry/api": { "node_modules/@opentelemetry/api": {
"version": "1.0.4", "version": "1.4.0",
"resolved": "https://registry.npmjs.org/@opentelemetry/api/-/api-1.0.4.tgz", "resolved": "https://registry.npmjs.org/@opentelemetry/api/-/api-1.4.0.tgz",
"integrity": "sha512-BuJuXRSJNQ3QoKA6GWWDyuLpOUck+9hAXNMCnrloc1aWVoy6Xq6t9PUV08aBZ4Lutqq2LEHM486bpZqoViScog==", "integrity": "sha512-IgMK9i3sFGNUqPMbjABm0G26g0QCKCUBfglhQ7rQq6WcxbKfEHRcmwsoER4hZcuYqJgkYn2OeuoJIv7Jsftp7g==",
"engines": { "engines": {
"node": ">=8.0.0" "node": ">=8.0.0"
} }
@ -2792,9 +2785,9 @@
"integrity": "sha512-jh6m0QUhIRcZpNv7Z/rpN+ZWXOicUUQbSoWks7Htkbb9IjFQj4kzcX/xFCkjstCj5flMsN8FiSvt+q+Tcs4Llg==" "integrity": "sha512-jh6m0QUhIRcZpNv7Z/rpN+ZWXOicUUQbSoWks7Htkbb9IjFQj4kzcX/xFCkjstCj5flMsN8FiSvt+q+Tcs4Llg=="
}, },
"node_modules/@types/node-fetch": { "node_modules/@types/node-fetch": {
"version": "2.6.1", "version": "2.6.2",
"resolved": "https://registry.npmjs.org/@types/node-fetch/-/node-fetch-2.6.1.tgz", "resolved": "https://registry.npmjs.org/@types/node-fetch/-/node-fetch-2.6.2.tgz",
"integrity": "sha512-oMqjURCaxoSIsHSr1E47QHzbmzNR5rK8McHuNb11BOM9cHcIK3Avy0s/b2JlXHoQGTYS3NsvWzV1M0iK7l0wbA==", "integrity": "sha512-DHqhlq5jeESLy19TYhLakJ07kNumXWjcDdxXsLUMJZ6ue8VZJj4kLPQVE/2mdHh3xZziNF1xppu5lwmS53HR+A==",
"dependencies": { "dependencies": {
"@types/node": "*", "@types/node": "*",
"form-data": "^3.0.0" "form-data": "^3.0.0"
@ -8811,7 +8804,7 @@
"node_modules/process": { "node_modules/process": {
"version": "0.11.10", "version": "0.11.10",
"resolved": "https://registry.npmjs.org/process/-/process-0.11.10.tgz", "resolved": "https://registry.npmjs.org/process/-/process-0.11.10.tgz",
"integrity": "sha1-czIwDoQBYb2j5podHZGn1LwW8YI=", "integrity": "sha512-cdGef/drWFoydD1JsMzuFf8100nZl+GT+yacc2bEced5f9Rjk4z+WtFUTBu9PhOi9j/jfmBPu0mMEY4wIdAF8A==",
"engines": { "engines": {
"node": ">= 0.6.0" "node": ">= 0.6.0"
} }
@ -9521,14 +9514,6 @@
"url": "https://github.com/sponsors/ljharb" "url": "https://github.com/sponsors/ljharb"
} }
}, },
"node_modules/universalify": {
"version": "0.1.2",
"resolved": "https://registry.npmjs.org/universalify/-/universalify-0.1.2.tgz",
"integrity": "sha512-rBJeI5CXAlmy1pV+617WB9J63U6XcazHHF2f2dbJix4XzpUF0RS3Zbj0FGIOCAva5P/d/GBOYaACQ1w+0azUkg==",
"engines": {
"node": ">= 4.0.0"
}
},
"node_modules/uri-js": { "node_modules/uri-js": {
"version": "4.4.1", "version": "4.4.1",
"resolved": "https://registry.npmjs.org/uri-js/-/uri-js-4.4.1.tgz", "resolved": "https://registry.npmjs.org/uri-js/-/uri-js-4.4.1.tgz",
@ -9722,18 +9707,18 @@
}, },
"dependencies": { "dependencies": {
"@actions/cache": { "@actions/cache": {
"version": "3.1.4", "version": "3.2.2",
"resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.4.tgz", "resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.2.2.tgz",
"integrity": "sha512-Uh9wsz7SxunfyqF3UY/wfHI81z97CYQrZs4NU+whzYd0N8emTaloB+XtrAq46X2RbQEOBjF6R090jKQpX4coGg==", "integrity": "sha512-6D0Jq5JrLZRQ3VApeQwQkkV20ZZXjXsHNYXd9VjNUdi9E0h93wESpxfMJ2JWLCUCgHNLcfY0v3GjNM+2FdRMlg==",
"requires": { "requires": {
"@actions/core": "^1.10.0", "@actions/core": "^1.10.0",
"@actions/exec": "^1.0.1", "@actions/exec": "^1.0.1",
"@actions/glob": "^0.1.0", "@actions/glob": "^0.1.0",
"@actions/http-client": "^2.0.1", "@actions/http-client": "^2.1.1",
"@actions/io": "^1.0.1", "@actions/io": "^1.0.1",
"@azure/abort-controller": "^1.1.0", "@azure/abort-controller": "^1.1.0",
"@azure/ms-rest-js": "^2.6.0", "@azure/ms-rest-js": "^2.6.0",
"@azure/storage-blob": "^12.8.0", "@azure/storage-blob": "^12.13.0",
"semver": "^6.1.0", "semver": "^6.1.0",
"uuid": "^3.3.3" "uuid": "^3.3.3"
} }
@ -9772,9 +9757,9 @@
} }
}, },
"@actions/http-client": { "@actions/http-client": {
"version": "2.0.1", "version": "2.1.1",
"resolved": "https://registry.npmjs.org/@actions/http-client/-/http-client-2.0.1.tgz", "resolved": "https://registry.npmjs.org/@actions/http-client/-/http-client-2.1.1.tgz",
"integrity": "sha512-PIXiMVtz6VvyaRsGY268qvj57hXQEpsYogYOu2nrQhlf+XCGmZstmuZBbAybUl1nQGnvS1k1eEsQ69ZoD7xlSw==", "integrity": "sha512-qhrkRMB40bbbLo7gF+0vu+X+UawOvQQqNAA/5Unx774RS8poaOhThDOG6BGmxvAnxhQnDp2BG/ZUm65xZILTpw==",
"requires": { "requires": {
"tunnel": "^0.0.6" "tunnel": "^0.0.6"
} }
@ -9808,11 +9793,6 @@
} }
} }
}, },
"@azure/core-asynciterator-polyfill": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/@azure/core-asynciterator-polyfill/-/core-asynciterator-polyfill-1.0.2.tgz",
"integrity": "sha512-3rkP4LnnlWawl0LZptJOdXNrT/fHp2eQMadoasa6afspXdpGrtPZuAQc2PD0cpgyuoXtUWyC3tv7xfntjGS5Dw=="
},
"@azure/core-auth": { "@azure/core-auth": {
"version": "1.3.2", "version": "1.3.2",
"resolved": "https://registry.npmjs.org/@azure/core-auth/-/core-auth-1.3.2.tgz", "resolved": "https://registry.npmjs.org/@azure/core-auth/-/core-auth-1.3.2.tgz",
@ -9830,21 +9810,20 @@
} }
}, },
"@azure/core-http": { "@azure/core-http": {
"version": "2.2.4", "version": "3.0.0",
"resolved": "https://registry.npmjs.org/@azure/core-http/-/core-http-2.2.4.tgz", "resolved": "https://registry.npmjs.org/@azure/core-http/-/core-http-3.0.0.tgz",
"integrity": "sha512-QmmJmexXKtPyc3/rsZR/YTLDvMatzbzAypJmLzvlfxgz/SkgnqV/D4f6F2LsK6tBj1qhyp8BoXiOebiej0zz3A==", "integrity": "sha512-BxI2SlGFPPz6J1XyZNIVUf0QZLBKFX+ViFjKOkzqD18J1zOINIQ8JSBKKr+i+v8+MB6LacL6Nn/sP/TE13+s2Q==",
"requires": { "requires": {
"@azure/abort-controller": "^1.0.0", "@azure/abort-controller": "^1.0.0",
"@azure/core-asynciterator-polyfill": "^1.0.0",
"@azure/core-auth": "^1.3.0", "@azure/core-auth": "^1.3.0",
"@azure/core-tracing": "1.0.0-preview.13", "@azure/core-tracing": "1.0.0-preview.13",
"@azure/core-util": "^1.1.1",
"@azure/logger": "^1.0.0", "@azure/logger": "^1.0.0",
"@types/node-fetch": "^2.5.0", "@types/node-fetch": "^2.5.0",
"@types/tunnel": "^0.0.3", "@types/tunnel": "^0.0.3",
"form-data": "^4.0.0", "form-data": "^4.0.0",
"node-fetch": "^2.6.7", "node-fetch": "^2.6.7",
"process": "^0.11.10", "process": "^0.11.10",
"tough-cookie": "^4.0.0",
"tslib": "^2.2.0", "tslib": "^2.2.0",
"tunnel": "^0.0.6", "tunnel": "^0.0.6",
"uuid": "^8.3.0", "uuid": "^8.3.0",
@ -9861,20 +9840,10 @@
"mime-types": "^2.1.12" "mime-types": "^2.1.12"
} }
}, },
"tough-cookie": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/tough-cookie/-/tough-cookie-4.0.0.tgz",
"integrity": "sha512-tHdtEpQCMrc1YLrMaqXXcj6AxhYi/xgit6mZu1+EDWUn+qhUf8wMQoFIy9NXuq23zAwtcB0t/MjACGR18pcRbg==",
"requires": {
"psl": "^1.1.33",
"punycode": "^2.1.1",
"universalify": "^0.1.2"
}
},
"tslib": { "tslib": {
"version": "2.3.1", "version": "2.5.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz", "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.5.0.tgz",
"integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw==" "integrity": "sha512-336iVw3rtn2BUK7ORdIAHTyxHGRIHVReokCR3XjbckJMK7ms8FysBfhLR8IXnAgy7T0PTPNBWKiH514FOW/WSg=="
}, },
"uuid": { "uuid": {
"version": "8.3.2", "version": "8.3.2",
@ -9884,36 +9853,34 @@
} }
}, },
"@azure/core-lro": { "@azure/core-lro": {
"version": "2.2.4", "version": "2.5.1",
"resolved": "https://registry.npmjs.org/@azure/core-lro/-/core-lro-2.2.4.tgz", "resolved": "https://registry.npmjs.org/@azure/core-lro/-/core-lro-2.5.1.tgz",
"integrity": "sha512-e1I2v2CZM0mQo8+RSix0x091Av493e4bnT22ds2fcQGslTHzM2oTbswkB65nP4iEpCxBrFxOSDPKExmTmjCVtQ==", "integrity": "sha512-JHQy/bA3NOz2WuzOi5zEk6n/TJdAropupxUT521JIJvW7EXV2YN2SFYZrf/2RHeD28QAClGdynYadZsbmP+nyQ==",
"requires": { "requires": {
"@azure/abort-controller": "^1.0.0", "@azure/abort-controller": "^1.0.0",
"@azure/core-tracing": "1.0.0-preview.13",
"@azure/logger": "^1.0.0", "@azure/logger": "^1.0.0",
"tslib": "^2.2.0" "tslib": "^2.2.0"
}, },
"dependencies": { "dependencies": {
"tslib": { "tslib": {
"version": "2.3.1", "version": "2.5.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz", "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.5.0.tgz",
"integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw==" "integrity": "sha512-336iVw3rtn2BUK7ORdIAHTyxHGRIHVReokCR3XjbckJMK7ms8FysBfhLR8IXnAgy7T0PTPNBWKiH514FOW/WSg=="
} }
} }
}, },
"@azure/core-paging": { "@azure/core-paging": {
"version": "1.2.1", "version": "1.5.0",
"resolved": "https://registry.npmjs.org/@azure/core-paging/-/core-paging-1.2.1.tgz", "resolved": "https://registry.npmjs.org/@azure/core-paging/-/core-paging-1.5.0.tgz",
"integrity": "sha512-UtH5iMlYsvg+nQYIl4UHlvvSrsBjOlRF4fs0j7mxd3rWdAStrKYrh2durOpHs5C9yZbVhsVDaisoyaf/lL1EVA==", "integrity": "sha512-zqWdVIt+2Z+3wqxEOGzR5hXFZ8MGKK52x4vFLw8n58pR6ZfKRx3EXYTxTaYxYHc/PexPUTyimcTWFJbji9Z6Iw==",
"requires": { "requires": {
"@azure/core-asynciterator-polyfill": "^1.0.0",
"tslib": "^2.2.0" "tslib": "^2.2.0"
}, },
"dependencies": { "dependencies": {
"tslib": { "tslib": {
"version": "2.3.1", "version": "2.5.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz", "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.5.0.tgz",
"integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw==" "integrity": "sha512-336iVw3rtn2BUK7ORdIAHTyxHGRIHVReokCR3XjbckJMK7ms8FysBfhLR8IXnAgy7T0PTPNBWKiH514FOW/WSg=="
} }
} }
}, },
@ -9927,24 +9894,40 @@
}, },
"dependencies": { "dependencies": {
"tslib": { "tslib": {
"version": "2.3.1", "version": "2.5.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz", "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.5.0.tgz",
"integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw==" "integrity": "sha512-336iVw3rtn2BUK7ORdIAHTyxHGRIHVReokCR3XjbckJMK7ms8FysBfhLR8IXnAgy7T0PTPNBWKiH514FOW/WSg=="
}
}
},
"@azure/core-util": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/@azure/core-util/-/core-util-1.2.0.tgz",
"integrity": "sha512-ffGIw+Qs8bNKNLxz5UPkz4/VBM/EZY07mPve1ZYFqYUdPwFqRj0RPk0U7LZMOfT7GCck9YjuT1Rfp1PApNl1ng==",
"requires": {
"@azure/abort-controller": "^1.0.0",
"tslib": "^2.2.0"
},
"dependencies": {
"tslib": {
"version": "2.5.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.5.0.tgz",
"integrity": "sha512-336iVw3rtn2BUK7ORdIAHTyxHGRIHVReokCR3XjbckJMK7ms8FysBfhLR8IXnAgy7T0PTPNBWKiH514FOW/WSg=="
} }
} }
}, },
"@azure/logger": { "@azure/logger": {
"version": "1.0.3", "version": "1.0.4",
"resolved": "https://registry.npmjs.org/@azure/logger/-/logger-1.0.3.tgz", "resolved": "https://registry.npmjs.org/@azure/logger/-/logger-1.0.4.tgz",
"integrity": "sha512-aK4s3Xxjrx3daZr3VylxejK3vG5ExXck5WOHDJ8in/k9AqlfIyFMMT1uG7u8mNjX+QRILTIn0/Xgschfh/dQ9g==", "integrity": "sha512-ustrPY8MryhloQj7OWGe+HrYx+aoiOxzbXTtgblbV3xwCqpzUK36phH3XNHQKj3EPonyFUuDTfR3qFhTEAuZEg==",
"requires": { "requires": {
"tslib": "^2.2.0" "tslib": "^2.2.0"
}, },
"dependencies": { "dependencies": {
"tslib": { "tslib": {
"version": "2.3.1", "version": "2.5.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz", "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.5.0.tgz",
"integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw==" "integrity": "sha512-336iVw3rtn2BUK7ORdIAHTyxHGRIHVReokCR3XjbckJMK7ms8FysBfhLR8IXnAgy7T0PTPNBWKiH514FOW/WSg=="
} }
} }
}, },
@ -9972,12 +9955,12 @@
} }
}, },
"@azure/storage-blob": { "@azure/storage-blob": {
"version": "12.9.0", "version": "12.13.0",
"resolved": "https://registry.npmjs.org/@azure/storage-blob/-/storage-blob-12.9.0.tgz", "resolved": "https://registry.npmjs.org/@azure/storage-blob/-/storage-blob-12.13.0.tgz",
"integrity": "sha512-ank38FdCLfJ+EoeMzCz3hkYJuZAd63ARvDKkxZYRDb+beBYf+/+gx8jNTqkq/hfyUl4dJQ/a7tECU0Y0F98CHg==", "integrity": "sha512-t3Q2lvBMJucgTjQcP5+hvEJMAsJSk0qmAnjDLie2td017IiduZbbC9BOcFfmwzR6y6cJdZOuewLCNFmEx9IrXA==",
"requires": { "requires": {
"@azure/abort-controller": "^1.0.0", "@azure/abort-controller": "^1.0.0",
"@azure/core-http": "^2.0.0", "@azure/core-http": "^3.0.0",
"@azure/core-lro": "^2.2.0", "@azure/core-lro": "^2.2.0",
"@azure/core-paging": "^1.1.1", "@azure/core-paging": "^1.1.1",
"@azure/core-tracing": "1.0.0-preview.13", "@azure/core-tracing": "1.0.0-preview.13",
@ -9987,9 +9970,9 @@
}, },
"dependencies": { "dependencies": {
"tslib": { "tslib": {
"version": "2.3.1", "version": "2.5.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz", "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.5.0.tgz",
"integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw==" "integrity": "sha512-336iVw3rtn2BUK7ORdIAHTyxHGRIHVReokCR3XjbckJMK7ms8FysBfhLR8IXnAgy7T0PTPNBWKiH514FOW/WSg=="
} }
} }
}, },
@ -11837,9 +11820,9 @@
} }
}, },
"@opentelemetry/api": { "@opentelemetry/api": {
"version": "1.0.4", "version": "1.4.0",
"resolved": "https://registry.npmjs.org/@opentelemetry/api/-/api-1.0.4.tgz", "resolved": "https://registry.npmjs.org/@opentelemetry/api/-/api-1.4.0.tgz",
"integrity": "sha512-BuJuXRSJNQ3QoKA6GWWDyuLpOUck+9hAXNMCnrloc1aWVoy6Xq6t9PUV08aBZ4Lutqq2LEHM486bpZqoViScog==" "integrity": "sha512-IgMK9i3sFGNUqPMbjABm0G26g0QCKCUBfglhQ7rQq6WcxbKfEHRcmwsoER4hZcuYqJgkYn2OeuoJIv7Jsftp7g=="
}, },
"@sinclair/typebox": { "@sinclair/typebox": {
"version": "0.24.51", "version": "0.24.51",
@ -11976,9 +11959,9 @@
"integrity": "sha512-jh6m0QUhIRcZpNv7Z/rpN+ZWXOicUUQbSoWks7Htkbb9IjFQj4kzcX/xFCkjstCj5flMsN8FiSvt+q+Tcs4Llg==" "integrity": "sha512-jh6m0QUhIRcZpNv7Z/rpN+ZWXOicUUQbSoWks7Htkbb9IjFQj4kzcX/xFCkjstCj5flMsN8FiSvt+q+Tcs4Llg=="
}, },
"@types/node-fetch": { "@types/node-fetch": {
"version": "2.6.1", "version": "2.6.2",
"resolved": "https://registry.npmjs.org/@types/node-fetch/-/node-fetch-2.6.1.tgz", "resolved": "https://registry.npmjs.org/@types/node-fetch/-/node-fetch-2.6.2.tgz",
"integrity": "sha512-oMqjURCaxoSIsHSr1E47QHzbmzNR5rK8McHuNb11BOM9cHcIK3Avy0s/b2JlXHoQGTYS3NsvWzV1M0iK7l0wbA==", "integrity": "sha512-DHqhlq5jeESLy19TYhLakJ07kNumXWjcDdxXsLUMJZ6ue8VZJj4kLPQVE/2mdHh3xZziNF1xppu5lwmS53HR+A==",
"requires": { "requires": {
"@types/node": "*", "@types/node": "*",
"form-data": "^3.0.0" "form-data": "^3.0.0"
@ -16554,7 +16537,7 @@
"process": { "process": {
"version": "0.11.10", "version": "0.11.10",
"resolved": "https://registry.npmjs.org/process/-/process-0.11.10.tgz", "resolved": "https://registry.npmjs.org/process/-/process-0.11.10.tgz",
"integrity": "sha1-czIwDoQBYb2j5podHZGn1LwW8YI=" "integrity": "sha512-cdGef/drWFoydD1JsMzuFf8100nZl+GT+yacc2bEced5f9Rjk4z+WtFUTBu9PhOi9j/jfmBPu0mMEY4wIdAF8A=="
}, },
"prompts": { "prompts": {
"version": "2.4.2", "version": "2.4.2",
@ -17050,11 +17033,6 @@
"which-boxed-primitive": "^1.0.2" "which-boxed-primitive": "^1.0.2"
} }
}, },
"universalify": {
"version": "0.1.2",
"resolved": "https://registry.npmjs.org/universalify/-/universalify-0.1.2.tgz",
"integrity": "sha512-rBJeI5CXAlmy1pV+617WB9J63U6XcazHHF2f2dbJix4XzpUF0RS3Zbj0FGIOCAva5P/d/GBOYaACQ1w+0azUkg=="
},
"uri-js": { "uri-js": {
"version": "4.4.1", "version": "4.4.1",
"resolved": "https://registry.npmjs.org/uri-js/-/uri-js-4.4.1.tgz", "resolved": "https://registry.npmjs.org/uri-js/-/uri-js-4.4.1.tgz",

View File

@ -1,6 +1,6 @@
{ {
"name": "cache", "name": "cache",
"version": "3.2.6", "version": "3.3.2",
"private": true, "private": true,
"description": "Cache dependencies and build outputs", "description": "Cache dependencies and build outputs",
"main": "dist/restore/index.js", "main": "dist/restore/index.js",
@ -23,7 +23,7 @@
"author": "GitHub", "author": "GitHub",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@actions/cache": "^3.1.4", "@actions/cache": "^3.2.2",
"@actions/core": "^1.10.0", "@actions/core": "^1.10.0",
"@actions/exec": "^1.1.1", "@actions/exec": "^1.1.1",
"@actions/io": "^1.1.2" "@actions/io": "^1.1.2"

View File

@ -9,7 +9,8 @@ The restore action restores a cache. It works similarly to the `cache` action ex
* `key` - An explicit key for a cache entry. See [creating a cache key](../README.md#creating-a-cache-key). * `key` - An explicit key for a cache entry. See [creating a cache key](../README.md#creating-a-cache-key).
* `path` - A list of files, directories, and wildcard patterns to restore. See [`@actions/glob`](https://github.com/actions/toolkit/tree/main/packages/glob) for supported patterns. * `path` - A list of files, directories, and wildcard patterns to restore. See [`@actions/glob`](https://github.com/actions/toolkit/tree/main/packages/glob) for supported patterns.
* `restore-keys` - An ordered list of prefix-matched keys to use for restoring stale cache if no cache hit occurred for key. * `restore-keys` - An ordered list of prefix-matched keys to use for restoring stale cache if no cache hit occurred for key.
* `fail-on-cache-miss` - Fail the workflow if cache entry is not found. Default: false * `fail-on-cache-miss` - Fail the workflow if cache entry is not found. Default: `false`
* `lookup-only` - If true, only checks if cache entry exists and skips download. Default: `false`
### Outputs ### Outputs
@ -22,7 +23,7 @@ The restore action restores a cache. It works similarly to the `cache` action ex
### Environment Variables ### Environment Variables
* `SEGMENT_DOWNLOAD_TIMEOUT_MINS` - Segment download timeout (in minutes, default `60`) to abort download of the segment if not completed in the defined number of minutes. [Read more](https://github.com/actions/cache/blob/main/tips-and-workarounds.md#cache-segment-restore-timeout) * `SEGMENT_DOWNLOAD_TIMEOUT_MINS` - Segment download timeout (in minutes, default `10`) to abort download of the segment if not completed in the defined number of minutes. [Read more](https://github.com/actions/cache/blob/main/tips-and-workarounds.md#cache-segment-restore-timeout)
## Use cases ## Use cases

View File

@ -19,6 +19,10 @@ inputs:
description: 'Fail the workflow if cache entry is not found' description: 'Fail the workflow if cache entry is not found'
default: 'false' default: 'false'
required: false required: false
lookup-only:
description: 'Check if a cache entry exists for the given input(s) (key, restore-keys) without downloading the cache'
default: 'false'
required: false
outputs: outputs:
cache-hit: cache-hit:
description: 'A boolean value to indicate an exact match was found for the primary key' description: 'A boolean value to indicate an exact match was found for the primary key'

View File

@ -4,7 +4,8 @@ export enum Inputs {
RestoreKeys = "restore-keys", // Input for cache, restore action RestoreKeys = "restore-keys", // Input for cache, restore action
UploadChunkSize = "upload-chunk-size", // Input for cache, save action UploadChunkSize = "upload-chunk-size", // Input for cache, save action
EnableCrossOsArchive = "enableCrossOsArchive", // Input for cache, restore, save action EnableCrossOsArchive = "enableCrossOsArchive", // Input for cache, restore, save action
FailOnCacheMiss = "fail-on-cache-miss" // Input for cache, restore action FailOnCacheMiss = "fail-on-cache-miss", // Input for cache, restore action
LookupOnly = "lookup-only" // Input for cache, restore action
} }
export enum Outputs { export enum Outputs {

View File

@ -1,10 +1,3 @@
import restoreImpl from "./restoreImpl"; import { restoreRun } from "./restoreImpl";
import { StateProvider } from "./stateProvider";
async function run(): Promise<void> { restoreRun(true);
await restoreImpl(new StateProvider());
}
run();
export default run;

View File

@ -2,10 +2,14 @@ import * as cache from "@actions/cache";
import * as core from "@actions/core"; import * as core from "@actions/core";
import { Events, Inputs, Outputs, State } from "./constants"; import { Events, Inputs, Outputs, State } from "./constants";
import { IStateProvider } from "./stateProvider"; import {
IStateProvider,
NullStateProvider,
StateProvider
} from "./stateProvider";
import * as utils from "./utils/actionUtils"; import * as utils from "./utils/actionUtils";
async function restoreImpl( export async function restoreImpl(
stateProvider: IStateProvider stateProvider: IStateProvider
): Promise<string | undefined> { ): Promise<string | undefined> {
try { try {
@ -35,12 +39,13 @@ async function restoreImpl(
Inputs.EnableCrossOsArchive Inputs.EnableCrossOsArchive
); );
const failOnCacheMiss = utils.getInputAsBool(Inputs.FailOnCacheMiss); const failOnCacheMiss = utils.getInputAsBool(Inputs.FailOnCacheMiss);
const lookupOnly = utils.getInputAsBool(Inputs.LookupOnly);
const cacheKey = await cache.restoreCache( const cacheKey = await cache.restoreCache(
cachePaths, cachePaths,
primaryKey, primaryKey,
restoreKeys, restoreKeys,
{}, { lookupOnly: lookupOnly },
enableCrossOsArchive enableCrossOsArchive
); );
@ -69,7 +74,11 @@ async function restoreImpl(
); );
core.setOutput(Outputs.CacheHit, isExactKeyMatch.toString()); core.setOutput(Outputs.CacheHit, isExactKeyMatch.toString());
if (lookupOnly) {
core.info(`Cache found and can be restored from key: ${cacheKey}`);
} else {
core.info(`Cache restored from key: ${cacheKey}`); core.info(`Cache restored from key: ${cacheKey}`);
}
return cacheKey; return cacheKey;
} catch (error: unknown) { } catch (error: unknown) {
@ -77,4 +86,37 @@ async function restoreImpl(
} }
} }
export default restoreImpl; async function run(
stateProvider: IStateProvider,
earlyExit: boolean | undefined
): Promise<void> {
try {
await restoreImpl(stateProvider);
} catch (err) {
console.error(err);
if (earlyExit) {
process.exit(1);
}
}
// node will stay alive if any promises are not resolved,
// which is a possibility if HTTP requests are dangling
// due to retries or timeouts. We know that if we got here
// that all promises that we care about have successfully
// resolved, so simply exit with success.
if (earlyExit) {
process.exit(0);
}
}
export async function restoreOnlyRun(
earlyExit?: boolean | undefined
): Promise<void> {
await run(new NullStateProvider(), earlyExit);
}
export async function restoreRun(
earlyExit?: boolean | undefined
): Promise<void> {
await run(new StateProvider(), earlyExit);
}

View File

@ -1,10 +1,3 @@
import restoreImpl from "./restoreImpl"; import { restoreOnlyRun } from "./restoreImpl";
import { NullStateProvider } from "./stateProvider";
async function run(): Promise<void> { restoreOnlyRun(true);
await restoreImpl(new NullStateProvider());
}
run();
export default run;

View File

@ -15,6 +15,7 @@ interface CacheInput {
restoreKeys?: string[]; restoreKeys?: string[];
enableCrossOsArchive?: boolean; enableCrossOsArchive?: boolean;
failOnCacheMiss?: boolean; failOnCacheMiss?: boolean;
lookupOnly?: boolean;
} }
export function setInputs(input: CacheInput): void { export function setInputs(input: CacheInput): void {
@ -29,6 +30,8 @@ export function setInputs(input: CacheInput): void {
); );
input.failOnCacheMiss !== undefined && input.failOnCacheMiss !== undefined &&
setInput(Inputs.FailOnCacheMiss, input.failOnCacheMiss.toString()); setInput(Inputs.FailOnCacheMiss, input.failOnCacheMiss.toString());
input.lookupOnly !== undefined &&
setInput(Inputs.LookupOnly, input.lookupOnly.toString());
} }
export function clearInputs(): void { export function clearInputs(): void {
@ -38,4 +41,5 @@ export function clearInputs(): void {
delete process.env[getInputName(Inputs.UploadChunkSize)]; delete process.env[getInputName(Inputs.UploadChunkSize)];
delete process.env[getInputName(Inputs.EnableCrossOsArchive)]; delete process.env[getInputName(Inputs.EnableCrossOsArchive)];
delete process.env[getInputName(Inputs.FailOnCacheMiss)]; delete process.env[getInputName(Inputs.FailOnCacheMiss)];
delete process.env[getInputName(Inputs.LookupOnly)];
} }

View File

@ -1,10 +1,15 @@
# Tips and workarounds
## Cache segment restore timeout ## Cache segment restore timeout
A cache gets downloaded in multiple segments of fixed sizes (`1GB` for a `32-bit` runner and `2GB` for a `64-bit` runner). Sometimes, a segment download gets stuck which causes the workflow job to be stuck forever and fail. Version `v3.0.8` of `actions/cache` introduces a segment download timeout. The segment download timeout will allow the segment download to get aborted and hence allow the job to proceed with a cache miss. A cache gets downloaded in multiple segments of fixed sizes (`1GB` for a `32-bit` runner and `2GB` for a `64-bit` runner). Sometimes, a segment download gets stuck which causes the workflow job to be stuck forever and fail. Version `v3.0.8` of `actions/cache` introduces a segment download timeout. The segment download timeout will allow the segment download to get aborted and hence allow the job to proceed with a cache miss.
Default value of this timeout is 60 minutes and can be customized by specifying an [environment variable](https://docs.github.com/en/actions/learn-github-actions/environment-variables) named `SEGMENT_DOWNLOAD_TIMEOUT_MINS` with timeout value in minutes. Default value of this timeout is 10 minutes and can be customized by specifying an [environment variable](https://docs.github.com/en/actions/learn-github-actions/environment-variables) named `SEGMENT_DOWNLOAD_TIMEOUT_MINS` with timeout value in minutes.
## Update a cache ## Update a cache
A cache today is immutable and cannot be updated. But some use cases require the cache to be saved even though there was a "hit" during restore. To do so, use a `key` which is unique for every run and use `restore-keys` to restore the nearest cache. For example: A cache today is immutable and cannot be updated. But some use cases require the cache to be saved even though there was a "hit" during restore. To do so, use a `key` which is unique for every run and use `restore-keys` to restore the nearest cache. For example:
```yaml ```yaml
- name: update cache on every commit - name: update cache on every commit
uses: actions/cache@v3 uses: actions/cache@v3
@ -14,19 +19,24 @@ A cache today is immutable and cannot be updated. But some use cases require the
restore-keys: | restore-keys: |
primes-${{ runner.os }} primes-${{ runner.os }}
``` ```
Please note that this will create a new cache on every run and hence will consume the cache [quota](./README.md#cache-limits). Please note that this will create a new cache on every run and hence will consume the cache [quota](./README.md#cache-limits).
## Use cache across feature branches ## Use cache across feature branches
Reusing cache across feature branches is not allowed today to provide cache [isolation](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache). However if both feature branches are from the default branch, a good way to achieve this is to ensure that the default branch has a cache. This cache will then be consumable by both feature branches. Reusing cache across feature branches is not allowed today to provide cache [isolation](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache). However if both feature branches are from the default branch, a good way to achieve this is to ensure that the default branch has a cache. This cache will then be consumable by both feature branches.
## Cross OS cache ## Cross OS cache
From `v3.2.3` cache is cross-os compatible when `enableCrossOsArchive` input is passed as true. This means that a cache created on `ubuntu-latest` or `mac-latest` can be used by `windows-latest` and vice versa, provided the workflow which runs on `windows-latest` have input `enableCrossOsArchive` as true. This is useful to cache dependencies which are independent of the runner platform. This will help reduce the consumption of the cache quota and help build for multiple platforms from the same cache. Things to keep in mind while using this feature: From `v3.2.3` cache is cross-os compatible when `enableCrossOsArchive` input is passed as true. This means that a cache created on `ubuntu-latest` or `mac-latest` can be used by `windows-latest` and vice versa, provided the workflow which runs on `windows-latest` have input `enableCrossOsArchive` as true. This is useful to cache dependencies which are independent of the runner platform. This will help reduce the consumption of the cache quota and help build for multiple platforms from the same cache. Things to keep in mind while using this feature:
- Only cache those files which are compatible across OSs.
- Caching symlinks might cause issues while restoration as they work differently on different OSs. - Only cache files that are compatible across OSs.
- Only cache files from within your github workspace directory. - Caching symlinks might cause issues while restoring them as they behave differently on different OSs.
- Avoid using directory pointers such as `${{ github.workspace }}` or `~` (home) which eventually evaluate to an absolute path and will not match across OSs. - Be mindful when caching files from outside your github workspace directory as the directory is located at different places across OS.
- Avoid using directory pointers such as `${{ github.workspace }}` or `~` (home) which eventually evaluate to an absolute path that does not match across OSs.
## Force deletion of caches overriding default cache eviction policy ## Force deletion of caches overriding default cache eviction policy
Caches have [branch scope restriction](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache) in place. This means that if caches for a specific branch are using a lot of storage quota, it may result into more frequently used caches from `default` branch getting thrashed. For example, if there are many pull requests happening on a repo and are creating caches, these cannot be used in default branch scope but will still occupy a lot of space till they get cleaned up by [eviction policy](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#usage-limits-and-eviction-policy). But sometime we want to clean them up on a faster cadence so as to ensure default branch is not thrashing. In order to achieve this, [gh-actions-cache cli](https://github.com/actions/gh-actions-cache/) can be used to delete caches for specific branches. Caches have [branch scope restriction](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache) in place. This means that if caches for a specific branch are using a lot of storage quota, it may result into more frequently used caches from `default` branch getting thrashed. For example, if there are many pull requests happening on a repo and are creating caches, these cannot be used in default branch scope but will still occupy a lot of space till they get cleaned up by [eviction policy](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#usage-limits-and-eviction-policy). But sometime we want to clean them up on a faster cadence so as to ensure default branch is not thrashing. In order to achieve this, [gh-actions-cache cli](https://github.com/actions/gh-actions-cache/) can be used to delete caches for specific branches.
This workflow uses `gh-actions-cache` to delete all the caches created by a branch. This workflow uses `gh-actions-cache` to delete all the caches created by a branch.
@ -44,6 +54,11 @@ on:
jobs: jobs:
cleanup: cleanup:
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions:
# `actions:write` permission is required to delete caches
# See also: https://docs.github.com/en/rest/actions/cache?apiVersion=2022-11-28#delete-a-github-actions-cache-for-a-repository-using-a-cache-id
actions: write
contents: read
steps: steps:
- name: Check out code - name: Check out code
uses: actions/checkout@v3 uses: actions/checkout@v3
@ -69,4 +84,5 @@ jobs:
env: env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
``` ```
</details> </details>