Compare commits

...

34 Commits

Author SHA1 Message Date
fbcb66e70c release: cut the v10.1.2 release 2020-09-16 14:38:02 -07:00
3d94919800 refactor(dev-infra): refactor commit-message files (#38845)
Refactor the commit-message files to be consistent with how other ng-dev tooling
is structured.

PR Close #38845
2020-09-15 16:05:46 -07:00
af3b401e15 docs: add ngc to glossary (#36781)
ngc angular compiler was not mentioned in the glossary.
Glossary should contain the relevant terms in angular
which are hard to get. So, added a small defination of
ngc to the glossary

PR Close #36781
2020-09-15 11:29:56 -07:00
e4c12c8f9e test(ngcc): load standard files only once (#38840)
In the integration test suite of ngcc, we load a set of files from
`node_modules` into memory. This includes the `typescript` package and
`@angular` scoped packages, which account for a large number of large
files that needs to be loaded from disk. This commit moves this work
to the top-level, such that it doesn't have to be repeated in all tests.

PR Close #38840
2020-09-15 11:23:28 -07:00
ea36466060 perf(ngcc): reduce maximum worker count (#38840)
Recent optimizations to ngcc have significantly reduced the total time
it takes to process `node_modules`, to such extend that sharding across
multiple processes has become less effective. Previously, running
ngcc asynchronously would allow for up to 8 workers to be allocated,
however these workers have to repeat work that could otherwise be shared.
Because ngcc is now able to reuse more shared computations, the overhead
of multiple workers is increased and therefore becomes less effective.
As an additional benefit, having fewer workers requires less memory and
less startup time.

To give an idea, using the following test setup:

```bash
npx @angular/cli new perf-test
cd perf-test
yarn ng add @angular/material
./node_modules/.bin/ngcc --properties es2015 module main \
  --first-only --create-ivy-entry-points
```

We observe the following figures on CI:

|                   | 10.1.1    | PR #38840 |
| ----------------- | --------- | --------- |
| Sync              | 85s       | 25s       |
| Async (8 workers) | 22s       | 16s       |
| Async (4 workers) | -         | 11s       |

In addition to changing the default number of workers, ngcc will now
use the environment variable `NGCC_MAX_WORKERS` that may be configured
to either reduce or increase the number of workers.

PR Close #38840
2020-09-15 11:23:24 -07:00
58411e7ad9 perf(ngcc): introduce cache for sharing data across entry-points (#38840)
ngcc creates typically two `ts.Program` instances for each entry-point,
one for processing sources and another one for processing the typings.
The creation of these programs is somewhat expensive, as it concerns
module resolution and parsing of source files.

This commit implements several layers of caching to optimize the
creation of programs:

1. A shared module resolution cache across all entry-points within a
   single invocation of ngcc. Both the sources and typings program
   benefit from this cache.
2. Sharing the parsed `ts.SourceFile` for a single entry-point between
   the sources and typings program.
3. Sharing parsed `ts.SourceFile`s of TypeScript's default libraries
   across all entry-points within a single invocation. Some of these
   default library typings are large and therefore expensive to parse,
   so sharing the parsed source files across all entry-points offers
   a significant performance improvement.

Using a bare CLI app created using `ng new` + `ng add @angular/material`,
the above changes offer a 3-4x improvement in ngcc's processing time
when running synchronously and ~2x improvement for asynchronous runs.

PR Close #38840
2020-09-15 11:23:18 -07:00
84bd1a233d feat(dev-infra): include CI status check in the caretaker check (#38779)
Add a CI status check in the ng-dev caretaker check command.

PR Close #38779
2020-09-15 08:45:06 -07:00
ef13d8f33a fix(dev-infra): remove ANSI escape codes from log file outputs (#38792)
Remove the ANSI codes from the log file outputs to make the ng-dev log files
more readable.

PR Close #38792
2020-09-15 08:44:04 -07:00
dc4f85888e build: create temporary script for symbol extractor tests (#38819)
Creates a temporary script to running all symbol extractor tests.

PR Close #38819
2020-09-14 16:54:45 -07:00
2bdfb14be0 build: add tag to symbol-extractor tests (#38819)
Add a tag to symbol-extractor tests to allow for bazel querying.

PR Close #38819
2020-09-14 16:54:42 -07:00
26deef2d3e feat(dev-infra): allow local ng-dev configuration to error on invalid commit messages (#38784)
As part of the commit message conformance check, local commit message checks are
made to be warnings rather than failures. An additional local option is also in
place to allow for the commit message validation failures to be considered errors
instead.

PR Close #38784
2020-09-14 14:34:55 -07:00
f52a494248 docs(elements): convert the ng add command to code-example (#38834)
the command was hardly visible currently
changing it to shell for visibility

PR Close #38834
2020-09-14 14:31:10 -07:00
ebede67433 perf(compiler-cli): optimize computation of type-check scope information (#38843)
When type-checking a component, the declaring NgModule scope is used
to create a directive matcher that contains flattened directive metadata,
i.e. the metadata of a directive and its base classes. This computation
is done for all components, whereas the type-check scope is constant per
NgModule. Additionally, the flattening of metadata is constant per
directive instance so doesn't necessarily have to be recomputed for
each component.

This commit introduces a `TypeCheckScopes` class that is responsible
for flattening directives and computing the scope per NgModule. It
caches the computed results as appropriate to avoid repeated computation.

PR Close #38843
2020-09-14 13:07:08 -07:00
565840515c perf(compiler-cli): only emit directive/pipe references that are used (#38843)
For the compilation of a component, the compiler has to prepare some
information about the directives and pipes that are used in the template.
This information includes an expression for directives/pipes, for usage
within the compilation output. For large NgModule compilation scopes
this has shown to introduce a performance hotspot, as the generation of
expressions is quite expensive. This commit reduces the performance
overhead by only generating expressions for the directives/pipes that
are actually used within the template, significantly cutting down on
the compiler's resolve phase.

PR Close #38843
2020-09-14 13:07:03 -07:00
c62d1cb80a refactor(dev-infra): do not validate config file multiple times (#38808)
Currently we validate the configuration file on each `getConfig`
invocation. We can only validate once since the configuration
is cached.

Also while being at it, renames the cache variables to lower-case as those
do not represent constants (which are convention-wise upper case).

PR Close #38808
2020-09-14 08:34:19 -07:00
aa43cbf8c5 fix(dev-infra): correct build setup for dev-infra (#38815)
Correct's the missing dependencies and incorrect build dependencies for the
dev-infra bazel targets.

PR Close #38815
2020-09-11 13:59:35 -07:00
b05d79d14a build(router): update symbols for routing app (#38817)
This commit updates the golden symbol files for the routing app.

PR Close #38817
2020-09-11 13:20:39 -07:00
04c2bb9580 build: create sample router app (#38714)
This commit creates a sample router test application to introduce the
symbol tests. It serves as a guard to ensure that any future work on the
router package does not unintentionally increase the payload size.

PR Close #38714
2020-09-11 12:10:49 -07:00
ec2dbe7fb4 fix(compiler): detect pipes in ICUs in template binder (#38810)
Recent work on compiler internals in #38539 led to an unexpected failure,
where a pipe used exclusively inside of an ICU would no longer be
emitted into the compilation output. This caused runtime errors due to
missing pipes.

The issue occurred because the change in #38539 would determine the set
of used pipes up-front, independent from the template compilation using
the `R3TargetBinder`. However, `R3TargetBinder` did not consider
expressions within ICUs, so any pipe usages within those expressions
would not be detected. This fix unblocks #38539 and also concerns
upcoming linker work, given that prelink compilations would not go
through full template compilation but only `R3TargetBinder`.

PR Close #38810
2020-09-11 12:07:43 -07:00
8096c63c66 docs: clarify what fixes are merged to LTS versions (#38788)
PR Close #38788
2020-09-11 08:45:46 -07:00
6ff9e6e2bd refactor(dev-infra): add default params to runBenchmark (#38748)
* Use '' as the default for 'url'
* Use [] as the default for 'params'
* Use true as the default for 'ignoreBrowserSynchronization'

PR Close #38748
2020-09-11 08:44:38 -07:00
31d0ee4cbf docs: Move Displaying data topic and ToH tutorial (#38774)
Move the "Displaying data topic" into the Tutorials section.
Move the ToH tutorial to the top of the tutorials section.

PR Close #38774
2020-09-10 14:32:28 -07:00
a47383d1e8 fix(localize): ensure that formatOptions is optional (#38787)
Some lower level APIs are used by CLI, and requiring
the `formatOpions` argument at that level is a
breaking change. This commit makes it optional
at every level to avoid the breaking change.

PR Close #38787
2020-09-10 10:55:08 -07:00
9078187378 docs: add note about only one label target being allowed to be applied to a PR (#38793)
To clarify the use of GitHub labels for targetting PRs, a not about only being
allowed to apply one label is added.

PR Close #38793
2020-09-10 09:41:06 -07:00
dcb473db34 docs: update docs to reflect new PR targeting methods for release trains (#38401) (#38793)
As part of the migration to a common strategy/method for branching and releasing across
the main angular repositories, updates need to be made to the documentation. These changes
reflect the updates made and is based on the following document which describes the
merging label expectations: https://docs.google.com/document/d/197kVillDwx-RZtSVOBtPb4BBIAw0E9RT3q3v6DZkykU

PR Close #38401

PR Close #38793
2020-09-10 09:41:00 -07:00
edb7f90363 fix(core): clear the RefreshTransplantedView when detached (#38768)
The `RefreshTransplantedView` flag is used to indicate that the view or one of its children
is transplanted and dirty, so it should still be refreshed as part of change detection.
This flag is set on the transplanted view itself as well setting a
counter on as its parents.
When a transplanted view is detached and still has this flag, it means
it got detached before it was refreshed. This can happen for "backwards
references" or transplanted views that are inserted at a location that
was already checked. In this case, we should decrement the parent
counters _and_ clear the flag on the detached view so it's not seen as
"transplanted" anymore (it is detached and has no parent counters to
adjust).

fixes #38619

PR Close #38768
2020-09-10 09:11:41 -07:00
9c51ba321e fix(router): Ensure routes are processed in priority order and only if needed (#38780)
There is a slight difference between `map`...`concatAll` and `concatMap`
in that the latter (`concatMap`) will ensure that the computations are
executed in-order and only if needed while the former may execute the
`map` body of all items if they do not emit immediately. That is, if the stream
is
`from([a, b, c]).pipe(map(v => of(v).pipe(delay(1))), concatAll(), first())`
the `map` body will execute for all of `a`, `b`, and `c`.
However, the following will only execute the `concatMap` body for `a`
`from([a, b, c]).pipe(concatMap(v => of(v).pipe(delay(1))), first())`

See https://stackblitz.com/edit/rxjs-cvwxyx

fixes #38691

PR Close #38780
2020-09-10 08:54:39 -07:00
d8714d045d refactor(core): _reset() remove nextRecord (#38752)
The nextRecord is not neccessary, so remove it and use record._nextMoved to iterate

PR Close #38752
2020-09-10 08:52:54 -07:00
5de2ac3e1b fix(upgrade): add try/catch when downgrading injectables (#38671)
This commit improves the error thrown by the downgrade module with a more
descriptive message on why the downgrade is failing.

Closes #37579

PR Close #38671
2020-09-10 08:50:18 -07:00
7669bd856f feat(dev-infra): Allow local user ng-dev configuration to disable commit message wizard (#38701)
As not all users, particularly contributors consistently contributing with a deep
understanding of our commit message guidelines, will not want to rely on the
commit message wizard, we allow a user to opt out of using this wizard during
commit message creation.

PR Close #38701
2020-09-09 16:31:17 -07:00
18d911d807 feat(dev-infra): Add support for local user ng-dev configuration (#38701)
Create a utility for loading a local user configuration object to describe
local configuration values, such as skipping the commit message wizard.

PR Close #38701
2020-09-09 16:31:17 -07:00
38ff66dc32 docs: put docs style guide back in left nav (#38683)
PR Close #38683
2020-09-09 16:22:58 -07:00
5672aba2f9 refactor(dev-infra): update commit message validation to return validation result (#38703)
Previously, the validateCommitMessage function ran validation and logged the results.
The validateCommitMessage function now returns an object containing the validation
results and the cli action functions are instead responsible for logging the results.

This is being done as a prefactor for a change which allows for commit message
hook validation to be either a blocking error or a warning.

PR Close #38703
2020-09-09 16:22:36 -07:00
5567bdc48e test(language-service): [ivy] remove all markers from test (#38777)
In the test project there are no longer reference markers and location
markers, so there's no need to "pre-process" the source files to remove
them. This will make the Ivy tests cleaner and faster.

PR Close #38777
2020-09-09 16:21:56 -07:00
74 changed files with 4135 additions and 683 deletions

3
.gitignore vendored
View File

@ -40,6 +40,9 @@ yarn-error.log
# User specific bazel settings
.bazelrc.user
# User specific ng-dev settings
.ng-dev.user*
.notes.md
baseline.json

View File

@ -1,3 +1,25 @@
<a name="10.1.2"></a>
## 10.1.2 (2020-09-16)
### Bug Fixes
* **compiler:** detect pipes in ICUs in template binder ([#38810](https://github.com/angular/angular/issues/38810)) ([ec2dbe7](https://github.com/angular/angular/commit/ec2dbe7)), closes [#38539](https://github.com/angular/angular/issues/38539) [#38539](https://github.com/angular/angular/issues/38539) [#38539](https://github.com/angular/angular/issues/38539)
* **core:** clear the `RefreshTransplantedView` when detached ([#38768](https://github.com/angular/angular/issues/38768)) ([edb7f90](https://github.com/angular/angular/commit/edb7f90)), closes [#38619](https://github.com/angular/angular/issues/38619)
* **localize:** ensure that `formatOptions` is optional ([#38787](https://github.com/angular/angular/issues/38787)) ([a47383d](https://github.com/angular/angular/commit/a47383d))
* **router:** Ensure routes are processed in priority order and only if needed ([#38780](https://github.com/angular/angular/issues/38780)) ([9c51ba3](https://github.com/angular/angular/commit/9c51ba3)), closes [#38691](https://github.com/angular/angular/issues/38691)
* **upgrade:** add try/catch when downgrading injectables ([#38671](https://github.com/angular/angular/issues/38671)) ([5de2ac3](https://github.com/angular/angular/commit/5de2ac3)), closes [#37579](https://github.com/angular/angular/issues/37579)
### Performance Improvements
* **compiler-cli:** only emit directive/pipe references that are used ([#38843](https://github.com/angular/angular/issues/38843)) ([5658405](https://github.com/angular/angular/commit/5658405))
* **compiler-cli:** optimize computation of type-check scope information ([#38843](https://github.com/angular/angular/issues/38843)) ([ebede67](https://github.com/angular/angular/commit/ebede67))
* **ngcc:** introduce cache for sharing data across entry-points ([#38840](https://github.com/angular/angular/issues/38840)) ([58411e7](https://github.com/angular/angular/commit/58411e7))
* **ngcc:** reduce maximum worker count ([#38840](https://github.com/angular/angular/issues/38840)) ([ea36466](https://github.com/angular/angular/commit/ea36466))
<a name="10.1.1"></a>
## 10.1.1 (2020-09-09)

View File

@ -119,7 +119,14 @@ The recently-developed [custom elements](https://developer.mozilla.org/en-US/doc
In browsers that support Custom Elements natively, the specification requires developers use ES2015 classes to define Custom Elements - developers can opt-in to this by setting the `target: "es2015"` property in their project's [TypeScript configuration file](/guide/typescript-configuration). As Custom Element and ES2015 support may not be available in all browsers, developers can instead choose to use a polyfill to support older browsers and ES5 code.
Use the [Angular CLI](cli) to automatically set up your project with the correct polyfill: `ng add @angular/elements --project=*your_project_name*`.
Use the [Angular CLI](cli) to automatically set up your project with the correct polyfill:
<code-example language="sh">
ng add @angular/elements --project=*your_project_name*
</code-example>
- For more information about polyfills, see [polyfill documentation](https://www.webcomponents.org/polyfills).
- For more information about Angular browser support, see [Browser Support](guide/browser-support).

View File

@ -627,6 +627,11 @@ The [npm package manager](https://docs.npmjs.com/getting-started/what-is-npm) is
Learn more about how Angular uses [Npm Packages](guide/npm-packages).
{@ ngc}
## ngc
`ngc` is a Typescript-to-Javascript transpiler that processes Angular decorators, metadata, and templates, and emits JavaScript code.
The most recent implementation is internally refered to as `ngtsc` because it's a minimalistic wrapper around the TypeScript compiler `tsc` that adds a transform for processing Angular code.
{@a O}
{@a observable}

View File

@ -94,7 +94,7 @@ All of our major releases are supported for 18 months.
* 6 months of *active support*, during which regularly-scheduled updates and patches are released.
* 12 months of *long-term support (LTS)*, during which only critical fixes and security patches are released.
* 12 months of *long-term support (LTS)*, during which only [critical fixes and security patches](#lts-fixes) are released.
The following table provides the status for Angular versions under support.
@ -107,6 +107,13 @@ Version | Status | Released | Active Ends | LTS Ends
Angular versions ^4.0.0, ^5.0.0, ^6.0.0 and ^7.0.0 are no longer under support.
### LTS fixes
As a general rule, a fix is considered for an LTS version if it resolves one of:
* a newly identified security vulnerability,
* a regression, since the start of LTS, caused by a 3rd party change, such as a new browser version.
{@a deprecation}
## Deprecation practices

View File

@ -101,11 +101,6 @@
"title": "Components",
"tooltip": "Building dynamic views with data binding",
"children": [
{
"url": "guide/displaying-data",
"title": "Data binding",
"tooltip": "Property binding helps show app data in the UI."
},
{
"url": "guide/user-input",
"title": "User Input",
@ -542,27 +537,6 @@
"title": "Tutorials",
"tooltip": "End-to-end tutorials for learning Angular concepts and patterns.",
"children": [
{
"title": "Routing",
"tooltip": "End-to-end tutorials for learning about Angular's router.",
"children": [
{
"url": "guide/router-tutorial",
"title": "Using Angular Routes in a Single-page Application",
"tooltip": "A tutorial that covers many patterns associated with Angular routing."
},
{
"url": "guide/router-tutorial-toh",
"title": "Router tutorial: tour of heroes",
"tooltip": "Explore how to use Angular's router. Based on the Tour of Heroes example."
}
]
},
{
"url": "guide/forms",
"title": "Building a Template-driven Form",
"tooltip": "Create a template-driven form using directives and Angular template syntax."
},
{
"title": "Tutorial: Tour of Heroes",
"tooltip": "The Tour of Heroes app is used as a reference point in many Angular examples.",
@ -609,6 +583,32 @@
}
]
},
{
"title": "Routing",
"tooltip": "End-to-end tutorials for learning about Angular's router.",
"children": [
{
"url": "guide/router-tutorial",
"title": "Using Angular Routes in a Single-page Application",
"tooltip": "A tutorial that covers many patterns associated with Angular routing."
},
{
"url": "guide/router-tutorial-toh",
"title": "Router tutorial: tour of heroes",
"tooltip": "Explore how to use Angular's router. Based on the Tour of Heroes example."
}
]
},
{
"url": "guide/forms",
"title": "Building a Template-driven Form",
"tooltip": "Create a template-driven form using directives and Angular template syntax."
},
{
"url": "guide/displaying-data",
"title": "Data binding",
"tooltip": "Property binding helps show app data in the UI."
},
{
"url": "guide/web-worker",
"title": "Web Workers",
@ -954,6 +954,11 @@
"url": "guide/styleguide",
"title": "Coding Style Guide",
"tooltip": "Guidelines for writing Angular code."
},
{
"url": "guide/docs-style-guide",
"title": "Documentation Style Guide",
"tooltip": "Style guide for documentation authors."
}
]
}

View File

@ -23,27 +23,36 @@ const globalOptions = {
const runner = createBenchpressRunner();
export async function runBenchmark(config: {
export async function runBenchmark({
id,
url = '',
params = [],
ignoreBrowserSynchronization = true,
microMetrics,
work,
prepare,
setup,
}: {
id: string,
url: string,
params: {name: string, value: any}[],
ignoreBrowserSynchronization?: boolean,
microMetrics?: {[key: string]: string},
work?: () => void,
prepare?: () => void,
setup?: () => void
work?: (() => void)|(() => Promise<unknown>),
prepare?: (() => void)|(() => Promise<unknown>),
setup?: (() => void)|(() => Promise<unknown>),
}): Promise<any> {
openBrowser(config);
if (config.setup) {
await config.setup();
openBrowser({url, params, ignoreBrowserSynchronization});
if (setup) {
await setup();
}
const description: {[key: string]: any} = {};
config.params.forEach((param) => description[param.name] = param.value);
params.forEach((param) => description[param.name] = param.value);
return runner.sample({
id: config.id,
execute: config.work,
prepare: config.prepare,
microMetrics: config.microMetrics,
id,
execute: work,
prepare,
microMetrics,
providers: [{provide: Options.SAMPLE_DESCRIPTION, useValue: {}}]
});
}

View File

@ -2,25 +2,20 @@ load("@npm_bazel_typescript//:index.bzl", "ts_library")
ts_library(
name = "caretaker",
srcs = [
"cli.ts",
],
srcs = glob([
"**/*.ts",
]),
module_name = "@angular/dev-infra-private/caretaker",
visibility = ["//dev-infra:__subpackages__"],
deps = [
"//dev-infra/caretaker/check",
"//dev-infra/utils",
"@npm//@types/node",
"@npm//@types/node-fetch",
"@npm//@types/yargs",
"@npm//multimatch",
"@npm//node-fetch",
"@npm//typed-graphqlify",
"@npm//yaml",
"@npm//yargs",
],
)
ts_library(
name = "config",
srcs = [
"config.ts",
],
visibility = ["//dev-infra:__subpackages__"],
deps = [
"//dev-infra/utils",
],
)

View File

@ -1,21 +0,0 @@
load("@npm_bazel_typescript//:index.bzl", "ts_library")
ts_library(
name = "check",
srcs = glob(["*.ts"]),
module_name = "@angular/dev-infra-private/caretaker/service-statuses",
visibility = ["//dev-infra:__subpackages__"],
deps = [
"//dev-infra/caretaker:config",
"//dev-infra/utils",
"@npm//@types/fs-extra",
"@npm//@types/node",
"@npm//@types/node-fetch",
"@npm//@types/yargs",
"@npm//multimatch",
"@npm//node-fetch",
"@npm//typed-graphqlify",
"@npm//yaml",
"@npm//yargs",
],
)

View File

@ -9,6 +9,7 @@
import {GitClient} from '../../utils/git';
import {getCaretakerConfig} from '../config';
import {printCiStatus} from './ci';
import {printG3Comparison} from './g3';
import {printGithubTasks} from './github';
import {printServiceStatuses} from './services';
@ -21,7 +22,9 @@ export async function checkServiceStatuses(githubToken: string) {
/** The GitClient for interacting with git and Github. */
const git = new GitClient(githubToken, config);
// TODO(josephperrott): Allow these checks to be loaded in parallel.
await printServiceStatuses();
await printGithubTasks(git, config.caretaker);
await printG3Comparison(git);
await printCiStatus(git);
}

View File

@ -0,0 +1,59 @@
/**
* @license
* Copyright Google LLC All Rights Reserved.
*
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
import fetch from 'node-fetch';
import {bold, green, info, red} from '../../utils/console';
import {GitClient} from '../../utils/git';
/** The results of checking the status of CI. */
interface StatusCheckResult {
status: 'success'|'failed'|'canceled'|'infrastructure_fail'|'timedout'|'failed'|'no_tests';
timestamp: Date;
buildUrl: string;
}
/** Retrieve and log status of CI for the project. */
export async function printCiStatus(git: GitClient) {
info.group(bold(`CI`));
// TODO(josephperrott): Expand list of branches checked to all active branches.
await printStatus(git, 'master');
info.groupEnd();
info();
}
/** Log the status of CI for a given branch to the console. */
async function printStatus(git: GitClient, branch: string) {
const result = await getStatusOfBranch(git, branch);
const branchName = branch.padEnd(10);
if (result === null) {
info(`${branchName} was not found on CircleCI`);
} else if (result.status === 'success') {
info(`${branchName}`);
} else {
info(`${branchName} ❌ (Ran at: ${result.timestamp.toLocaleString()})`);
}
}
/** Get the CI status of a given branch from CircleCI. */
async function getStatusOfBranch(git: GitClient, branch: string): Promise<StatusCheckResult|null> {
const {owner, name} = git.remoteConfig;
const url = `https://circleci.com/api/v1.1/project/gh/${owner}/${name}/tree/${
branch}?limit=1&filter=completed&shallow=true`;
const result = (await fetch(url).then(result => result.json()))?.[0];
if (result) {
return {
status: result.outcome,
timestamp: new Date(result.stop_time),
buildUrl: result.build_url
};
}
return null;
}

View File

@ -6,7 +6,7 @@
* found in the LICENSE file at https://angular.io/license
*/
import {existsSync, readFileSync} from 'fs-extra';
import {existsSync, readFileSync} from 'fs';
import * as multimatch from 'multimatch';
import {join} from 'path';
import {parse as parseYaml} from 'yaml';

View File

@ -3,18 +3,10 @@ load("@npm_bazel_typescript//:index.bzl", "ts_library")
ts_library(
name = "commit-message",
srcs = [
"builder.ts",
"cli.ts",
"commit-message-draft.ts",
"config.ts",
"parse.ts",
"restore-commit-message.ts",
"validate.ts",
"validate-file.ts",
"validate-range.ts",
"wizard.ts",
],
srcs = glob(
["**/*.ts"],
exclude = ["**/*.spec.ts"],
),
module_name = "@angular/dev-infra-private/commit-message",
visibility = ["//dev-infra:__subpackages__"],
deps = [
@ -32,11 +24,7 @@ ts_library(
ts_library(
name = "test_lib",
testonly = True,
srcs = [
"builder.spec.ts",
"parse.spec.ts",
"validate.spec.ts",
],
srcs = glob(["**/*.spec.ts"]),
deps = [
":commit-message",
"//dev-infra/utils",

View File

@ -7,104 +7,19 @@
*/
import * as yargs from 'yargs';
import {info} from '../utils/console';
import {restoreCommitMessage} from './restore-commit-message';
import {validateFile} from './validate-file';
import {validateCommitRange} from './validate-range';
import {runWizard} from './wizard';
import {RestoreCommitMessageModule} from './restore-commit-message/cli';
import {ValidateFileModule} from './validate-file/cli';
import {ValidateRangeModule} from './validate-range/cli';
import {WizardModule} from './wizard/cli';
/** Build the parser for the commit-message commands. */
export function buildCommitMessageParser(localYargs: yargs.Argv) {
return localYargs.help()
.strict()
.command(
'restore-commit-message-draft', false,
args => {
return args.option('file-env-variable', {
type: 'string',
array: true,
conflicts: ['file'],
required: true,
description:
'The key for the environment variable which holds the arguments for the\n' +
'prepare-commit-msg hook as described here:\n' +
'https://git-scm.com/docs/githooks#_prepare_commit_msg',
coerce: arg => {
const [file, source] = (process.env[arg] || '').split(' ');
if (!file) {
throw new Error(`Provided environment variable "${arg}" was not found.`);
}
return [file, source];
},
});
},
args => {
restoreCommitMessage(args['file-env-variable'][0], args['file-env-variable'][1] as any);
})
.command(
'wizard <filePath> [source] [commitSha]', '', ((args: any) => {
return args
.positional(
'filePath',
{description: 'The file path to write the generated commit message into'})
.positional('source', {
choices: ['message', 'template', 'merge', 'squash', 'commit'],
description: 'The source of the commit message as described here: ' +
'https://git-scm.com/docs/githooks#_prepare_commit_msg'
})
.positional(
'commitSha', {description: 'The commit sha if source is set to `commit`'});
}),
async (args: any) => {
await runWizard(args);
})
.command(
'pre-commit-validate', 'Validate the most recent commit message', {
'file': {
type: 'string',
conflicts: ['file-env-variable'],
description: 'The path of the commit message file.',
},
'file-env-variable': {
type: 'string',
conflicts: ['file'],
description:
'The key of the environment variable for the path of the commit message file.',
coerce: arg => {
const file = process.env[arg];
if (!file) {
throw new Error(`Provided environment variable "${arg}" was not found.`);
}
return file;
},
}
},
args => {
const file = args.file || args['file-env-variable'] || '.git/COMMIT_EDITMSG';
validateFile(file);
})
.command(
'validate-range', 'Validate a range of commit messages', {
'range': {
description: 'The range of commits to check, e.g. --range abc123..xyz456',
demandOption: ' A range must be provided, e.g. --range abc123..xyz456',
type: 'string',
requiresArg: true,
},
},
argv => {
// If on CI, and not pull request number is provided, assume the branch
// being run on is an upstream branch.
if (process.env['CI'] && process.env['CI_PULL_REQUEST'] === 'false') {
info(`Since valid commit messages are enforced by PR linting on CI, we do not`);
info(`need to validate commit messages on CI runs on upstream branches.`);
info();
info(`Skipping check of provided commit range`);
return;
}
validateCommitRange(argv.range);
});
.command(RestoreCommitMessageModule)
.command(WizardModule)
.command(ValidateFileModule)
.command(ValidateRangeModule);
}
if (require.main == module) {

View File

@ -0,0 +1,13 @@
/**
* @license
* Copyright Google LLC All Rights Reserved.
*
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
/**
* The source triggering the git commit message creation.
* As described in: https://git-scm.com/docs/githooks#_prepare_commit_msg
*/
export type CommitMsgSource = 'message'|'template'|'merge'|'squash'|'commit';

View File

@ -8,6 +8,7 @@
import {assertNoErrors, getConfig, NgDevConfig} from '../utils/config';
/** Configuration for commit-message comands. */
export interface CommitMessageConfig {
maxLineLength: number;
minBodyLength: number;

View File

@ -0,0 +1,51 @@
/**
* @license
* Copyright Google LLC All Rights Reserved.
*
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
import {Arguments, Argv, CommandModule} from 'yargs';
import {CommitMsgSource} from '../commit-message-source';
import {restoreCommitMessage} from './restore-commit-message';
export interface RestoreCommitMessageOptions {
fileEnvVariable: string[];
}
/** Builds the command. */
function builder(yargs: Argv) {
return yargs.option('file-env-variable' as 'fileEnvVariable', {
type: 'string',
array: true,
demandOption: true,
description: 'The key for the environment variable which holds the arguments for the\n' +
'prepare-commit-msg hook as described here:\n' +
'https://git-scm.com/docs/githooks#_prepare_commit_msg',
coerce: arg => {
const [file, source] = (process.env[arg] || '').split(' ');
if (!file) {
throw new Error(`Provided environment variable "${arg}" was not found.`);
}
return [file, source];
},
});
}
/** Handles the command. */
async function handler({fileEnvVariable}: Arguments<RestoreCommitMessageOptions>) {
restoreCommitMessage(fileEnvVariable[0], fileEnvVariable[1] as CommitMsgSource);
}
/** yargs command module describing the command. */
export const RestoreCommitMessageModule: CommandModule<{}, RestoreCommitMessageOptions> = {
handler,
builder,
command: 'restore-commit-message-draft',
// Description: Restore a commit message draft if one has been saved from a failed commit attempt.
// No describe is defiend to hide the command from the --help.
describe: false,
};

View File

@ -8,9 +8,10 @@
import {writeFileSync} from 'fs';
import {debug, log} from '../utils/console';
import {debug, log} from '../../utils/console';
import {loadCommitMessageDraft} from './commit-message-draft';
import {loadCommitMessageDraft} from '../commit-message-draft';
import {CommitMsgSource} from '../commit-message-source';
/**
* Restore the commit message draft to the git to be used as the default commit message.
@ -18,8 +19,7 @@ import {loadCommitMessageDraft} from './commit-message-draft';
* The source provided may be one of the sources described in
* https://git-scm.com/docs/githooks#_prepare_commit_msg
*/
export function restoreCommitMessage(
filePath: string, source?: 'message'|'template'|'squash'|'commit') {
export function restoreCommitMessage(filePath: string, source?: CommitMsgSource) {
if (!!source) {
log('Skipping commit message restoration attempt');
if (source === 'message') {

View File

@ -1,30 +0,0 @@
/**
* @license
* Copyright Google LLC All Rights Reserved.
*
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
import {readFileSync} from 'fs';
import {resolve} from 'path';
import {getRepoBaseDir} from '../utils/config';
import {info} from '../utils/console';
import {deleteCommitMessageDraft, saveCommitMessageDraft} from './commit-message-draft';
import {validateCommitMessage} from './validate';
/** Validate commit message at the provided file path. */
export function validateFile(filePath: string) {
const commitMessage = readFileSync(resolve(getRepoBaseDir(), filePath), 'utf8');
if (validateCommitMessage(commitMessage)) {
info('√ Valid commit message');
deleteCommitMessageDraft(filePath);
return;
}
// On all invalid commit messages, the commit message should be saved as a draft to be
// restored on the next commit attempt.
saveCommitMessageDraft(filePath, commitMessage);
// If the validation did not return true, exit as a failure.
process.exit(1);
}

View File

@ -0,0 +1,62 @@
/**
* @license
* Copyright Google LLC All Rights Reserved.
*
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
import {Arguments, Argv, CommandModule} from 'yargs';
import {getUserConfig} from '../../utils/config';
import {validateFile} from './validate-file';
export interface ValidateFileOptions {
file?: string;
fileEnvVariable?: string;
error: boolean;
}
/** Builds the command. */
function builder(yargs: Argv) {
return yargs
.option('file', {
type: 'string',
conflicts: ['file-env-variable'],
description: 'The path of the commit message file.',
})
.option('file-env-variable' as 'fileEnvVariable', {
type: 'string',
conflicts: ['file'],
description: 'The key of the environment variable for the path of the commit message file.',
coerce: (arg: string) => {
const file = process.env[arg];
if (!file) {
throw new Error(`Provided environment variable "${arg}" was not found.`);
}
return file;
},
})
.option('error', {
type: 'boolean',
description:
'Whether invalid commit messages should be treated as failures rather than a warning',
default: !!getUserConfig().commitMessage?.errorOnInvalidMessage || !!process.env['CI']
});
}
/** Handles the command. */
async function handler({error, file, fileEnvVariable}: Arguments<ValidateFileOptions>) {
const filePath = file || fileEnvVariable || '.git/COMMIT_EDITMSG';
validateFile(filePath, error);
}
/** yargs command module describing the command. */
export const ValidateFileModule: CommandModule<{}, ValidateFileOptions> = {
handler,
builder,
command: 'pre-commit-validate',
describe: 'Validate the most recent commit message',
};

View File

@ -0,0 +1,47 @@
/**
* @license
* Copyright Google LLC All Rights Reserved.
*
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
import {readFileSync} from 'fs';
import {resolve} from 'path';
import {getRepoBaseDir} from '../../utils/config';
import {error, green, info, log, red, yellow} from '../../utils/console';
import {deleteCommitMessageDraft, saveCommitMessageDraft} from '../commit-message-draft';
import {printValidationErrors, validateCommitMessage} from '../validate';
/** Validate commit message at the provided file path. */
export function validateFile(filePath: string, isErrorMode: boolean) {
const commitMessage = readFileSync(resolve(getRepoBaseDir(), filePath), 'utf8');
const {valid, errors} = validateCommitMessage(commitMessage);
if (valid) {
info(`${green('√')} Valid commit message`);
deleteCommitMessageDraft(filePath);
process.exitCode = 0;
return;
}
/** Function used to print to the console log. */
let printFn = isErrorMode ? error : log;
printFn(`${isErrorMode ? red('✘') : yellow('!')} Invalid commit message`);
printValidationErrors(errors, printFn);
if (isErrorMode) {
printFn(red('Aborting commit attempt due to invalid commit message.'));
printFn(
red('Commit message aborted as failure rather than warning due to local configuration.'));
} else {
printFn(yellow('Before this commit can be merged into the upstream repository, it must be'));
printFn(yellow('amended to follow commit message guidelines.'));
}
// On all invalid commit messages, the commit message should be saved as a draft to be
// restored on the next commit attempt.
saveCommitMessageDraft(filePath, commitMessage);
// Set the correct exit code based on if invalid commit message is an error.
process.exitCode = isErrorMode ? 1 : 0;
}

View File

@ -0,0 +1,50 @@
/**
* @license
* Copyright Google LLC All Rights Reserved.
*
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
import {Arguments, Argv, CommandModule} from 'yargs';
import {info} from '../../utils/console';
import {validateCommitRange} from './validate-range';
export interface ValidateRangeOptions {
range: string;
}
/** Builds the command. */
function builder(yargs: Argv) {
return yargs.option('range', {
description: 'The range of commits to check, e.g. --range abc123..xyz456',
demandOption: ' A range must be provided, e.g. --range abc123..xyz456',
type: 'string',
requiresArg: true,
});
}
/** Handles the command. */
async function handler({range}: Arguments<ValidateRangeOptions>) {
// If on CI, and no pull request number is provided, assume the branch
// being run on is an upstream branch.
if (process.env['CI'] && process.env['CI_PULL_REQUEST'] === 'false') {
info(`Since valid commit messages are enforced by PR linting on CI, we do not`);
info(`need to validate commit messages on CI runs on upstream branches.`);
info();
info(`Skipping check of provided commit range`);
return;
}
validateCommitRange(range);
}
/** yargs command module describing the command. */
export const ValidateRangeModule: CommandModule<{}, ValidateRangeOptions> = {
handler,
builder,
command: 'validate-range',
describe: 'Validate a range of commit messages',
};

View File

@ -5,11 +5,11 @@
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
import {info} from '../utils/console';
import {exec} from '../utils/shelljs';
import {error, info} from '../../utils/console';
import {exec} from '../../utils/shelljs';
import {parseCommitMessage} from './parse';
import {validateCommitMessage, ValidateCommitMessageOptions} from './validate';
import {parseCommitMessage} from '../parse';
import {printValidationErrors, validateCommitMessage, ValidateCommitMessageOptions} from '../validate';
// Whether the provided commit is a fixup commit.
const isNonFixup = (m: string) => !parseCommitMessage(m).isFixup;
@ -19,11 +19,20 @@ const extractCommitHeader = (m: string) => parseCommitMessage(m).header;
/** Validate all commits in a provided git commit range. */
export function validateCommitRange(range: string) {
// A random value is used as a string to allow for a definite split point in the git log result.
/**
* A random value is used as a string to allow for a definite split point in the git log result.
*/
const randomValueSeparator = `${Math.random()}`;
// Custom git log format that provides the commit header and body, separated as expected with
// the custom separator as the trailing value.
/**
* Custom git log format that provides the commit header and body, separated as expected with the
* custom separator as the trailing value.
*/
const gitLogFormat = `%s%n%n%b${randomValueSeparator}`;
/**
* A list of tuples containing a commit header string and the list of error messages for the
* commit.
*/
const errors: [commitHeader: string, errors: string[]][] = [];
// Retrieve the commits in the provided range.
const result = exec(`git log --reverse --format=${gitLogFormat} ${range}`);
@ -45,12 +54,22 @@ export function validateCommitRange(range: string) {
undefined :
commits.slice(0, i).filter(isNonFixup).map(extractCommitHeader)
};
return validateCommitMessage(m, options);
const {valid, errors: localErrors, commit} = validateCommitMessage(m, options);
if (localErrors.length) {
errors.push([commit.header, localErrors]);
}
return valid;
});
if (allCommitsInRangeValid) {
info('√ All commit messages in range valid.');
} else {
error('✘ Invalid commit message');
errors.forEach(([header, validationErrors]) => {
error.group(header);
printValidationErrors(validationErrors);
error.groupEnd();
});
// Exit with a non-zero exit code if invalid commit messages have
// been discovered.
process.exit(1);

View File

@ -8,7 +8,7 @@
// Imports
import * as validateConfig from './config';
import {validateCommitMessage} from './validate';
import {validateCommitMessage, ValidateCommitMessageResult} from './validate';
type CommitMessageConfig = validateConfig.CommitMessageConfig;
@ -31,44 +31,35 @@ const SCOPES = config.commitMessage.scopes.join(', ');
const INVALID = false;
const VALID = true;
function expectValidationResult(
validationResult: ValidateCommitMessageResult, valid: boolean, errors: string[] = []) {
expect(validationResult).toEqual(jasmine.objectContaining({valid, errors}));
}
// TODO(josephperrott): Clean up tests to test script rather than for
// specific commit messages we want to use.
describe('validate-commit-message.js', () => {
let lastError: string = '';
beforeEach(() => {
lastError = '';
spyOn(console, 'error').and.callFake((msg: string) => lastError = msg);
spyOn(validateConfig, 'getCommitMessageConfig')
.and.returnValue(config as ReturnType<typeof validateConfig.getCommitMessageConfig>);
});
describe('validateMessage()', () => {
it('should be valid', () => {
expect(validateCommitMessage('feat(packaging): something')).toBe(VALID);
expect(lastError).toBe('');
expect(validateCommitMessage('fix(packaging): something')).toBe(VALID);
expect(lastError).toBe('');
expect(validateCommitMessage('fixup! fix(packaging): something')).toBe(VALID);
expect(lastError).toBe('');
expect(validateCommitMessage('squash! fix(packaging): something')).toBe(VALID);
expect(lastError).toBe('');
expect(validateCommitMessage('Revert: "fix(packaging): something"')).toBe(VALID);
expect(lastError).toBe('');
expectValidationResult(validateCommitMessage('feat(packaging): something'), VALID);
expectValidationResult(validateCommitMessage('fix(packaging): something'), VALID);
expectValidationResult(validateCommitMessage('fixup! fix(packaging): something'), VALID);
expectValidationResult(validateCommitMessage('squash! fix(packaging): something'), VALID);
expectValidationResult(validateCommitMessage('Revert: "fix(packaging): something"'), VALID);
});
it('should validate max length', () => {
const msg =
'fix(compiler): something super mega extra giga tera long, maybe even longer and longer and longer and longer and longer and longer...';
expect(validateCommitMessage(msg)).toBe(INVALID);
expect(lastError).toContain(`The commit message header is longer than ${
config.commitMessage.maxLineLength} characters`);
expectValidationResult(validateCommitMessage(msg), INVALID, [
`The commit message header is longer than ${config.commitMessage.maxLineLength} characters`
]);
});
it('should skip max length limit for URLs', () => {
@ -77,49 +68,56 @@ describe('validate-commit-message.js', () => {
'limit. For more details see the following super long URL:\n\n' +
'https://github.com/angular/components/commit/e2ace018ddfad10608e0e32932c43dcfef4095d7#diff-9879d6db96fd29134fc802214163b95a';
expect(validateCommitMessage(msg)).toBe(VALID);
expectValidationResult(validateCommitMessage(msg), VALID);
});
it('should validate "<type>(<scope>): <subject>" format', () => {
const msg = 'not correct format';
expect(validateCommitMessage(msg)).toBe(INVALID);
expect(lastError).toContain(`The commit message header does not match the expected format.`);
expectValidationResult(
validateCommitMessage(msg), INVALID,
[`The commit message header does not match the expected format.`]);
});
it('should fail when type is invalid', () => {
const msg = 'weird(core): something';
expect(validateCommitMessage(msg)).toBe(INVALID);
expect(lastError).toContain(`'weird' is not an allowed type.\n => TYPES: ${TYPES}`);
expectValidationResult(
validateCommitMessage(msg), INVALID,
[`'weird' is not an allowed type.\n => TYPES: ${TYPES}`]);
});
it('should fail when scope is invalid', () => {
const errorMessageFor = (scope: string, header: string) =>
`'${scope}' is not an allowed scope.\n => SCOPES: ${SCOPES}`;
expect(validateCommitMessage('fix(Compiler): something')).toBe(INVALID);
expect(lastError).toContain(errorMessageFor('Compiler', 'fix(Compiler): something'));
expectValidationResult(
validateCommitMessage('fix(Compiler): something'), INVALID,
[errorMessageFor('Compiler', 'fix(Compiler): something')]);
expect(validateCommitMessage('feat(bah): something')).toBe(INVALID);
expect(lastError).toContain(errorMessageFor('bah', 'feat(bah): something'));
expectValidationResult(
validateCommitMessage('feat(bah): something'), INVALID,
[errorMessageFor('bah', 'feat(bah): something')]);
expect(validateCommitMessage('fix(webworker): something')).toBe(INVALID);
expect(lastError).toContain(errorMessageFor('webworker', 'fix(webworker): something'));
expectValidationResult(
validateCommitMessage('fix(webworker): something'), INVALID,
[errorMessageFor('webworker', 'fix(webworker): something')]);
expect(validateCommitMessage('refactor(security): something')).toBe(INVALID);
expect(lastError).toContain(errorMessageFor('security', 'refactor(security): something'));
expectValidationResult(
validateCommitMessage('refactor(security): something'), INVALID,
[errorMessageFor('security', 'refactor(security): something')]);
expect(validateCommitMessage('refactor(docs): something')).toBe(INVALID);
expect(lastError).toContain(errorMessageFor('docs', 'refactor(docs): something'));
expectValidationResult(
validateCommitMessage('refactor(docs): something'), INVALID,
[errorMessageFor('docs', 'refactor(docs): something')]);
expect(validateCommitMessage('feat(angular): something')).toBe(INVALID);
expect(lastError).toContain(errorMessageFor('angular', 'feat(angular): something'));
expectValidationResult(
validateCommitMessage('feat(angular): something'), INVALID,
[errorMessageFor('angular', 'feat(angular): something')]);
});
it('should allow empty scope', () => {
expect(validateCommitMessage('build: blablabla')).toBe(VALID);
expect(lastError).toBe('');
expectValidationResult(validateCommitMessage('build: blablabla'), VALID);
});
// We do not want to allow WIP. It is OK to fail the PR build in this case to show that there is
@ -127,30 +125,25 @@ describe('validate-commit-message.js', () => {
it('should not allow "WIP: ..." syntax', () => {
const msg = 'WIP: fix: something';
expect(validateCommitMessage(msg)).toBe(INVALID);
expect(lastError).toContain(`'WIP' is not an allowed type.\n => TYPES: ${TYPES}`);
expectValidationResult(
validateCommitMessage(msg), INVALID,
[`'WIP' is not an allowed type.\n => TYPES: ${TYPES}`]);
});
describe('(revert)', () => {
it('should allow valid "revert: ..." syntaxes', () => {
expect(validateCommitMessage('revert: anything')).toBe(VALID);
expect(lastError).toBe('');
expect(validateCommitMessage('Revert: "anything"')).toBe(VALID);
expect(lastError).toBe('');
expect(validateCommitMessage('revert anything')).toBe(VALID);
expect(lastError).toBe('');
expect(validateCommitMessage('rEvErT anything')).toBe(VALID);
expect(lastError).toBe('');
expectValidationResult(validateCommitMessage('revert: anything'), VALID);
expectValidationResult(validateCommitMessage('Revert: "anything"'), VALID);
expectValidationResult(validateCommitMessage('revert anything'), VALID);
expectValidationResult(validateCommitMessage('rEvErT anything'), VALID);
});
it('should not allow "revert(scope): ..." syntax', () => {
const msg = 'revert(compiler): reduce generated code payload size by 65%';
expect(validateCommitMessage(msg)).toBe(INVALID);
expect(lastError).toContain(`'revert' is not an allowed type.\n => TYPES: ${TYPES}`);
expectValidationResult(
validateCommitMessage(msg), INVALID,
[`'revert' is not an allowed type.\n => TYPES: ${TYPES}`]);
});
// https://github.com/angular/angular/issues/23479
@ -158,28 +151,26 @@ describe('validate-commit-message.js', () => {
const msg =
'Revert "fix(compiler): Pretty print object instead of [Object object] (#22689)" (#23442)';
expect(validateCommitMessage(msg)).toBe(VALID);
expect(lastError).toBe('');
expectValidationResult(validateCommitMessage(msg), VALID);
});
});
describe('(squash)', () => {
describe('without `disallowSquash`', () => {
it('should return commits as valid', () => {
expect(validateCommitMessage('squash! feat(core): add feature')).toBe(VALID);
expect(validateCommitMessage('squash! fix: a bug')).toBe(VALID);
expect(validateCommitMessage('squash! fix a typo')).toBe(VALID);
expectValidationResult(validateCommitMessage('squash! feat(core): add feature'), VALID);
expectValidationResult(validateCommitMessage('squash! fix: a bug'), VALID);
expectValidationResult(validateCommitMessage('squash! fix a typo'), VALID);
});
});
describe('with `disallowSquash`', () => {
it('should fail', () => {
expect(validateCommitMessage('fix(core): something', {disallowSquash: true})).toBe(VALID);
expect(validateCommitMessage('squash! fix(core): something', {
disallowSquash: true
})).toBe(INVALID);
expect(lastError).toContain(
'The commit must be manually squashed into the target commit');
expectValidationResult(
validateCommitMessage('fix(core): something', {disallowSquash: true}), VALID);
expectValidationResult(
validateCommitMessage('squash! fix(core): something', {disallowSquash: true}),
INVALID, ['The commit must be manually squashed into the target commit']);
});
});
});
@ -187,9 +178,9 @@ describe('validate-commit-message.js', () => {
describe('(fixup)', () => {
describe('without `nonFixupCommitHeaders`', () => {
it('should return commits as valid', () => {
expect(validateCommitMessage('fixup! feat(core): add feature')).toBe(VALID);
expect(validateCommitMessage('fixup! fix: a bug')).toBe(VALID);
expect(validateCommitMessage('fixup! fixup! fix: a bug')).toBe(VALID);
expectValidationResult(validateCommitMessage('fixup! feat(core): add feature'), VALID);
expectValidationResult(validateCommitMessage('fixup! fix: a bug'), VALID);
expectValidationResult(validateCommitMessage('fixup! fixup! fix: a bug'), VALID);
});
});
@ -197,36 +188,39 @@ describe('validate-commit-message.js', () => {
it('should check that the fixup commit matches a non-fixup one', () => {
const msg = 'fixup! foo';
expect(validateCommitMessage(
msg, {disallowSquash: false, nonFixupCommitHeaders: ['foo', 'bar', 'baz']}))
.toBe(VALID);
expect(validateCommitMessage(
msg, {disallowSquash: false, nonFixupCommitHeaders: ['bar', 'baz', 'foo']}))
.toBe(VALID);
expect(validateCommitMessage(
msg, {disallowSquash: false, nonFixupCommitHeaders: ['baz', 'foo', 'bar']}))
.toBe(VALID);
expectValidationResult(
validateCommitMessage(
msg, {disallowSquash: false, nonFixupCommitHeaders: ['foo', 'bar', 'baz']}),
VALID);
expectValidationResult(
validateCommitMessage(
msg, {disallowSquash: false, nonFixupCommitHeaders: ['bar', 'baz', 'foo']}),
VALID);
expectValidationResult(
validateCommitMessage(
msg, {disallowSquash: false, nonFixupCommitHeaders: ['baz', 'foo', 'bar']}),
VALID);
expect(validateCommitMessage(
msg, {disallowSquash: false, nonFixupCommitHeaders: ['qux', 'quux', 'quuux']}))
.toBe(INVALID);
expect(lastError).toContain(
'Unable to find match for fixup commit among prior commits: \n' +
expectValidationResult(
validateCommitMessage(
msg, {disallowSquash: false, nonFixupCommitHeaders: ['qux', 'quux', 'quuux']}),
INVALID,
['Unable to find match for fixup commit among prior commits: \n' +
' qux\n' +
' quux\n' +
' quuux');
' quuux']);
});
it('should fail if `nonFixupCommitHeaders` is empty', () => {
expect(validateCommitMessage('refactor(core): make reactive', {
disallowSquash: false,
nonFixupCommitHeaders: []
})).toBe(VALID);
expect(validateCommitMessage(
'fixup! foo', {disallowSquash: false, nonFixupCommitHeaders: []}))
.toBe(INVALID);
expect(lastError).toContain(
`Unable to find match for fixup commit among prior commits: -`);
expectValidationResult(
validateCommitMessage(
'refactor(core): make reactive',
{disallowSquash: false, nonFixupCommitHeaders: []}),
VALID);
expectValidationResult(
validateCommitMessage(
'fixup! foo', {disallowSquash: false, nonFixupCommitHeaders: []}),
INVALID, [`Unable to find match for fixup commit among prior commits: -`]);
});
});
});
@ -246,24 +240,27 @@ describe('validate-commit-message.js', () => {
});
it('should fail validation if the body is shorter than `minBodyLength`', () => {
expect(validateCommitMessage(
'fix(core): something\n\n Explanation of the motivation behind this change'))
.toBe(VALID);
expect(validateCommitMessage('fix(core): something\n\n too short')).toBe(INVALID);
expect(lastError).toContain(
'The commit message body does not meet the minimum length of 30 characters');
expect(validateCommitMessage('fix(core): something')).toBe(INVALID);
expect(lastError).toContain(
'The commit message body does not meet the minimum length of 30 characters');
expectValidationResult(
validateCommitMessage(
'fix(core): something\n\n Explanation of the motivation behind this change'),
VALID);
expectValidationResult(
validateCommitMessage('fix(core): something\n\n too short'), INVALID,
['The commit message body does not meet the minimum length of 30 characters']);
expectValidationResult(validateCommitMessage('fix(core): something'), INVALID, [
'The commit message body does not meet the minimum length of 30 characters'
]);
});
it('should pass validation if the body is shorter than `minBodyLength` but the commit type is in the `minBodyLengthTypeExclusions` list',
() => {
expect(validateCommitMessage('docs: just fixing a typo')).toBe(VALID);
expect(validateCommitMessage('docs(core): just fixing a typo')).toBe(VALID);
expect(validateCommitMessage(
'docs(core): just fixing a typo\n\nThis was just a silly typo.'))
.toBe(VALID);
expectValidationResult(validateCommitMessage('docs: just fixing a typo'), VALID);
expectValidationResult(validateCommitMessage('docs(core): just fixing a typo'), VALID);
expectValidationResult(
validateCommitMessage(
'docs(core): just fixing a typo\n\nThis was just a silly typo.'),
VALID);
});
});
});

View File

@ -8,7 +8,7 @@
import {error} from '../utils/console';
import {COMMIT_TYPES, getCommitMessageConfig, ScopeRequirement} from './config';
import {parseCommitMessage} from './parse';
import {parseCommitMessage, ParsedCommitMessage} from './parse';
/** Options for commit message validation. */
export interface ValidateCommitMessageOptions {
@ -16,27 +16,26 @@ export interface ValidateCommitMessageOptions {
nonFixupCommitHeaders?: string[];
}
/** The result of a commit message validation check. */
export interface ValidateCommitMessageResult {
valid: boolean;
errors: string[];
commit: ParsedCommitMessage;
}
/** Regex matching a URL for an entire commit body line. */
const COMMIT_BODY_URL_LINE_RE = /^https?:\/\/.*$/;
/** Validate a commit message against using the local repo's config. */
export function validateCommitMessage(
commitMsg: string, options: ValidateCommitMessageOptions = {}) {
function printError(errorMessage: string) {
error(
`INVALID COMMIT MSG: \n` +
`${'─'.repeat(40)}\n` +
`${commitMsg}\n` +
`${'─'.repeat(40)}\n` +
`ERROR: \n` +
` ${errorMessage}` +
`\n\n` +
`The expected format for a commit is: \n` +
`<type>(<scope>): <subject>\n\n<body>`);
}
commitMsg: string, options: ValidateCommitMessageOptions = {}): ValidateCommitMessageResult {
const config = getCommitMessageConfig().commitMessage;
const commit = parseCommitMessage(commitMsg);
const errors: string[] = [];
/** Perform the validation checks against the parsed commit. */
function validateCommitAndCollectErrors() {
// TODO(josephperrott): Remove early return calls when commit message errors are found
////////////////////////////////////
// Checking revert, squash, fixup //
@ -51,7 +50,7 @@ export function validateCommitMessage(
// the git history anyway, unless the options provided to not allow squash commits.
if (commit.isSquash) {
if (options.disallowSquash) {
printError('The commit must be manually squashed into the target commit');
errors.push('The commit must be manually squashed into the target commit');
return false;
}
return true;
@ -64,7 +63,7 @@ export function validateCommitMessage(
// check.
if (commit.isFixup) {
if (options.nonFixupCommitHeaders && !options.nonFixupCommitHeaders.includes(commit.header)) {
printError(
errors.push(
'Unable to find match for fixup commit among prior commits: ' +
(options.nonFixupCommitHeaders.map(x => `\n ${x}`).join('') || '-'));
return false;
@ -77,19 +76,17 @@ export function validateCommitMessage(
// Checking commit header //
////////////////////////////
if (commit.header.length > config.maxLineLength) {
printError(`The commit message header is longer than ${config.maxLineLength} characters`);
errors.push(`The commit message header is longer than ${config.maxLineLength} characters`);
return false;
}
if (!commit.type) {
printError(`The commit message header does not match the expected format.`);
errors.push(`The commit message header does not match the expected format.`);
return false;
}
if (COMMIT_TYPES[commit.type] === undefined) {
printError(`'${commit.type}' is not an allowed type.\n => TYPES: ${
errors.push(`'${commit.type}' is not an allowed type.\n => TYPES: ${
Object.keys(COMMIT_TYPES).join(', ')}`);
return false;
}
@ -98,19 +95,19 @@ export function validateCommitMessage(
const scopeRequirementForType = COMMIT_TYPES[commit.type].scope;
if (scopeRequirementForType === ScopeRequirement.Forbidden && commit.scope) {
printError(`Scopes are forbidden for commits with type '${commit.type}', but a scope of '${
errors.push(`Scopes are forbidden for commits with type '${commit.type}', but a scope of '${
commit.scope}' was provided.`);
return false;
}
if (scopeRequirementForType === ScopeRequirement.Required && !commit.scope) {
printError(
errors.push(
`Scopes are required for commits with type '${commit.type}', but no scope was provided.`);
return false;
}
if (commit.scope && !config.scopes.includes(commit.scope)) {
printError(
errors.push(
`'${commit.scope}' is not an allowed scope.\n => SCOPES: ${config.scopes.join(', ')}`);
return false;
}
@ -126,7 +123,7 @@ export function validateCommitMessage(
if (!config.minBodyLengthTypeExcludes?.includes(commit.type) &&
commit.bodyWithoutLinking.trim().length < config.minBodyLength) {
printError(`The commit message body does not meet the minimum length of ${
errors.push(`The commit message body does not meet the minimum length of ${
config.minBodyLength} characters`);
return false;
}
@ -139,10 +136,27 @@ export function validateCommitMessage(
});
if (lineExceedsMaxLength) {
printError(
errors.push(
`The commit message body contains lines greater than ${config.maxLineLength} characters`);
return false;
}
return true;
}
return {valid: validateCommitAndCollectErrors(), errors, commit};
}
/** Print the error messages from the commit message validation to the console. */
export function printValidationErrors(errors: string[], print = error) {
print.group(`Error${errors.length === 1 ? '' : 's'}:`);
errors.forEach(line => print(line));
print.groupEnd();
print();
print('The expected format for a commit is: ');
print('<type>(<scope>): <summary>');
print();
print('<body>');
print();
}

View File

@ -0,0 +1,54 @@
/**
* @license
* Copyright Google LLC All Rights Reserved.
*
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
import {Arguments, Argv, CommandModule} from 'yargs';
import {CommitMsgSource} from '../commit-message-source';
import {runWizard} from './wizard';
export interface WizardOptions {
filePath: string;
commitSha: string|undefined;
source: CommitMsgSource|undefined;
}
/** Builds the command. */
function builder(yargs: Argv) {
return yargs
.positional('filePath', {
description: 'The file path to write the generated commit message into',
type: 'string',
demandOption: true,
})
.positional('source', {
choices: ['message', 'template', 'merge', 'squash', 'commit'] as const,
description: 'The source of the commit message as described here: ' +
'https://git-scm.com/docs/githooks#_prepare_commit_msg'
})
.positional('commitSha', {
description: 'The commit sha if source is set to `commit`',
type: 'string',
});
}
/** Handles the command. */
async function handler(args: Arguments<WizardOptions>) {
await runWizard(args);
}
/** yargs command module describing the command. */
export const WizardModule: CommandModule<{}, WizardOptions> = {
handler,
builder,
command: 'wizard <filePath> [source] [commitSha]',
// Description: Run the wizard to build a base commit message before opening to complete.
// No describe is defiend to hide the command from the --help.
describe: false,
};

View File

@ -7,15 +7,12 @@
*/
import {writeFileSync} from 'fs';
import {info} from '../utils/console';
import {getUserConfig} from '../../utils/config';
import {debug, info} from '../../utils/console';
import {buildCommitMessage} from './builder';
import {buildCommitMessage} from '../builder';
import {CommitMsgSource} from '../commit-message-source';
/**
* The source triggering the git commit message creation.
* As described in: https://git-scm.com/docs/githooks#_prepare_commit_msg
*/
export type PrepareCommitMsgHookSource = 'message'|'template'|'merge'|'squash'|'commit';
/** The default commit message used if the wizard does not procude a commit message. */
const defaultCommitMessage = `<type>(<scope>): <summary>
@ -24,11 +21,16 @@ const defaultCommitMessage = `<type>(<scope>): <summary>
# lines at 100 characters.>\n\n`;
export async function runWizard(
args: {filePath: string, source?: PrepareCommitMsgHookSource, commitSha?: string}) {
// TODO(josephperrott): Add support for skipping wizard with local untracked config file
args: {filePath: string, source?: CommitMsgSource, commitSha?: string}) {
if (getUserConfig().commitMessage?.disableWizard) {
debug('Skipping commit message wizard due to enabled `commitMessage.disableWizard` option in');
debug('user config.');
process.exitCode = 0;
return;
}
if (args.source !== undefined) {
info(`Skipping commit message wizard due because the commit was created via '${
info(`Skipping commit message wizard because the commit was created via '${
args.source}' source`);
process.exitCode = 0;
return;

View File

@ -3,6 +3,7 @@ load("@npm_bazel_typescript//:index.bzl", "ts_library")
ts_library(
name = "common",
srcs = glob(["*.ts"]),
module_name = "@angular/dev-infra-private/pr/common",
visibility = ["//dev-infra:__subpackages__"],
deps = [
"//dev-infra/utils",

View File

@ -12,11 +12,13 @@
"@angular/benchpress": "0.2.1",
"@octokit/graphql": "<from-root>",
"@octokit/types": "<from-root>",
"@octokit/rest": "<from-root>",
"brotli": "<from-root>",
"chalk": "<from-root>",
"cli-progress": "<from-root>",
"glob": "<from-root>",
"inquirer": "<from-root>",
"inquirer-autocomplete-prompt": "<from-root>",
"minimatch": "<from-root>",
"multimatch": "<from-root>",
"node-fetch": "<from-root>",
@ -26,9 +28,7 @@
"tslib": "<from-root>",
"typed-graphqlify": "<from-root>",
"yaml": "<from-root>",
"yargs": "<from-root>"
},
"peerDependencies": {
"yargs": "<from-root>",
"@bazel/buildifier": "<from-root>",
"clang-format": "<from-root>",
"protractor": "<from-root>",

View File

@ -12,13 +12,11 @@ ts_library(
"@npm//@octokit/graphql",
"@npm//@octokit/rest",
"@npm//@octokit/types",
"@npm//@types/fs-extra",
"@npm//@types/inquirer",
"@npm//@types/node",
"@npm//@types/shelljs",
"@npm//@types/yargs",
"@npm//chalk",
"@npm//fs-extra",
"@npm//inquirer",
"@npm//inquirer-autocomplete-prompt",
"@npm//shelljs",

View File

@ -9,7 +9,7 @@
import {existsSync} from 'fs';
import {dirname, join} from 'path';
import {error} from './console';
import {debug, error} from './console';
import {exec} from './shelljs';
import {isTsNodeAvailable} from './ts-node';
@ -49,7 +49,16 @@ export type NgDevConfig<T = {}> = CommonConfig&T;
const CONFIG_FILE_PATH = '.ng-dev/config';
/** The configuration for ng-dev. */
let CONFIG: {}|null = null;
let cachedConfig: NgDevConfig|null = null;
/**
* The filename expected for local user config, without the file extension to allow a typescript,
* javascript or json file to be used.
*/
const USER_CONFIG_FILE_PATH = '.ng-dev.user';
/** The local user configuration for ng-dev. */
let userConfig: {[key: string]: any}|null = null;
/**
* Get the configuration from the file system, returning the already loaded
@ -57,15 +66,15 @@ let CONFIG: {}|null = null;
*/
export function getConfig(): NgDevConfig {
// If the global config is not defined, load it from the file system.
if (CONFIG === null) {
if (cachedConfig === null) {
// The full path to the configuration file.
const configPath = join(getRepoBaseDir(), CONFIG_FILE_PATH);
// Set the global config object.
CONFIG = readConfigFile(configPath);
// Read the configuration and validate it before caching it for the future.
cachedConfig = validateCommonConfig(readConfigFile(configPath));
}
// Return a clone of the global config to ensure that a new instance of the config is returned
// each time, preventing unexpected effects of modifications to the config object.
return validateCommonConfig({...CONFIG});
// Return a clone of the cached global config to ensure that a new instance of the config
// is returned each time, preventing unexpected effects of modifications to the config object.
return {...cachedConfig};
}
/** Validate the common configuration has been met for the ng-dev command. */
@ -86,8 +95,11 @@ function validateCommonConfig(config: Partial<NgDevConfig>) {
return config as NgDevConfig;
}
/** Resolves and reads the specified configuration file. */
function readConfigFile(configPath: string): object {
/**
* Resolves and reads the specified configuration file, optionally returning an empty object if the
* configuration file cannot be read.
*/
function readConfigFile(configPath: string, returnEmptyObjectOnError = false): object {
// If the the `.ts` extension has not been set up already, and a TypeScript based
// version of the given configuration seems to exist, set up `ts-node` if available.
if (require.extensions['.ts'] === undefined && existsSync(`${configPath}.ts`) &&
@ -103,7 +115,12 @@ function readConfigFile(configPath: string): object {
try {
return require(configPath);
} catch (e) {
error('Could not read configuration file.');
if (returnEmptyObjectOnError) {
debug(`Could not read configuration file at ${configPath}, returning empty object instead.`);
debug(e);
return {};
}
error(`Could not read configuration file at ${configPath}.`);
error(e);
process.exit(1);
}
@ -135,3 +152,23 @@ export function getRepoBaseDir() {
}
return baseRepoDir.trim();
}
/**
* Get the local user configuration from the file system, returning the already loaded copy if it is
* defined.
*
* @returns The user configuration object, or an empty object if no user configuration file is
* present. The object is an untyped object as there are no required user configurations.
*/
export function getUserConfig() {
// If the global config is not defined, load it from the file system.
if (userConfig === null) {
// The full path to the configuration file.
const configPath = join(getRepoBaseDir(), USER_CONFIG_FILE_PATH);
// Set the global config object.
userConfig = readConfigFile(configPath, true);
}
// Return a clone of the user config to ensure that a new instance of the config is returned
// each time, preventing unexpected effects of modifications to the config object.
return {...userConfig};
}

View File

@ -7,7 +7,7 @@
*/
import chalk from 'chalk';
import {writeFileSync} from 'fs-extra';
import {writeFileSync} from 'fs';
import {createPromptModule, ListChoiceOptions, prompt} from 'inquirer';
import * as inquirerAutocomplete from 'inquirer-autocomplete-prompt';
import {join} from 'path';
@ -196,6 +196,9 @@ export function captureLogOutputForCommand(argv: Arguments) {
/** Path to the log file location. */
const logFilePath = join(getRepoBaseDir(), '.ng-dev.log');
// Strip ANSI escape codes from log outputs.
LOGGED_TEXT = LOGGED_TEXT.replace(/\x1B\[([0-9]{1,3}(;[0-9]{1,2})?)?[mGK]/g, '');
writeFileSync(logFilePath, LOGGED_TEXT);
// For failure codes greater than 1, the new logged lines should be written to a specific log

View File

@ -4,71 +4,23 @@ Caretaker is responsible for merging PRs into the individual branches and intern
## Responsibilities
- Draining the queue of PRs ready to be merged. (PRs with [`PR action: merge`](https://github.com/angular/angular/pulls?q=is%3Aopen+is%3Apr+label%3A%22PR+action%3A+merge%22) label)
- Draining the queue of PRs ready to be merged. (PRs with [`action: merge`](https://github.com/angular/angular/pulls?q=is%3Aopen+is%3Apr+label%3A%22action%3A+merge%22) label)
- Assigning [new issues](https://github.com/angular/angular/issues?q=is%3Aopen+is%3Aissue+no%3Alabel) to individual component authors.
## Merging the PR
A PR needs to have `PR action: merge` and `PR target: *` labels to be considered
ready to merge. Merging is performed by running `merge-pr` with a PR number to merge.
A PR needs to have `action: merge` and `target: *` labels to be considered
ready to merge. Merging is performed by running `ng-dev pr merge` with a PR number to merge.
The tooling automatically verifies the given PR is ready for merge. If the PR passes the tests, the
tool will automatically merge it based on the applied target label.
To merge a PR run:
```
$ ./scripts/github/merge-pr 1234
$ yarn ng-dev pr merge <pr number>
```
The `merge-pr` script will:
- Ensure that all appropriate labels are on the PR.
- Fetches the latest PR code from the `angular/angular` repo.
- It will `cherry-pick` all of the SHAs from the PR into the current corresponding branches `master` and or `?.?.x` (patch).
- It will rewrite commit history by automatically adding `Close #1234` and `(#1234)` into the commit message.
NOTE: The `merge-pr` will land the PR on `master` and or `?.?.x` (patch) as described by `PR target: *` label.
### Recovering from failed `merge-pr` due to conflicts
When running `merge-pr` the script will output the commands which it is about to run.
```
$ ./scripts/github/merge-pr 1234
======================
GitHub Merge PR Steps
======================
git cherry-pick angular/pr/1234~1..angular/pr/1234
git filter-branch -f --msg-filter "/home/misko/angular/scripts/github/utils/github.closes 1234" HEAD~1..HEAD
```
If the `cherry-pick` command fails than resolve conflicts and use `git cherry-pick --continue` once ready. After the `cherry-pick` is done cut&paste and run the `filter-branch` command to properly rewrite the messages
## Cherry-picking PRs into patch branch
In addition to merging PRs into the master branch, many PRs need to be also merged into a patch branch.
Follow these steps to get patch branch up to date.
1. Check out the most recent patch branch: `git checkout 4.3.x`
2. Get a list of PRs merged into master: `git log master --oneline -n10`
3. For each PR number in the commit message run: `./scripts/github/merge-pr 1234`
- The PR will only merge if the `PR target:` matches the branch.
Once all of the PRs are in patch branch, push the all branches and tags to github using `push-upstream` script.
## Pushing merged PRs into github
Use `push-upstream` script to push all of the branch and tags to github.
```
$ ./scripts/github/push-upstream
git push git@github.com:angular/angular.git master:master 4.3.x:4.3.x
Counting objects: 25, done.
Delta compression using up to 6 threads.
Compressing objects: 100% (17/17), done.
Writing objects: 100% (25/25), 2.22 KiB | 284.00 KiB/s, done.
Total 25 (delta 22), reused 8 (delta 7)
remote: Resolving deltas: 100% (22/22), completed with 18 local objects.
To github.com:angular/angular.git
079d884b6..d1c4a94bb master -> master
git push --tags -f git@github.com:angular/angular.git patch_sync:patch_sync
Everything up-to-date
```
The `ng-dev pr merge` tool will automatically restore to the previous git state when a merge fails.

View File

@ -12,7 +12,7 @@ Change approvals in our monorepo are managed via [PullApprove](https://docs.pull
# Merging
Once a change has all of the required approvals, either the last approver or the PR author (if PR author has the project collaborator status)
should mark the PR with the `PR action: merge` label and the correct [target label](https://github.com/angular/angular/blob/master/docs/TRIAGE_AND_LABELS.md#pr-target).
should mark the PR with the `action: merge` label and the correct [target label](https://github.com/angular/angular/blob/master/docs/TRIAGE_AND_LABELS.md#pr-target).
This signals to the caretaker that the PR should be merged. See [merge instructions](CARETAKER.md).
# Who is the Caretaker?

View File

@ -1,6 +1,6 @@
# Triage Process and GitHub Labels for Angular
This document describes how the Angular team uses labels and milestones to triage issues on github.
This document describes how the Angular team uses labels and milestones to triage issues on GitHub.
The basic idea of the process is that caretaker only assigns a component (`comp: *`) label.
The owner of the component is then responsible for the secondary / component-level triage.
@ -125,32 +125,32 @@ Triaging PRs is the same as triaging issues, except that the labels `frequency:
PRs also have additional label categories that should be used to signal their state.
Every triaged PR must have a `PR action` label assigned to it:
Every triaged PR must have a `action: *` label assigned to it:
* `PR action: discuss`: Discussion is needed, to be led by the author.
* `action: discuss`: Discussion is needed, to be led by the author.
* _**Who adds it:** Typically the PR author._
* _**Who removes it:** Whoever added it._
* `PR action: review` (optional): One or more reviews are pending. The label is optional, since the review status can be derived from GitHub's Reviewers interface.
* `action: review` (optional): One or more reviews are pending. The label is optional, since the review status can be derived from GitHub's Reviewers interface.
* _**Who adds it:** Any team member. The caretaker can use it to differentiate PRs pending review from merge-ready PRs._
* _**Who removes it:** Whoever added it or the reviewer adding the last missing review._
* `PR action: cleanup`: More work is needed from the author.
* `action: cleanup`: More work is needed from the author.
* _**Who adds it:** The reviewer requesting changes to the PR._
* _**Who removes it:** Either the author (after implementing the requested changes) or the reviewer (after confirming the requested changes have been implemented)._
* `PR action: merge`: The PR author is ready for the changes to be merged by the caretaker as soon as the PR is green (or merge-assistance label is applied and caretaker has deemed it acceptable manually). In other words, this label indicates to "auto submit when ready".
* `action: merge`: The PR author is ready for the changes to be merged by the caretaker as soon as the PR is green (or merge-assistance label is applied and caretaker has deemed it acceptable manually). In other words, this label indicates to "auto submit when ready".
* _**Who adds it:** Typically the PR author._
* _**Who removes it:** Whoever added it._
In addition, PRs can have the following states:
* `PR state: WIP`: PR is experimental or rapidly changing. Not ready for review or triage.
* `state: WIP`: PR is experimental or rapidly changing. Not ready for review or triage.
* _**Who adds it:** The PR author._
* _**Who removes it:** Whoever added it._
* `PR state: blocked`: PR is blocked on an issue or other PR. Not ready for merge.
* `state: blocked`: PR is blocked on an issue or other PR. Not ready for merge.
* _**Who adds it:** Any team member._
* _**Who removes it:** Any team member._
When a PR is ready for review, a review should be requested using the Reviewers interface in Github.
When a PR is ready for review, a review should be requested using the Reviewers interface in GitHub.
## PR Target
@ -160,15 +160,29 @@ In our git workflow, we merge changes either to the `master` branch, the active
The decision about the target must be done by the PR author and/or reviewer.
This decision is then honored when the PR is being merged by the caretaker.
To communicate the target we use the following labels:
To communicate the target we use GitHub labels and only one target label may be applied to a PR.
* `PR target: master & patch`: the PR should me merged into the master branch and cherry-picked into the most recent patch branch. All PRs with fixes, docs and refactorings should use this target.
* `PR target: master-only`: the PR should be merged only into the `master` branch. All PRs with new features, API changes or high-risk changes should use this target.
* `PR target: patch-only`: the PR should be merged only into the most recent patch branch (e.g. 5.0.x). This target is useful if a `master & patch` PR can't be cleanly cherry-picked into the stable branch and a new PR is needed.
* `PR target: LTS-only`: the PR should be merged only into the active LTS branch(es). Only security and critical fixes are allowed in these branches. Always send a new PR targeting just the LTS branch and request review approval from @IgorMinar.
* `PR target: TBD`: the target is yet to be determined.
Targeting an active release train:
If a PR is missing the `PR target: *` label, or if the label is set to "TBD" when the PR is sent to the caretaker, the caretaker should reject the PR and request the appropriate target label to be applied before the PR is merged.
* `target: major`: Any breaking change
* `target: minor`: Any new feature
* `target: patch`: Bug fixes, refactorings, documentation changes, etc. that pose no or very low risk of adversely
affecting existing applications.
Special Cases:
* `target: rc`: A critical fix for an active release-train while it is in a feature freeze or RC phase
* `target: lts`: A criticial fix for a specific release-train that is still within the long term support phase
Notes:
- To land a change only in a patch/RC branch, without landing it in any other active release-train branch (such
as `master`), the patch/RC branch can be targeted in the GitHub UI with the appropriate
`target: patch`/`target: rc` label.
- `target: lts` PRs must target the specific LTS branch they would need to merge into in the GitHub UI, in
cases which a change is desired in multiple LTS branches, individual PRs for each LTS branch must be created
If a PR is missing the `target:*` label, it will be marked as pending by the angular robot status checks.
## PR Approvals
@ -182,7 +196,7 @@ In any case, the reviewer should actually look through the code and provide feed
Note that approved state does not mean a PR is ready to be merged.
For example, a reviewer might approve the PR but request a minor tweak that doesn't need further review, e.g., a rebase or small uncontroversial change.
Only the `PR action: merge` label means that the PR is ready for merging.
Only the `action: merge` label means that the PR is ready for merging.
## Special Labels
@ -201,7 +215,7 @@ Only issues with `cla:yes` should be merged into master.
Applying this label to a PR makes the angular.io preview available regardless of the author. [More info](../aio/aio-builds-setup/docs/overview--security-model.md)
### `PR action: merge-assistance`
### `action: merge-assistance`
* _**Who adds it:** Any team member._
* _**Who removes it:** Any team member._
@ -211,7 +225,7 @@ The comment should be formatted like this: `merge-assistance: <explain what kind
For example, the PR owner might not be a Googler and needs help to run g3sync; or one of the checks is failing due to external causes and the PR should still be merged.
### `PR action: rerun CI at HEAD`
### `action: rerun CI at HEAD`
* _**Who adds it:** Any team member._
* _**Who removes it:** The Angular Bot, once it triggers the CI rerun._

View File

@ -1,6 +1,6 @@
{
"name": "angular-srcs",
"version": "10.1.1",
"version": "10.1.2",
"private": true,
"description": "Angular - a web framework for modern web apps",
"homepage": "https://github.com/angular/angular",
@ -35,6 +35,8 @@
"tslint": "tsc -p tools/tsconfig.json && tslint -c tslint.json \"+(dev-infra|packages|modules|scripts|tools)/**/*.+(js|ts)\"",
"public-api:check": "node goldens/public-api/manage.js test",
"public-api:update": "node goldens/public-api/manage.js accept",
"symbol-extractor:check": "node tools/symbol-extractor/run_all_symbols_extractor_tests.js test",
"symbol-extractor:update": "node tools/symbol-extractor/run_all_symbols_extractor_tests.js accept",
"ts-circular-deps": "ts-node --transpile-only -- dev-infra/ts-circular-dependencies/index.ts --config ./packages/circular-deps-test.conf.js",
"ts-circular-deps:check": "yarn -s ts-circular-deps check",
"ts-circular-deps:approve": "yarn -s ts-circular-deps approve",

View File

@ -14,6 +14,7 @@ import {Logger} from '../../../src/ngtsc/logging';
import {ParsedConfiguration} from '../../../src/perform_compile';
import {getEntryPointFormat} from '../packages/entry_point';
import {makeEntryPointBundle} from '../packages/entry_point_bundle';
import {createModuleResolutionCache, SharedFileCache} from '../packages/source_file_cache';
import {PathMappings} from '../path_mappings';
import {FileWriter} from '../writing/file_writer';
@ -30,6 +31,8 @@ export function getCreateCompileFn(
return (beforeWritingFiles, onTaskCompleted) => {
const {Transformer} = require('../packages/transformer');
const transformer = new Transformer(fileSystem, logger, tsConfig);
const sharedFileCache = new SharedFileCache(fileSystem);
const moduleResolutionCache = createModuleResolutionCache(fileSystem);
return (task: Task) => {
const {entryPoint, formatProperty, formatPropertiesToMarkAsProcessed, processDts} = task;
@ -54,8 +57,8 @@ export function getCreateCompileFn(
logger.info(`Compiling ${entryPoint.name} : ${formatProperty} as ${format}`);
const bundle = makeEntryPointBundle(
fileSystem, entryPoint, formatPath, isCore, format, processDts, pathMappings, true,
enableI18nLegacyMessageIdFormat);
fileSystem, entryPoint, sharedFileCache, moduleResolutionCache, formatPath, isCore,
format, processDts, pathMappings, true, enableI18nLegacyMessageIdFormat);
const result = transformer.transform(bundle);
if (result.success) {

View File

@ -8,8 +8,6 @@
/// <reference types="node" />
import * as os from 'os';
import {AbsoluteFsPath, FileSystem, resolve} from '../../src/ngtsc/file_system';
import {Logger} from '../../src/ngtsc/logging';
import {ParsedConfiguration} from '../../src/perform_compile';
@ -35,7 +33,7 @@ import {composeTaskCompletedCallbacks, createLogErrorHandler, createMarkAsProces
import {AsyncLocker} from './locking/async_locker';
import {LockFileWithChildProcess} from './locking/lock_file_with_child_process';
import {SyncLocker} from './locking/sync_locker';
import {AsyncNgccOptions, getSharedSetup, SyncNgccOptions} from './ngcc_options';
import {AsyncNgccOptions, getMaxNumberOfWorkers, getSharedSetup, SyncNgccOptions} from './ngcc_options';
import {NgccConfiguration} from './packages/configuration';
import {EntryPointJsonProperty, SUPPORTED_FORMAT_PROPERTIES} from './packages/entry_point';
import {EntryPointManifest, InvalidatingEntryPointManifest} from './packages/entry_point_manifest';
@ -92,10 +90,9 @@ export function mainNgcc(options: AsyncNgccOptions|SyncNgccOptions): void|Promis
return;
}
// Execute in parallel, if async execution is acceptable and there are more than 2 CPU cores.
// (One CPU core is always reserved for the master process and we need at least 2 worker processes
// in order to run tasks in parallel.)
const inParallel = async && (os.cpus().length > 2);
// Determine the number of workers to use and whether ngcc should run in parallel.
const workerCount = async ? getMaxNumberOfWorkers() : 1;
const inParallel = workerCount > 1;
const analyzeEntryPoints = getAnalyzeEntryPointsFn(
logger, finder, fileSystem, supportedPropertiesToConsider, compileAllFormats,
@ -113,7 +110,7 @@ export function mainNgcc(options: AsyncNgccOptions|SyncNgccOptions): void|Promis
const createTaskCompletedCallback =
getCreateTaskCompletedCallback(pkgJsonUpdater, errorOnFailedEntryPoint, logger, fileSystem);
const executor = getExecutor(
async, inParallel, logger, fileWriter, pkgJsonUpdater, fileSystem, config,
async, workerCount, logger, fileWriter, pkgJsonUpdater, fileSystem, config,
createTaskCompletedCallback);
return executor.execute(analyzeEntryPoints, createCompileFn);
@ -153,7 +150,7 @@ function getCreateTaskCompletedCallback(
}
function getExecutor(
async: boolean, inParallel: boolean, logger: Logger, fileWriter: FileWriter,
async: boolean, workerCount: number, logger: Logger, fileWriter: FileWriter,
pkgJsonUpdater: PackageJsonUpdater, fileSystem: FileSystem, config: NgccConfiguration,
createTaskCompletedCallback: CreateTaskCompletedCallback): Executor {
const lockFile = new LockFileWithChildProcess(fileSystem, logger);
@ -161,9 +158,8 @@ function getExecutor(
// Execute asynchronously (either serially or in parallel)
const {retryAttempts, retryDelay} = config.getLockingConfig();
const locker = new AsyncLocker(lockFile, logger, retryDelay, retryAttempts);
if (inParallel) {
// Execute in parallel. Use up to 8 CPU cores for workers, always reserving one for master.
const workerCount = Math.min(8, os.cpus().length - 1);
if (workerCount > 1) {
// Execute in parallel.
return new ClusterExecutor(
workerCount, fileSystem, logger, fileWriter, pkgJsonUpdater, locker,
createTaskCompletedCallback);

View File

@ -5,6 +5,8 @@
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
import * as os from 'os';
import {absoluteFrom, AbsoluteFsPath, FileSystem, getFileSystem} from '../../src/ngtsc/file_system';
import {ConsoleLogger, Logger, LogLevel} from '../../src/ngtsc/logging';
import {ParsedConfiguration, readConfiguration} from '../../src/perform_compile';
@ -254,3 +256,26 @@ function checkForSolutionStyleTsConfig(
` ngcc ... --tsconfig "${fileSystem.relative(projectPath, tsConfig.project)}"`);
}
}
/**
* Determines the maximum number of workers to use for parallel execution. This can be set using the
* NGCC_MAX_WORKERS environment variable, or is computed based on the number of available CPUs. One
* CPU core is always reserved for the master process, so we take the number of CPUs minus one, with
* a maximum of 4 workers. We don't scale the number of workers beyond 4 by default, as it takes
* considerably more memory and CPU cycles while not offering a substantial improvement in time.
*/
export function getMaxNumberOfWorkers(): number {
const maxWorkers = process.env.NGCC_MAX_WORKERS;
if (maxWorkers === undefined) {
// Use up to 4 CPU cores for workers, always reserving one for master.
return Math.max(1, Math.min(4, os.cpus().length - 1));
}
const numericMaxWorkers = +maxWorkers.trim();
if (!Number.isInteger(numericMaxWorkers)) {
throw new Error('NGCC_MAX_WORKERS should be an integer.');
} else if (numericMaxWorkers < 1) {
throw new Error('NGCC_MAX_WORKERS should be at least 1.');
}
return numericMaxWorkers;
}

View File

@ -6,11 +6,12 @@
* found in the LICENSE file at https://angular.io/license
*/
import * as ts from 'typescript';
import {AbsoluteFsPath, FileSystem, NgtscCompilerHost} from '../../../src/ngtsc/file_system';
import {AbsoluteFsPath, FileSystem} from '../../../src/ngtsc/file_system';
import {PathMappings} from '../path_mappings';
import {BundleProgram, makeBundleProgram} from './bundle_program';
import {EntryPoint, EntryPointFormat} from './entry_point';
import {NgccSourcesCompilerHost} from './ngcc_compiler_host';
import {NgccDtsCompilerHost, NgccSourcesCompilerHost} from './ngcc_compiler_host';
import {EntryPointFileCache, SharedFileCache} from './source_file_cache';
/**
* A bundle of files and paths (and TS programs) that correspond to a particular
@ -31,6 +32,8 @@ export interface EntryPointBundle {
* Get an object that describes a formatted bundle for an entry-point.
* @param fs The current file-system being used.
* @param entryPoint The entry-point that contains the bundle.
* @param sharedFileCache The cache to use for source files that are shared across all entry-points.
* @param moduleResolutionCache The module resolution cache to use.
* @param formatPath The path to the source files for this bundle.
* @param isCore This entry point is the Angular core package.
* @param format The underlying format of the bundle.
@ -42,7 +45,8 @@ export interface EntryPointBundle {
* component templates.
*/
export function makeEntryPointBundle(
fs: FileSystem, entryPoint: EntryPoint, formatPath: string, isCore: boolean,
fs: FileSystem, entryPoint: EntryPoint, sharedFileCache: SharedFileCache,
moduleResolutionCache: ts.ModuleResolutionCache, formatPath: string, isCore: boolean,
format: EntryPointFormat, transformDts: boolean, pathMappings?: PathMappings,
mirrorDtsFromSrc: boolean = false,
enableI18nLegacyMessageIdFormat: boolean = true): EntryPointBundle {
@ -50,8 +54,10 @@ export function makeEntryPointBundle(
const rootDir = entryPoint.packagePath;
const options: ts
.CompilerOptions = {allowJs: true, maxNodeModuleJsDepth: Infinity, rootDir, ...pathMappings};
const srcHost = new NgccSourcesCompilerHost(fs, options, entryPoint.packagePath);
const dtsHost = new NgtscCompilerHost(fs, options);
const entryPointCache = new EntryPointFileCache(fs, sharedFileCache);
const dtsHost = new NgccDtsCompilerHost(fs, options, entryPointCache, moduleResolutionCache);
const srcHost = new NgccSourcesCompilerHost(
fs, options, entryPointCache, moduleResolutionCache, entryPoint.packagePath);
// Create the bundle programs, as necessary.
const absFormatPath = fs.resolve(entryPoint.path, formatPath);

View File

@ -10,6 +10,7 @@ import * as ts from 'typescript';
import {AbsoluteFsPath, FileSystem, NgtscCompilerHost} from '../../../src/ngtsc/file_system';
import {isWithinPackage} from '../analysis/util';
import {isRelativePath} from '../utils';
import {EntryPointFileCache} from './source_file_cache';
/**
* Represents a compiler host that resolves a module import as a JavaScript source file if
@ -18,19 +19,24 @@ import {isRelativePath} from '../utils';
* would otherwise let TypeScript prefer the .d.ts file instead of the JavaScript source file.
*/
export class NgccSourcesCompilerHost extends NgtscCompilerHost {
private cache = ts.createModuleResolutionCache(
this.getCurrentDirectory(), file => this.getCanonicalFileName(file));
constructor(fs: FileSystem, options: ts.CompilerOptions, protected packagePath: AbsoluteFsPath) {
constructor(
fs: FileSystem, options: ts.CompilerOptions, private cache: EntryPointFileCache,
private moduleResolutionCache: ts.ModuleResolutionCache,
protected packagePath: AbsoluteFsPath) {
super(fs, options);
}
getSourceFile(fileName: string, languageVersion: ts.ScriptTarget): ts.SourceFile|undefined {
return this.cache.getCachedSourceFile(fileName, languageVersion);
}
resolveModuleNames(
moduleNames: string[], containingFile: string, reusedNames?: string[],
redirectedReference?: ts.ResolvedProjectReference): Array<ts.ResolvedModule|undefined> {
return moduleNames.map(moduleName => {
const {resolvedModule} = ts.resolveModuleName(
moduleName, containingFile, this.options, this, this.cache, redirectedReference);
moduleName, containingFile, this.options, this, this.moduleResolutionCache,
redirectedReference);
// If the module request originated from a relative import in a JavaScript source file,
// TypeScript may have resolved the module to its .d.ts declaration file if the .js source
@ -59,3 +65,31 @@ export class NgccSourcesCompilerHost extends NgtscCompilerHost {
});
}
}
/**
* A compiler host implementation that is used for the typings program. It leverages the entry-point
* cache for source files and module resolution, as these results can be reused across the sources
* program.
*/
export class NgccDtsCompilerHost extends NgtscCompilerHost {
constructor(
fs: FileSystem, options: ts.CompilerOptions, private cache: EntryPointFileCache,
private moduleResolutionCache: ts.ModuleResolutionCache) {
super(fs, options);
}
getSourceFile(fileName: string, languageVersion: ts.ScriptTarget): ts.SourceFile|undefined {
return this.cache.getCachedSourceFile(fileName, languageVersion);
}
resolveModuleNames(
moduleNames: string[], containingFile: string, reusedNames?: string[],
redirectedReference?: ts.ResolvedProjectReference): Array<ts.ResolvedModule|undefined> {
return moduleNames.map(moduleName => {
const {resolvedModule} = ts.resolveModuleName(
moduleName, containingFile, this.options, this, this.moduleResolutionCache,
redirectedReference);
return resolvedModule;
});
}
}

View File

@ -0,0 +1,197 @@
/**
* @license
* Copyright Google LLC All Rights Reserved.
*
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
import * as ts from 'typescript';
import {AbsoluteFsPath, FileSystem} from '../../../src/ngtsc/file_system';
/**
* A cache that holds on to source files that can be shared for processing all entry-points in a
* single invocation of ngcc. In particular, the following files are shared across all entry-points
* through this cache:
*
* 1. Default library files such as `lib.dom.d.ts` and `lib.es5.d.ts`. These files don't change
* and some are very large, so parsing is expensive. Therefore, the parsed `ts.SourceFile`s for
* the default library files are cached.
* 2. The typings of @angular scoped packages. The typing files for @angular packages are typically
* used in the entry-points that ngcc processes, so benefit from a single source file cache.
* Especially `@angular/core/core.d.ts` is large and expensive to parse repeatedly. In contrast
* to default library files, we have to account for these files to be invalidated during a single
* invocation of ngcc, as ngcc will overwrite the .d.ts files during its processing.
*
* The lifecycle of this cache corresponds with a single invocation of ngcc. Separate invocations,
* e.g. the CLI's synchronous module resolution fallback will therefore all have their own cache.
* This allows for the source file cache to be garbage collected once ngcc processing has completed.
*/
export class SharedFileCache {
private sfCache = new Map<AbsoluteFsPath, ts.SourceFile>();
constructor(private fs: FileSystem) {}
/**
* Loads a `ts.SourceFile` if the provided `fileName` is deemed appropriate to be cached. To
* optimize for memory usage, only files that are generally used in all entry-points are cached.
* If `fileName` is not considered to benefit from caching or the requested file does not exist,
* then `undefined` is returned.
*/
getCachedSourceFile(fileName: string): ts.SourceFile|undefined {
const absPath = this.fs.resolve(fileName);
if (isDefaultLibrary(absPath, this.fs)) {
return this.getStableCachedFile(absPath);
} else if (isAngularDts(absPath, this.fs)) {
return this.getVolatileCachedFile(absPath);
} else {
return undefined;
}
}
/**
* Attempts to load the source file from the cache, or parses the file into a `ts.SourceFile` if
* it's not yet cached. This method assumes that the file will not be modified for the duration
* that this cache is valid for. If that assumption does not hold, the `getVolatileCachedFile`
* method is to be used instead.
*/
private getStableCachedFile(absPath: AbsoluteFsPath): ts.SourceFile|undefined {
if (!this.sfCache.has(absPath)) {
const content = readFile(absPath, this.fs);
if (content === undefined) {
return undefined;
}
const sf = ts.createSourceFile(absPath, content, ts.ScriptTarget.ES2015);
this.sfCache.set(absPath, sf);
}
return this.sfCache.get(absPath)!;
}
/**
* In contrast to `getStableCachedFile`, this method always verifies that the cached source file
* is the same as what's stored on disk. This is done for files that are expected to change during
* ngcc's processing, such as @angular scoped packages for which the .d.ts files are overwritten
* by ngcc. If the contents on disk have changed compared to a previously cached source file, the
* content from disk is re-parsed and the cache entry is replaced.
*/
private getVolatileCachedFile(absPath: AbsoluteFsPath): ts.SourceFile|undefined {
const content = readFile(absPath, this.fs);
if (content === undefined) {
return undefined;
}
if (!this.sfCache.has(absPath) || this.sfCache.get(absPath)!.text !== content) {
const sf = ts.createSourceFile(absPath, content, ts.ScriptTarget.ES2015);
this.sfCache.set(absPath, sf);
}
return this.sfCache.get(absPath)!;
}
}
const DEFAULT_LIB_PATTERN = ['node_modules', 'typescript', 'lib', /^lib\..+\.d\.ts$/];
/**
* Determines whether the provided path corresponds with a default library file inside of the
* typescript package.
*
* @param absPath The path for which to determine if it corresponds with a default library file.
* @param fs The filesystem to use for inspecting the path.
*/
export function isDefaultLibrary(absPath: AbsoluteFsPath, fs: FileSystem): boolean {
return isFile(absPath, DEFAULT_LIB_PATTERN, fs);
}
const ANGULAR_DTS_PATTERN = ['node_modules', '@angular', /./, /\.d\.ts$/];
/**
* Determines whether the provided path corresponds with a .d.ts file inside of an @angular
* scoped package. This logic only accounts for the .d.ts files in the root, which is sufficient
* to find the large, flattened entry-point files that benefit from caching.
*
* @param absPath The path for which to determine if it corresponds with an @angular .d.ts file.
* @param fs The filesystem to use for inspecting the path.
*/
export function isAngularDts(absPath: AbsoluteFsPath, fs: FileSystem): boolean {
return isFile(absPath, ANGULAR_DTS_PATTERN, fs);
}
/**
* Helper function to determine whether a file corresponds with a given pattern of segments.
*
* @param path The path for which to determine if it corresponds with the provided segments.
* @param segments Array of segments; the `path` must have ending segments that match the
* patterns in this array.
* @param fs The filesystem to use for inspecting the path.
*/
function isFile(
path: AbsoluteFsPath, segments: ReadonlyArray<string|RegExp>, fs: FileSystem): boolean {
for (let i = segments.length - 1; i >= 0; i--) {
const pattern = segments[i];
const segment = fs.basename(path);
if (typeof pattern === 'string') {
if (pattern !== segment) {
return false;
}
} else {
if (!pattern.test(segment)) {
return false;
}
}
path = fs.dirname(path);
}
return true;
}
/**
* A cache for processing a single entry-point. This exists to share `ts.SourceFile`s between the
* source and typing programs that are created for a single program.
*/
export class EntryPointFileCache {
private readonly sfCache = new Map<AbsoluteFsPath, ts.SourceFile>();
constructor(private fs: FileSystem, private sharedFileCache: SharedFileCache) {}
/**
* Returns and caches a parsed `ts.SourceFile` for the provided `fileName`. If the `fileName` is
* cached in the shared file cache, that result is used. Otherwise, the source file is cached
* internally. This method returns `undefined` if the requested file does not exist.
*
* @param fileName The path of the file to retrieve a source file for.
* @param languageVersion The language version to use for parsing the file.
*/
getCachedSourceFile(fileName: string, languageVersion: ts.ScriptTarget): ts.SourceFile|undefined {
const staticSf = this.sharedFileCache.getCachedSourceFile(fileName);
if (staticSf !== undefined) {
return staticSf;
}
const absPath = this.fs.resolve(fileName);
if (this.sfCache.has(absPath)) {
return this.sfCache.get(absPath);
}
const content = readFile(absPath, this.fs);
if (content === undefined) {
return undefined;
}
const sf = ts.createSourceFile(fileName, content, languageVersion);
this.sfCache.set(absPath, sf);
return sf;
}
}
function readFile(absPath: AbsoluteFsPath, fs: FileSystem): string|undefined {
if (!fs.exists(absPath) || !fs.stat(absPath).isFile()) {
return undefined;
}
return fs.readFile(absPath);
}
/**
* Creates a `ts.ModuleResolutionCache` that uses the provided filesystem for path operations.
*
* @param fs The filesystem to use for path operations.
*/
export function createModuleResolutionCache(fs: FileSystem): ts.ModuleResolutionCache {
return ts.createModuleResolutionCache(fs.pwd(), fileName => {
return fs.isCaseSensitive() ? fileName : fileName.toLowerCase();
});
}

View File

@ -14,6 +14,7 @@ import {NgccEntryPointConfig} from '../../src/packages/configuration';
import {EntryPoint, EntryPointFormat} from '../../src/packages/entry_point';
import {EntryPointBundle} from '../../src/packages/entry_point_bundle';
import {NgccSourcesCompilerHost} from '../../src/packages/ngcc_compiler_host';
import {createModuleResolutionCache, EntryPointFileCache, SharedFileCache} from '../../src/packages/source_file_cache';
export type TestConfig = Pick<NgccEntryPointConfig, 'generateDeepReexports'>;
@ -68,7 +69,10 @@ export function makeTestBundleProgram(
const rootDir = fs.dirname(entryPointPath);
const options: ts.CompilerOptions =
{allowJs: true, maxNodeModuleJsDepth: Infinity, checkJs: false, rootDir, rootDirs: [rootDir]};
const host = new NgccSourcesCompilerHost(fs, options, rootDir);
const moduleResolutionCache = createModuleResolutionCache(fs);
const entryPointFileCache = new EntryPointFileCache(fs, new SharedFileCache(fs));
const host =
new NgccSourcesCompilerHost(fs, options, entryPointFileCache, moduleResolutionCache, rootDir);
return makeBundleProgram(
fs, isCore, rootDir, path, 'r3_symbols.js', options, host, additionalFiles);
}

View File

@ -213,13 +213,15 @@ export function compileIntoApf(
fs.resolve(`/node_modules/${pkgName}/package.json`), JSON.stringify(pkgJson, null, 2));
}
const stdFiles = loadStandardTestFiles({fakeCore: false});
/**
* Prepares a mock filesystem that contains all provided source files, which can be used to emit
* compiled code into.
*/
function setupCompileFs(sources: PackageSources): {rootNames: string[], compileFs: FileSystem} {
const compileFs = new MockFileSystemPosix(true);
compileFs.init(loadStandardTestFiles({fakeCore: false}));
compileFs.init(stdFiles);
const rootNames = Object.keys(sources);

View File

@ -5,12 +5,13 @@
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
import * as os from 'os';
import {absoluteFrom, AbsoluteFsPath, FileSystem, getFileSystem} from '../../src/ngtsc/file_system';
import {runInEachFileSystem} from '../../src/ngtsc/file_system/testing';
import {MockLogger} from '../../src/ngtsc/logging/testing';
import {clearTsConfigCache, getSharedSetup, NgccOptions} from '../src/ngcc_options';
import {clearTsConfigCache, getMaxNumberOfWorkers, getSharedSetup, NgccOptions} from '../src/ngcc_options';
@ -100,6 +101,67 @@ runInEachFileSystem(() => {
});
});
describe('getMaxNumberOfWorkers', () => {
let processEnv: NodeJS.ProcessEnv;
let cpuSpy: jasmine.Spy;
beforeEach(() => {
processEnv = process.env;
process.env = {...process.env};
cpuSpy = spyOn(os, 'cpus');
});
afterEach(() => {
process.env = processEnv;
});
it('should use NGCC_MAX_WORKERS environment variable if set', () => {
process.env.NGCC_MAX_WORKERS = '16';
expect(getMaxNumberOfWorkers()).toBe(16);
process.env.NGCC_MAX_WORKERS = '8';
expect(getMaxNumberOfWorkers()).toBe(8);
process.env.NGCC_MAX_WORKERS = ' 8 ';
expect(getMaxNumberOfWorkers()).toBe(8);
});
it('should throw an error if NGCC_MAX_WORKERS is less than 1', () => {
process.env.NGCC_MAX_WORKERS = '0';
expect(() => getMaxNumberOfWorkers())
.toThrow(new Error('NGCC_MAX_WORKERS should be at least 1.'));
process.env.NGCC_MAX_WORKERS = '-1';
expect(() => getMaxNumberOfWorkers())
.toThrow(new Error('NGCC_MAX_WORKERS should be at least 1.'));
});
it('should throw an error if NGCC_MAX_WORKERS is not an integer', () => {
process.env.NGCC_MAX_WORKERS = 'a';
expect(() => getMaxNumberOfWorkers())
.toThrow(new Error('NGCC_MAX_WORKERS should be an integer.'));
process.env.NGCC_MAX_WORKERS = '1.5';
expect(() => getMaxNumberOfWorkers())
.toThrow(new Error('NGCC_MAX_WORKERS should be an integer.'));
process.env.NGCC_MAX_WORKERS = '-';
expect(() => getMaxNumberOfWorkers())
.toThrow(new Error('NGCC_MAX_WORKERS should be an integer.'));
});
it('should fallback to the number of cpus, minus one (for the master process), with a maximum of 4 workers',
() => {
simulateNumberOfCpus(1);
expect(getMaxNumberOfWorkers()).toBe(1);
simulateNumberOfCpus(2);
expect(getMaxNumberOfWorkers()).toBe(1);
simulateNumberOfCpus(4);
expect(getMaxNumberOfWorkers()).toBe(3);
simulateNumberOfCpus(6);
expect(getMaxNumberOfWorkers()).toBe(4);
simulateNumberOfCpus(8);
expect(getMaxNumberOfWorkers()).toBe(4);
});
function simulateNumberOfCpus(cpus: number): void {
cpuSpy.and.returnValue(new Array(cpus).fill({model: 'Mock CPU'} as any));
}
});
/**
* This function creates an object that contains the minimal required properties for NgccOptions.
*/

View File

@ -10,6 +10,7 @@ import {runInEachFileSystem} from '../../../src/ngtsc/file_system/testing';
import {loadTestFiles} from '../../../test/helpers';
import {EntryPoint} from '../../src/packages/entry_point';
import {makeEntryPointBundle} from '../../src/packages/entry_point_bundle';
import {createModuleResolutionCache, SharedFileCache} from '../../src/packages/source_file_cache';
runInEachFileSystem(() => {
describe('entry point bundle', () => {
@ -180,7 +181,10 @@ runInEachFileSystem(() => {
ignoreMissingDependencies: false,
generateDeepReexports: false,
};
const esm5bundle = makeEntryPointBundle(fs, entryPoint, './index.js', false, 'esm5', true);
const moduleResolutionCache = createModuleResolutionCache(fs);
const esm5bundle = makeEntryPointBundle(
fs, entryPoint, new SharedFileCache(fs), moduleResolutionCache, './index.js', false,
'esm5', true);
expect(esm5bundle.src.program.getSourceFiles().map(sf => sf.fileName))
.toEqual(jasmine.arrayWithExactContents([
@ -291,8 +295,11 @@ runInEachFileSystem(() => {
ignoreMissingDependencies: false,
generateDeepReexports: false,
};
const moduleResolutionCache = createModuleResolutionCache(fs);
const esm5bundle = makeEntryPointBundle(
fs, entryPoint, './index.js', false, 'esm5', /* transformDts */ true,
fs, entryPoint, new SharedFileCache(fs), moduleResolutionCache, './index.js', false,
'esm5',
/* transformDts */ true,
/* pathMappings */ undefined, /* mirrorDtsFromSrc */ true);
expect(esm5bundle.src.program.getSourceFiles().map(sf => _(sf.fileName)))
@ -328,8 +335,11 @@ runInEachFileSystem(() => {
ignoreMissingDependencies: false,
generateDeepReexports: false,
};
const moduleResolutionCache = createModuleResolutionCache(fs);
const esm5bundle = makeEntryPointBundle(
fs, entryPoint, './index.js', false, 'esm5', /* transformDts */ true,
fs, entryPoint, new SharedFileCache(fs), moduleResolutionCache, './index.js', false,
'esm5',
/* transformDts */ true,
/* pathMappings */ undefined, /* mirrorDtsFromSrc */ true);
expect(esm5bundle.src.program.getSourceFiles().map(sf => sf.fileName))
.toContain(absoluteFrom('/node_modules/test/internal.js'));
@ -351,8 +361,11 @@ runInEachFileSystem(() => {
ignoreMissingDependencies: false,
generateDeepReexports: false,
};
const moduleResolutionCache = createModuleResolutionCache(fs);
const esm5bundle = makeEntryPointBundle(
fs, entryPoint, './esm2015/index.js', false, 'esm2015', /* transformDts */ true,
fs, entryPoint, new SharedFileCache(fs), moduleResolutionCache,
'./esm2015/index.js', false, 'esm2015',
/* transformDts */ true,
/* pathMappings */ undefined, /* mirrorDtsFromSrc */ true);
expect(esm5bundle.src.program.getSourceFiles().map(sf => sf.fileName))
.toContain(absoluteFrom('/node_modules/internal/esm2015/src/internal.js'));
@ -374,8 +387,11 @@ runInEachFileSystem(() => {
ignoreMissingDependencies: false,
generateDeepReexports: false,
};
const moduleResolutionCache = createModuleResolutionCache(fs);
const esm5bundle = makeEntryPointBundle(
fs, entryPoint, './index.js', false, 'esm5', /* transformDts */ true,
fs, entryPoint, new SharedFileCache(fs), moduleResolutionCache, './index.js', false,
'esm5',
/* transformDts */ true,
/* pathMappings */ undefined, /* mirrorDtsFromSrc */ false);
expect(esm5bundle.src.program.getSourceFiles().map(sf => sf.fileName))
.toContain(absoluteFrom('/node_modules/test/internal.js'));
@ -398,8 +414,11 @@ runInEachFileSystem(() => {
ignoreMissingDependencies: false,
generateDeepReexports: false,
};
const moduleResolutionCache = createModuleResolutionCache(fs);
const bundle = makeEntryPointBundle(
fs, entryPoint, './index.js', false, 'esm2015', /* transformDts */ true,
fs, entryPoint, new SharedFileCache(fs), moduleResolutionCache, './index.js', false,
'esm2015',
/* transformDts */ true,
/* pathMappings */ undefined, /* mirrorDtsFromSrc */ true);
expect(bundle.rootDirs).toEqual([absoluteFrom('/node_modules/primary')]);
});

View File

@ -0,0 +1,223 @@
/**
* @license
* Copyright Google LLC All Rights Reserved.
*
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
import * as ts from 'typescript';
import {absoluteFrom, FileSystem, getFileSystem} from '../../../src/ngtsc/file_system';
import {runInEachFileSystem} from '../../../src/ngtsc/file_system/testing';
import {loadTestFiles} from '../../../test/helpers';
import {EntryPointFileCache, isAngularDts, isDefaultLibrary, SharedFileCache} from '../../src/packages/source_file_cache';
runInEachFileSystem(() => {
describe('caching', () => {
let _: typeof absoluteFrom;
let fs: FileSystem;
beforeEach(() => {
_ = absoluteFrom;
fs = getFileSystem();
loadTestFiles([
{
name: _('/node_modules/typescript/lib/lib.es5.d.ts'),
contents: `export declare interface Array {}`,
},
{
name: _('/node_modules/typescript/lib/lib.dom.d.ts'),
contents: `export declare interface Window {}`,
},
{
name: _('/node_modules/@angular/core/core.d.ts'),
contents: `export declare interface Component {}`,
},
{
name: _('/node_modules/@angular/common/common.d.ts'),
contents: `export declare interface NgIf {}`,
},
{
name: _('/index.ts'),
contents: `export const index = true;`,
},
{
name: _('/main.ts'),
contents: `export const main = true;`,
},
]);
});
describe('SharedFileCache', () => {
it('should cache a parsed source file for default libraries', () => {
const cache = new SharedFileCache(fs);
const libEs5 = cache.getCachedSourceFile('/node_modules/typescript/lib/lib.es5.d.ts')!;
expect(libEs5).not.toBeUndefined();
expect(libEs5.text).toContain('Array');
const libDom = cache.getCachedSourceFile('/node_modules/typescript/lib/lib.dom.d.ts')!;
expect(libDom).not.toBeUndefined();
expect(libDom.text).toContain('Window');
const libEs5_2 = cache.getCachedSourceFile('/node_modules/typescript/lib/lib.es5.d.ts')!;
expect(libEs5_2).toBe(libEs5);
const libDom_2 = cache.getCachedSourceFile('/node_modules/typescript/lib/lib.dom.d.ts')!;
expect(libDom_2).toBe(libDom);
});
it('should cache a parsed source file for @angular scoped packages', () => {
const cache = new SharedFileCache(fs);
const core = cache.getCachedSourceFile('/node_modules/@angular/core/core.d.ts')!;
expect(core).not.toBeUndefined();
expect(core.text).toContain('Component');
const common = cache.getCachedSourceFile('/node_modules/@angular/common/common.d.ts')!;
expect(common).not.toBeUndefined();
expect(common.text).toContain('NgIf');
const core_2 = cache.getCachedSourceFile('/node_modules/@angular/core/core.d.ts')!;
expect(core_2).toBe(core);
const common_2 = cache.getCachedSourceFile('/node_modules/@angular/common/common.d.ts')!;
expect(common_2).toBe(common);
});
it('should reparse @angular d.ts files when they change', () => {
const cache = new SharedFileCache(fs);
const core = cache.getCachedSourceFile('/node_modules/@angular/core/core.d.ts')!;
expect(core).not.toBeUndefined();
expect(core.text).toContain('Component');
const common = cache.getCachedSourceFile('/node_modules/@angular/common/common.d.ts')!;
expect(common).not.toBeUndefined();
expect(common.text).toContain('NgIf');
fs.writeFile(
_('/node_modules/@angular/core/core.d.ts'), `export declare interface Directive {}`);
const core_2 = cache.getCachedSourceFile('/node_modules/@angular/core/core.d.ts')!;
expect(core_2).not.toBe(core);
expect(core_2.text).toContain('Directive');
const core_3 = cache.getCachedSourceFile('/node_modules/@angular/core/core.d.ts')!;
expect(core_3).toBe(core_2);
const common_2 = cache.getCachedSourceFile('/node_modules/@angular/common/common.d.ts')!;
expect(common_2).toBe(common);
});
it('should not cache files that are not default library files inside of the typescript package',
() => {
const cache = new SharedFileCache(fs);
expect(cache.getCachedSourceFile('/node_modules/typescript/lib/typescript.d.ts'))
.toBeUndefined();
expect(cache.getCachedSourceFile('/typescript/lib.es5.d.ts')).toBeUndefined();
});
});
describe('isDefaultLibrary()', () => {
it('should accept lib files inside of the typescript package', () => {
expect(isDefaultLibrary(_('/node_modules/typescript/lib/lib.es5.d.ts'), fs)).toBe(true);
expect(isDefaultLibrary(_('/node_modules/typescript/lib/lib.dom.d.ts'), fs)).toBe(true);
expect(isDefaultLibrary(_('/node_modules/typescript/lib/lib.es2015.core.d.ts'), fs))
.toBe(true);
expect(isDefaultLibrary(_('/root/node_modules/typescript/lib/lib.es5.d.ts'), fs))
.toBe(true);
});
it('should reject non lib files inside of the typescript package', () => {
expect(isDefaultLibrary(_('/node_modules/typescript/lib/typescript.d.ts'), fs)).toBe(false);
expect(isDefaultLibrary(_('/node_modules/typescript/lib/lib.es5.ts'), fs)).toBe(false);
expect(isDefaultLibrary(_('/node_modules/typescript/lib/lib.d.ts'), fs)).toBe(false);
expect(isDefaultLibrary(_('/node_modules/typescript/lib.es5.d.ts'), fs)).toBe(false);
});
it('should reject lib files outside of the typescript package', () => {
expect(isDefaultLibrary(_('/node_modules/ttypescript/lib/lib.es5.d.ts'), fs)).toBe(false);
expect(isDefaultLibrary(_('/node_modules/ttypescript/lib/lib.es5.d.ts'), fs)).toBe(false);
expect(isDefaultLibrary(_('/typescript/lib/lib.es5.d.ts'), fs)).toBe(false);
});
});
describe('isAngularDts()', () => {
it('should accept .d.ts files inside of the @angular scope', () => {
expect(isAngularDts(_('/node_modules/@angular/core/core.d.ts'), fs)).toBe(true);
expect(isAngularDts(_('/node_modules/@angular/common/common.d.ts'), fs)).toBe(true);
});
it('should reject non-.d.ts files inside @angular scoped packages', () => {
expect(isAngularDts(_('/node_modules/@angular/common/src/common.ts'), fs)).toBe(false);
});
it('should reject .d.ts files nested deeply inside @angular scoped packages', () => {
expect(isAngularDts(_('/node_modules/@angular/common/src/common.d.ts'), fs)).toBe(false);
});
it('should reject .d.ts files directly inside the @angular scope', () => {
expect(isAngularDts(_('/node_modules/@angular/common.d.ts'), fs)).toBe(false);
});
it('should reject files that are not inside node_modules', () => {
expect(isAngularDts(_('/@angular/core/core.d.ts'), fs)).toBe(false);
});
});
describe('EntryPointFileCache', () => {
let sharedFileCache: SharedFileCache;
beforeEach(() => {
sharedFileCache = new SharedFileCache(fs);
});
it('should prefer source files cached in SharedFileCache', () => {
const cache1 = new EntryPointFileCache(fs, sharedFileCache);
const libEs5_1 = cache1.getCachedSourceFile(
'/node_modules/typescript/lib/lib.es5.d.ts', ts.ScriptTarget.ESNext)!;
expect(libEs5_1).not.toBeUndefined();
expect(libEs5_1.text).toContain('Array');
expect(libEs5_1.languageVersion).toBe(ts.ScriptTarget.ES2015);
const cache2 = new EntryPointFileCache(fs, sharedFileCache);
const libEs5_2 = cache2.getCachedSourceFile(
'/node_modules/typescript/lib/lib.es5.d.ts', ts.ScriptTarget.ESNext)!;
expect(libEs5_1).toBe(libEs5_2);
});
it('should cache source files that are not default library files', () => {
const cache = new EntryPointFileCache(fs, sharedFileCache);
const index = cache.getCachedSourceFile('/index.ts', ts.ScriptTarget.ESNext)!;
expect(index).not.toBeUndefined();
expect(index.text).toContain('index');
expect(index.languageVersion).toBe(ts.ScriptTarget.ESNext);
const main = cache.getCachedSourceFile('/main.ts', ts.ScriptTarget.ESNext)!;
expect(main).not.toBeUndefined();
expect(main.text).toContain('main');
expect(main.languageVersion).toBe(ts.ScriptTarget.ESNext);
const index_2 = cache.getCachedSourceFile('/index.ts', ts.ScriptTarget.ESNext)!;
expect(index_2).toBe(index);
const main_2 = cache.getCachedSourceFile('/main.ts', ts.ScriptTarget.ESNext)!;
expect(main_2).toBe(main);
});
it('should not share non-library files across multiple cache instances', () => {
const cache1 = new EntryPointFileCache(fs, sharedFileCache);
const cache2 = new EntryPointFileCache(fs, sharedFileCache);
const index1 = cache1.getCachedSourceFile('/index.ts', ts.ScriptTarget.ESNext)!;
const index2 = cache2.getCachedSourceFile('/index.ts', ts.ScriptTarget.ESNext)!;
expect(index1).not.toBe(index2);
});
it('should return undefined if the file does not exist', () => {
const cache = new EntryPointFileCache(fs, sharedFileCache);
expect(cache.getCachedSourceFile('/nonexistent.ts', ts.ScriptTarget.ESNext))
.toBeUndefined();
});
it('should return undefined if the path is a directory', () => {
const cache = new EntryPointFileCache(fs, sharedFileCache);
expect(cache.getCachedSourceFile('/node_modules', ts.ScriptTarget.ESNext)).toBeUndefined();
});
});
});
});

View File

@ -12,6 +12,7 @@ import {loadTestFiles} from '../../../test/helpers';
import {NgccConfiguration} from '../../src/packages/configuration';
import {EntryPoint, EntryPointFormat, EntryPointJsonProperty, getEntryPointInfo, isEntryPoint} from '../../src/packages/entry_point';
import {EntryPointBundle, makeEntryPointBundle} from '../../src/packages/entry_point_bundle';
import {createModuleResolutionCache, SharedFileCache} from '../../src/packages/source_file_cache';
import {FileWriter} from '../../src/writing/file_writer';
import {NewEntryPointFileWriter} from '../../src/writing/new_entry_point_file_writer';
import {DirectPackageJsonUpdater} from '../../src/writing/package_json_updater';
@ -634,7 +635,9 @@ runInEachFileSystem(() => {
function makeTestBundle(
fs: FileSystem, entryPoint: EntryPoint, formatProperty: EntryPointJsonProperty,
format: EntryPointFormat): EntryPointBundle {
const moduleResolutionCache = createModuleResolutionCache(fs);
return makeEntryPointBundle(
fs, entryPoint, entryPoint.packageJson[formatProperty]!, false, format, true);
fs, entryPoint, new SharedFileCache(fs), moduleResolutionCache,
entryPoint.packageJson[formatProperty]!, false, format, true);
}
});

View File

@ -16,7 +16,6 @@ import {DefaultImportRecorder, ModuleResolver, Reference, ReferenceEmitter} from
import {DependencyTracker} from '../../incremental/api';
import {IndexingContext} from '../../indexer';
import {ClassPropertyMapping, DirectiveMeta, DirectiveTypeCheckMeta, extractDirectiveTypeCheckMeta, InjectableClassRegistry, MetadataReader, MetadataRegistry} from '../../metadata';
import {flattenInheritedDirectiveMetadata} from '../../metadata/src/inheritance';
import {EnumValue, PartialEvaluator} from '../../partial_evaluator';
import {ClassDeclaration, Decorator, ReflectionHost, reflectObjectLiteral} from '../../reflection';
import {ComponentScopeReader, LocalModuleScopeRegistry} from '../../scope';
@ -31,6 +30,7 @@ import {createValueHasWrongTypeError, getDirectiveDiagnostics, getProviderDiagno
import {extractDirectiveMetadata, parseFieldArrayValue} from './directive';
import {compileNgFactoryDefField} from './factory';
import {generateSetClassMetadataCall} from './metadata';
import {TypeCheckScopes} from './typecheck_scopes';
import {findAngularDecorator, isAngularCoreReference, isExpressionForwardReference, makeDuplicateDeclarationError, readBaseClass, resolveProvidersRequiringFactory, unwrapExpression, wrapFunctionExpressionsInParens} from './util';
const EMPTY_MAP = new Map<string, Expression>();
@ -95,6 +95,7 @@ export class ComponentDecoratorHandler implements
private literalCache = new Map<Decorator, ts.ObjectLiteralExpression>();
private elementSchemaRegistry = new DomElementSchemaRegistry();
private typeCheckScopes = new TypeCheckScopes(this.scopeReader, this.metaReader);
/**
* During the asynchronous preanalyze phase, it's necessary to parse the template to extract
@ -423,36 +424,15 @@ export class ComponentDecoratorHandler implements
return;
}
const matcher = new SelectorMatcher<DirectiveMeta>();
const pipes = new Map<string, Reference<ClassDeclaration<ts.ClassDeclaration>>>();
let schemas: SchemaMetadata[] = [];
const scope = this.scopeReader.getScopeForComponent(node);
const scope = this.typeCheckScopes.getTypeCheckScope(node);
if (scope === 'error') {
// Don't type-check components that had errors in their scopes.
return;
}
if (scope !== null) {
for (const meta of scope.compilation.directives) {
if (meta.selector !== null) {
const extMeta = flattenInheritedDirectiveMetadata(this.metaReader, meta.ref);
matcher.addSelectables(CssSelector.parse(meta.selector), extMeta);
}
}
for (const {name, ref} of scope.compilation.pipes) {
if (!ts.isClassDeclaration(ref.node)) {
throw new Error(`Unexpected non-class declaration ${
ts.SyntaxKind[ref.node.kind]} for pipe ${ref.debugName}`);
}
pipes.set(name, ref as Reference<ClassDeclaration<ts.ClassDeclaration>>);
}
schemas = scope.schemas;
}
const binder = new R3TargetBinder(matcher);
const binder = new R3TargetBinder(scope.matcher);
ctx.addTemplate(
new Reference(node), binder, meta.template.diagNodes, pipes, schemas,
new Reference(node), binder, meta.template.diagNodes, scope.pipes, scope.schemas,
meta.template.sourceMapping, meta.template.file);
}
@ -495,36 +475,49 @@ export class ComponentDecoratorHandler implements
// Set up the R3TargetBinder, as well as a 'directives' array and a 'pipes' map that are later
// fed to the TemplateDefinitionBuilder. First, a SelectorMatcher is constructed to match
// directives that are in scope.
const matcher = new SelectorMatcher<DirectiveMeta&{expression: Expression}>();
const directives: {selector: string, expression: Expression}[] = [];
type MatchedDirective = DirectiveMeta&{selector: string};
const matcher = new SelectorMatcher<MatchedDirective>();
for (const dir of scope.compilation.directives) {
const {ref, selector} = dir;
if (selector !== null) {
const expression = this.refEmitter.emit(ref, context);
directives.push({selector, expression});
matcher.addSelectables(CssSelector.parse(selector), {...dir, expression});
if (dir.selector !== null) {
matcher.addSelectables(CssSelector.parse(dir.selector), dir as MatchedDirective);
}
}
const pipes = new Map<string, Expression>();
const pipes = new Map<string, Reference<ClassDeclaration>>();
for (const pipe of scope.compilation.pipes) {
pipes.set(pipe.name, this.refEmitter.emit(pipe.ref, context));
pipes.set(pipe.name, pipe.ref);
}
// Next, the component template AST is bound using the R3TargetBinder. This produces an
// Next, the component template AST is bound using the R3TargetBinder. This produces a
// BoundTarget, which is similar to a ts.TypeChecker.
const binder = new R3TargetBinder(matcher);
const bound = binder.bind({template: metadata.template.nodes});
// The BoundTarget knows which directives and pipes matched the template.
const usedDirectives = bound.getUsedDirectives();
const usedPipes = bound.getUsedPipes().map(name => pipes.get(name)!);
const usedDirectives = bound.getUsedDirectives().map(directive => {
return {
selector: directive.selector,
expression: this.refEmitter.emit(directive.ref, context),
};
});
const usedPipes: {pipeName: string, expression: Expression}[] = [];
for (const pipeName of bound.getUsedPipes()) {
if (!pipes.has(pipeName)) {
continue;
}
const pipe = pipes.get(pipeName)!;
usedPipes.push({
pipeName,
expression: this.refEmitter.emit(pipe, context),
});
}
// Scan through the directives/pipes actually used in the template and check whether any
// import which needs to be generated would create a cycle.
const cycleDetected =
usedDirectives.some(dir => this._isCyclicImport(dir.expression, context)) ||
usedPipes.some(pipe => this._isCyclicImport(pipe, context));
usedPipes.some(pipe => this._isCyclicImport(pipe.expression, context));
if (!cycleDetected) {
// No cycle was detected. Record the imports that need to be created in the cycle detector
@ -532,8 +525,8 @@ export class ComponentDecoratorHandler implements
for (const {expression} of usedDirectives) {
this._recordSyntheticImport(expression, context);
}
for (const pipe of usedPipes) {
this._recordSyntheticImport(pipe, context);
for (const {expression} of usedPipes) {
this._recordSyntheticImport(expression, context);
}
// Check whether the directive/pipe arrays in ɵcmp need to be wrapped in closures.
@ -542,16 +535,11 @@ export class ComponentDecoratorHandler implements
const wrapDirectivesAndPipesInClosure =
usedDirectives.some(
dir => isExpressionForwardReference(dir.expression, node.name, context)) ||
usedPipes.some(pipe => isExpressionForwardReference(pipe, node.name, context));
usedPipes.some(
pipe => isExpressionForwardReference(pipe.expression, node.name, context));
// Actual compilation still uses the full scope, not the narrowed scope determined by
// R3TargetBinder. This is a hedge against potential issues with the R3TargetBinder - right
// now the TemplateDefinitionBuilder is the "source of truth" for which directives/pipes are
// actually used (though the two should agree perfectly).
//
// TODO(alxhub): switch TemplateDefinitionBuilder over to using R3TargetBinder directly.
data.directives = directives;
data.pipes = pipes;
data.directives = usedDirectives;
data.pipes = new Map(usedPipes.map(pipe => [pipe.pipeName, pipe.expression]));
data.wrapDirectivesAndPipesInClosure = wrapDirectivesAndPipesInClosure;
} else {
// Declaring the directiveDefs/pipeDefs arrays directly would require imports that would

View File

@ -0,0 +1,105 @@
/**
* @license
* Copyright Google LLC All Rights Reserved.
*
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
import {CssSelector, SchemaMetadata, SelectorMatcher} from '@angular/compiler';
import * as ts from 'typescript';
import {Reference} from '../../imports';
import {DirectiveMeta, flattenInheritedDirectiveMetadata, MetadataReader} from '../../metadata';
import {ClassDeclaration} from '../../reflection';
import {ComponentScopeReader} from '../../scope';
/**
* The scope that is used for type-check code generation of a component template.
*/
export interface TypeCheckScope {
/**
* A `SelectorMatcher` instance that contains the flattened directive metadata of all directives
* that are in the compilation scope of the declaring NgModule.
*/
matcher: SelectorMatcher<DirectiveMeta>;
/**
* The pipes that are available in the compilation scope.
*/
pipes: Map<string, Reference<ClassDeclaration<ts.ClassDeclaration>>>;
/**
* The schemas that are used in this scope.
*/
schemas: SchemaMetadata[];
}
/**
* Computes scope information to be used in template type checking.
*/
export class TypeCheckScopes {
/**
* Cache of flattened directive metadata. Because flattened metadata is scope-invariant it's
* cached individually, such that all scopes refer to the same flattened metadata.
*/
private flattenedDirectiveMetaCache = new Map<ClassDeclaration, DirectiveMeta>();
/**
* Cache of the computed type check scope per NgModule declaration.
*/
private scopeCache = new Map<ClassDeclaration, TypeCheckScope>();
constructor(private scopeReader: ComponentScopeReader, private metaReader: MetadataReader) {}
/**
* Computes the type-check scope information for the component declaration. If the NgModule
* contains an error, then 'error' is returned. If the component is not declared in any NgModule,
* an empty type-check scope is returned.
*/
getTypeCheckScope(node: ClassDeclaration): TypeCheckScope|'error' {
const matcher = new SelectorMatcher<DirectiveMeta>();
const pipes = new Map<string, Reference<ClassDeclaration<ts.ClassDeclaration>>>();
const scope = this.scopeReader.getScopeForComponent(node);
if (scope === null) {
return {matcher, pipes, schemas: []};
} else if (scope === 'error') {
return scope;
}
if (this.scopeCache.has(scope.ngModule)) {
return this.scopeCache.get(scope.ngModule)!;
}
for (const meta of scope.compilation.directives) {
if (meta.selector !== null) {
const extMeta = this.getInheritedDirectiveMetadata(meta.ref);
matcher.addSelectables(CssSelector.parse(meta.selector), extMeta);
}
}
for (const {name, ref} of scope.compilation.pipes) {
if (!ts.isClassDeclaration(ref.node)) {
throw new Error(`Unexpected non-class declaration ${
ts.SyntaxKind[ref.node.kind]} for pipe ${ref.debugName}`);
}
pipes.set(name, ref as Reference<ClassDeclaration<ts.ClassDeclaration>>);
}
const typeCheckScope: TypeCheckScope = {matcher, pipes, schemas: scope.schemas};
this.scopeCache.set(scope.ngModule, typeCheckScope);
return typeCheckScope;
}
private getInheritedDirectiveMetadata(ref: Reference<ClassDeclaration>): DirectiveMeta {
const clazz = ref.node;
if (this.flattenedDirectiveMetaCache.has(clazz)) {
return this.flattenedDirectiveMetaCache.get(clazz)!;
}
const meta = flattenInheritedDirectiveMetadata(this.metaReader, ref);
this.flattenedDirectiveMetaCache.set(clazz, meta);
return meta;
}
}

View File

@ -8,6 +8,7 @@
export * from './src/api';
export {DtsMetadataReader} from './src/dts';
export {flattenInheritedDirectiveMetadata} from './src/inheritance';
export {CompoundMetadataRegistry, LocalMetadataRegistry, InjectableClassRegistry} from './src/registry';
export {extractDirectiveTypeCheckMeta, CompoundMetadataReader} from './src/util';
export {BindingPropertyName, ClassPropertyMapping, ClassPropertyName, InputOrOutput} from './src/property_mapping';

View File

@ -26,6 +26,9 @@ export function flattenInheritedDirectiveMetadata(
if (topMeta === null) {
throw new Error(`Metadata not found for directive: ${dir.debugName}`);
}
if (topMeta.baseClass === null) {
return topMeta;
}
const coercedInputFields = new Set<ClassPropertyName>();
const undeclaredInputFields = new Set<ClassPropertyName>();

View File

@ -26,6 +26,7 @@ export interface LocalNgModuleData {
}
export interface LocalModuleScope extends ExportScope {
ngModule: ClassDeclaration;
compilation: ScopeData;
reexports: Reexport[]|null;
schemas: SchemaMetadata[];
@ -433,7 +434,8 @@ export class LocalModuleScopeRegistry implements MetadataRegistry, ComponentScop
}
// Finally, produce the `LocalModuleScope` with both the compilation and export scopes.
const scope = {
const scope: LocalModuleScope = {
ngModule: ngModule.ref.node,
compilation: {
directives: Array.from(compilationDirectives.values()),
pipes: Array.from(compilationPipes.values()),

View File

@ -434,7 +434,10 @@ class TemplateBinder extends RecursiveAstVisitor implements Visitor {
visitText(text: Text) {}
visitContent(content: Content) {}
visitTextAttribute(attribute: TextAttribute) {}
visitIcu(icu: Icu): void {}
visitIcu(icu: Icu): void {
Object.keys(icu.vars).forEach(key => icu.vars[key].visit(this));
Object.keys(icu.placeholders).forEach(key => icu.placeholders[key].visit(this));
}
// The remaining visitors are concerned with processing AST expressions within template bindings

View File

@ -194,4 +194,38 @@ describe('t2 binding', () => {
expect(consumer).toEqual(el);
});
});
describe('used pipes', () => {
it('should record pipes used in interpolations', () => {
const template = parseTemplate('{{value|date}}', '', {});
const binder = new R3TargetBinder(makeSelectorMatcher());
const res = binder.bind({template: template.nodes});
expect(res.getUsedPipes()).toEqual(['date']);
});
it('should record pipes used in bound attributes', () => {
const template = parseTemplate('<person [age]="age|number"></person>', '', {});
const binder = new R3TargetBinder(makeSelectorMatcher());
const res = binder.bind({template: template.nodes});
expect(res.getUsedPipes()).toEqual(['number']);
});
it('should record pipes used in bound template attributes', () => {
const template = parseTemplate('<ng-template [ngIf]="obs|async"></ng-template>', '', {});
const binder = new R3TargetBinder(makeSelectorMatcher());
const res = binder.bind({template: template.nodes});
expect(res.getUsedPipes()).toEqual(['async']);
});
it('should record pipes used in ICUs', () => {
const template = parseTemplate(
`<span i18n>{count|number, plural,
=1 { {{value|date}} }
}</span>`,
'', {});
const binder = new R3TargetBinder(makeSelectorMatcher());
const res = binder.bind({template: template.nodes});
expect(res.getUsedPipes()).toEqual(['number', 'date']);
});
});
});

View File

@ -236,7 +236,6 @@ export class DefaultIterableDiffer<V> implements IterableDiffer<V>, IterableChan
_reset() {
if (this.isDirty) {
let record: IterableChangeRecord_<V>|null;
let nextRecord: IterableChangeRecord_<V>|null;
for (record = this._previousItHead = this._itHead; record !== null; record = record._next) {
record._nextPrevious = record._next;
@ -247,9 +246,8 @@ export class DefaultIterableDiffer<V> implements IterableDiffer<V>, IterableChan
}
this._additionsHead = this._additionsTail = null;
for (record = this._movesHead; record !== null; record = nextRecord) {
for (record = this._movesHead; record !== null; record = record._nextMoved) {
record.previousIndex = record.currentIndex;
nextRecord = record._nextMoved;
}
this._movesHead = this._movesTail = null;
this._removalsHead = this._removalsTail = null;

View File

@ -300,6 +300,7 @@ function detachMovedView(declarationContainer: LContainer, lView: LView) {
// would be cleared and the counter decremented), we need to decrement the view counter here
// instead.
if (lView[FLAGS] & LViewFlags.RefreshTransplantedView) {
lView[FLAGS] &= ~LViewFlags.RefreshTransplantedView;
updateTransplantedViewCount(insertionLContainer, -1);
}

View File

@ -7,7 +7,7 @@
*/
import {CommonModule} from '@angular/common';
import {ChangeDetectionStrategy, ChangeDetectorRef, Component, DoCheck, Input, TemplateRef, Type, ViewChild} from '@angular/core';
import {ChangeDetectionStrategy, ChangeDetectorRef, Component, DoCheck, Input, TemplateRef, Type, ViewChild, ViewContainerRef} from '@angular/core';
import {AfterViewChecked} from '@angular/core/src/core';
import {ComponentFixture, TestBed} from '@angular/core/testing';
import {expect} from '@angular/platform-browser/testing/src/matchers';
@ -594,6 +594,40 @@ describe('change detection for transplanted views', () => {
'SheldonSheldonSheldon',
'Expected transplanted view to be refreshed even when insertion is not dirty');
});
it('should not fail when change detecting detached transplanted view', () => {
@Component({template: '<ng-template>{{incrementChecks()}}</ng-template>'})
class AppComponent {
@ViewChild(TemplateRef) templateRef!: TemplateRef<{}>;
constructor(readonly rootVref: ViewContainerRef, readonly cdr: ChangeDetectorRef) {}
checks = 0;
incrementChecks() {
this.checks++;
}
}
const fixture = TestBed.configureTestingModule({declarations: [AppComponent]})
.createComponent(AppComponent);
const component = fixture.componentInstance;
fixture.detectChanges();
const viewRef = component.templateRef.createEmbeddedView({});
// This `ViewContainerRef` is for the root view
component.rootVref.insert(viewRef);
// `detectChanges` on this `ChangeDetectorRef` will refresh this view and children, not the root
// view that has the transplanted `viewRef` inserted.
component.cdr.detectChanges();
// The template should not have been refreshed because it was inserted "above" the component so
// `detectChanges` will not refresh it.
expect(component.checks).toEqual(0);
// Detach view, manually call `detectChanges`, and verify the template was refreshed
component.rootVref.detach();
viewRef.detectChanges();
expect(component.checks).toEqual(1);
});
});
function trim(text: string|null): string {

View File

@ -11,3 +11,11 @@ yarn bazel run --config=ivy //packages/core/test/bundling/hello_world:symbol_tes
yarn bazel run --config=ivy //packages/core/test/bundling/injection:symbol_test.accept
yarn bazel run --config=ivy //packages/core/test/bundling/todo:symbol_test.accept
```
## Running all symbol tests
To run all symbol tests with one command, you can use the following scripts:
```
yarn run symbol-extractor:check
yarn run symbol-extractor:update
```

View File

@ -0,0 +1,118 @@
package(default_visibility = ["//visibility:public"])
load("//tools:defaults.bzl", "jasmine_node_test", "ng_module", "ng_rollup_bundle", "ts_devserver", "ts_library")
load("//tools/symbol-extractor:index.bzl", "js_expected_symbol_test")
load("@npm//http-server:index.bzl", "http_server")
ng_module(
name = "router",
srcs = [
"index.ts",
],
tags = [
"ivy-only",
],
deps = [
"//packages/core",
"//packages/platform-browser",
"//packages/router",
],
)
ng_rollup_bundle(
name = "bundle",
entry_point = ":index.ts",
tags = [
"ivy-only",
],
deps = [
":router",
"//packages/core",
"//packages/platform-browser",
"//packages/router",
"@npm//rxjs",
],
)
ts_library(
name = "test_lib",
testonly = True,
srcs = glob(["*_spec.ts"]),
tags = [
"ivy-only",
],
deps = [
"//packages:types",
"//packages/compiler",
"//packages/core",
"//packages/core/testing",
"//packages/private/testing",
],
)
jasmine_node_test(
name = "test",
data = [
":bundle",
":bundle.js",
":bundle.min.js",
":bundle.min_debug.js",
],
tags = [
"ivy-only",
],
deps = [":test_lib"],
)
js_expected_symbol_test(
name = "symbol_test",
src = ":bundle.min_debug.js",
golden = ":bundle.golden_symbols.json",
tags = [
"ivy-aot",
"ivy-only",
],
)
genrule(
name = "tslib",
srcs = [
"@npm//:node_modules/tslib/tslib.js",
],
outs = [
"tslib.js",
],
cmd = "cp $< $@",
tags = [
"ivy-only",
],
)
ts_devserver(
name = "devserver",
entry_module = "@angular/core/test/bundling/router",
scripts = [
"//tools/rxjs:rxjs_umd_modules",
],
serving_path = "/bundle.min.js",
static_files = [
"index.html",
":tslib",
],
tags = [
"ivy-only",
],
deps = [":router"],
)
http_server(
name = "prodserver",
data = [
"index.html",
":bundle.min.js",
":bundle.min_debug.js",
],
tags = [
"ivy-only",
],
)

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,32 @@
<!doctype html>
<html>
<head>
<title>Angular Routing Example</title>
</head>
<body>
<!-- The Angular application will be bootstrapped into this element. -->
<app-root></app-root>
<!--
Script tag which bootstraps the application. Use `?debug` in URL to select
the debug version of the script.
There are two scripts sources: `bundle.min.js` and `bundle.min_debug.js` You can
switch between which bundle the browser loads to experiment with the application.
- `bundle.min.js`: Is what the site would serve to their users. It has gone
through rollup, build-optimizer, and uglify with tree shaking.
- `bundle.min_debug.js`: Is what the developer would like to see when debugging
the application. It has also done through full pipeline of rollup, build-optimizer,
and uglify, however special flags were passed to uglify to prevent inlining and
property renaming.
-->
<script>
document.write('<script src="' +
(document.location.search.endsWith('debug') ? '/bundle.min_debug.js' : '/bundle.min.js') +
'"></' + 'script>');
</script>
</body>
</html>

View File

@ -0,0 +1,72 @@
/**
* @license
* Copyright Google LLC All Rights Reserved.
*
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
import {APP_BASE_HREF} from '@angular/common';
import {Component, NgModule, OnInit, ɵNgModuleFactory as NgModuleFactory} from '@angular/core';
import {BrowserModule, platformBrowser} from '@angular/platform-browser';
import {ActivatedRoute, Router, RouterModule, Routes} from '@angular/router';
@Component({
selector: 'app-list',
template: `
<ul>
<li><a routerLink="/item/1" routerLinkActive="active">List Item 1</a></li>
<li><a routerLink="/item/2" routerLinkActive="active">List Item 2</a></li>
<li><a routerLink="/item/3" routerLinkActive="active">List Item 3</a></li>
</ul>
`,
})
class ListComponent {
}
@Component({
selector: 'app-item',
template: `
Item {{id}}
<p><button (click)="viewList()">Back to List</button></p>`,
})
class ItemComponent implements OnInit {
id = -1;
constructor(private activatedRoute: ActivatedRoute, private router: Router) {}
ngOnInit() {
this.activatedRoute.paramMap.subscribe(paramsMap => {
this.id = +paramsMap.get('id')!;
});
}
viewList() {
this.router.navigate(['/list']);
}
}
@Component({
selector: 'app-root',
template: `<router-outlet></router-outlet>`,
})
class RootComponent {
constructor() {}
}
const ROUTES: Routes = [
{path: '', redirectTo: '/list', pathMatch: 'full'}, {path: 'list', component: ListComponent},
{path: 'item/:id', component: ItemComponent}
];
@NgModule({
declarations: [RootComponent, ListComponent, ItemComponent],
imports: [BrowserModule, RouterModule.forRoot(ROUTES)],
providers: [{provide: APP_BASE_HREF, useValue: ''}]
})
class RouteExampleModule {
ngDoBootstrap(app: any) {
app.bootstrap(RootComponent);
}
}
(window as any).waitForApp = platformBrowser().bootstrapModuleFactory(
new NgModuleFactory(RouteExampleModule), {ngZone: 'noop'});

View File

@ -0,0 +1,35 @@
/**
* @license
* Copyright Google LLC All Rights Reserved.
*
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
import '@angular/compiler';
import * as fs from 'fs';
import * as path from 'path';
const UTF8 = {
encoding: 'utf-8'
};
const PACKAGE = 'angular/packages/core/test/bundling/router';
describe('treeshaking with uglify', () => {
let content: string;
// We use the debug version as otherwise symbols/identifiers would be mangled (and the test would
// always pass)
const contentPath = require.resolve(path.join(PACKAGE, 'bundle.min_debug.js'));
beforeAll(() => {
content = fs.readFileSync(contentPath, UTF8);
});
it('should drop unused TypeScript helpers', () => {
expect(content).not.toContain('__asyncGenerator');
});
it('should not contain rxjs from commonjs distro', () => {
expect(content).not.toContain('commonjsGlobal');
expect(content).not.toContain('createCommonjsModule');
});
});

View File

@ -45,14 +45,7 @@ export const host: ts.server.ServerHost = {
...ts.sys,
readFile(absPath: string, encoding?: string): string |
undefined {
const content = ts.sys.readFile(absPath, encoding);
if (content === undefined) {
return undefined;
}
if (absPath === APP_COMPONENT || absPath === PARSING_CASES || absPath === TEST_TEMPLATE) {
return removeReferenceMarkers(removeLocationMarkers(content));
}
return content;
return ts.sys.readFile(absPath, encoding);
},
watchFile(path: string, callback: ts.FileWatcherCallback): ts.FileWatcher {
return NOOP_FILE_WATCHER;
@ -206,14 +199,3 @@ function replaceOnce(searchText: string, regex: RegExp, replaceText: string): Ov
});
return {position, text};
}
const REF_MARKER = /«(((\w|\-)+)|([^ᐱ]*ᐱ(\w+)ᐱ.[^»]*))»/g;
const LOC_MARKER = /\~\{(\w+(-\w+)*)\}/g;
function removeReferenceMarkers(value: string): string {
return value.replace(REF_MARKER, '');
}
function removeLocationMarkers(value: string): string {
return value.replace(LOC_MARKER, '');
}

View File

@ -208,7 +208,7 @@ export function extractTranslations({
export function getSerializer(
format: string, sourceLocale: string, rootPath: AbsoluteFsPath, useLegacyIds: boolean,
formatOptions: FormatOptions): TranslationSerializer {
formatOptions: FormatOptions = {}): TranslationSerializer {
switch (format) {
case 'xlf':
case 'xlif':

View File

@ -27,7 +27,7 @@ const LEGACY_XLIFF_MESSAGE_LENGTH = 40;
export class Xliff1TranslationSerializer implements TranslationSerializer {
constructor(
private sourceLocale: string, private basePath: AbsoluteFsPath, private useLegacyIds: boolean,
private formatOptions: FormatOptions) {
private formatOptions: FormatOptions = {}) {
validateOptions('Xliff1TranslationSerializer', [['xml:space', ['preserve']]], formatOptions);
}

View File

@ -27,7 +27,7 @@ export class Xliff2TranslationSerializer implements TranslationSerializer {
private currentPlaceholderId = 0;
constructor(
private sourceLocale: string, private basePath: AbsoluteFsPath, private useLegacyIds: boolean,
private formatOptions: FormatOptions) {
private formatOptions: FormatOptions = {}) {
validateOptions('Xliff1TranslationSerializer', [['xml:space', ['preserve']]], formatOptions);
}

View File

@ -7,8 +7,8 @@
*/
import {Injector, NgModuleRef} from '@angular/core';
import {defer, EmptyError, Observable, Observer, of} from 'rxjs';
import {catchError, concatAll, first, map, mergeMap, tap} from 'rxjs/operators';
import {EmptyError, Observable, Observer, of} from 'rxjs';
import {catchError, concatMap, first, map, mergeMap, tap} from 'rxjs/operators';
import {LoadedRouterConfig, Route, Routes} from './config';
import {CanLoadFn} from './interfaces';
@ -149,7 +149,7 @@ class ApplyRedirects {
segments: UrlSegment[], outlet: string,
allowRedirects: boolean): Observable<UrlSegmentGroup> {
return of(...routes).pipe(
map((r: any) => {
concatMap((r: any) => {
const expanded$ = this.expandSegmentAgainstRoute(
ngModule, segmentGroup, routes, r, segments, outlet, allowRedirects);
return expanded$.pipe(catchError((e: any) => {
@ -161,7 +161,7 @@ class ApplyRedirects {
throw e;
}));
}),
concatAll(), first((s: any) => !!s), catchError((e: any, _: any) => {
first((s: any) => !!s), catchError((e: any, _: any) => {
if (e instanceof EmptyError || e.name === 'EmptyError') {
if (this.noLeftoversInUrl(segmentGroup, segments, outlet)) {
return of(new UrlSegmentGroup([], {}));
@ -247,12 +247,11 @@ class ApplyRedirects {
segments: UrlSegment[]): Observable<UrlSegmentGroup> {
if (route.path === '**') {
if (route.loadChildren) {
return defer(
() => this.configLoader.load(ngModule.injector, route)
return this.configLoader.load(ngModule.injector, route)
.pipe(map((cfg: LoadedRouterConfig) => {
route._loadedConfig = cfg;
return new UrlSegmentGroup(segments, {});
})));
}));
}
return of(new UrlSegmentGroup(segments, {}));

View File

@ -4090,7 +4090,7 @@ describe('Integration', () => {
return of(delayMs).pipe(delay(delayMs), mapTo(true));
}
@NgModule()
@NgModule({imports: [RouterModule.forChild([{path: '', component: BlankCmp}])]})
class LoadedModule {
}
@ -4119,6 +4119,15 @@ describe('Integration', () => {
return false;
}
},
{
provide: 'returnFalseAndNavigate',
useFactory: (router: Router) => () => {
log.push('returnFalseAndNavigate');
router.navigateByUrl('/redirected');
return false;
},
deps: [Router]
},
{
provide: 'returnUrlTree',
useFactory: (router: Router) => () => {
@ -4132,7 +4141,23 @@ describe('Integration', () => {
});
});
it('should wait for higher priority guards to be resolved',
it('should only execute canLoad guards of routes being activated', fakeAsync(() => {
const router = TestBed.inject(Router);
router.resetConfig([
{path: 'lazy', canLoad: ['guard1'], loadChildren: () => of(LoadedModule)},
{path: 'redirected', component: SimpleCmp},
// canLoad should not run for this route because 'lazy' activates first
{path: '', canLoad: ['returnFalseAndNavigate'], loadChildren: () => of(LoadedModule)},
]);
router.navigateByUrl('/lazy');
tick(5);
expect(log.length).toEqual(1);
expect(log).toEqual(['guard1']);
}));
it('should execute canLoad guards',
fakeAsync(inject(
[Router, NgModuleFactoryLoader],
(router: Router, loader: SpyNgModuleFactoryLoader) => {

View File

@ -79,8 +79,12 @@ export function downgradeInjectable(token: any, downgradedModule: string = ''):
validateInjectionKey($injector, downgradedModule, injectorKey, attemptedAction);
try {
const injector: Injector = $injector.get(injectorKey);
return injector.get(token);
} catch (err) {
throw new Error(`Error while ${attemptedAction}: ${err.message || err}`);
}
};
(factory as any)['$inject'] = [$INJECTOR];

View File

@ -48,4 +48,16 @@ describe('downgradeInjectable', () => {
expect(factory(mockNg1Injector)).toEqual('service value');
expect(mockNg2Injector.get).toHaveBeenCalledWith('someToken');
});
it('should mention the injectable\'s name in the error thrown when failing to retrieve injectable',
() => {
const factory = downgradeInjectable('someToken');
expect(factory).toEqual(jasmine.any(Function));
expect((factory as any).$inject).toEqual([$INJECTOR]);
const {mockNg1Injector, mockNg2Injector} = setupMockInjectors();
mockNg2Injector.get.and.throwError('Mock failure');
expect(() => factory(mockNg1Injector))
.toThrowError(/^Error while instantiating injectable 'someToken': Mock failure/);
});
});

View File

@ -24,6 +24,7 @@ def js_expected_symbol_test(name, src, golden, data = [], **kwargs):
name = name,
data = all_data,
entry_point = entry_point,
tags = kwargs.pop("tags", []) + ["symbol_extractor"],
templated_args = ["$(rootpath %s)" % src, "$(rootpath %s)" % golden],
configuration_env_vars = ["angular_ivy_enabled"],
**kwargs

View File

@ -0,0 +1,68 @@
/**
* @license
* Copyright Google LLC All Rights Reserved.
*
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
// TODO(josephperrott): migrate golden testing to ng-dev toolset
const {spawnSync} = require('child_process');
const minimist = require('minimist');
const path = require('path');
// Remove all command line flags from the arguments.
const argv = minimist(process.argv.slice(2));
// The command the user would like to run, either 'accept' or 'test'
const USER_COMMAND = argv._[0];
// The shell command to query for all tests.
// Bazel targets for testing goldens
process.stdout.write('Gathering all symbol extractor targets');
const ALL_TEST_TARGETS =
spawnSync(
'yarn',
[
'-s', 'bazel', 'query', '--output', 'label',
`kind(nodejs_test, ...) intersect attr("tags", "symbol_extractor", ...)`
],
{encoding: 'utf8', shell: true, cwd: path.resolve(__dirname, '../..')})
.stdout.trim()
.split('\n')
.map(line => line.trim());
process.stdout.clearLine();
process.stdout.cursorTo(0);
// Bazel targets for generating goldens
const ALL_ACCEPT_TARGETS = ALL_TEST_TARGETS.map(test => `${test}.accept`);
/** Run the provided bazel commands on each provided target individually. */
function runBazelCommandOnTargets(command, targets, present) {
for (const target of targets) {
process.stdout.write(`${present}: ${target}`);
const commandResult =
spawnSync('yarn', ['-s', 'bazel', command, '--config=ivy', target], {encoding: 'utf8'});
process.stdout.clearLine();
process.stdout.cursorTo(0);
if (commandResult.status) {
console.error(`Failed ${command}: ${target}`);
console.group();
console.error(commandResult.stdout || commandResult.stderr);
console.groupEnd();
} else {
console.info(`Successful ${command}: ${target}`);
}
}
}
switch (USER_COMMAND) {
case 'accept':
runBazelCommandOnTargets('run', ALL_ACCEPT_TARGETS, 'Running');
break;
case 'test':
runBazelCommandOnTargets('test', ALL_TEST_TARGETS, 'Testing');
break;
default:
console.warn('Invalid command provided.');
console.warn();
console.warn(`Run this script with either "accept" and "test"`);
break;
}