Compare commits

..

46 Commits
9.0.7 ... 4.1.3

Author SHA1 Message Date
7edc5e96f3 docs: add changelog for 4.1.3 2017-05-17 15:44:25 -07:00
7b802faa96 release: cut the 4.1.3 release 2017-05-17 15:42:14 -07:00
6f039d7a3c build: do not create the bundles when updating the public API
- 1.7x faster on my machine (2.7 vs 4.6 min)
- should solve the memory issue
2017-05-17 15:12:12 -07:00
0a82f7d69f fix(platform-server): wait for async app initializers to complete before removing server side styles (#16712)
This fixes a flicker when transitioning from server rendered page to client rendered page in lazy loaded routes by waiting for the lazy loaded route to finish loading, assuming initialNavigation on the route is set to 'enabled'.

Fixes #15716
2017-05-17 15:12:08 -07:00
b7f85813d5 fix(compiler-cli): import routing module with forRoot (#16438) 2017-05-17 15:10:59 -07:00
a5bdbed306 fix: add typescript 2.3.2 typings test (#16738)
Closes #16663
2017-05-17 15:10:59 -07:00
883ca285de fix(router): Wrap Promise-like instances in native Promises (#16759)
Hybrid apps (mix of Angular and AngularJS) might return AngularJS implementation
of Promises that do not play well with the change detection. Wrapping them in
native Promises fix this issue.

This could be the case when a Resolver returns a `$q` promise.
2017-05-17 15:10:58 -07:00
afb7540067 fix(upgrade): Prevent renaming of $inject property (#16706)
Use bracket notation to access $inject in downgradeInjectable to
support property renaming. Since the return type is any,
Closure compiler renames $inject.
2017-05-17 15:10:58 -07:00
47df3d681f fix(upgrade): use quote to prevent ClossureCompiler obfuscating $event. (#16724)
This is critical for AngularJS to get the $event object in template.
2017-05-17 15:10:57 -07:00
cd7fe31fbc docs: add saved replies (#16726) 2017-05-17 15:10:56 -07:00
4203b1bb1f docs(common): remove h1s from API docs
Only one h1 is allowed per document, and this is provided by the template.
So we cannot have any h1 tags (or `#` markdown shorthand) in any API docs.

Closes #16193
2017-05-17 15:10:56 -07:00
055db5ad6a docs(router): Change CanDeactivate to CanLoad (#16237)
fix mistake in docs. CanDeactivate should be CanLoad
2017-05-17 15:10:55 -07:00
62a8618536 docs: add changelog for 4.1.2 2017-05-10 15:44:28 -07:00
419fe0ca0d release: cut the 4.1.2 release 2017-05-10 15:42:11 -07:00
e46a65f8fd build: update key for pushing to PACKAGE-builds repos (#16667) 2017-05-09 17:34:35 -07:00
f0e1043a1d docs(*) fix dangling links in API docs (#16632)
* docs(animations): fix links to `Component` animations

* docs(core): fix links to `ReflectiveInjector` methods

The `resolve` and other methods were moved from the
`Injector` to the `ReflectiveInjector`.

* docs(core): fix links to `Renderer`

The local links were assuming that that methods were on the
current document (e.g. `RootRenderer`), but they are actually
on the `Renderer` class.

* docs(router): fix links to methods

* docs(forms): fix links to methods

* docs(core): fix links to methods

* docs(router): fix API page links and an internal link
2017-05-09 17:34:35 -07:00
c1938f5af1 docs(core): remove link to non-existent class in InjectableDecorator (#16653)
Closes #16640
2017-05-09 17:34:34 -07:00
915eae50b4 fix(router): fix redirect to a URL with a param having multiple values (#16376)
fixes #16310

PR Close #16376
2017-05-09 17:34:33 -07:00
68675e625a test(router): simplify redirect tests (#16376) 2017-05-09 17:34:32 -07:00
d0e1688514 fix(compiler): avoid a ...null spread in extraction (#16547)
This code only runs in ES5 mode in the test suite, so this is difficult to test.
However `updateFromTemplate` is being called with a spread operator, as
`...updateFromTemplate(...)`. The spread operator should fail on `null` values.
This change avoids the problem by always returning a (possibly empty) array.

PR Close #16547
2017-05-09 17:34:31 -07:00
ee6705af49 fix(core): detach projected views when a parent view is destroyed (#16592)
This also clarifies via a test 
that we only update projected views when the view is created or destroyed,
but not when it is attached/detached/moved.

Fixes #15578

PR Close #16592
2017-05-09 17:34:31 -07:00
9218812011 fix(core): projected views should be dirty checked when the declaring component is dirty checked. (#16592)
Previously a projected view was only dirty checked when the
component in which it was inserted was dirty checked.

This fix changes the behavior so that a view is also dirty checked if
the declaring component is dirty checked.

Note: This does not change the order of change detection,
only the fact whether a projected view is dirty checked or not.

Fixes #14321
2017-05-09 17:34:30 -07:00
ec77bf9fb0 docs(core): EventEmitter docs for isAsync defaults (#15780)
- solves #15758
2017-05-09 17:34:29 -07:00
fee5b2ad56 docs(animations): remove duplicate word (#16508) 2017-05-09 17:34:24 -07:00
9c70a3cfb1 fix(http): flatten metadata for @angular/http/testing
@angular/http/testing used to publish a metadata structure which paralleled
the .d.ts structure. This causes ngc to write incorrect imports for this
bundle when compiling providers using MockBackend and other http testing
classes.

This change restructures the @angular/http/testing build a bit, modeling it
after @angular/platform-browser-animations, and produces a FESM structure
that has flat metadata.

Fixes #15521.
2017-05-09 17:33:59 -07:00
ec3b6e9603 fix(http): introduce encodingHint for text() for better ArrayBuffer support
Currently, if a Response has an ArrayBuffer body and text() is called, Angular
attempts to convert the ArrayBuffer to a string. Doing this requires knowing
the encoding of the bytes in the buffer, which is context that we don't have.

Instead, we assume that the buffer is encoded in UTF-16, and attempt to process
it that way. Unfortunately the approach chosen (interpret buffer as Uint16Array and
create a Javascript string from each entry using String.fromCharCode) is incorrect
as it does not handle UTF-16 surrogate pairs. What Angular actually implements, then,
is UCS-2 decoding, which is equivalent to UTF-16 with characters restricted to the
base plane.

No standard way of decoding UTF-8 or UTF-16 exists in the browser today. APIs like
TextDecoder are only supported in a few browsers, and although hacks like using the
FileReader API with a Blob to force browsers to do content encoding detection and
decoding exist, they're slow and not compatible with the synchronous text() API.

Thus, this bug is fixed by introducing an encodingHint parameter to text(). The
default value of this parameter is 'legacy', indicating that the existing broken
behavior should be used - this prevents breaking existing apps. The only other
possible value of the hint is 'iso-8859' which interprets each byte of the buffer
with String.fromCharCode. UTF-8 and UTF-16 are not supported - it is up to the
consumer to get the ArrayBuffer and decode it themselves.

The parameter is a hint, as it's not always used (for example, if the conversion
to text doesn't involve an ArrayBuffer source). Additionally, this leaves the door
open for future implementations to perform more sophisticated encoding detection
and ignore the user-provided value if it can be proven to be incorrect.

Fixes #15932.

PR Close #16420
2017-05-09 17:33:59 -07:00
63066f7697 fix(http): honor RequestArgs.search and RequestArgs.params map type
Currently `new Request({search: ...})` is not honored, and
`new Request({params: {'x': 'y'}) doesn't work either, as this object would have
toString() called. This change allows both of these cases to work, as proved by
the 2 new tests.

Fixes #15761

PR Close #16392
2017-05-09 17:33:58 -07:00
5f6d0f2340 revert: fix: public API golden files (#16414)
This reverts commit dcaa11a88b.
2017-05-04 15:00:09 -07:00
4a0e93b03b revert: test(compiler-cli): add test for missingTranslation parameter
This reverts commit 109229246c.
2017-05-04 14:51:30 -07:00
1f5dce2128 docs: add changelog for 4.1.1 2017-05-04 14:22:43 -07:00
14e7e43ad8 release: cut the 4.1.1 release 2017-05-04 14:18:24 -07:00
54d4b893fd test(compiler-cli): add test for missingTranslation parameter 2017-05-04 14:05:17 -07:00
4670cf51cc test: cleanup rxjs custom build
The latest rxjs release works with closure compiler out of the box.
We no longer need to compile our own.

Also put closure options into a file rather than using a shell script.
2017-05-04 14:05:16 -07:00
dd4e501999 fix(upgrade): initialize all inputs in time for ngOnChanges()
Previously, non-bracketed inputs (e.g. `xyz="foo"`) on downgraded components
were initialized using `attrs.$observe()` (which uses `$evalAsync()` under the
hood), while bracketed inputs (e.g. `[xyz]="'foo'"`) were initialized using
`$watch()`. If the downgraded component was created during a `$digest` (e.g. by
an `ng-if` watcher), the non-bracketed inputs were not initialized in time for
the initial call to `ngOnChanges()` and `ngOnInit()`.

This commit fixes it by using `$watch()` to initialize all inputs. `$observe()`
is still used for subsequent updates on non-bracketed inputs, because it is more
performant.

Fixes #16212
2017-05-04 14:05:16 -07:00
9124994849 build: update concurrently to latest version
The regression in `concurrently` v3.2.0 (which made us roll back to v3.1.0
in #14378) has been fixed in v3.3.0 (see kimmobrunfeldt/concurrently#89).
2017-05-04 14:05:16 -07:00
4fbc61469f docs: fix links in api docs 2017-05-04 14:05:16 -07:00
8a883f24f6 refactor(compiler): simplify AOT tests 2017-05-04 14:05:15 -07:00
07cef367ac fix(core): don’t stop change detection because of errors
- prevents unsubscribing from the zone on error
- prevents unsubscribing from directive `EventEmitter`s on error
- prevents detaching views in dev mode if there on error
- ensures that `ngOnInit` is only called 1x (also in prod mode)

Fixes #9531
Fixes #2413
Fixes #15925
2017-05-04 14:05:15 -07:00
c060110695 fix(language-service): remove asserts for non-null expressions (#16422)
Reworked some of the code so asserts are no longer necessary.
Added additional and potentially redundant checks
Added checks where the null checker found real problems.

PR Close #16422
2017-05-04 14:05:15 -07:00
93ff3166ab docs(common): fix API docs for NgComponentOutlet (#16411)
fixes #16373

PR Close #16411
2017-05-04 14:05:14 -07:00
85a1b54c6e fix(core): don’t set ng-version for dynamically created components (#16394)
Angular uses the `ng-version` attribute to indicate which elements
were used to bootstrap an application. However, after 4.0 we also
added this attribute for all dynamically created components.

Fixes #15880

PR Close #16394
2017-05-04 14:05:14 -07:00
acf83b90bc fix(core): allow to detach OnPush components (#16394)
Fixes #9720
2017-05-04 14:05:14 -07:00
f66e59ebe4 fix(core): allow directives to inject the component’s ChangeDetectorRef. (#16394)
When a directive lives on the same element as a component
(e.g. `<my-comp myDir>`), the directive was not able to get hold
of the `ChangeDetectorRef` of the component on that element. However,
as directives are supposed to decorate components, this is incorrect.

This commit enables this use case.

Closes #12816
2017-05-04 14:05:14 -07:00
d932e724ab ci(language-service): update ci tests to official 2.3 build (#16415)
PR Close #16415
2017-05-04 14:05:13 -07:00
dcaa11a88b fix: public API golden files (#16414) 2017-05-04 14:05:13 -07:00
427d63a422 fix: strictNullCheck support. (#16389) (#16389)
Fix #16357

Workaround for https://github.com/Microsoft/TypeScript/issues/10078

Closes #16389

PR Close #16389
2017-05-04 14:05:13 -07:00
9636 changed files with 176969 additions and 986913 deletions

View File

@ -1,97 +0,0 @@
# Bazel does not yet support wildcards or other .gitignore semantics for
# .bazelignore. Two issues for this feature request are outstanding:
# https://github.com/bazelbuild/bazel/issues/7093
# https://github.com/bazelbuild/bazel/issues/8106
.git
node_modules
dist
aio/content
aio/node_modules
aio/tools/examples/shared/node_modules
packages/bazel/node_modules
integration/bazel/bazel-bazel
integration/bazel/bazel-bin
integration/bazel/bazel-out
integration/bazel/bazel-testlogs
integration/bazel-schematics/demo
# All integration test node_modules folders
integration/bazel/node_modules
integration/bazel-schematics/node_modules
integration/cli-hello-world/node_modules
integration/cli-hello-world-ivy-compat/node_modules
integration/cli-hello-world-ivy-i18n/node_modules
integration/cli-hello-world-ivy-minimal/node_modules
integration/cli-hello-world-lazy/node_modules
integration/cli-hello-world-lazy-rollup/node_modules
integration/dynamic-compiler/node_modules
integration/hello_world__closure/node_modules
integration/hello_world__systemjs_umd/node_modules
integration/i18n/node_modules
integration/injectable-def/node_modules
integration/ivy-i18n/node_modules
integration/language_service_plugin/node_modules
integration/ng_elements/node_modules
integration/ng_elements_schematics/node_modules
integration/ng_update/node_modules
integration/ng_update_migrations/node_modules
integration/ngcc/node_modules
integration/platform-server/node_modules
integration/service-worker-schema/node_modules
integration/side-effects/node_modules
integration/terser/node_modules
integration/typings_test_ts36/node_modules
integration/typings_test_ts37/node_modules
# All integration test .yarn_local_cache folders
integration/bazel/.yarn_local_cache
integration/bazel-schematics/.yarn_local_cache
integration/cli-hello-world/.yarn_local_cache
integration/cli-hello-world-ivy-compat/.yarn_local_cache
integration/cli-hello-world-ivy-i18n/.yarn_local_cache
integration/cli-hello-world-ivy-minimal/.yarn_local_cache
integration/cli-hello-world-lazy/.yarn_local_cache
integration/cli-hello-world-lazy-rollup/.yarn_local_cache
integration/dynamic-compiler/.yarn_local_cache
integration/hello_world__closure/.yarn_local_cache
integration/hello_world__systemjs_umd/.yarn_local_cache
integration/i18n/.yarn_local_cache
integration/injectable-def/.yarn_local_cache
integration/ivy-i18n/.yarn_local_cache
integration/language_service_plugin/.yarn_local_cache
integration/ng_elements/.yarn_local_cache
integration/ng_elements_schematics/.yarn_local_cache
integration/ng_update/.yarn_local_cache
integration/ng_update_migrations/.yarn_local_cache
integration/ngcc/.yarn_local_cache
integration/platform-server/.yarn_local_cache
integration/service-worker-schema/.yarn_local_cache
integration/side-effects/.yarn_local_cache
integration/terser/.yarn_local_cache
integration/typings_test_ts36/.yarn_local_cache
integration/typings_test_ts37/.yarn_local_cache
# All integration test NPM_PACKAGE_MANIFEST.json folders
integration/bazel/NPM_PACKAGE_MANIFEST.json
integration/bazel-schematics/NPM_PACKAGE_MANIFEST.json
integration/cli-hello-world/NPM_PACKAGE_MANIFEST.json
integration/cli-hello-world-ivy-compat/NPM_PACKAGE_MANIFEST.json
integration/cli-hello-world-ivy-i18n/NPM_PACKAGE_MANIFEST.json
integration/cli-hello-world-ivy-minimal/NPM_PACKAGE_MANIFEST.json
integration/cli-hello-world-lazy/NPM_PACKAGE_MANIFEST.json
integration/cli-hello-world-lazy-rollup/NPM_PACKAGE_MANIFEST.json
integration/dynamic-compiler/NPM_PACKAGE_MANIFEST.json
integration/hello_world__closure/NPM_PACKAGE_MANIFEST.json
integration/hello_world__systemjs_umd/NPM_PACKAGE_MANIFEST.json
integration/i18n/NPM_PACKAGE_MANIFEST.json
integration/injectable-def/NPM_PACKAGE_MANIFEST.json
integration/ivy-i18n/NPM_PACKAGE_MANIFEST.json
integration/language_service_plugin/NPM_PACKAGE_MANIFEST.json
integration/ng_elements/NPM_PACKAGE_MANIFEST.json
integration/ng_elements_schematics/NPM_PACKAGE_MANIFEST.json
integration/ng_update/NPM_PACKAGE_MANIFEST.json
integration/ng_update_migrations/NPM_PACKAGE_MANIFEST.json
integration/ngcc/NPM_PACKAGE_MANIFEST.json
integration/platform-server/NPM_PACKAGE_MANIFEST.json
integration/service-worker-schema/NPM_PACKAGE_MANIFEST.json
integration/side-effects/NPM_PACKAGE_MANIFEST.json
integration/terser/NPM_PACKAGE_MANIFEST.json
integration/typings_test_ts36/NPM_PACKAGE_MANIFEST.json
integration/typings_test_ts37/NPM_PACKAGE_MANIFEST.json

150
.bazelrc
View File

@ -1,150 +0,0 @@
# Enable debugging tests with --config=debug
test:debug --test_arg=--node_options=--inspect-brk --test_output=streamed --test_strategy=exclusive --test_timeout=9999 --nocache_test_results
###############################
# Filesystem interactions #
###############################
# Create symlinks in the project:
# - dist/bin for outputs
# - dist/testlogs, dist/genfiles
# - bazel-out
# NB: bazel-out should be excluded from the editor configuration.
# The checked-in /.vscode/settings.json does this for VSCode.
# Other editors may require manual config to ignore this directory.
# In the past, we say a problem where VSCode traversed a massive tree, opening file handles and
# eventually a surprising failure with auto-discovery of the C++ toolchain in
# MacOS High Sierra.
# See https://github.com/bazelbuild/bazel/issues/4603
build --symlink_prefix=dist/
# Turn off legacy external runfiles
build --nolegacy_external_runfiles
run --nolegacy_external_runfiles
test --nolegacy_external_runfiles
# Turn on --incompatible_strict_action_env which was on by default
# in Bazel 0.21.0 but turned off again in 0.22.0. Follow
# https://github.com/bazelbuild/bazel/issues/7026 for more details.
# This flag is needed to so that the bazel cache is not invalidated
# when running bazel via `yarn bazel`.
# See https://github.com/angular/angular/issues/27514.
build --incompatible_strict_action_env
run --incompatible_strict_action_env
test --incompatible_strict_action_env
###############################
# Release support #
# Turn on these settings with #
# --config=release #
###############################
# Releases should always be stamped with version control info
# This command assumes node on the path and is a workaround for
# https://github.com/bazelbuild/bazel/issues/4802
build:release --workspace_status_command="node ./tools/bazel_stamp_vars.js"
build:release --stamp
###############################
# Output #
###############################
# A more useful default output mode for bazel query
# Prints eg. "ng_module rule //foo:bar" rather than just "//foo:bar"
query --output=label_kind
# By default, failing tests don't print any output, it goes to the log file
test --test_output=errors
################################
# Settings for CircleCI #
################################
# Bazel flags for CircleCI are in /.circleci/bazel.linux.rc and /.circleci/bazel.windows.rc
##################################
# Settings for integration tests #
##################################
# Trick bazel into treating BUILD files under integration/bazel as being regular files
# This lets us glob() up all the files inside this integration test to make them inputs to tests
# (Note, we cannot use common --deleted_packages because the bazel version command doesn't support it)
build --deleted_packages=integration/bazel,integration/bazel/src,integration/bazel/src/hello-world,integration/bazel/test,integration/bazel/test/e2e
query --deleted_packages=integration/bazel,integration/bazel/src,integration/bazel/src/hello-world,integration/bazel/test,integration/bazel/test/e2e
################################
# Temporary Settings for Ivy #
################################
# To determine if the compiler used should be Ivy instead of ViewEngine, one can use `--config=ivy`
# on any bazel target. This is a temporary flag until codebase is permanently switched to Ivy.
build --define=angular_ivy_enabled=False
build:view-engine --define=angular_ivy_enabled=False
build:ivy --define=angular_ivy_enabled=True
##################################
# Remote Build Execution support #
# Turn on these settings with #
# --config=remote #
##################################
# The following --define=EXECUTOR=remote will be able to be removed
# once https://github.com/bazelbuild/bazel/issues/7254 is fixed
build:remote --define=EXECUTOR=remote
# Set a higher timeout value, just in case.
build:remote --remote_timeout=600
# Increase the default number of jobs by 50% because our build has lots of
# parallelism
build:remote --jobs=150
build:remote --google_default_credentials
# Force remote exeuctions to consider the entire run as linux
build:remote --cpu=k8
build:remote --host_cpu=k8
# Toolchain and platform related flags
build:remote --host_javabase=@rbe_ubuntu1604_angular//java:jdk
build:remote --javabase=@rbe_ubuntu1604_angular//java:jdk
build:remote --host_java_toolchain=@bazel_tools//tools/jdk:toolchain_hostjdk8
build:remote --java_toolchain=@bazel_tools//tools/jdk:toolchain_hostjdk8
build:remote --crosstool_top=@rbe_ubuntu1604_angular//cc:toolchain
build:remote --extra_toolchains=@rbe_ubuntu1604_angular//config:cc-toolchain
build:remote --extra_execution_platforms=//tools:rbe_ubuntu1604-angular
build:remote --host_platform=//tools:rbe_ubuntu1604-angular
build:remote --platforms=//tools:rbe_ubuntu1604-angular
# Remote instance and caching
build:remote --remote_instance_name=projects/internal-200822/instances/default_instance
build:remote --project_id=internal-200822
build:remote --remote_cache=remotebuildexecution.googleapis.com
build:remote --remote_executor=remotebuildexecution.googleapis.com
##################################
# Saucelabs tests settings #
# Turn on these settings with #
# --config=saucelabs #
##################################
# For saucelabs tests we don't want to enable flaky test attempts. Karma has its own integrated
# retry mechanism and we do not want to retry unnecessarily if Karma already tried multiple times.
test:saucelabs --flaky_test_attempts=1
###############################
# NodeJS rules settings
# These settings are required for rules_nodejs
###############################
# Turn on managed directories feature in Bazel
# This allows us to avoid installing a second copy of node_modules
common --experimental_allow_incremental_repository_updates
####################################################
# User bazel configuration
# NOTE: This needs to be the *last* entry in the config.
####################################################
# Load any settings which are specific to the current user. Needs to be *last* statement
# in this config, as the user configuration should be able to overwrite flags from this file.
try-import .bazelrc.user

3
.bowerrc Normal file
View File

@ -0,0 +1,3 @@
{
"directory" : "bower_components"
}

View File

@ -1,19 +0,0 @@
# Encryption
Based on https://github.com/circleci/encrypted-files
In the CircleCI web UI, we have a secret variable called `KEY`
https://circleci.com/gh/angular/angular/edit#env-vars
which is only exposed to non-fork builds
(see "Pass secrets to builds from forked pull requests" under
https://circleci.com/gh/angular/angular/edit#advanced-settings)
We use this as a symmetric AES encryption key to encrypt tokens like
a GitHub token that enables publishing snapshots.
To create the github_token file, we take this approach:
- Find the angular-builds:token in http://valentine
- Go inside the CircleCI default docker image so you use the same version of openssl as we will at runtime: `docker run --rm -it circleci/node:10.12`
- echo "https://[token]:@github.com" > credentials
- openssl aes-256-cbc -e -in credentials -out .circleci/github_token -k $KEY
- If needed, base64-encode the result so you can copy-paste it out of docker: `base64 github_token`

View File

@ -1,15 +0,0 @@
# Settings in this file should be OS agnostic. Use the bazel.<OS>.rc files for OS specific settings.
# Don't be spammy in the logs
build --noshow_progress
# Print all the options that apply to the build.
# This helps us diagnose which options override others
# (e.g. /etc/bazel.bazelrc vs. tools/bazel.rc)
build --announce_rc
# Retry in the event of flakes, eg. https://circleci.com/gh/angular/angular/31309
test --flaky_test_attempts=2
# More details on failures
build --verbose_failures=true

View File

@ -1,21 +0,0 @@
# These options are enabled when running on CI
# We do this by copying this file to /etc/bazel.bazelrc at the start of the build.
# See documentation in /docs/BAZEL.md
# Import config items common to both Linux and Windows setups.
# https://docs.bazel.build/versions/master/guide.html#bazelrc-syntax-and-semantics
try-import %workspace%/.circleci/bazel.common.rc
# Save downloaded repositories in a location that can be cached by CircleCI. This helps us
# speeding up the analysis time significantly with Bazel managed node dependencies on the CI.
build --repository_cache=/home/circleci/bazel_repository_cache
# Workaround https://github.com/bazelbuild/bazel/issues/3645
# Bazel doesn't calculate the memory ceiling correctly when running under Docker.
# Limit Bazel to consuming resources that fit in CircleCI "xlarge" class
# https://circleci.com/docs/2.0/configuration-reference/#resource_class
build --local_resources=14336,8.0,1.0
# All build executed remotely should be done using our RBE configuration.
build:remote --google_default_credentials
build --config=remote

View File

@ -1,17 +0,0 @@
# These options are enabled when running on CI
# We do this by copying this file to $env:ProgramData\bazel.bazelrc at the start of the build.
# See documentation in /docs/BAZEL.md
# Import config items common to both Linux and Windows setups.
# https://docs.bazel.build/versions/master/guide.html#bazelrc-syntax-and-semantics
try-import %workspace%/.circleci/bazel.common.rc
# Save downloaded repositories in a location that can be cached by CircleCI. This helps us
# speeding up the analysis time significantly with Bazel managed node dependencies on the CI.
build --repository_cache=C:/Users/circleci/bazel_repository_cache
# All windows jobs run on master and should use http caching
build --remote_http_cache=https://storage.googleapis.com/angular-team-cache
build --remote_accept_cached=true
build --remote_upload_local_results=true
build --google_default_credentials

View File

@ -1,905 +0,0 @@
# Configuration file for https://circleci.com/gh/angular/angular
# Note: YAML anchors allow an object to be re-used, reducing duplication.
# The ampersand declares an alias for an object, then later the `<<: *name`
# syntax dereferences it.
# See http://blog.daemonl.com/2016/02/yaml.html
# To validate changes, use an online parser, eg.
# http://yaml-online-parser.appspot.com/
# CircleCI configuration version
# Version 2.1 allows for extra config reuse features
# https://circleci.com/docs/2.0/reusing-config/#getting-started-with-config-reuse
version: 2.1
# We don't want to include the current branch name in the cache key because that would prevent
# PRs from being able to restore the cache since the branch names are always different for PRs.
# The cache key should only consist of dynamic values that change whenever something in the
# cache changes. For example:
# 1) yarn lock file changes --> cached "node_modules" are different.
# 2) bazel repository definitions change --> cached bazel repositories are different.
# Windows needs its own cache key because binaries in node_modules are different.
# **NOTE 1 **: If you change the cache key prefix, also sync the cache_key_fallback to match.
# **NOTE 2 **: Keep the static part of the cache key as prefix to enable correct fallbacks.
# See https://circleci.com/docs/2.0/caching/#restoring-cache for how prefixes work in CircleCI.
var_3: &cache_key v4-angular-node-12-{{ checksum "yarn.lock" }}-{{ checksum "WORKSPACE" }}-{{ checksum "packages/bazel/package.bzl" }}-{{ checksum "aio/yarn.lock" }}
var_4: &cache_key_fallback v4-angular-node-12-
var_3_win: &cache_key_win v5-angular-win-node-12-{{ checksum "yarn.lock" }}-{{ checksum "WORKSPACE" }}-{{ checksum "packages/bazel/package.bzl" }}-{{ checksum "aio/yarn.lock" }}
var_4_win: &cache_key_win_fallback v5-angular-win-node-12-
# Cache key for the `components-repo-unit-tests` job. **Note** when updating the SHA in the
# cache keys also update the SHA for the "COMPONENTS_REPO_COMMIT" environment variable.
var_5: &components_repo_unit_tests_cache_key v5-angular-components-598db096e668aa7e9debd56eedfd127b7a55e371
var_6: &components_repo_unit_tests_cache_key_fallback v5-angular-components-
# Workspace initially persisted by the `setup` job, and then enhanced by `build-npm-packages` and
# `build-ivy-npm-packages`.
# https://circleci.com/docs/2.0/workflows/#using-workspaces-to-share-data-among-jobs
# https://circleci.com/blog/deep-diving-into-circleci-workspaces/
var_7: &workspace_location ~/
# Filter to run a job on builds for pull requests only.
var_8: &only_on_pull_requests
filters:
branches:
only:
- /pull\/\d+/
# Filter to skip a job on builds for pull requests.
var_9: &skip_on_pull_requests
filters:
branches:
ignore:
- /pull\/\d+/
# Filter to run a job on builds for the master branch only.
var_10: &only_on_master
filters:
branches:
only:
- master
# Executor Definitions
# https://circleci.com/docs/2.0/reusing-config/#authoring-reusable-executors
# **NOTE 1**: Pin to exact images using an ID (SHA). See https://circleci.com/docs/2.0/circleci-images/#using-a-docker-image-id-to-pin-an-image-to-a-fixed-version.
# (Using the tag in not necessary when pinning by ID, but include it anyway for documentation purposes.)
# **NOTE 2**: If you change the version of the docker images, also change the `cache_key` suffix.
# **NOTE 3**: If you change the version of the `*-browsers` docker image, make sure the
# `--versions.chrome` arg in `integration/bazel-schematics/test.sh` specifies a
# ChromeDriver version that is compatible with the Chrome version in the image.
executors:
default-executor:
parameters:
resource_class:
type: string
default: medium
docker:
- image: circleci/node:12.14.1@sha256:f9de24fc0017059cc42ef7d07db060008af65a98b1f0cdd1ef3339213226bf6d
resource_class: << parameters.resource_class >>
working_directory: ~/ng
windows-executor:
working_directory: ~/ng
resource_class: windows.medium
# CircleCI windows VMs do have the GitBash shell available:
# https://github.com/CircleCI-Public/windows-preview-docs#shells
# But in this specific case we really should not use it because Bazel must not be ran from
# GitBash. These issues discuss why:
# https://github.com/bazelbuild/bazel/issues/5751
# https://github.com/bazelbuild/bazel/issues/5724#issuecomment-410194038
# https://github.com/bazelbuild/bazel/issues/6339#issuecomment-441600879
shell: powershell.exe -ExecutionPolicy Bypass
machine:
# Windows preview image that includes the following:
# - Visual Studio 2019 build tools
# - Node 12
# - yarn 1.17
# - Python 3 3.7.4
image: windows-server-2019-vs2019:201908-02
# Command Definitions
# https://circleci.com/docs/2.0/reusing-config/#authoring-reusable-commands
commands:
custom_attach_workspace:
description: Attach workspace at a predefined location
steps:
- attach_workspace:
at: *workspace_location
# Install shared libs used by Chrome that is either provisioned by
# rules_webtesting or by puppeteer.
install_chrome_libs:
description: Install shared Chrome libs
steps:
- run:
name: Install shared Chrome libs
command: |
sudo apt-get update
# Install GTK+ graphical user interface (libgtk-3-0), advanced linux sound architecture (libasound2)
# and network security service libraries (libnss3) & X11 Screen Saver extension library (libssx1)
# which are dependendies of chrome & needed for karma & protractor headless chrome tests.
# This is a very small install which takes around 7s in comparing to using the full
# circleci/node:x.x.x-browsers image.
sudo apt-get -y install libgtk-3-0 libasound2 libnss3 libxss1
# Install java runtime which is required by some integration tests such as
# //integration:hello_world__closure_test, //integration:i18n_test and
# //integration:ng_elements_test to run the closure compiler
install_java:
description: Install java
steps:
- run:
name: Install java
command: |
sudo apt-get update
# Install java runtime
sudo apt-get install default-jre
# Initializes the CI environment by setting up common environment variables.
init_environment:
description: Initializing environment (setting up variables)
steps:
- run:
name: Set up environment
environment:
CIRCLE_GIT_BASE_REVISION: << pipeline.git.base_revision >>
CIRCLE_GIT_REVISION: << pipeline.git.revision >>
command: ./.circleci/env.sh
- run:
# Configure git as the CircleCI `checkout` command does.
# This is needed because we only checkout on the setup job.
# Add GitHub to known hosts
name: Configure git
command: |
mkdir -p ~/.ssh
echo 'github.com ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEAq2A7hRGmdnm9tUDbO9IDSwBK6TbQa+PXYPCPy6rbTrTtw7PHkccKrpp0yVhp5HdEIcKr6pLlVDBfOLX9QUsyCOV0wzfjIJNlGEYsdlLJizHhbn2mUjvSAHQqZETYP81eFzLQNnPHt4EVVUh7VfDESU84KezmD5QlWpXLmvU31/yMf+Se8xhHTvKSCZIFImWwoG6mbUoWf9nzpIoaSjB+weqqUUmpaaasXVal72J+UX2B+2RPW3RcT0eOzQgqlJL3RKrTJvdsjE3JEAvGq3lGHSZXy28G3skua2SmVi/w4yCE6gbODqnTWlg7+wC604ydGXA8VJiS5ap43JXiUFFAaQ==' >> ~/.ssh/known_hosts
git config --global url."ssh://git@github.com".insteadOf "https://github.com" || true
git config --global gc.auto 0 || true
init_saucelabs_environment:
description: Sets up a domain that resolves to the local host.
steps:
- run:
name: Preparing environment for running tests on Saucelabs.
command: |
# For SauceLabs jobs, we set up a domain which resolves to the machine which launched
# the tunnel. We do this because devices are sometimes not able to properly resolve
# `localhost` or `127.0.0.1` through the SauceLabs tunnel. Using a domain that does not
# resolve to anything on SauceLabs VMs ensures that such requests are always resolved
# through the tunnel, and resolve to the actual tunnel host machine (i.e. the CircleCI VM).
# More context can be found in: https://github.com/angular/angular/pull/35171.
setPublicVar SAUCE_LOCALHOST_ALIAS_DOMAIN "angular-ci.local"
setSecretVar SAUCE_ACCESS_KEY $(echo $SAUCE_ACCESS_KEY | rev)
- run:
# Sets up a local domain in the machine's host file that resolves to the local
# host. This domain is helpful in Saucelabs tests where devices are not able to
# properly resolve `localhost` or `127.0.0.1` through the sauce-connect tunnel.
name: Setting up alias domain for local host.
command: echo "127.0.0.1 $SAUCE_LOCALHOST_ALIAS_DOMAIN" | sudo tee -a /etc/hosts
# Normally this would be an individual job instead of a command.
# But startup and setup time for each invidual windows job are high enough to discourage
# many small jobs, so instead we use a command for setup unless the gain becomes significant.
setup_win:
description: Setup windows node environment
steps:
# Use the Linux workspace directly, as it already has checkout, rebased and node modules.
- custom_attach_workspace
# Install Bazel pre-requisites that aren't in the preconfigured CircleCI Windows VM.
- run: ./.circleci/windows-env.ps1
- run: node --version
- run: yarn --version
- restore_cache:
keys:
- *cache_key_win
- *cache_key_win_fallback
# Reinstall to get windows binaries.
- run: yarn install --frozen-lockfile --non-interactive
# Install @bazel/bazel globally and use that for the first run.
# Workaround for https://github.com/bazelbuild/rules_nodejs/issues/894
- run: yarn global add @bazel/bazel@$env:BAZEL_VERSION
- run: bazel info
notify_webhook_on_fail:
description: Notify a webhook about failure
parameters:
# `webhook_url_env_var` are secret env vars defined in CircleCI project settings.
# The URLs come from https://angular-team.slack.com/apps/A0F7VRE7N-circleci.
webhook_url_env_var:
type: env_var_name
steps:
- run:
when: on_fail
command: |
notificationJson="{\"text\":\":x: \`$CIRCLE_JOB\` job for $CIRCLE_BRANCH branch failed on build $CIRCLE_BUILD_NUM: $CIRCLE_BUILD_URL :scream:\"}"
curl --request POST --header "Content-Type: application/json" --data "$notificationJson" ${<< parameters.webhook_url_env_var >>}
# Job definitions
# Jobs can include parameters that are passed in the workflow job invocation.
# https://circleci.com/docs/2.0/reusing-config/#authoring-parameterized-jobs
jobs:
setup:
executor: default-executor
steps:
- checkout
- run:
name: Rebase PR on target branch
# After checkout, rebase on top of target branch.
command: >
if [[ -n "${CIRCLE_PR_NUMBER}" ]]; then
# User is required for rebase.
git config user.name "angular-ci"
git config user.email "angular-ci"
# Rebase PR on top of target branch.
node tools/rebase-pr.js angular/angular ${CIRCLE_PR_NUMBER}
else
echo "This build is not over a PR, nothing to do."
fi
# This cache is saved in the build-npm-packages so that Bazel cache is also included.
- restore_cache:
keys:
- *cache_key
- *cache_key_fallback
- init_environment
- run:
name: Running Yarn install
command: yarn install --frozen-lockfile --non-interactive
# Yarn's requests sometimes take more than 10mins to complete.
no_output_timeout: 45m
- run: yarn --cwd aio install --frozen-lockfile --non-interactive
# Make the bazel directories and add a file to them if they don't exist already so that
# persist_to_workspace does not fail.
- run: |
if [ ! -d ~/bazel_repository_cache ]; then
mkdir ~/bazel_repository_cache
touch ~/bazel_repository_cache/MARKER
fi
# Persist any changes at this point to be reused by further jobs.
# **NOTE**: To add new content to the workspace, always persist on the same root.
- persist_to_workspace:
root: *workspace_location
paths:
- ./ng
- ./bazel_repository_cache
lint:
executor: default-executor
steps:
- custom_attach_workspace
- init_environment
- run: 'yarn bazel:format -mode=check ||
(echo "BUILD files not formatted. Please run ''yarn bazel:format''" ; exit 1)'
# Run the skylark linter to check our Bazel rules
- run: 'yarn bazel:lint ||
(echo -e "\n.bzl files have lint errors. Please run ''yarn bazel:lint-fix''"; exit 1)'
- run: yarn lint
- run: yarn ts-circular-deps:check
- run: node tools/pullapprove/verify.js
test:
executor:
name: default-executor
# Now that large integration tests are running locally in parallel (they can't run on RBE yet
# as they require network access for yarn install), this test is running out of memory
# consistently with the xlarge machine.
# TODO: switch back to xlarge once integration tests are running on remote-exec
resource_class: 2xlarge+
steps:
- custom_attach_workspace
- init_environment
- install_chrome_libs
- install_java
- run:
command: yarn bazel test //... --build_tag_filters=-ivy-only --test_tag_filters=-ivy-only
no_output_timeout: 20m
# Temporary job to test what will happen when we flip the Ivy flag to true
test_ivy_aot:
executor:
name: default-executor
resource_class: xlarge
steps:
- custom_attach_workspace
- init_environment
- install_chrome_libs
# We need to explicitly specify the --symlink_prefix option because otherwise we would
# not be able to easily find the output bin directory when uploading artifacts for size
# measurements.
- run:
command: yarn test-ivy-aot //... --symlink_prefix=dist/
no_output_timeout: 20m
# Publish bundle artifacts which will be used to calculate the size change. **Note**: Make
# sure that the size plugin from the Angular robot fetches the artifacts from this CircleCI
# job (see .github/angular-robot.yml). Additionally any artifacts need to be stored with the
# following path format: "{projectName}/{context}/{fileName}". This format is necessary
# because otherwise the bot is not able to pick up the artifacts from CircleCI. See:
# https://github.com/angular/github-robot/blob/master/functions/src/plugins/size.ts#L392-L394
- store_artifacts:
path: dist/bin/packages/core/test/bundling/hello_world/bundle.min.js
destination: core/hello_world/bundle
- store_artifacts:
path: dist/bin/packages/core/test/bundling/todo/bundle.min.js
destination: core/todo/bundle
- store_artifacts:
path: dist/bin/packages/core/test/bundling/hello_world/bundle.min.js.br
destination: core/hello_world/bundle.br
- store_artifacts:
path: dist/bin/packages/core/test/bundling/todo/bundle.min.js.br
destination: core/todo/bundle.br
# NOTE: This is currently limited to master builds only. See the `monitoring` configuration.
saucelabs_view_engine:
executor:
name: default-executor
# In order to avoid the bottleneck of having a slow host machine, we acquire a better
# container for this job. This is necessary because we launch a lot of browsers concurrently
# and therefore the tunnel and Karma need to process a lot of file requests and tests.
resource_class: xlarge
steps:
- custom_attach_workspace
- init_environment
- init_saucelabs_environment
- run:
name: Run Bazel tests on Saucelabs with ViewEngine
# See /tools/saucelabs/README.md for more info
command: |
yarn bazel run //tools/saucelabs:sauce_service_setup
TESTS=$(./node_modules/.bin/bazelisk query --output label '(kind(karma_web_test, ...) intersect attr("tags", "saucelabs", ...)) except attr("tags", "ivy-only", ...) except attr("tags", "fixme-saucelabs-ve", ...)')
yarn bazel test --config=saucelabs ${TESTS}
yarn bazel run //tools/saucelabs:sauce_service_stop
no_output_timeout: 40m
- notify_webhook_on_fail:
webhook_url_env_var: SLACK_DEV_INFRA_CI_FAILURES_WEBHOOK_URL
# NOTE: This is currently limited to master builds only. See the `monitoring` configuration.
saucelabs_ivy:
executor:
name: default-executor
# In order to avoid the bottleneck of having a slow host machine, we acquire a better
# container for this job. This is necessary because we launch a lot of browsers concurrently
# and therefore the tunnel and Karma need to process a lot of file requests and tests.
resource_class: xlarge
steps:
- custom_attach_workspace
- init_environment
- init_saucelabs_environment
- run:
name: Run Bazel tests on Saucelabs with Ivy
# See /tools/saucelabs/README.md for more info
command: |
yarn bazel run //tools/saucelabs:sauce_service_setup
TESTS=$(./node_modules/.bin/bazelisk query --output label '(kind(karma_web_test, ...) intersect attr("tags", "saucelabs", ...)) except attr("tags", "no-ivy-aot", ...) except attr("tags", "fixme-saucelabs-ivy", ...)')
yarn bazel test --config=saucelabs --config=ivy ${TESTS}
yarn bazel run //tools/saucelabs:sauce_service_stop
no_output_timeout: 40m
- notify_webhook_on_fail:
webhook_url_env_var: SLACK_DEV_INFRA_CI_FAILURES_WEBHOOK_URL
test_aio:
executor: default-executor
steps:
- custom_attach_workspace
- init_environment
- install_chrome_libs
# Compile dependencies to ivy
# Running `ngcc` here (instead of implicitly via `ng build`) allows us to take advantage of
# the parallel, async mode speed-up (~20-25s on CI).
- run: yarn --cwd aio ngcc --properties es2015
# Build aio
- run: yarn --cwd aio build --progress=false
# Lint the code
- run: yarn --cwd aio lint
# Run unit tests
- run: yarn --cwd aio test --progress=false --watch=false
# Run e2e tests
- run: yarn --cwd aio e2e --configuration=ci
# Run PWA-score tests
- run: yarn --cwd aio test-pwa-score-localhost $CI_AIO_MIN_PWA_SCORE
# Run accessibility tests
- run: yarn --cwd aio test-a11y-score-localhost
# Check the bundle sizes.
- run: yarn --cwd aio payload-size
# Run unit tests for Firebase redirects
- run: yarn --cwd aio redirects-test
deploy_aio:
executor: default-executor
steps:
- custom_attach_workspace
- init_environment
- install_chrome_libs
# Deploy angular.io to production (if necessary)
- run: setPublicVar_CI_STABLE_BRANCH
- run: yarn --cwd aio deploy-production
test_aio_local:
parameters:
viewengine:
type: boolean
default: false
executor: default-executor
steps:
- custom_attach_workspace
- init_environment
- install_chrome_libs
# Build aio (with local Angular packages)
- run: yarn --cwd aio build-local<<# parameters.viewengine >>-with-viewengine<</ parameters.viewengine >>-ci
# Run unit tests
- run: yarn --cwd aio test --progress=false --watch=false
# Run e2e tests
- run: yarn --cwd aio e2e --configuration=ci
# Run PWA-score tests
- run: yarn --cwd aio test-pwa-score-localhost $CI_AIO_MIN_PWA_SCORE
# Check the bundle sizes.
- run: yarn --cwd aio payload-size aio-local<<# parameters.viewengine >>-viewengine<</ parameters.viewengine >>
test_aio_tools:
executor: default-executor
steps:
- custom_attach_workspace
- init_environment
# Install
- run: yarn --cwd aio install --frozen-lockfile --non-interactive
- run: yarn --cwd aio extract-cli-command-docs
# Run tools tests
- run: yarn --cwd aio tools-test
- run: ./aio/aio-builds-setup/scripts/test.sh
test_docs_examples:
parameters:
ivy:
type: boolean
default: false
executor:
name: default-executor
resource_class: xlarge
parallelism: 5
steps:
- custom_attach_workspace
- init_environment
- install_chrome_libs
# Install aio
- run: yarn --cwd aio install --frozen-lockfile --non-interactive
# Run examples tests. The "CIRCLE_NODE_INDEX" will be set if "parallelism" is enabled.
# Since the parallelism is set to "5", there will be five parallel CircleCI containers.
# with either "0", "1", etc as node index. This can be passed to the "--shard" argument.
- run: yarn --cwd aio example-e2e --setup --local <<# parameters.ivy >>--ivy<</ parameters.ivy >> --cliSpecsConcurrency=5 --shard=${CIRCLE_NODE_INDEX}/${CIRCLE_NODE_TOTAL} --retry 2
# This job should only be run on PR builds, where `CI_PULL_REQUEST` is not `false`.
aio_preview:
executor: default-executor
environment:
AIO_SNAPSHOT_ARTIFACT_PATH: &aio_preview_artifact_path 'aio/tmp/snapshot.tgz'
steps:
- custom_attach_workspace
- init_environment
- run: ./aio/scripts/build-artifacts.sh $AIO_SNAPSHOT_ARTIFACT_PATH $CI_PULL_REQUEST $CI_COMMIT
- store_artifacts:
path: *aio_preview_artifact_path
# The `destination` needs to be kept in synch with the value of
# `AIO_ARTIFACT_PATH` in `aio/aio-builds-setup/Dockerfile`
destination: aio/dist/aio-snapshot.tgz
- run: node ./aio/scripts/create-preview $CIRCLE_BUILD_NUM
# This job should only be run on PR builds, where `CI_PULL_REQUEST` is not `false`.
test_aio_preview:
executor: default-executor
steps:
- custom_attach_workspace
- init_environment
- install_chrome_libs
- run: yarn --cwd aio install --frozen-lockfile --non-interactive
- run:
name: Wait for preview and run tests
command: node aio/scripts/test-preview.js $CI_PULL_REQUEST $CI_COMMIT $CI_AIO_MIN_PWA_SCORE
# The `build-npm-packages` tasks exist for backwards-compatibility with old scripts and
# tests that rely on the pre-Bazel `dist/packages-dist` output structure (build.sh).
# Having multiple jobs that independently build in this manner duplicates some work; we build
# the bazel packages more than once. Even though we have a remote cache, these jobs will
# typically run in parallel so up-to-date outputs will not be available at the time the build
# starts.
# Build the view engine npm packages. No new jobs should depend on this.
build-npm-packages:
executor:
name: default-executor
resource_class: xlarge
steps:
- custom_attach_workspace
- init_environment
- run: node scripts/build/build-packages-dist.js
# Save the npm packages from //packages/... for other workflow jobs to read
- persist_to_workspace:
root: *workspace_location
paths:
- ng/dist/packages-dist
- ng/dist/zone.js-dist
# Save dependencies and bazel repository cache to use on subsequent runs.
- save_cache:
key: *cache_key
paths:
- "node_modules"
- "aio/node_modules"
- "~/bazel_repository_cache"
# Build the ivy npm packages.
build-ivy-npm-packages:
executor:
name: default-executor
resource_class: xlarge
steps:
- custom_attach_workspace
- init_environment
- run: node scripts/build/build-ivy-npm-packages.js
# Save the npm packages from //packages/... for other workflow jobs to read
- persist_to_workspace:
root: *workspace_location
paths:
- ng/dist/packages-dist-ivy-aot
- ng/dist/zone.js-dist-ivy-aot
# This job creates compressed tarballs (`.tgz` files) for all Angular packages and stores them as
# build artifacts. This makes it easy to try out changes from a PR build for testing purposes.
# More info CircleCI build artifacts: https://circleci.com/docs/2.0/artifacts
#
# NOTE: Currently, this job only runs for PR builds. See `publish_snapshot` for non-PR builds.
publish_packages_as_artifacts:
executor: default-executor
environment:
NG_PACKAGES_DIR: &ng_packages_dir 'dist/packages-dist'
NG_PACKAGES_ARCHIVES_DIR: &ng_packages_archives_dir 'dist/packages-dist-archives'
ZONEJS_PACKAGES_DIR: &zonejs_packages_dir 'dist/zone.js-dist'
ZONEJS_PACKAGES_ARCHIVES_DIR: &zonejs_packages_archives_dir 'dist/zone.js-dist-archives'
steps:
- custom_attach_workspace
- init_environment
# Publish `@angular/*` packages.
- run:
name: Create artifacts for @angular/* packages
command: ./scripts/ci/create-package-archives.sh $CI_BRANCH $CI_COMMIT $NG_PACKAGES_DIR $NG_PACKAGES_ARCHIVES_DIR
- store_artifacts:
path: *ng_packages_archives_dir
destination: angular
# Publish `zone.js` package.
- run:
name: Create artifacts for zone.js package
command: ./scripts/ci/create-package-archives.sh $CI_BRANCH $CI_COMMIT $ZONEJS_PACKAGES_DIR $ZONEJS_PACKAGES_ARCHIVES_DIR
- store_artifacts:
path: *zonejs_packages_archives_dir
destination: zone.js
# This job updates the content of repos like github.com/angular/core-builds
# for every green build on angular/angular.
publish_snapshot:
executor: default-executor
steps:
# See below - ideally this job should not trigger for non-upstream builds.
# But since it does, we have to check this condition.
- run:
name: Skip this job for Pull Requests and Fork builds
# Note: Using `CIRCLE_*` env variables (instead of those defined in `env.sh` so that this
# step can be run before `init_environment`.
command: >
if [[ -n "${CIRCLE_PR_NUMBER}" ]] ||
[[ "$CIRCLE_PROJECT_USERNAME" != "angular" ]] ||
[[ "$CIRCLE_PROJECT_REPONAME" != "angular" ]]; then
circleci step halt
fi
- custom_attach_workspace
- init_environment
# CircleCI has a config setting to force SSH for all github connections
# This is not compatible with our mechanism of using a Personal Access Token
# Clear the global setting
- run: git config --global --unset "url.ssh://git@github.com.insteadof"
- run:
name: Decrypt github credentials
# We need ensure that the same default digest is used for encoding and decoding with
# openssl. Openssl versions might have different default digests which can cause
# decryption failures based on the installed openssl version. https://stackoverflow.com/a/39641378/4317734
command: 'openssl aes-256-cbc -d -in .circleci/github_token -md md5 -k "${KEY}" -out ~/.git_credentials'
- run: ./scripts/ci/publish-build-artifacts.sh
aio_monitoring_stable:
executor: default-executor
steps:
- custom_attach_workspace
- init_environment
- install_chrome_libs
- run: setPublicVar_CI_STABLE_BRANCH
- run:
name: Check out `aio/` and yarn from the stable branch
command: |
git fetch origin $CI_STABLE_BRANCH
git checkout --force origin/$CI_STABLE_BRANCH -- aio/ .yarn/ .yarnrc
# Ignore yarn's engines check, because we checked out `aio/package.json` from the stable
# branch and there could be a node version skew, which is acceptable in this monitoring job.
- run: yarn config set ignore-engines true
- run:
name: Run tests against https://angular.io/
command: ./aio/scripts/test-production.sh https://angular.io/ $CI_AIO_MIN_PWA_SCORE
- notify_webhook_on_fail:
webhook_url_env_var: SLACK_CARETAKER_WEBHOOK_URL
- notify_webhook_on_fail:
webhook_url_env_var: SLACK_DEV_INFRA_CI_FAILURES_WEBHOOK_URL
aio_monitoring_next:
executor: default-executor
steps:
- custom_attach_workspace
- init_environment
- install_chrome_libs
- run:
name: Run tests against https://next.angular.io/
command: ./aio/scripts/test-production.sh https://next.angular.io/ $CI_AIO_MIN_PWA_SCORE
- notify_webhook_on_fail:
webhook_url_env_var: SLACK_CARETAKER_WEBHOOK_URL
- notify_webhook_on_fail:
webhook_url_env_var: SLACK_DEV_INFRA_CI_FAILURES_WEBHOOK_URL
legacy-unit-tests-saucelabs:
executor:
name: default-executor
# In order to avoid the bottleneck of having a slow host machine, we acquire a better
# container for this job. This is necessary because we launch a lot of browsers concurrently
# and therefore the tunnel and Karma need to process a lot of file requests and tests.
resource_class: xlarge
steps:
- custom_attach_workspace
- init_environment
- init_saucelabs_environment
- run:
name: Starting Saucelabs tunnel service
command: ./tools/saucelabs/sauce-service.sh run
background: true
- run: yarn tsc -p packages
- run: yarn tsc -p modules
- run: yarn bazel build //packages/zone.js:npm_package
- run:
# Waiting on ready ensures that we don't run tests too early without Saucelabs not being ready.
name: Waiting for Saucelabs tunnel to connect
command: ./tools/saucelabs/sauce-service.sh ready-wait
- run:
name: Running tests on Saucelabs.
command: |
browsers=$(node -e 'console.log(require("./browser-providers.conf").sauceAliases.CI_REQUIRED.join(","))')
yarn karma start ./karma-js.conf.js --single-run --browsers=${browsers}
- run:
name: Stop Saucelabs tunnel service
command: ./tools/saucelabs/sauce-service.sh stop
# Job that runs all unit tests of the `angular/components` repository.
components-repo-unit-tests:
executor:
name: default-executor
resource_class: xlarge
steps:
- custom_attach_workspace
- init_environment
# Restore the cache before cloning the repository because the clone script re-uses
# the restored repository if present. This reduces the amount of times the components
# repository needs to be cloned (this is slow and increases based on commits in the repo).
- restore_cache:
keys:
- *components_repo_unit_tests_cache_key
# Whenever the `angular/components` SHA is updated, the cache key will no longer
# match. The fallback cache will still match, and CircleCI will restore the most
# recently cached repository folder. Without the fallback cache, we'd need to download
# the repository from scratch and it would slow down the job. This is because we can't
# clone the repository with reduced `--depth`, but rather need to clone the whole
# repository to be able to support arbitrary SHAs.
- *components_repo_unit_tests_cache_key_fallback
- run:
name: "Fetching angular/components repository"
command: ./scripts/ci/clone_angular_components_repo.sh
- run:
# Run yarn install to fetch the Bazel binaries as used in the components repo.
name: Installing dependencies.
# TODO: remove this once the repo has been updated to use NodeJS v12 and Yarn 1.19.1.
# We temporarily ignore the "engines" because the Angular components repository has
# minimum dependency on NodeJS v12 and Yarn 1.19.1, but the framework repository uses
# older versions.
command: yarn --ignore-engines --cwd ${COMPONENTS_REPO_TMP_DIR} install --frozen-lockfile --non-interactive
- save_cache:
key: *components_repo_unit_tests_cache_key
paths:
# Temporary directory must be kept in sync with the `$COMPONENTS_REPO_TMP_DIR` env
# variable. It needs to be hardcoded here, because env variables interpolation is
# not supported.
- "/tmp/angular-components-repo"
- run:
# Updates the `angular/components` `package.json` file to refer to the release output
# inside the `packages-dist` directory. Note that it's not necessary to perform a yarn
# install as Bazel runs Yarn automatically when needed.
name: Setting up release packages.
command: node scripts/ci/update-deps-to-dist-packages.js ${COMPONENTS_REPO_TMP_DIR}/package.json dist/packages-dist/
- run:
name: "Running `angular/components` unit tests"
command: ./scripts/ci/run_angular_components_unit_tests.sh
test_zonejs:
executor:
name: default-executor
resource_class: xlarge
steps:
- custom_attach_workspace
- init_environment
# Install
- run: yarn --cwd packages/zone.js install --frozen-lockfile --non-interactive
# Run zone.js tools tests
- run: yarn --cwd packages/zone.js promisetest
- run: yarn --cwd packages/zone.js promisefinallytest
- run: yarn bazel build //packages/zone.js:npm_package &&
cp dist/bin/packages/zone.js/npm_package/dist/zone-mix.js ./packages/zone.js/test/extra/ &&
cp dist/bin/packages/zone.js/npm_package/dist/zone-patch-electron.js ./packages/zone.js/test/extra/ &&
yarn --cwd packages/zone.js electrontest
# Windows jobs
# Docs: https://circleci.com/docs/2.0/hello-world-windows/
test_win:
executor: windows-executor
steps:
- setup_win
- run:
# Ran into a command parsing problem where `-browser:chromium-local` was converted to
# `-browser: chromium-local` (a space was added) in https://circleci.com/gh/angular/angular/357511.
# Probably a powershell command parsing thing. There's no problem using a yarn script though.
command: yarn circleci-win-ve
no_output_timeout: 45m
# Save bazel repository cache to use on subsequent runs.
# We don't save node_modules because it's faster to use the linux workspace and reinstall.
- save_cache:
key: *cache_key_win
paths:
- "C:/Users/circleci/bazel_repository_cache"
test_ivy_aot_win:
executor: windows-executor
steps:
- setup_win
- run:
command: yarn circleci-win-ivy
no_output_timeout: 45m
workflows:
version: 2
default_workflow:
jobs:
- setup:
filters:
branches:
ignore: g3
- lint:
requires:
- setup
- test:
requires:
- setup
- test_ivy_aot:
requires:
- setup
- build-npm-packages:
requires:
- setup
- build-ivy-npm-packages:
requires:
- setup
- legacy-unit-tests-saucelabs:
requires:
- setup
- test_aio:
requires:
- setup
- deploy_aio:
requires:
- test_aio
- test_aio_local:
requires:
- build-npm-packages
- test_aio_local:
name: test_aio_local_viewengine
viewengine: true
requires:
- build-npm-packages
- test_aio_tools:
requires:
- build-npm-packages
- test_docs_examples:
requires:
- build-npm-packages
- test_docs_examples:
name: test_docs_examples_ivy
ivy: true
requires:
- build-npm-packages
- aio_preview:
# Only run on PR builds. (There can be no previews for non-PR builds.)
<<: *only_on_pull_requests
requires:
- setup
- test_aio_preview:
requires:
- aio_preview
- publish_packages_as_artifacts:
requires:
- build-npm-packages
- publish_snapshot:
# Note: no filters on this job because we want it to run for all upstream branches
# We'd really like to filter out pull requests here, but not yet available:
# https://discuss.circleci.com/t/workflows-pull-request-filter/14396/4
# Instead, the job just exits immediately at the first step.
requires:
# Only publish if tests and integration tests pass
- test
- test_ivy_aot
# Only publish if `aio`/`docs` tests using the locally built Angular packages pass
- test_aio_local
- test_aio_local_viewengine
- test_docs_examples
- test_docs_examples_ivy
# Get the artifacts to publish from the build-packages-dist job
# since the publishing script expects the legacy outputs layout.
- build-npm-packages
- build-ivy-npm-packages
- legacy-unit-tests-saucelabs
- components-repo-unit-tests:
requires:
- build-npm-packages
- test_zonejs:
requires:
- setup
# Windows Jobs
# These are very slow so we run them on non-PRs only for now.
# TODO: remove the filter when CircleCI makes Windows FS faster.
# The Windows jobs are only run after their non-windows counterparts finish successfully.
# This isn't strictly necessary as there is no artifact dependency, but helps economize
# CI resources by not attempting to build when we know should fail.
- test_win:
<<: *skip_on_pull_requests
requires:
- test
- test_ivy_aot_win:
<<: *skip_on_pull_requests
requires:
- test_ivy_aot
monitoring:
jobs:
- setup
- aio_monitoring_stable:
requires:
- setup
- aio_monitoring_next:
requires:
- setup
- saucelabs_ivy:
# Testing saucelabs via Bazel currently taking longer than the legacy saucelabs job as it
# each karma_web_test target is provisioning and tearing down browsers which is adding
# a lot of overhead. Running once daily on master only to avoid wasting resources and
# slowing down CI for PRs.
# TODO: Run this job on all branches (including PRs) once karma_web_test targets can
# share provisioned browsers and we can remove the legacy saucelabs job.
requires:
- setup
- saucelabs_view_engine:
# Testing saucelabs via Bazel currently taking longer than the legacy saucelabs job as it
# each karma_web_test target is provisioning and tearing down browsers which is adding
# a lot of overhead. Running once daily on master only to avoid wasting resources and
# slowing down CI for PRs.
# TODO: Run this job on all branches (including PRs) once karma_web_test targets can
# share provisioned browsers and we can remove the legacy saucelabs job.
requires:
- setup
triggers:
- schedule:
<<: *only_on_master
# Runs monitoring jobs at 10:00AM every day.
cron: "0 10 * * *"

View File

@ -1,73 +0,0 @@
####################################################################################################
# Helpers for defining environment variables for CircleCI.
#
# In CircleCI, each step runs in a new shell. The way to share ENV variables across steps is to
# export them from `$BASH_ENV`, which is automatically sourced at the beginning of every step (for
# the default `bash` shell).
#
# See also https://circleci.com/docs/2.0/env-vars/#using-bash_env-to-set-environment-variables.
####################################################################################################
# Set and print an environment variable.
#
# Use this function for setting environment variables that are public, i.e. it is OK for them to be
# visible to anyone through the CI logs.
#
# Usage: `setPublicVar <name> <value>`
function setPublicVar() {
setSecretVar $1 "$2";
echo "$1=$2";
}
# Set (without printing) an environment variable.
#
# Use this function for setting environment variables that are secret, i.e. should not be visible to
# everyone through the CI logs.
#
# Usage: `setSecretVar <name> <value>`
function setSecretVar() {
# WARNING: Secrets (e.g. passwords, access tokens) should NOT be printed.
# (Keep original shell options to restore at the end.)
local -r originalShellOptions=$(set +o);
set +x -eu -o pipefail;
echo "export $1=\"${2:-}\";" >> $BASH_ENV;
# Restore original shell options.
eval "$originalShellOptions";
}
# Create a function to set an environment variable, when called.
#
# Use this function for creating setter for public environment variables that require expensive or
# time-consuming computaions and may not be needed. When needed, you can call this function to set
# the environment variable (which will be available through `$BASH_ENV` from that point onwards).
#
# Arguments:
# - `<name>`: The name of the environment variable. The generated setter function will be
# `setPublicVar_<name>`.
# - `<code>`: The code to run to compute the value for the variable. Since this code should be
# executed lazily, it must be properly escaped. For example:
# ```sh
# # DO NOT do this:
# createPublicVarSetter MY_VAR "$(whoami)"; # `whoami` will be evaluated eagerly
#
# # DO this isntead:
# createPublicVarSetter MY_VAR "\$(whoami)"; # `whoami` will NOT be evaluated eagerly
# ```
#
# Usage: `createPublicVarSetter <name> <code>`
#
# Example:
# ```sh
# createPublicVarSetter MY_VAR 'echo "FOO"';
# echo $MY_VAR; # Not defined
#
# setPublicVar_MY_VAR;
# source $BASH_ENV;
# echo $MY_VAR; # FOO
# ```
function createPublicVarSetter() {
echo "setPublicVar_$1() { setPublicVar $1 \"$2\"; }" >> $BASH_ENV;
}

View File

@ -1,107 +0,0 @@
#!/usr/bin/env bash
# Variables
readonly projectDir=$(realpath "$(dirname ${BASH_SOURCE[0]})/..")
readonly envHelpersPath="$projectDir/.circleci/env-helpers.inc.sh";
# Load helpers and make them available everywhere (through `$BASH_ENV`).
source $envHelpersPath;
echo "source $envHelpersPath;" >> $BASH_ENV;
####################################################################################################
# Define PUBLIC environment variables for CircleCI.
####################################################################################################
# See https://circleci.com/docs/2.0/env-vars/#built-in-environment-variables for more info.
####################################################################################################
setPublicVar PROJECT_ROOT "$projectDir";
setPublicVar CI_AIO_MIN_PWA_SCORE "95";
# This is the branch being built; e.g. `pull/12345` for PR builds.
setPublicVar CI_BRANCH "$CIRCLE_BRANCH";
setPublicVar CI_BUILD_URL "$CIRCLE_BUILD_URL";
setPublicVar CI_COMMIT "$CIRCLE_SHA1";
# `CI_COMMIT_RANGE` is only used on push builds (a.k.a. non-PR, non-scheduled builds and rerun
# workflows of such builds).
setPublicVar CI_COMMIT_RANGE "$CIRCLE_GIT_BASE_REVISION..$CIRCLE_GIT_REVISION";
setPublicVar CI_PULL_REQUEST "${CIRCLE_PR_NUMBER:-false}";
setPublicVar CI_REPO_NAME "$CIRCLE_PROJECT_REPONAME";
setPublicVar CI_REPO_OWNER "$CIRCLE_PROJECT_USERNAME";
####################################################################################################
# Define "lazy" PUBLIC environment variables for CircleCI.
# (I.e. functions to set an environment variable when called.)
####################################################################################################
createPublicVarSetter CI_STABLE_BRANCH "\$(npm info @angular/core dist-tags.latest | sed -r 's/^\\s*([0-9]+\\.[0-9]+)\\.[0-9]+.*$/\\1.x/')";
####################################################################################################
# Define SECRET environment variables for CircleCI.
####################################################################################################
setSecretVar CI_SECRET_AIO_DEPLOY_FIREBASE_TOKEN "$AIO_DEPLOY_TOKEN";
setSecretVar CI_SECRET_PAYLOAD_FIREBASE_TOKEN "$ANGULAR_PAYLOAD_TOKEN";
####################################################################################################
# Define SauceLabs environment variables for CircleCI.
####################################################################################################
setPublicVar SAUCE_USERNAME "angular-framework";
setSecretVar SAUCE_ACCESS_KEY "0c731274ed5f-cbc9-16f4-021a-9835e39f";
# TODO(josephperrott): Remove environment variables once all saucelabs tests are via bazel method.
setPublicVar SAUCE_LOG_FILE /tmp/angular/sauce-connect.log
setPublicVar SAUCE_READY_FILE /tmp/angular/sauce-connect-ready-file.lock
setPublicVar SAUCE_PID_FILE /tmp/angular/sauce-connect-pid-file.lock
setPublicVar SAUCE_TUNNEL_IDENTIFIER "angular-framework-${CIRCLE_BUILD_NUM}-${CIRCLE_NODE_INDEX}"
# Amount of seconds we wait for sauceconnect to establish a tunnel instance. In order to not
# acquire CircleCI instances for too long if sauceconnect failed, we need a connect timeout.
setPublicVar SAUCE_READY_FILE_TIMEOUT 120
####################################################################################################
# Define environment variables for the `angular/components` repo unit tests job.
####################################################################################################
# We specifically use a directory within "/tmp" here because we want the cloned repo to be
# completely isolated from angular/angular in order to avoid any bad interactions between
# their separate build setups. **NOTE**: When updating the temporary directory, also update
# the `save_cache` path configuration in `config.yml`
setPublicVar COMPONENTS_REPO_TMP_DIR "/tmp/angular-components-repo"
setPublicVar COMPONENTS_REPO_URL "https://github.com/angular/components.git"
setPublicVar COMPONENTS_REPO_BRANCH "master"
# **NOTE**: When updating the commit SHA, also update the cache key in the CircleCI `config.yml`.
setPublicVar COMPONENTS_REPO_COMMIT "598db096e668aa7e9debd56eedfd127b7a55e371"
####################################################################################################
# Decrypt GCP Credentials and store them as the Google default credentials.
####################################################################################################
mkdir -p "$HOME/.config/gcloud";
openssl aes-256-cbc -d -in "${projectDir}/.circleci/gcp_token" \
-md md5 -k "$CIRCLE_PROJECT_REPONAME" -out "$HOME/.config/gcloud/application_default_credentials.json"
####################################################################################################
# Set bazel configuration for CircleCI runs.
####################################################################################################
cp "${projectDir}/.circleci/bazel.linux.rc" "$HOME/.bazelrc";
####################################################################################################
# Create shell script in /tmp for Bazel actions to access CI envs without
# busting the cache. Used by payload-size.sh script in integration tests.
####################################################################################################
readonly bazelVarEnv="/tmp/bazel-ci-env.sh"
echo "# Setup by /.circle/env.sh" > $bazelVarEnv
echo "export PROJECT_ROOT=\"${PROJECT_ROOT}\";" >> $bazelVarEnv
echo "export CI_BRANCH=\"${CI_BRANCH}\";" >> $bazelVarEnv
echo "export CI_BUILD_URL=\"${CI_BUILD_URL}\";" >> $bazelVarEnv
echo "export CI_COMMIT=\"${CI_COMMIT}\";" >> $bazelVarEnv
echo "export CI_COMMIT_RANGE=\"${CI_COMMIT_RANGE}\";" >> $bazelVarEnv
echo "export CI_PULL_REQUEST=\"${CI_PULL_REQUEST}\";" >> $bazelVarEnv
echo "export CI_REPO_NAME=\"${CI_REPO_NAME}\";" >> $bazelVarEnv
echo "export CI_REPO_OWNER=\"${CI_REPO_OWNER}\";" >> $bazelVarEnv
echo "export CI_SECRET_PAYLOAD_FIREBASE_TOKEN=\"${CI_SECRET_PAYLOAD_FIREBASE_TOKEN}\";" >> $bazelVarEnv
####################################################################################################
####################################################################################################
## Source `$BASH_ENV` to make the variables available immediately. ##
## ***NOTE: This must remain the the last action in this script*** ##
####################################################################################################
####################################################################################################
source $BASH_ENV;

Binary file not shown.

Binary file not shown.

View File

@ -1,11 +0,0 @@
#!/bin/sh
# Install bazel remote cache proxy
# This is temporary until the feature is no longer experimental on CircleCI.
# See remote cache documentation in /docs/BAZEL.md
set -u -e
readonly DOWNLOAD_URL="https://5-116431813-gh.circle-artifacts.com/0/pkg/bazel-remote-proxy-$(uname -s)_$(uname -m)"
curl --fail -o ~/bazel-remote-proxy "$DOWNLOAD_URL"
chmod +x ~/bazel-remote-proxy

View File

@ -1,107 +0,0 @@
#!/usr/bin/env node
/**
* Usage (cli):
* ```
* node create-preview <build-number> <job-name> <webhook-url>
* ```
*
* Usage (JS):
* ```js
* require('./trigger-webhook').
* triggerWebhook(buildNumber, jobName, webhookUrl).
* then(...);
* ```
*
* Triggers a notification webhook with CircleCI specific info.
*
* It can be used for notifying external servers and trigger operations based on CircleCI job status
* (e.g. triggering the creation of a preview based on previously stored build atrifacts).
*
* The body of the sent payload is of the form:
* ```json
* {
* "payload": {
* "build_num": ${buildNumber}
* "build_parameters": {
* "CIRCLE_JOB": "${jobName}"
* }
* }
* }
* ```
*
* When used from JS, it returns a promise which resolves to an object of the form:
* ```json
* {
* "statucCode": ${statusCode},
* "responseText": "${responseText}"
* }
* ```
*
* NOTE:
* - When used from the cli, the command will exit with an error code if the response's status code
* is outside the [200, 400) range.
* - When used from JS, the returned promise will be resolved, even if the response's status code is
* outside the [200, 400) range. It is up to the caller to decide how this should be handled.
*/
// Imports
const {request} = require('https');
// Exports
module.exports = {
triggerWebhook,
};
// Run
if (require.resolve === module) {
_main(process.argv.slice(2));
}
// Helpers
function _main(args) {
triggerWebhook(...args).
then(({statusCode, responseText}) => (200 <= statusCode && statusCode < 400) ?
console.log(`Status: ${statusCode}\n${responseText}`) :
Promise.reject(new Error(`Request failed (status: ${statusCode}): ${responseText}`))).
catch(err => {
console.error(err);
process.exit(1);
});
}
function postJson(url, data) {
return new Promise((resolve, reject) => {
const opts = {method: 'post', headers: {'Content-Type': 'application/json'}};
const onResponse = res => {
const statusCode = res.statusCode || -1;
let responseText = '';
res.
on('error', reject).
on('data', d => responseText += d).
on('end', () => resolve({statusCode, responseText}));
};
request(url, opts, onResponse).
on('error', reject).
end(JSON.stringify(data));
});
}
async function triggerWebhook(buildNumber, jobName, webhookUrl) {
if (!buildNumber || !jobName || !webhookUrl || isNaN(buildNumber)) {
throw new Error(
'Missing or invalid arguments.\n' +
'Expected: buildNumber (number), jobName (string), webhookUrl (string)');
}
const data = {
payload: {
build_num: +buildNumber,
build_parameters: {CIRCLE_JOB: jobName},
},
};
return postJson(webhookUrl, data);
}

View File

@ -1,56 +0,0 @@
# Install Bazel pre-reqs on Windows
# https://docs.bazel.build/versions/master/install-windows.html
# https://docs.bazel.build/versions/master/windows.html
# Install MSYS2 and packages
choco install msys2 --version 20180531.0.0 --no-progress --package-parameters "/NoUpdate"
C:\tools\msys64\usr\bin\bash.exe -l -c "pacman --needed --noconfirm -S zip unzip patch diffutils git"
# Add PATH modifications to the Powershell profile. This is the win equivalent of .bash_profile.
# https://docs.microsoft.com/en-us/previous-versions//bb613488(v=vs.85)
new-item -path $profile -itemtype file -force
# Paths for nodejs, npm, yarn, and msys2. Use single quotes to prevent interpolation.
# Add before the original path to use msys2 instead of the installed gitbash.
Add-Content $profile '$Env:path = "${Env:ProgramFiles}\nodejs\;C:\Users\circleci\AppData\Roaming\npm\;${Env:ProgramFiles(x86)}\Yarn\bin\;C:\Users\circleci\AppData\Local\Yarn\bin\;C:\tools\msys64\usr\bin\;" + $Env:path'
# Environment variables for Bazel
Add-Content $profile '$Env:BAZEL_SH = "C:\tools\msys64\usr\bin\bash.exe"'
# Get the bazel version devdep and store it in a global var for use in the circleci job.
$bazelVersion = & ${Env:ProgramFiles}\nodejs\node.exe -e "console.log(require('./package.json').devDependencies['@bazel/bazel'])"
# This is a tricky situation: we want $bazelVersion to be evaluated but not $Env:BAZEL_VERSION.
# Formatting works https://stackoverflow.com/questions/32127583/expand-variable-inside-single-quotes
$bazelVersionGlobalVar = '$Env:BAZEL_VERSION = "{0}"' -f $bazelVersion
Add-Content $profile $bazelVersionGlobalVar
# Remove the CircleCI checkout SSH override, because it breaks cloning repositories through Bazel.
# See https://circleci.com/gh/angular/angular/401454 for an example.
# TODO: is this really needed? Maybe there's a better way. It doesn't happen on Linux or on Codefresh.
git config --global --unset url.ssh://git@github.com.insteadOf
####################################################################################################
# Decrypt GCP Credentials and store them as the Google default credentials.
####################################################################################################
mkdir ${env:APPDATA}\gcloud
openssl aes-256-cbc -d -in .circleci\gcp_token -md md5 -out "$env:APPDATA\gcloud\application_default_credentials.json" -k "$env:CIRCLE_PROJECT_REPONAME"
####################################################################################################
# Set bazel configuration for CircleCI runs.
####################################################################################################
copy .circleci\bazel.windows.rc ${Env:USERPROFILE}\.bazelrc
####################################################################################################
# Install specific version of node.
####################################################################################################
choco install nodejs --version 12.14.1 --no-progress
# These Bazel prereqs aren't needed because the CircleCI image already includes them.
# choco install yarn --version 1.16.0 --no-progress
# choco install vcredist2015 --version 14.0.24215.20170201
# We don't need VS Build Tools for the tested bazel targets.
# If it's needed again, uncomment these lines.
# VS Build Tools are needed for Bazel C++ targets (like com_google_protobuf)
# choco install visualstudio2019buildtools --version 16.1.2.0 --no-progress --package-parameters "--add Microsoft.VisualStudio.Workload.VCTools --add Microsoft.VisualStudio.Component.VC.Tools.x86.x64 --add Microsoft.Component.VC.Runtime.UCRTSDK --add Microsoft.VisualStudio.Component.Windows10SDK.17763"
# Add-Content $profile '$Env:BAZEL_VC = "${Env:ProgramFiles(x86)}\Microsoft Visual Studio\2019\BuildTools\VC\"'
# Python is needed for Bazel Python targets
# choco install python --version 3.5.1 --no-progress

View File

@ -1,31 +0,0 @@
# VSCode Remote Development - Developing inside a Containers
This folder contains configuration files that can be used to opt into working on this repository in a [Docker container](https://www.docker.com/resources/what-container) via [VSCode](https://code.visualstudio.com/)'s Remote Development feature (see below).
Info on remote development and developing inside a container with VSCode:
- [VSCode: Remote Development](https://code.visualstudio.com/docs/remote/remote-overview)
- [VSCode: Developing inside a Container](https://code.visualstudio.com/docs/remote/containers)
- [VSCode: Remote Development FAQ](https://code.visualstudio.com/docs/remote/faq)
## Usage
_Prerequisite: [Install Docker](https://docs.docker.com/install) on your local environment._
To get started, read and follow the instuctions in [Developing inside a Container](https://code.visualstudio.com/docs/remote/containers). The [.devcontainer/](.) directory contains pre-configured `devcontainer.json` and `Dockerfile` files, which you can use to set up remote development with a docker container.
In a nutshell, you need to:
- Install the [Remote - Containers](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers) extension.
- Copy [recommended-Dockerfile](./recommended-Dockerfile) to `Dockerfile` (and optionally tweak to suit your needs).
- Copy [recommended-devcontainer.json](./recommended-devcontainer.json) to `devcontainer.json` (and optionally tweak to suit your needs).
- Open VSCode and bring up the [Command Palette](https://code.visualstudio.com/docs/getstarted/userinterface#_command-palette).
- Type `Remote-Containers: Open Folder in Container` and choose your local clone of [angular/angular](https://github.com/angular/angular).
The `.devcontainer/devcontainer.json` and `.devcontainer/Dockerfile` files are ignored by git, so you can have your own local versions. We may occasionally update the template files ([recommended-devcontainer.json](./recommended-devcontainer.json), [recommended-Dockerfile](./recommended-Dockerfile)), in which case you will need to manually update your local copies (if desired).
## Updating `recommended-devcontainer.json` and `recommended-Dockerfile`
You can update and commit the recommended config files (which people use as basis for their local configs), if you find that something is broken, out-of-date or can be improved.
Please, keep in mind that any changes you make will potentially be used by many people on different environments. Try to keep these config files cross-platform compatible and free of personal preferences.

View File

@ -1,24 +0,0 @@
# Image metadata and config.
FROM circleci/node:10-browsers # Ideally, the image version should be what we use on CI.
# See `executors > browsers-executor` in `.circleci/config.yml`.
LABEL name="Angular dev environment" \
description="This image can be used to create a dev environment for building Angular." \
vendor="angular" \
version="1.0"
EXPOSE 4000 4200 4433 5000 8080 9876
# Switch to `root` (CircleCI images use `circleci` as the user).
USER root
# Configure `Node.js`/`npm` and install utilities.
RUN npm config --global set user root
RUN npm install --global yarn@latest # Ideally, the version should be what we use on CI.
# See `commands > overwrite_yarn` in `.circleci/config.yml`.
# Go! (And keep going.)
CMD ["tail", "--follow", "/dev/null"]

View File

@ -1,16 +0,0 @@
// Reference: https://code.visualstudio.com/docs/remote/containers#_devcontainerjson-reference
{
"name": "Angular dev container",
"dockerFile": "Dockerfile",
"appPort": [4000, 4200, 4433, 5000, 8080, 9876],
"postCreateCommand": "yarn install",
"extensions": [
"devondcarew.bazel-code",
"gkalpak.aio-docs-utils",
"ms-vscode.vscode-typescript-tslint-plugin",
"xaver.clang-format",
// The following extensions are useful when working on angular.io (i.e. inside the `aio/` directory).
//"angular.ng-template",
//"dbaeumer.vscode-eslint",
],
}

View File

@ -1,4 +1,4 @@
# https://editorconfig.org
# http://editorconfig.org
root = true

3
.gitattributes vendored
View File

@ -5,8 +5,5 @@
*.js eol=lf
*.ts eol=lf
# API guardian patch must always use LF for tests to work
*.patch eol=lf
# Must keep Windows line ending to be parsed correctly
scripts/windows/packages.txt eol=crlf

View File

@ -1,10 +1,39 @@
🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑
<!--
IF YOU DON'T FILL OUT THE FOLLOWING INFORMATION WE MIGHT CLOSE YOUR ISSUE WITHOUT INVESTIGATING
-->
Please help us process issues more efficiently by filing an
issue using one of the following templates:
**I'm submitting a ...** (check one with "x")
```
[ ] bug report => search github for a similar issue or PR before submitting
[ ] feature request
[ ] support request => Please do not submit support request here, instead see https://github.com/angular/angular/blob/master/CONTRIBUTING.md#question
```
https://github.com/angular/angular/issues/new/choose
**Current behavior**
<!-- Describe how the bug manifests. -->
Thank you!
**Expected behavior**
<!-- Describe what the behavior would be without the bug. -->
🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑
**Minimal reproduction of the problem with instructions**
<!--
If the current behavior is a bug or you can illustrate your feature request better with an example,
please provide the *STEPS TO REPRODUCE* and if possible a *MINIMAL DEMO* of the problem via
https://plnkr.co or similar (you can use this template as a starting point: http://plnkr.co/edit/tpl:AvJOMERrnz94ekVua0u5).
-->
**What is the motivation / use case for changing the behavior?**
<!-- Describe the motivation or the concrete use case -->
**Please tell us about your environment:**
<!-- Operating system, IDE, package manager, HTTP server, ... -->
* **Angular version:** 2.0.X
<!-- Check whether this is still an issue in the most recent Angular version -->
* **Browser:** [all | Chrome XX | Firefox XX | IE XX | Safari XX | Mobile Chrome XX | Android X.X Web Browser | iOS XX Safari | iOS XX UIWebView | iOS XX WKWebView ]
<!-- All browsers where this could be reproduced -->
* **Language:** [all | TypeScript X.X | ES6/7 | ES5]
* **Node (for AoT issues):** `node --version` =

View File

@ -1,69 +0,0 @@
---
name: "\U0001F41EBug report"
about: Report a bug in the Angular Framework
---
<!--🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅
Oh hi there! 😄
To expedite issue processing please search open and closed issues before submitting a new one.
Existing issues often contain information about workarounds, resolution, or progress updates.
🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅-->
# 🐞 bug report
### Affected Package
<!-- Can you pin-point one or more @angular/* packages as the source of the bug? -->
<!-- ✍edit: --> The issue is caused by package @angular/....
### Is this a regression?
<!-- Did this behavior use to work in the previous version? -->
<!-- ✍️--> Yes, the previous version in which this bug was not present was: ....
### Description
<!-- ✍️--> A clear and concise description of the problem...
## 🔬 Minimal Reproduction
<!--
Please create and share minimal reproduction of the issue starting with this template: https://stackblitz.com/fork/angular-issue-repro2
-->
<!-- ✍️--> https://stackblitz.com/...
<!--
If StackBlitz is not suitable for reproduction of your issue, please create a minimal GitHub repository with the reproduction of the issue.
A good way to make a minimal reproduction is to create a new app via `ng new repro-app` and add the minimum possible code to show the problem.
Share the link to the repo below along with step-by-step instructions to reproduce the problem, as well as expected and actual behavior.
Issues that don't have enough info and can't be reproduced will be closed.
You can read more about issue submission guidelines here: https://github.com/angular/angular/blob/master/CONTRIBUTING.md#-submitting-an-issue
-->
## 🔥 Exception or Error
<pre><code>
<!-- If the issue is accompanied by an exception or an error, please share it below: -->
<!-- ✍️-->
</code></pre>
## 🌍 Your Environment
**Angular Version:**
<pre><code>
<!-- run `ng version` and paste output below -->
<!-- ✍️-->
</code></pre>
**Anything else relevant?**
<!-- ✍Is this a browser specific issue? If so, please specify the browser and version. -->
<!-- ✍Do any of these matter: operating system, IDE, package manager, HTTP server, ...? If so, please mention it below. -->

View File

@ -1,32 +0,0 @@
---
name: "\U0001F680Feature request"
about: Suggest a feature for Angular Framework
---
<!--🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅
Oh hi there! 😄
To expedite issue processing please search open and closed issues before submitting a new one.
Existing issues often contain information about workarounds, resolution, or progress updates.
🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅-->
# 🚀 feature request
### Relevant Package
<!-- Can you pin-point one or more @angular/* packages the are relevant for this feature request? -->
<!-- ✍edit: --> This feature request is for @angular/....
### Description
<!-- ✍️--> A clear and concise description of the problem or missing capability...
### Describe the solution you'd like
<!-- ✍️--> If you have a solution in mind, please describe it.
### Describe alternatives you've considered
<!-- ✍️--> Have you considered any alternative solutions or workarounds?

View File

@ -1,55 +0,0 @@
---
name: "📚 Docs or angular.io issue report"
about: Report an issue in Angular's documentation or angular.io application
---
<!--🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅
Oh hi there! 😄
To expedite issue processing please search open and closed issues before submitting a new one.
Existing issues often contain information about workarounds, resolution, or progress updates.
🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅🔅-->
# 📚 Docs or angular.io bug report
### Description
<!-- ✍edit:--> A clear and concise description of the problem...
## 🔬 Minimal Reproduction
### What's the affected URL?**
<!-- ✍edit:--> https://angular.io/...
### Reproduction Steps**
<!-- If applicable please list the steps to take to reproduce the issue -->
<!-- ✍edit:-->
### Expected vs Actual Behavior**
<!-- If applicable please describe the difference between the expected and actual behavior after following the repro steps. -->
<!-- ✍edit:-->
## 📷Screenshot
<!-- Often a screenshot can help to capture the issue better than a long description. -->
<!-- ✍upload a screenshot:-->
## 🔥 Exception or Error
<pre><code>
<!-- If the issue is accompanied by an exception or an error, please share it below: -->
<!-- ✍️-->
</code></pre>
## 🌍 Your Environment
### Browser info
<!-- ✍Is this a browser specific issue? If so, please specify the device, browser, and version. -->
### Anything else relevant?
<!-- ✍Please provide additional info if necessary. -->

View File

@ -1,11 +0,0 @@
---
name: ⚠️ Security issue disclosure
about: Report a security issue in Angular Framework, Material, or CLI
---
🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑
Please read https://angular.io/guide/security#report-issues on how to disclose security related issues.
🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑

View File

@ -1,16 +0,0 @@
---
name: "❓Support request"
about: Questions and requests for support
---
🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑
Please do not file questions or support requests on the GitHub issues tracker.
You can get your questions answered using other communication channels. Please see:
https://github.com/angular/angular/blob/master/CONTRIBUTING.md#question
Thank you!
🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑

View File

@ -1,13 +0,0 @@
---
name: "\U0001F6E0Angular CLI"
about: Issues and feature requests for Angular CLI
---
🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑
Please file any Angular CLI issues at: https://github.com/angular/angular-cli/issues/new
For the time being, we keep Angular CLI issues in a separate repository.
🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑

View File

@ -1,13 +0,0 @@
---
name: "\U0001F48EAngular Components"
about: Issues and feature requests for Angular Components
---
🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑
Please file any Angular Components issues at: https://github.com/angular/components/issues/new
For the time being, we keep Angular Components issues in a separate repository.
🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑🛑

View File

@ -1,43 +1,36 @@
## PR Checklist
Please check if your PR fulfills the following requirements:
**Please check if the PR fulfills these requirements**
- [ ] The commit message follows our guidelines: https://github.com/angular/angular/blob/master/CONTRIBUTING.md#commit
- [ ] Tests for the changes have been added (for bug fixes / features)
- [ ] Docs have been added / updated (for bug fixes / features)
## PR Type
What kind of change does this PR introduce?
**What kind of change does this PR introduce?** (check one with "x")
```
[ ] Bugfix
[ ] Feature
[ ] Code style update (formatting, local variables)
[ ] Refactoring (no functional changes, no api changes)
[ ] Build related changes
[ ] CI related changes
[ ] Other... Please describe:
```
<!-- Please check the one that applies to this PR using "x". -->
- [ ] Bugfix
- [ ] Feature
- [ ] Code style update (formatting, local variables)
- [ ] Refactoring (no functional changes, no api changes)
- [ ] Build related changes
- [ ] CI related changes
- [ ] Documentation content changes
- [ ] angular.io application / infrastructure changes
- [ ] Other... Please describe:
**What is the current behavior?** (You can also link to an open issue here)
## What is the current behavior?
<!-- Please describe the current behavior that you are modifying, or link to a relevant issue. -->
Issue Number: N/A
**What is the new behavior?**
## What is the new behavior?
**Does this PR introduce a breaking change?** (check one with "x")
```
[ ] Yes
[ ] No
```
If this PR contains a breaking change, please describe the impact and migration path for existing applications: ...
## Does this PR introduce a breaking change?
**Other information**:
- [ ] Yes
- [ ] No
<!-- If this PR contains a breaking change, please describe the impact and migration path for existing applications below. -->
## Other information

View File

@ -1,183 +0,0 @@
# Configuration for angular-robot
#options for the size plugin
size:
disabled: false
maxSizeIncrease: 2000
circleCiStatusName: "ci/circleci: test_ivy_aot"
# options for the merge plugin
merge:
# the status will be added to your pull requests
status:
# set to true to disable
disabled: false
# the name of the status
context: "ci/angular: merge status"
# text to show when all checks pass
successText: "All checks passed!"
# text to show when some checks are failing
failureText: "The following checks are failing:"
# the g3 status will be added to your pull requests if they include files that match the patterns
g3Status:
# set to true to disable
disabled: false
# the name of the status
context: "google3"
# text to show when the status is pending, {{PRNumber}} will be replaced by the PR number
pendingDesc: "Googler: run g3sync presubmit {{PRNumber}}"
# text to show when the status is success
successDesc: "Does not affect google3"
# link to use for the details
url: "http://go/angular/g3sync"
# list of patterns to check for the files changed by the PR
# this list must be manually kept in sync with google3/third_party/javascript/angular2/copy.bara.sky
include:
- "LICENSE"
- "modules/benchmarks/**"
- "modules/system.d.ts"
- "packages/**"
# list of patterns to ignore for the files changed by the PR
exclude:
- "packages/*"
- "packages/bazel/*"
- "packages/bazel/src/api-extractor/**"
- "packages/bazel/src/builders/**"
- "packages/bazel/src/ng_package/**"
- "packages/bazel/src/protractor/**"
- "packages/bazel/src/schematics/**"
- "packages/compiler-cli/ngcc/**"
- "packages/docs/**"
- "packages/elements/schematics/**"
- "packages/examples/**"
- "packages/language-service/**"
- "packages/localize/**"
- "packages/private/**"
- "packages/service-worker/**"
- "**/.gitignore"
- "**/.gitkeep"
- "**/yarn.lock"
- "**/package.json"
- "**/third_party/**"
- "**/tsconfig-build.json"
- "**/tsconfig.json"
- "**/rollup.config.js"
- "**/BUILD.bazel"
- "**/*.md"
- "packages/**/integrationtest/**"
- "packages/**/test/**"
- "packages/zone.js/*"
- "packages/zone.js/doc/**"
- "packages/zone.js/example/**"
- "packages/zone.js/scripts/**"
# comment that will be added to a PR when there is a conflict, leave empty or set to false to disable
mergeConflictComment: "Hi @{{PRAuthor}}! This PR has merge conflicts due to recent upstream merges.
\nPlease help to unblock it by resolving these conflicts. Thanks!"
# label to monitor
mergeLabel: "PR action: merge"
# adding any of these labels will also add the merge label
mergeLinkedLabels:
- "PR action: merge-assistance"
# list of checks that will determine if the merge label can be added
checks:
# require that the PR has reviews from all requested reviewers
#
# This enables us to request reviews from both eng and tech writers, or multiple eng folks, and prevents accidental merges.
# Rather than merging PRs with pending reviews, if all approvals are obtained and additional reviews are not needed, any pending reviewers should be removed via GitHub UI (this also leaves an audit trail behind these decisions).
requireReviews: true,
# whether the PR shouldn't have a conflict with the base branch
noConflict: true
# list of labels that a PR needs to have, checked with a regexp (e.g. "PR target:" will work for the label "PR target: master")
requiredLabels:
- "PR target: *"
- "cla: yes"
# list of labels that a PR shouldn't have, checked after the required labels with a regexp
forbiddenLabels:
- "PR target: TBD"
- "PR action: cleanup"
- "PR action: review"
- "PR state: blocked"
- "cla: no"
# list of PR statuses that need to be successful
requiredStatuses:
- "ci/circleci: build"
- "ci/circleci: lint"
- "ci/circleci: publish_snapshot"
- "ci/angular: size"
- "cla/google"
- "google3"
- "pullapprove"
# the comment that will be added when the merge label is added despite failing checks, leave empty or set to false to disable
# {{MERGE_LABEL}} will be replaced by the value of the mergeLabel option
# {{PLACEHOLDER}} will be replaced by the list of failing checks
mergeRemovedComment: "I see that you just added the `{{MERGE_LABEL}}` label, but the following checks are still failing:
\n{{PLACEHOLDER}}
\n
\n**If you want your PR to be merged, it has to pass all the CI checks.**
\n
\nIf you can't get the PR to a green state due to flakes or broken master, please try rebasing to master and/or restarting the CI job. If that fails and you believe that the issue is not due to your change, please contact the caretaker and ask for help."
# options for the triage plugin
triage:
# number of the milestone to apply when the issue has not been triaged yet
needsTriageMilestone: 83,
# number of the milestone to apply when the issue is triaged
defaultMilestone: 82,
# arrays of labels that determine if an issue has been triaged by the caretaker
l1TriageLabels:
-
- "comp: *"
# arrays of labels that determine if an issue has been fully triaged
l2TriageLabels:
-
- "type: bug/fix"
- "severity*"
- "freq*"
- "comp: *"
-
- "type: feature"
- "comp: *"
-
- "type: refactor"
- "comp: *"
-
- "type: RFC / Discussion / question"
- "comp: *"
# options for the triage PR plugin
triagePR:
# set to true to disable
disabled: false
# number of the milestone to apply when the PR has not been triaged yet
needsTriageMilestone: 83,
# number of the milestone to apply when the PR is triaged
defaultMilestone: 82,
# arrays of labels that determine if a PR has been triaged by the caretaker
l1TriageLabels:
-
- "comp: *"
# arrays of labels that determine if a PR has been fully triaged
l2TriageLabels:
-
- "type: *"
- "effort*"
- "risk*"
- "comp: *"
# options for rerunning CI
rerunCircleCI:
# set to true to disable
disabled: false
# the label which when added triggers a rerun of the default CircleCI workflow
triggerRerunLabel: "PR action: rerun CI at HEAD"

View File

@ -1,15 +0,0 @@
name: Lock closed inactive issues
on:
schedule:
# Run at 16:00 every day
- cron: '0 16 * * *'
jobs:
lock_closed:
if: github.repository == 'angular/angular'
runs-on: ubuntu-latest
steps:
- uses: angular/dev-infra/github-actions/lock-closed@66462f6
with:
lock-bot-key: ${{ secrets.LOCK_BOT_PRIVATE_KEY }}

21
.gitignore vendored
View File

@ -1,29 +1,18 @@
.DS_STORE
/dist/
/bazel-out
/integration/bazel/bazel-*
*.log
node_modules
tools/gulp-tasks/cldr/cldr-data/
bower_components
# Include when developing application packages.
pubspec.lock
.c9
.idea/
.devcontainer/*
!.devcontainer/README.md
!.devcontainer/recommended-devcontainer.json
!.devcontainer/recommended-Dockerfile
.settings/
.vscode/launch.json
.vscode/settings.json
.vscode/tasks.json
*.swo
modules/.settings
.vscode
modules/.vscode
.vimrc
.nvimrc
# Don't check in secret files
*secret.js
@ -37,9 +26,3 @@ yarn-error.log
# rollup-test output
/modules/rollup-test/dist/
# User specific bazel settings
.bazelrc.user
.notes.md
baseline.json

View File

@ -1 +0,0 @@
Michał Gołębiowski-Owczarek <m.goleb@gmail.com>

2
.nvmrc
View File

@ -1 +1 @@
12.14.1
6.9.5

File diff suppressed because it is too large Load Diff

75
.travis.yml Normal file
View File

@ -0,0 +1,75 @@
language: node_js
sudo: false
node_js:
- '6.9.5'
addons:
# firefox: "38.0"
apt:
sources:
# needed to install g++ that is used by npms's native modules
- ubuntu-toolchain-r-test
packages:
# needed to install g++ that is used by npms's native modules
- g++-4.8
# https://docs.travis-ci.com/user/jwt
jwt:
# SAUCE_ACCESS_KEY<=secret for NGBUILDS_IO_KEY to work around travis-ci/travis-ci#7223, unencrypted value in valentine as NGBUILDS_IO_KEY>
# we alias NGBUILDS_IO_KEY to $SAUCE_ACCESS_KEY in env.sh and set the SAUCE_ACCESS_KEY there
- secure: "L7nrZwkAtFtYrP2DykPXgZvEKjkv0J/TwQ/r2QGxFTaBq4VZn+2Dw0YS7uCxoMqYzDwH0aAOqxoutibVpk8Z/16nE3tNmU5RzltMd6Xmt3qU2f/JDQLMo6PSlBodnjOUsDHJgmtrcbjhqrx/znA237BkNUu6UZRT7mxhXIZpn0U="
branches:
except:
- g3
cache:
yarn: true
directories:
- ./node_modules
- ./.chrome/chromium
- ./aio/node_modules
env:
global:
# GITHUB_TOKEN_ANGULAR=<github token, a personal access token of the angular-builds account, account access in valentine>
# This is needed for the e2e Travis matrix task to publish packages to github for continuous packages delivery.
- secure: "aCdHveZuY8AT4Jr1JoJB4LxZsnGWRe/KseZh1YXYe5UtufFCtTVHvUcLn0j2aLBF0KpdyS+hWf0i4np9jthKu2xPKriefoPgCMpisYeC0MFkwbmv+XlgkUbgkgVZMGiVyX7DCYXVahxIoOUjVMEDCbNiHTIrfEuyq24U3ok2tHc="
# FIREBASE_TOKEN
# This is needed for publishing builds to the "aio-staging" firebase site.
# TODO(i): the token was generated using the iminar@google account, we should switch to a shared/role-base account.
- secure: "MPx3UM77o5IlhT75PKHL0FXoB5tSXDc3vnCXCd1sRy4XUTZ9vjcV6nNuyqEf+SOw659bGbC1FI4mACGx1Q+z7MQDR85b1mcA9uSgHDkh+IR82CnCVdaX9d1RXafdJIArahxfmorbiiPPLyPIKggo7ituRm+2c+iraoCkE/pXxYg="
matrix:
# Order: a slower build first, so that we don't occupy an idle travis worker waiting for others to complete.
- CI_MODE=e2e
- CI_MODE=e2e_2
- CI_MODE=js
- CI_MODE=saucelabs_required
- CI_MODE=browserstack_required
- CI_MODE=saucelabs_optional
- CI_MODE=browserstack_optional
- CI_MODE=docs_test
- CI_MODE=aio
- CI_MODE=aio_e2e
matrix:
fast_finish: true
allow_failures:
- env: "CI_MODE=saucelabs_optional"
- env: "CI_MODE=browserstack_optional"
- env: "CI_MODE=aio_e2e"
before_install:
# source the env.sh script so that the exported variables are available to other scripts later on
- source ./scripts/ci/env.sh print
install:
- ./scripts/ci/install.sh
script:
- ./scripts/ci/build.sh
- ./scripts/ci/test.sh
# deploy is part of 'script' and not 'after_success' so that we fail the build if the deployment fails
- ./scripts/ci/deploy.sh
- ./scripts/ci/angular.sh
# all the scripts under this line will not quickly abort in case ${TRAVIS_TEST_RESULT} is 1 (job failure)
- ./scripts/ci/cleanup.sh
- ./scripts/ci/print-logs.sh

25
.vscode/README.md vendored
View File

@ -1,25 +0,0 @@
# VSCode Configuration
This folder contains opt-in [Workspace Settings](https://code.visualstudio.com/docs/getstarted/settings), [Tasks](https://code.visualstudio.com/docs/editor/tasks), [Launch Configurations](https://code.visualstudio.com/Docs/editor/debugging#_launch-configurations) and [Extension Recommendations](https://code.visualstudio.com/docs/editor/extension-gallery#_workspace-recommended-extensions) that the Angular team recommends using when working on this repository.
## Usage
To use the recommended configurations follow the steps below:
- install the recommneded extensions in `.vscode/extensions.json`
- copy (or link) `.vscode/recommended-settings.json` to `.vscode/settings.json`
- copy (or link) `.vscode/recommended-launch.json` to `.vscode/launch.json`
- copy (or link) `.vscode/recommended-tasks.json` to `.vscode/tasks.json`
- restart the editor
If you already have your custom workspace settings you should instead manually merge the file contents.
This isn't an automatic process so you will need to repeat it when settings are updated.
To see the recommended extensions select "Extensions: Show Recommended Extensions" in the [Command Palette](https://code.visualstudio.com/docs/getstarted/userinterface#_command-palette).
## Editing `.vscode/recommended-*.json` files
If you wish to add extra configuration items please keep in mind any modifications you make here will be used by many users.
Try to keep these settings/configuations to things that help facilitate the development process and avoid altering the user workflow whenever possible.

View File

@ -1,15 +0,0 @@
{
// See http://go.microsoft.com/fwlink/?LinkId=827846 to learn about workspace recommendations.
// Extension identifier format: ${publisher}.${name}. Example: vscode.csharp
// List of extensions which should be recommended for users of this workspace.
"recommendations": [
"devondcarew.bazel-code",
"gkalpak.aio-docs-utils",
"ms-vscode.vscode-typescript-tslint-plugin",
"xaver.clang-format",
// The following extensions are useful when working on angular.io (i.e. inside the `aio/` directory).
//"angular.ng-template",
//"dbaeumer.vscode-eslint",
],
}

View File

@ -1,85 +0,0 @@
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "Attach to bazel test ... --config=debug",
"type": "node",
"request": "attach",
"port": 9229,
"address": "localhost",
"restart": false,
"sourceMaps": true,
"localRoot": "${workspaceRoot}",
"remoteRoot": "${workspaceRoot}",
"stopOnEntry": false,
"timeout": 600000,
},
{
"name": "Attach to bazel test ... --config=debug (no source maps)",
"type": "node",
"request": "attach",
"port": 9229,
"address": "localhost",
"restart": false,
"sourceMaps": false,
"localRoot": "${workspaceRoot}",
"remoteRoot": "${workspaceRoot}",
"stopOnEntry": false,
"timeout": 600000,
},
{
"name": "IVY:packages/core/test/acceptance",
"type": "node",
"request": "launch",
"program": "${workspaceFolder}/node_modules/.bin/bazel",
"args": [
"test",
"--config=ivy",
"packages/core/test/acceptance",
"--config=debug"
],
"port": 9229,
"address": "localhost",
"restart": true,
"sourceMaps": true,
"timeout": 600000,
},
{
"name": "IVY:packages/core/test/render3",
"type": "node",
"request": "launch",
"program": "${workspaceFolder}/node_modules/.bin/bazel",
"args": [
"test",
"--config=ivy",
"packages/core/test/render3",
"--config=debug"
],
"port": 9229,
"address": "localhost",
"restart": true,
"sourceMaps": true,
"timeout": 600000,
},
{
"name": "IVY:packages/core/test",
"type": "node",
"request": "launch",
"program": "${workspaceFolder}/node_modules/.bin/bazel",
"args": [
"test",
"--config=ivy",
"packages/core/test",
"--config=debug"
],
"port": 9229,
"address": "localhost",
"restart": true,
"sourceMaps": true,
"timeout": 600000,
},
]
}

View File

@ -1,31 +0,0 @@
{
// Format js and ts files on save with `clang-format.executable`
// If `clang-format.executable` is not being used, these two settings should be removed otherwise it will break existing formatting.
// You can instead run `yarn gulp format` to manually format your code.
"[javascript]": {
"editor.formatOnSave": true,
},
"[typescript]": {
"editor.formatOnSave": true,
},
// Please install https://marketplace.visualstudio.com/items?itemName=xaver.clang-format to take advantage of `clang-format` in VSCode.
// (See https://clang.llvm.org/docs/ClangFormat.html for more info `clang-format`.)
"clang-format.executable": "${workspaceRoot}/node_modules/.bin/clang-format",
// Exclude third party modules and build artifacts from the editor watchers/searches.
"files.watcherExclude": {
"**/.git/objects/**": true,
"**/.git/subtree-cache/**": true,
"**/node_modules/**": true,
"**/bazel-out/**": true,
"**/dist/**": true,
"**/aio/src/generated/**": true,
},
"search.exclude": {
"**/node_modules": true,
"**/bower_components": true,
"**/bazel-out": true,
"**/dist": true,
"**/aio/src/generated": true,
},
"git.ignoreLimitWarning": true,
}

View File

@ -1,113 +0,0 @@
{
// See https://go.microsoft.com/fwlink/?LinkId=733558
// for the documentation about the tasks.json format
"version": "2.0.0",
"tasks": [
{
"label": "IVY:packages/core/test/...",
"type": "shell",
"command": "${workspaceFolder}/node_modules/.bin/bazel",
"args": [
"test",
"--config=ivy",
"packages/core/test",
"packages/core/test/acceptance",
"packages/core/test/render3",
],
"group": "test",
"presentation": {
"reveal": "always",
"panel": "dedicated",
},
},
{
"label": "VE:packages/core/test/...",
"type": "shell",
"command": "${workspaceFolder}/node_modules/.bin/bazel",
"args": [
"test",
"packages/core/test",
"packages/core/test/acceptance",
"packages/core/test/render3",
],
"group": "test",
"presentation": {
"reveal": "always",
"panel": "dedicated",
},
},
{
"label": "IVY:packages/core/test/acceptance",
"type": "shell",
"command": "${workspaceFolder}/node_modules/.bin/bazel",
"args": [
"test",
"--config=ivy",
"packages/core/test/acceptance",
],
"group": "test",
"presentation": {
"reveal": "always",
"panel": "dedicated",
},
},
{
"label": "VE:packages/core/test/acceptance",
"type": "shell",
"command": "${workspaceFolder}/node_modules/.bin/bazel",
"args": [
"test",
"packages/core/test/acceptance",
],
"group": "test",
"presentation": {
"reveal": "always",
"panel": "dedicated",
},
},
{
"label": "IVY:packages/core/test",
"type": "shell",
"command": "${workspaceFolder}/node_modules/.bin/bazel",
"args": [
"test",
"--config=ivy",
"packages/core/test",
],
"group": "test",
"presentation": {
"reveal": "always",
"panel": "dedicated",
},
},
{
"label": "VE:packages/core/test",
"type": "shell",
"command": "${workspaceFolder}/node_modules/.bin/bazel",
"args": [
"test",
"packages/core/test",
],
"group": "test",
"presentation": {
"reveal": "always",
"panel": "dedicated",
},
},
{
"label": "IVY:packages/core/test/render3",
"type": "shell",
"command": "${workspaceFolder}/node_modules/.bin/bazel",
"args": [
"test",
"--config=ivy",
"packages/core/test/render3",
],
"group": "test",
"presentation": {
"reveal": "always",
"panel": "dedicated",
},
},
],
}

View File

@ -1,13 +0,0 @@
# Yarn Vendoring
We utilize Yarn's `yarn-path` configuration in a shared `.yarnrc` file to enforce
everyone using the same version of Yarn. Yarn checks the `.yarnrc` file to
determine if yarn should delegate the command to a vendored version at the
provided path.
## How to update
To update to the latest version of Yarn as our vendored version:
- Run this command
```sh
yarn policies set-version latest
```
- Remove the previous version

File diff suppressed because one or more lines are too long

View File

@ -1,5 +0,0 @@
# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY.
# yarn lockfile v1
yarn-path ".yarn/releases/yarn-1.21.1.js"

View File

@ -1,50 +0,0 @@
package(default_visibility = ["//visibility:public"])
exports_files([
"LICENSE",
"protractor-perf.conf.js",
"karma-js.conf.js",
"browser-providers.conf.js",
"scripts/ci/track-payload-size.sh",
"scripts/ci/payload-size.sh",
"scripts/ci/payload-size.js",
"package.json",
])
alias(
name = "tsconfig.json",
actual = "//packages:tsconfig-build.json",
)
filegroup(
name = "web_test_bootstrap_scripts",
# do not sort
srcs = [
"@npm//:node_modules/core-js/client/core.js",
"//packages/zone.js/dist:zone.js",
"//packages/zone.js/dist:zone-testing.js",
"//packages/zone.js/dist:task-tracking.js",
"//:test-events.js",
"//:shims_for_IE.js",
# Including systemjs because it defines `__eval`, which produces correct stack traces.
"@npm//:node_modules/systemjs/dist/system.src.js",
"@npm//:node_modules/reflect-metadata/Reflect.js",
],
)
filegroup(
name = "angularjs_scripts",
srcs = [
# We also declare the unminfied AngularJS files since these can be used for
# local debugging (e.g. see: packages/upgrade/test/common/test_helpers.ts)
"@npm//:node_modules/angular/angular.js",
"@npm//:node_modules/angular/angular.min.js",
"@npm//:node_modules/angular-1.5/angular.js",
"@npm//:node_modules/angular-1.5/angular.min.js",
"@npm//:node_modules/angular-1.6/angular.js",
"@npm//:node_modules/angular-1.6/angular.min.js",
"@npm//:node_modules/angular-mocks/angular-mocks.js",
"@npm//:node_modules/angular-mocks-1.5/angular-mocks.js",
"@npm//:node_modules/angular-mocks-1.6/angular-mocks.js",
],
)

File diff suppressed because it is too large Load Diff

View File

@ -1,12 +0,0 @@
# Contributor Code of Conduct
## Version 0.3b-angular
As contributors and maintainers of the Angular project, we pledge to respect everyone who contributes by posting issues, updating documentation, submitting pull requests, providing feedback in comments, and any other activities.
Communication through any of Angular's channels (GitHub, Gitter, IRC, mailing lists, Google+, Twitter, etc.) must be constructive and never resort to personal attacks, trolling, public or private harassment, insults, or other unprofessional conduct.
We promise to extend courtesy and respect to everyone involved in this project regardless of gender, gender identity, sexual orientation, disability, age, race, ethnicity, religion, or level of experience. We expect anyone contributing to the Angular project to do the same.
If any member of the community violates this code of conduct, the maintainers of the Angular project may take action, removing issues, comments, and PRs or blocking accounts as deemed appropriate.
If you are subject to or witness unacceptable behavior, or have any other concerns, please email us at [conduct@angular.io](mailto:conduct@angular.io).

View File

@ -17,15 +17,15 @@ Help us keep Angular open and inclusive. Please read and follow our [Code of Con
## <a name="question"></a> Got a Question or Problem?
Do not open issues for general support questions as we want to keep GitHub issues for bug reports and feature requests. You've got much better chances of getting your question answered on [Stack Overflow](https://stackoverflow.com/questions/tagged/angular) where the questions should be tagged with tag `angular`.
Please, do not open issues for the general support questions as we want to keep GitHub issues for bug reports and feature requests. You've got much better chances of getting your question answered on [StackOverflow](https://stackoverflow.com/questions/tagged/angular) where the questions should be tagged with tag `angular`.
Stack Overflow is a much better place to ask questions since:
StackOverflow is a much better place to ask questions since:
- there are thousands of people willing to help on Stack Overflow
- there are thousands of people willing to help on StackOverflow
- questions and answers stay available for public viewing so your question / answer might help someone else
- Stack Overflow's voting system assures that the best answers are prominently visible.
- StackOverflow's voting system assures that the best answers are prominently visible.
To save your and our time, we will systematically close all issues that are requests for general support and redirect people to Stack Overflow.
To save your and our time we will be systematically closing all the issues that are requests for general support and redirecting people to StackOverflow.
If you would like to chat about the question in real-time, you can reach out via [our gitter channel][gitter].
@ -51,53 +51,54 @@ and help you to craft the change so that it is successfully accepted into the pr
Before you submit an issue, please search the issue tracker, maybe an issue for your problem already exists and the discussion might inform you of workarounds readily available.
We want to fix all the issues as soon as possible, but before fixing a bug we need to reproduce and confirm it. In order to reproduce bugs, we will systematically ask you to provide a minimal reproduction. Having a minimal reproducible scenario gives us a wealth of important information without going back & forth to you with additional questions.
We want to fix all the issues as soon as possible, but before fixing a bug we need to reproduce and confirm it. In order to reproduce bugs we will systematically ask you to provide a minimal reproduction scenario using http://plnkr.co. Having a live, reproducible scenario gives us wealth of important information without going back & forth to you with additional questions like:
A minimal reproduction allows us to quickly confirm a bug (or point out a coding problem) as well as confirm that we are fixing the right problem.
- version of Angular used
- 3rd-party libraries and their versions
- and most importantly - a use-case that fails
We will be insisting on a minimal reproduction scenario in order to save maintainers time and ultimately be able to fix more bugs. Interestingly, from our experience, users often find coding problems themselves while preparing a minimal reproduction. We understand that sometimes it might be hard to extract essential bits of code from a larger codebase but we really need to isolate the problem before we can fix it.
A minimal reproduce scenario using http://plnkr.co/ allows us to quickly confirm a bug (or point out coding problem) as well as confirm that we are fixing the right problem. If plunker is not a suitable way to demonstrate the problem (for example for issues related to our npm packaging), please create a standalone git repository demonstrating the problem.
Unfortunately, we are not able to investigate / fix bugs without a minimal reproduction, so if we don't hear back from you, we are going to close an issue that doesn't have enough info to be reproduced.
We will be insisting on a minimal reproduce scenario in order to save maintainers time and ultimately be able to fix more bugs. Interestingly, from our experience users often find coding problems themselves while preparing a minimal plunk. We understand that sometimes it might be hard to extract essentials bits of code from a larger code-base but we really need to isolate the problem before we can fix it.
You can file new issues by selecting from our [new issue templates](https://github.com/angular/angular/issues/new/choose) and filling out the issue template.
Unfortunately we are not able to investigate / fix bugs without a minimal reproduction, so if we don't hear back from you we are going to close an issue that don't have enough info to be reproduced.
You can file new issues by filling out our [new issue form](https://github.com/angular/angular/issues/new).
### <a name="submit-pr"></a> Submitting a Pull Request (PR)
Before you submit your Pull Request (PR) consider the following guidelines:
1. Search [GitHub](https://github.com/angular/angular/pulls) for an open or closed PR
* Search [GitHub](https://github.com/angular/angular/pulls) for an open or closed PR
that relates to your submission. You don't want to duplicate effort.
1. Be sure that an issue describes the problem you're fixing, or documents the design for the feature you'd like to add.
Discussing the design up front helps to ensure that we're ready to accept your work.
1. Please sign our [Contributor License Agreement (CLA)](#cla) before sending PRs.
We cannot accept code without this. Make sure you sign with the primary email address of the Git identity that has been granted access to the Angular repository.
1. Fork the angular/angular repo.
1. Make your changes in a new git branch:
* Please sign our [Contributor License Agreement (CLA)](#cla) before sending PRs.
We cannot accept code without this.
* Make your changes in a new git branch:
```shell
git checkout -b my-fix-branch master
```
1. Create your patch, **including appropriate test cases**.
1. Follow our [Coding Rules](#rules).
1. Run the full Angular test suite, as described in the [developer documentation][dev-doc],
* Create your patch, **including appropriate test cases**.
* Follow our [Coding Rules](#rules).
* Run the full Angular test suite, as described in the [developer documentation][dev-doc],
and ensure that all tests pass.
1. Commit your changes using a descriptive commit message that follows our
* Commit your changes using a descriptive commit message that follows our
[commit message conventions](#commit). Adherence to these conventions
is necessary because release notes are automatically generated from these messages.
```shell
git commit -a
```
Note: the optional commit `-a` command line option will automatically "add" and "rm" edited files.
Note: the optional commit `-a` command line option will automatically "add" and "rm" edited files.
1. Push your branch to GitHub:
* Push your branch to GitHub:
```shell
git push origin my-fix-branch
```
1. In GitHub, send a pull request to `angular:master`.
* In GitHub, send a pull request to `angular:master`.
* If we suggest changes then:
* Make the required updates.
* Re-run the Angular test suites to ensure tests are still passing.
@ -168,15 +169,15 @@ format that includes a **type**, a **scope** and a **subject**:
The **header** is mandatory and the **scope** of the header is optional.
Any line of the commit message cannot be longer than 100 characters! This allows the message to be easier
Any line of the commit message cannot be longer 100 characters! This allows the message to be easier
to read on GitHub as well as in various git tools.
The footer should contain a [closing reference to an issue](https://help.github.com/articles/closing-issues-via-commit-messages/) if any.
Footer should contain a [closing reference to an issue](https://help.github.com/articles/closing-issues-via-commit-messages/) if any.
Samples: (even more [samples](https://github.com/angular/angular/commits/master))
```
docs(changelog): update changelog to beta.5
docs(changelog): update change log to beta.5
```
```
fix(release): need to depend on latest rxjs and zone.js
@ -191,62 +192,49 @@ If the commit reverts a previous commit, it should begin with `revert: `, follow
Must be one of the following:
* **build**: Changes that affect the build system or external dependencies (example scopes: gulp, broccoli, npm)
* **ci**: Changes to our CI configuration files and scripts (example scopes: Circle, BrowserStack, SauceLabs)
* **ci**: Changes to our CI configuration files and scripts (example scopes: Travis, Circle, BrowserStack, SauceLabs)
* **docs**: Documentation only changes
* **feat**: A new feature
* **fix**: A bug fix
* **perf**: A code change that improves performance
* **refactor**: A code change that neither fixes a bug nor adds a feature
* **style**: Changes that do not affect the meaning of the code (white-space, formatting, missing semi-colons, etc)
* **style**: Changes that do not affect the meaning of the code (white-space, formatting, missing
semi-colons, etc)
* **test**: Adding missing tests or correcting existing tests
### Scope
The scope should be the name of the npm package affected (as perceived by the person reading the changelog generated from commit messages).
The scope should be the name of the npm package affected (as perceived by person reading changelog generated from commit messages.
The following is the list of supported scopes:
* **animations**
* **bazel**
* **benchpress**
* **common**
* **compiler**
* **compiler-cli**
* **core**
* **elements**
* **forms**
* **http**
* **language-service**
* **localize**
* **platform-browser**
* **platform-browser-dynamic**
* **platform-server**
* **platform-webworker**
* **platform-webworker-dynamic**
* **router**
* **service-worker**
* **upgrade**
* **zone.js**
* **tsc-wrapped**
There are currently a few exceptions to the "use package name" rule:
There is currently few exception to the "use package name" rule:
* **packaging**: used for changes that change the npm package layout in all of our packages, e.g.
public path changes, package.json changes done to all packages, d.ts file/format changes, changes
to bundles, etc.
* **packaging**: used for changes that change the npm package layout in all of our packages, e.g. public path changes, package.json changes done to all packages, d.ts file/format changes, changes to bundles, etc.
* **changelog**: used for updating the release notes in CHANGELOG.md
* **docs-infra**: used for docs-app (angular.io) related changes within the /aio directory of the
repo
* **dev-infra**: used for dev-infra related changes within the directories /scripts, /tools and /dev-infra
* **ngcc**: used for changes to the [Angular Compatibility Compiler](./packages/compiler-cli/ngcc/README.md)
* **ve**: used for changes specific to ViewEngine (legacy compiler/renderer).
* none/empty string: useful for `style`, `test` and `refactor` changes that are done across all
packages (e.g. `style: add missing semicolons`) and for docs changes that are not related to a
specific package (e.g. `docs: fix typo in tutorial`).
* **aio**: used for docs-app (angular.io) related changes within the /aio directory of the repo
* none/empty string: useful for `style`, `test` and `refactor` changes that are done across all packages (e.g. `style: add missing semicolons`)
### Subject
The subject contains a succinct description of the change:
The subject contains succinct description of the change:
* use the imperative, present tense: "change" not "changed" nor "changes"
* don't capitalize the first letter
* don't capitalize first letter
* no dot (.) at the end
### Body
@ -266,22 +254,10 @@ A detailed explanation can be found in this [document][commit-message-format].
Please sign our Contributor License Agreement (CLA) before sending pull requests. For any code
changes to be accepted, the CLA must be signed. It's a quick process, we promise!
* For individuals, we have a [simple click-through form][individual-cla].
* For corporations, we'll need you to
* For individuals we have a [simple click-through form][individual-cla].
* For corporations we'll need you to
[print, sign and one of scan+email, fax or mail the form][corporate-cla].
<hr>
If you have more than one Git identity, you must make sure that you sign the CLA using the primary email address associated with the ID that has been granted access to the Angular repository. Git identities can be associated with more than one email address, and only one is primary. Here are some links to help you sort out multiple Git identities and email addresses:
* https://help.github.com/articles/setting-your-commit-email-address-in-git/
* https://stackoverflow.com/questions/37245303/what-does-usera-committed-with-userb-13-days-ago-on-github-mean
* https://help.github.com/articles/about-commit-email-addresses/
* https://help.github.com/articles/blocking-command-line-pushes-that-expose-your-personal-email-address/
Note that if you have more than one Git identity, it is important to verify that you are logged in with the same ID with which you signed the CLA, before you commit changes. If not, your PR will fail the CLA check.
<hr>
[angular-group]: https://groups.google.com/forum/#!forum/angular
[coc]: https://github.com/angular/code-of-conduct/blob/master/CODE_OF_CONDUCT.md

View File

@ -1,6 +1,6 @@
The MIT License
Copyright (c) 2010-2020 Google LLC. http://angular.io/license
Copyright (c) 2014-2017 Google, Inc. http://angular.io
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

View File

@ -1,26 +1,30 @@
[![CircleCI](https://circleci.com/gh/angular/angular/tree/master.svg?style=shield)](https://circleci.com/gh/angular/workflows/angular/tree/master)
[![Build Status](https://travis-ci.org/angular/angular.svg?branch=master)](https://travis-ci.org/angular/angular)
[![CircleCI](https://circleci.com/gh/angular/angular/tree/master.svg?style=shield)](https://circleci.com/gh/angular/angular/tree/master)
[![Join the chat at https://gitter.im/angular/angular](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/angular/angular?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
[![npm version](https://badge.fury.io/js/%40angular%2Fcore.svg)](https://www.npmjs.com/@angular/core)
[![Issue Stats](http://issuestats.com/github/angular/angular/badge/pr?style=flat)](http://issuestats.com/github/angular/angular)
[![Issue Stats](http://issuestats.com/github/angular/angular/badge/issue?style=flat)](http://issuestats.com/github/angular/angular)
[![npm version](https://badge.fury.io/js/%40angular%2Fcore.svg)](https://badge.fury.io/js/%40angular%2Fcore)
[![Sauce Test Status](https://saucelabs.com/browser-matrix/angular2-ci.svg)](https://saucelabs.com/u/angular2-ci)
*Safari (7+), iOS (7+), Edge (14) and IE mobile (11) are tested on [BrowserStack][browserstack].*
# Angular
Angular
=========
Angular is a development platform for building mobile and desktop web applications using Typescript/JavaScript (JS) and other languages.
Angular is a development platform for building mobile and desktop web applications using TypeScript/JavaScript and other languages.
## Quickstart
[Get started in 5 minutes][quickstart].
## Changelog
[Learn about the latest improvements][changelog].
## Want to help?
Want to file a bug, contribute some code, or improve documentation? Excellent! Read up on our
guidelines for [contributing][contributing] and then check out one of our issues in the [hotlist: community-help](https://github.com/angular/angular/labels/hotlist%3A%20community-help).
[contributing]: https://github.com/angular/angular/blob/master/CONTRIBUTING.md
[quickstart]: https://angular.io/start
[changelog]: https://github.com/angular/angular/blob/master/CHANGELOG.md
[ng]: https://angular.io
[browserstack]: https://www.browserstack.com/
[contributing]: http://github.com/angular/angular/blob/master/CONTRIBUTING.md
[quickstart]: https://angular.io/docs/ts/latest/quickstart.html
[ng]: http://angular.io

120
WORKSPACE
View File

@ -1,120 +0,0 @@
workspace(
name = "angular",
managed_directories = {"@npm": ["node_modules"]},
)
load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
# Fetch rules_nodejs so we can install our npm dependencies
http_archive(
name = "build_bazel_rules_nodejs",
sha256 = "b6670f9f43faa66e3009488bbd909bc7bc46a5a9661a33f6bc578068d1837f37",
urls = ["https://github.com/bazelbuild/rules_nodejs/releases/download/1.3.0/rules_nodejs-1.3.0.tar.gz"],
)
# Check the bazel version and download npm dependencies
load("@build_bazel_rules_nodejs//:index.bzl", "check_bazel_version", "check_rules_nodejs_version", "node_repositories", "yarn_install")
# Bazel version must be at least the following version because:
# - 0.26.0 managed_directories feature added which is required for nodejs rules 0.30.0
# - 0.27.0 has a fix for managed_directories after `rm -rf node_modules`
# - 2.1.0 feature added to honor .bazelignore in external repositories
check_bazel_version(
message = """
You no longer need to install Bazel on your machine.
Angular has a dependency on the @bazel/bazel package which supplies it.
Try running `yarn bazel` instead.
(If you did run that, check that you've got a fresh `yarn install`)
""",
minimum_bazel_version = "2.1.0",
)
check_rules_nodejs_version(minimum_version_string = "1.3.0")
# Setup the Node.js toolchain
node_repositories(
node_repositories = {
"12.14.1-darwin_amd64": ("node-v12.14.1-darwin-x64.tar.gz", "node-v12.14.1-darwin-x64", "0be10a28737527a1e5e3784d3ad844d742fe8b0718acd701fd48f718fd3af78f"),
"12.14.1-linux_amd64": ("node-v12.14.1-linux-x64.tar.xz", "node-v12.14.1-linux-x64", "07cfcaa0aa9d0fcb6e99725408d9e0b07be03b844701588e3ab5dbc395b98e1b"),
"12.14.1-windows_amd64": ("node-v12.14.1-win-x64.zip", "node-v12.14.1-win-x64", "1f96ccce3ba045ecea3f458e189500adb90b8bc1a34de5d82fc10a5bf66ce7e3"),
},
node_version = "12.14.1",
package_json = ["//:package.json"],
)
load("//integration:angular_integration_test.bzl", "npm_package_archives")
yarn_install(
name = "npm",
manual_build_file_contents = npm_package_archives(),
package_json = "//:package.json",
yarn_lock = "//:yarn.lock",
)
# Install all bazel dependencies of the @npm npm packages
load("@npm//:install_bazel_dependencies.bzl", "install_bazel_dependencies")
install_bazel_dependencies()
# Load angular dependencies
load("//packages/bazel:package.bzl", "rules_angular_dev_dependencies")
rules_angular_dev_dependencies()
# Load protractor dependencies
load("@npm_bazel_protractor//:package.bzl", "npm_bazel_protractor_dependencies")
npm_bazel_protractor_dependencies()
# Load karma dependencies
load("@npm_bazel_karma//:package.bzl", "npm_bazel_karma_dependencies")
npm_bazel_karma_dependencies()
# Setup the rules_webtesting toolchain
load("@io_bazel_rules_webtesting//web:repositories.bzl", "web_test_repositories")
web_test_repositories()
load("//tools/browsers:browser_repositories.bzl", "browser_repositories")
browser_repositories()
# Setup the rules_typescript tooolchain
load("@npm_bazel_typescript//:index.bzl", "ts_setup_workspace")
ts_setup_workspace()
# Setup the rules_sass toolchain
load("@io_bazel_rules_sass//sass:sass_repositories.bzl", "sass_repositories")
sass_repositories()
# Setup the skydoc toolchain
load("@io_bazel_skydoc//skylark:skylark.bzl", "skydoc_repositories")
skydoc_repositories()
load("@bazel_toolchains//rules:environments.bzl", "clang_env")
load("@bazel_toolchains//rules:rbe_repo.bzl", "rbe_autoconfig")
rbe_autoconfig(
name = "rbe_ubuntu1604_angular",
# Need to specify a base container digest in order to ensure that we can use the checked-in
# platform configurations for the "ubuntu16_04" image. Otherwise the autoconfig rule would
# need to pull the image and run it in order determine the toolchain configuration. See:
# https://github.com/bazelbuild/bazel-toolchains/blob/1.1.2/configs/ubuntu16_04_clang/versions.bzl
base_container_digest = "sha256:1ab40405810effefa0b2f45824d6d608634ccddbf06366760c341ef6fbead011",
# Note that if you change the `digest`, you might also need to update the
# `base_container_digest` to make sure marketplace.gcr.io/google/rbe-ubuntu16-04-webtest:<digest>
# and marketplace.gcr.io/google/rbe-ubuntu16-04:<base_container_digest> have
# the same Clang and JDK installed. Clang is needed because of the dependency on
# @com_google_protobuf. Java is needed for the Bazel's test executor Java tool.
digest = "sha256:0b8fa87db4b8e5366717a7164342a029d1348d2feea7ecc4b18c780bc2507059",
env = clang_env(),
registry = "marketplace.gcr.io",
# We can't use the default "ubuntu16_04" RBE image provided by the autoconfig because we need
# a specific Linux kernel that comes with "libx11" in order to run headless browser tests.
repository = "google/rbe-ubuntu16-04-webtest",
)

65
aio/.angular-cli.json Normal file
View File

@ -0,0 +1,65 @@
{
"$schema": "./node_modules/@angular/cli/lib/config/schema.json",
"project": {
"name": "site"
},
"apps": [
{
"root": "src",
"outDir": "dist",
"assets": [
"assets",
"content",
"app/search/search-worker.js",
"favicon.ico",
"pwa-manifest.json"
],
"index": "index.html",
"main": "main.ts",
"polyfills": "polyfills.ts",
"test": "test.ts",
"tsconfig": "tsconfig.app.json",
"testTsconfig": "tsconfig.spec.json",
"prefix": "aio",
"serviceWorker": true,
"styles": [
"styles.scss"
],
"scripts": [
],
"environmentSource": "environments/environment.ts",
"environments": {
"dev": "environments/environment.ts",
"prod": "environments/environment.prod.ts"
}
}
],
"e2e": {
"protractor": {
"config": "./protractor.conf.js"
}
},
"lint": [
{
"project": "src/tsconfig.app.json"
},
{
"project": "src/tsconfig.spec.json"
},
{
"project": "e2e/tsconfig.e2e.json"
}
],
"test": {
"karma": {
"config": "./karma.conf.js"
}
},
"defaults": {
"styleExt": "scss",
"component": {
"inlineStyle": true
}
}
}

5
aio/.firebaserc Normal file
View File

@ -0,0 +1,5 @@
{
"projects": {
"staging": "aio-staging"
}
}

5
aio/.gitignore vendored
View File

@ -3,7 +3,7 @@
# compiled output
/dist
/out-tsc
/src/generated
/src/content
/tmp
# dependencies
@ -26,13 +26,10 @@
!.vscode/extensions.json
# misc
/.firebase/
/.sass-cache
/connect.lock
/coverage
/libpeerconnection.log
debug.log
firebase-debug.log
npm-debug.log
testem.log
/typings

View File

@ -4,28 +4,20 @@ Everything in this folder is part of the documentation project. This includes
* the web site for displaying the documentation
* the dgeni configuration for converting source files to rendered files that can be viewed in the web site.
* the tooling for setting up examples for development; and generating live-example and zip files from the examples.
* the tooling for setting up examples for development; and generating plunkers and zip files from the examples.
## Developer tasks
We use [Yarn](https://yarnpkg.com) to manage the dependencies and to run build tasks.
We use `yarn` to manage the dependencies and to run build tasks.
You should run all these tasks from the `angular/aio` folder.
Here are the most important tasks you might need to use:
* `yarn` - install all the dependencies.
* `yarn setup` - install all the dependencies, boilerplate, stackblitz, zips and run dgeni on the docs.
* `yarn setup-local` - same as `setup`, but build the Angular packages from the source code and use these locally built versions (instead of the ones fetched from npm) for aio and docs examples boilerplate.
* `yarn build` - create a production build of the application (after installing dependencies, boilerplate, etc).
* `yarn build-local` - same as `build`, but use `setup-local` instead of `setup`.
* `yarn build-local-with-viewengine` - same as `build-local`, but in addition also turns on `ViewEngine` mode in aio.
(Note: Docs examples run in `ViewEngine` mode by default. To turn on `ivy` mode in examples, see `yarn boilerplate:add` below.)
* `yarn setup` - Install all the dependencies, boilerplate, plunkers, zips and runs dgeni on the docs.
* `yarn start` - run a development web server that watches the files; then builds the doc-viewer and reloads the page, as necessary.
* `yarn serve-and-sync` - run both the `docs-watch` and `start` in the same console.
* `yarn lint` - check that the doc-viewer code follows our style rules.
* `yarn test` - watch all the source files, for the doc-viewer, and run all the unit tests when any change.
* `yarn test --watch=false` - run all the unit tests once.
* `yarn e2e` - run all the e2e tests for the doc-viewer.
* `yarn docs` - generate all the docs from the source files.
@ -34,39 +26,14 @@ Here are the most important tasks you might need to use:
* `yarn docs-test` - run the unit tests for the doc generation code.
* `yarn boilerplate:add` - generate all the boilerplate code for the examples, so that they can be run locally.
* `yarn boilerplate:add:ivy` - same as `boilerplate:add` but also turns on `ivy` mode.
* `yarn boilerplate:remove` - remove all the boilerplate code that was added via `yarn boilerplate:add`.
* `yarn generate-stackblitz` - generate the stackblitz files that are used by the `live-example` tags in the docs.
* `yarn generate-plunkers` - generate the plunker files that are used by the `live-example` tags in the docs.
* `yarn generate-zips` - generate the zip files from the examples. Zip available via the `live-example` tags in the docs.
* `yarn example-e2e` - run all e2e tests for examples. Available options:
- `--setup`: generate boilerplate, force webdriver update & other setup, then run tests.
- `--local`: run e2e tests with the local version of Angular contained in the "dist" folder.
_Requires `--setup` in order to take effect._
- `--ivy`: run e2e tests in `ivy` mode.
- `--filter=foo`: limit e2e tests to those containing the word "foo".
> **Note for Windows users**
>
> Setting up the examples involves creating some [symbolic links](https://en.wikipedia.org/wiki/Symbolic_link) (see [here](./tools/examples/README.md#symlinked-node_modules) for details). On Windows, this requires to either have [Developer Mode enabled](https://blogs.windows.com/windowsdeveloper/2016/12/02/symlinks-windows-10) (supported on Windows 10 or newer) or run the setup commands as administrator.
>
> The affected commands are:
> - `yarn setup` / `yarn setup-*`
> - `yarn build` / `yarn build-*`
> - `yarn boilerplate:add`
> - `yarn example-e2e --setup`
## Using ServiceWorker locally
Running `yarn start` (even when explicitly targeting production mode) does not set up the
ServiceWorker. If you want to test the ServiceWorker locally, you can use `yarn build` and then
serve the files in `dist/` with `yarn http-server dist -p 4200`.
## Guide to authoring
There are two types of content in the documentation:
There are two types of content in the documentatation:
* **API docs**: descriptions of the modules, classes, interfaces, decorators, etc that make up the Angular platform.
API docs are generated directly from the source code.
@ -78,16 +45,8 @@ The content is written in markdown.
All other content is written using markdown in text files, located in the `angular/aio/content` folder.
More specifically, there are sub-folders that contain particular types of content: guides, tutorial and marketing.
* **Code examples**: code examples need to be testable to ensure their accuracy.
Also, our examples have a specific look and feel and allow the user to copy the source code. For larger
examples they are rendered in a tabbed interface (e.g. template, HTML, and TypeScript on separate
tabs). Additionally, some are live examples, which provide links where the code can be edited, executed, and/or downloaded. For details on working with code examples, please read the [Code snippets](https://angular.io/guide/docs-style-guide#code-snippets), [Source code markup](https://angular.io/guide/docs-style-guide#source-code-markup), and [Live examples](https://angular.io/guide/docs-style-guide#live-examples) pages of the [Authors Style Guide](https://angular.io/guide/docs-style-guide).
We use the [dgeni](https://github.com/angular/dgeni) tool to convert these files into docs that can be viewed in the doc-viewer.
The [Authors Style Guide](https://angular.io/guide/docs-style-guide) prescribes guidelines for
writing guide pages, explains how to use the documentation classes and components, and how to markup sample source code to produce code snippets.
### Generating the complete docs
The main task for generating the docs is `yarn docs`. This will process all the source files (API and other),
@ -97,14 +56,14 @@ extracting the documentation and generating JSON files that can be consumed by t
Full doc generation can take up to one minute. That's too slow for efficient document creation and editing.
You can make small changes in a smart editor that displays formatted markdown:
You can make small changes in a smart editor that displays formatted markdown:
>In VS Code, _Cmd-K, V_ opens markdown preview in side pane; _Cmd-B_ toggles left sidebar
You also want to see those changes displayed properly in the doc viewer
with a quick, edit/view cycle time.
For this purpose, use the `yarn docs-watch` task, which watches for changes to source files and only
re-processes the files necessary to generate the docs that are related to the file that has changed.
re-processes the the files necessary to generate the docs that are related to the file that has changed.
Since this task takes shortcuts, it is much faster (often less than 1 second) but it won't produce full
fidelity content. For example, links to other docs and code examples may not render correctly. This is
most particularly noticed in links to other docs and in the embedded examples, which may not always render
@ -115,7 +74,8 @@ The general setup is as follows:
* Open a terminal, ensure the dependencies are installed; run an initial doc generation; then start the doc-viewer:
```bash
yarn setup
yarn
yarn docs
yarn start
```
@ -125,16 +85,8 @@ yarn start
yarn docs-watch
```
>Alternatively, try the consolidated `serve-and-sync` command that builds, watches and serves in the same terminal window
```bash
yarn serve-and-sync
```
* Open a browser at https://localhost:4200/ and navigate to the document on which you want to work.
You can automatically open the browser by using `yarn start -o` in the first terminal.
You can automatically open the browser by using `yarn start -- -o` in the first terminal.
* Make changes to the page's associated doc or example files. Every time a file is saved, the doc will
be regenerated, the app will rebuild and the page will reload.
* If you get a build error complaining about examples or any other odd behavior, be sure to consult
the [Authors Style Guide](https://angular.io/guide/docs-style-guide).

View File

@ -1,5 +1,5 @@
# Image metadata and config
FROM debian:stretch
FROM debian:jessie
LABEL name="angular.io PR preview" \
description="This image implements the PR preview functionality for angular.io." \
@ -8,62 +8,49 @@ LABEL name="angular.io PR preview" \
VOLUME /aio-secrets
VOLUME /var/www/aio-builds
VOLUME /dockerbuild
EXPOSE 80 443
# Build-time args and env vars
# The AIO_ARTIFACT_PATH path needs to be kept in synch with the value of
# `aio_preview->steps->store_artifacts->destination` property in `.circleci/config.yml`
ARG AIO_ARTIFACT_PATH=aio/dist/aio-snapshot.tgz
ARG TEST_AIO_ARTIFACT_PATH=$AIO_ARTIFACT_PATH
ARG AIO_BUILDS_DIR=/var/www/aio-builds
ARG TEST_AIO_BUILDS_DIR=/tmp/aio-builds
ARG AIO_DOMAIN_NAME=ngbuilds.io
ARG TEST_AIO_DOMAIN_NAME=$AIO_DOMAIN_NAME.localhost
ARG AIO_GITHUB_ORGANIZATION=angular
ARG TEST_AIO_GITHUB_ORGANIZATION=test-org
ARG AIO_GITHUB_REPO=angular
ARG TEST_AIO_GITHUB_REPO=test-repo
ARG AIO_GITHUB_TEAM_SLUGS=aio-auto-previews,aio-contributors
ARG TEST_AIO_GITHUB_TEAM_SLUGS=test-team-1,test-team-2
ARG TEST_AIO_GITHUB_ORGANIZATION=angular
ARG AIO_GITHUB_TEAM_SLUGS=angular-core,aio-contributors
ARG TEST_AIO_GITHUB_TEAM_SLUGS=angular-core,aio-contributors
ARG AIO_NGINX_HOSTNAME=$AIO_DOMAIN_NAME
ARG TEST_AIO_NGINX_HOSTNAME=$TEST_AIO_DOMAIN_NAME
ARG AIO_NGINX_PORT_HTTP=80
ARG TEST_AIO_NGINX_PORT_HTTP=8080
ARG AIO_NGINX_PORT_HTTPS=443
ARG TEST_AIO_NGINX_PORT_HTTPS=4433
ARG AIO_SIGNIFICANT_FILES_PATTERN='^(?:aio|packages)/(?!.*[._]spec\\.[jt]s$)'
ARG TEST_AIO_SIGNIFICANT_FILES_PATTERN=$AIO_SIGNIFICANT_FILES_PATTERN
ARG AIO_TRUSTED_PR_LABEL="aio: preview"
ARG TEST_AIO_TRUSTED_PR_LABEL="aio: preview"
ARG AIO_PREVIEW_SERVER_HOSTNAME=preview.localhost
ARG TEST_AIO_PREVIEW_SERVER_HOSTNAME=preview.localhost
ARG AIO_ARTIFACT_MAX_SIZE=26214400
ARG TEST_AIO_ARTIFACT_MAX_SIZE=200
ARG AIO_PREVIEW_SERVER_PORT=3000
ARG TEST_AIO_PREVIEW_SERVER_PORT=3001
ARG AIO_REPO_SLUG=angular/angular
ARG TEST_AIO_REPO_SLUG=test-repo/test-slug
ARG AIO_UPLOAD_HOSTNAME=upload.localhost
ARG TEST_AIO_UPLOAD_HOSTNAME=upload.localhost
ARG AIO_UPLOAD_MAX_SIZE=20971520
ARG TEST_AIO_UPLOAD_MAX_SIZE=20971520
ARG AIO_UPLOAD_PORT=3000
ARG TEST_AIO_UPLOAD_PORT=3001
ENV AIO_ARTIFACT_PATH=$AIO_ARTIFACT_PATH TEST_AIO_ARTIFACT_PATH=$TEST_AIO_ARTIFACT_PATH \
AIO_BUILDS_DIR=$AIO_BUILDS_DIR TEST_AIO_BUILDS_DIR=$TEST_AIO_BUILDS_DIR \
AIO_DOMAIN_NAME=$AIO_DOMAIN_NAME TEST_AIO_DOMAIN_NAME=$TEST_AIO_DOMAIN_NAME \
AIO_GITHUB_ORGANIZATION=$AIO_GITHUB_ORGANIZATION TEST_AIO_GITHUB_ORGANIZATION=$TEST_AIO_GITHUB_ORGANIZATION \
AIO_GITHUB_REPO=$AIO_GITHUB_REPO TEST_AIO_GITHUB_REPO=$TEST_AIO_GITHUB_REPO \
AIO_GITHUB_TEAM_SLUGS=$AIO_GITHUB_TEAM_SLUGS TEST_AIO_GITHUB_TEAM_SLUGS=$TEST_AIO_GITHUB_TEAM_SLUGS \
AIO_LOCALCERTS_DIR=/etc/ssl/localcerts TEST_AIO_LOCALCERTS_DIR=/etc/ssl/localcerts-test \
AIO_NGINX_HOSTNAME=$AIO_NGINX_HOSTNAME TEST_AIO_NGINX_HOSTNAME=$TEST_AIO_NGINX_HOSTNAME \
AIO_NGINX_LOGS_DIR=/var/log/aio/nginx TEST_AIO_NGINX_LOGS_DIR=/var/log/aio/nginx-test \
AIO_NGINX_PORT_HTTP=$AIO_NGINX_PORT_HTTP TEST_AIO_NGINX_PORT_HTTP=$TEST_AIO_NGINX_PORT_HTTP \
AIO_NGINX_PORT_HTTPS=$AIO_NGINX_PORT_HTTPS TEST_AIO_NGINX_PORT_HTTPS=$TEST_AIO_NGINX_PORT_HTTPS \
AIO_SCRIPTS_JS_DIR=/usr/share/aio-scripts-js \
AIO_SCRIPTS_SH_DIR=/usr/share/aio-scripts-sh \
AIO_SIGNIFICANT_FILES_PATTERN=$AIO_SIGNIFICANT_FILES_PATTERN TEST_AIO_SIGNIFICANT_FILES_PATTERN=$TEST_AIO_SIGNIFICANT_FILES_PATTERN \
AIO_TRUSTED_PR_LABEL=$AIO_TRUSTED_PR_LABEL TEST_AIO_TRUSTED_PR_LABEL=$TEST_AIO_TRUSTED_PR_LABEL \
AIO_PREVIEW_SERVER_HOSTNAME=$AIO_PREVIEW_SERVER_HOSTNAME TEST_AIO_PREVIEW_SERVER_HOSTNAME=$TEST_AIO_PREVIEW_SERVER_HOSTNAME \
AIO_ARTIFACT_MAX_SIZE=$AIO_ARTIFACT_MAX_SIZE TEST_AIO_ARTIFACT_MAX_SIZE=$TEST_AIO_ARTIFACT_MAX_SIZE \
AIO_PREVIEW_SERVER_PORT=$AIO_PREVIEW_SERVER_PORT TEST_AIO_PREVIEW_SERVER_PORT=$TEST_AIO_PREVIEW_SERVER_PORT \
AIO_WWW_USER=www-data \
ENV AIO_BUILDS_DIR=$AIO_BUILDS_DIR TEST_AIO_BUILDS_DIR=$TEST_AIO_BUILDS_DIR \
AIO_DOMAIN_NAME=$AIO_DOMAIN_NAME TEST_AIO_DOMAIN_NAME=$TEST_AIO_DOMAIN_NAME \
AIO_GITHUB_ORGANIZATION=$AIO_GITHUB_ORGANIZATION TEST_AIO_GITHUB_ORGANIZATION=$TEST_AIO_GITHUB_ORGANIZATION \
AIO_GITHUB_TEAM_SLUGS=$AIO_GITHUB_TEAM_SLUGS TEST_AIO_GITHUB_TEAM_SLUGS=$TEST_AIO_GITHUB_TEAM_SLUGS \
AIO_LOCALCERTS_DIR=/etc/ssl/localcerts TEST_AIO_LOCALCERTS_DIR=/etc/ssl/localcerts-test \
AIO_NGINX_HOSTNAME=$AIO_NGINX_HOSTNAME TEST_AIO_NGINX_HOSTNAME=$TEST_AIO_NGINX_HOSTNAME \
AIO_NGINX_LOGS_DIR=/var/log/aio/nginx TEST_AIO_NGINX_LOGS_DIR=/var/log/aio/nginx-test \
AIO_NGINX_PORT_HTTP=$AIO_NGINX_PORT_HTTP TEST_AIO_NGINX_PORT_HTTP=$TEST_AIO_NGINX_PORT_HTTP \
AIO_NGINX_PORT_HTTPS=$AIO_NGINX_PORT_HTTPS TEST_AIO_NGINX_PORT_HTTPS=$TEST_AIO_NGINX_PORT_HTTPS \
AIO_REPO_SLUG=$AIO_REPO_SLUG TEST_AIO_REPO_SLUG=$TEST_AIO_REPO_SLUG \
AIO_SCRIPTS_JS_DIR=/usr/share/aio-scripts-js \
AIO_SCRIPTS_SH_DIR=/usr/share/aio-scripts-sh \
AIO_UPLOAD_HOSTNAME=$AIO_UPLOAD_HOSTNAME TEST_AIO_UPLOAD_HOSTNAME=$TEST_AIO_UPLOAD_HOSTNAME \
AIO_UPLOAD_MAX_SIZE=$AIO_UPLOAD_MAX_SIZE TEST_AIO_UPLOAD_MAX_SIZE=$TEST_AIO_UPLOAD_MAX_SIZE \
AIO_UPLOAD_PORT=$AIO_UPLOAD_PORT TEST_AIO_UPLOAD_PORT=$TEST_AIO_UPLOAD_PORT \
NODE_ENV=production
@ -73,23 +60,23 @@ RUN mkdir /var/log/aio
# Add extra package sources
RUN apt-get update -y && apt-get install -y curl
RUN curl --silent --show-error --location https://deb.nodesource.com/setup_10.x | bash -
RUN curl --silent --show-error --location https://deb.nodesource.com/setup_6.x | bash -
RUN curl --silent --show-error https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add -
RUN echo "deb https://dl.yarnpkg.com/debian/ stable main" | tee /etc/apt/sources.list.d/yarn.list
RUN echo "deb http://ftp.debian.org/debian stretch-backports main" | tee /etc/apt/sources.list.d/backports.list
# Install packages
RUN apt-get update -y && apt-get install -y \
cron=3.0pl1-128+deb9u1 \
dnsmasq=2.76-5+deb9u2 \
nano=2.7.4-1 \
nginx=1.10.3-1+deb9u2 \
nodejs=10.15.3-1nodesource1 \
openssl=1.1.0j-1~deb9u1 \
rsyslog=8.24.0-1 \
yarn=1.15.2-1
RUN yarn global add pm2@3.5.0
chkconfig \
cron \
dnsmasq \
nano \
nginx \
nodejs \
openssl \
rsyslog \
yarn
RUN yarn global add pm2@2
# Set up log rotation
@ -107,9 +94,9 @@ RUN printenv | grep AIO_ >> /etc/environment
# Set up dnsmasq
COPY dnsmasq/dnsmasq.conf /etc/
RUN sed -i "s|{{\$AIO_NGINX_HOSTNAME}}|$AIO_NGINX_HOSTNAME|g" /etc/dnsmasq.conf
RUN sed -i "s|{{\$AIO_PREVIEW_SERVER_HOSTNAME}}|$AIO_PREVIEW_SERVER_HOSTNAME|g" /etc/dnsmasq.conf
RUN sed -i "s|{{\$AIO_UPLOAD_HOSTNAME}}|$AIO_UPLOAD_HOSTNAME|g" /etc/dnsmasq.conf
RUN sed -i "s|{{\$TEST_AIO_NGINX_HOSTNAME}}|$TEST_AIO_NGINX_HOSTNAME|g" /etc/dnsmasq.conf
RUN sed -i "s|{{\$TEST_AIO_PREVIEW_SERVER_HOSTNAME}}|$TEST_AIO_PREVIEW_SERVER_HOSTNAME|g" /etc/dnsmasq.conf
RUN sed -i "s|{{\$TEST_AIO_UPLOAD_HOSTNAME}}|$TEST_AIO_UPLOAD_HOSTNAME|g" /etc/dnsmasq.conf
# Set up SSL/TLS certificates
@ -122,35 +109,36 @@ RUN update-ca-certificates
# Set up nginx (for production and testing)
RUN sed -i -E "s|^user\s+\S+;|user $AIO_WWW_USER;|" /etc/nginx/nginx.conf
RUN rm -f /etc/nginx/conf.d/*
RUN rm -f /etc/nginx/sites-enabled/*
RUN rm /etc/nginx/sites-enabled/*
COPY nginx/aio-builds.conf /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_BUILDS_DIR}}|$AIO_BUILDS_DIR|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_DOMAIN_NAME}}|$AIO_DOMAIN_NAME|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_LOCALCERTS_DIR}}|$AIO_LOCALCERTS_DIR|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_NGINX_LOGS_DIR}}|$AIO_NGINX_LOGS_DIR|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_NGINX_PORT_HTTP}}|$AIO_NGINX_PORT_HTTP|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_NGINX_PORT_HTTPS}}|$AIO_NGINX_PORT_HTTPS|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_PREVIEW_SERVER_HOSTNAME}}|$AIO_PREVIEW_SERVER_HOSTNAME|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_ARTIFACT_MAX_SIZE}}|$AIO_ARTIFACT_MAX_SIZE|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_PREVIEW_SERVER_PORT}}|$AIO_PREVIEW_SERVER_PORT|g" /etc/nginx/conf.d/aio-builds-prod.conf
COPY nginx/aio-builds.conf /etc/nginx/sites-available/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_BUILDS_DIR}}|$AIO_BUILDS_DIR|g" /etc/nginx/sites-available/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_DOMAIN_NAME}}|$AIO_DOMAIN_NAME|g" /etc/nginx/sites-available/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_LOCALCERTS_DIR}}|$AIO_LOCALCERTS_DIR|g" /etc/nginx/sites-available/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_NGINX_LOGS_DIR}}|$AIO_NGINX_LOGS_DIR|g" /etc/nginx/sites-available/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_NGINX_PORT_HTTP}}|$AIO_NGINX_PORT_HTTP|g" /etc/nginx/sites-available/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_NGINX_PORT_HTTPS}}|$AIO_NGINX_PORT_HTTPS|g" /etc/nginx/sites-available/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_UPLOAD_HOSTNAME}}|$AIO_UPLOAD_HOSTNAME|g" /etc/nginx/sites-available/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_UPLOAD_MAX_SIZE}}|$AIO_UPLOAD_MAX_SIZE|g" /etc/nginx/sites-available/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_UPLOAD_PORT}}|$AIO_UPLOAD_PORT|g" /etc/nginx/sites-available/aio-builds-prod.conf
RUN ln -s /etc/nginx/sites-available/aio-builds-prod.conf /etc/nginx/sites-enabled/aio-builds-prod.conf
COPY nginx/aio-builds.conf /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_BUILDS_DIR}}|$TEST_AIO_BUILDS_DIR|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_DOMAIN_NAME}}|$TEST_AIO_DOMAIN_NAME|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_LOCALCERTS_DIR}}|$TEST_AIO_LOCALCERTS_DIR|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_NGINX_LOGS_DIR}}|$TEST_AIO_NGINX_LOGS_DIR|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_NGINX_PORT_HTTP}}|$TEST_AIO_NGINX_PORT_HTTP|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_NGINX_PORT_HTTPS}}|$TEST_AIO_NGINX_PORT_HTTPS|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_PREVIEW_SERVER_HOSTNAME}}|$TEST_AIO_PREVIEW_SERVER_HOSTNAME|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_ARTIFACT_MAX_SIZE}}|$TEST_AIO_ARTIFACT_MAX_SIZE|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_PREVIEW_SERVER_PORT}}|$TEST_AIO_PREVIEW_SERVER_PORT|g" /etc/nginx/conf.d/aio-builds-test.conf
COPY nginx/aio-builds.conf /etc/nginx/sites-available/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_BUILDS_DIR}}|$TEST_AIO_BUILDS_DIR|g" /etc/nginx/sites-available/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_DOMAIN_NAME}}|$TEST_AIO_DOMAIN_NAME|g" /etc/nginx/sites-available/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_LOCALCERTS_DIR}}|$TEST_AIO_LOCALCERTS_DIR|g" /etc/nginx/sites-available/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_NGINX_LOGS_DIR}}|$TEST_AIO_NGINX_LOGS_DIR|g" /etc/nginx/sites-available/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_NGINX_PORT_HTTP}}|$TEST_AIO_NGINX_PORT_HTTP|g" /etc/nginx/sites-available/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_NGINX_PORT_HTTPS}}|$TEST_AIO_NGINX_PORT_HTTPS|g" /etc/nginx/sites-available/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_UPLOAD_HOSTNAME}}|$TEST_AIO_UPLOAD_HOSTNAME|g" /etc/nginx/sites-available/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_UPLOAD_MAX_SIZE}}|$TEST_AIO_UPLOAD_MAX_SIZE|g" /etc/nginx/sites-available/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_UPLOAD_PORT}}|$TEST_AIO_UPLOAD_PORT|g" /etc/nginx/sites-available/aio-builds-test.conf
RUN ln -s /etc/nginx/sites-available/aio-builds-test.conf /etc/nginx/sites-enabled/aio-builds-test.conf
# Set up pm2
RUN pm2 startup --user root > /dev/null
RUN pm2 startup systemv -u root > /dev/null
RUN chkconfig pm2-root on
# Set up the shell scripts
@ -163,7 +151,7 @@ RUN find $AIO_SCRIPTS_SH_DIR -maxdepth 1 -type f -printf "%P\n" \
# Set up the Node.js scripts
COPY scripts-js/ $AIO_SCRIPTS_JS_DIR/
WORKDIR $AIO_SCRIPTS_JS_DIR/
RUN yarn install --production --frozen-lockfile
RUN yarn install --production
# Set up health check

View File

@ -1,2 +1,2 @@
# Periodically clean up builds that do not correspond to currently open PRs
0 12 * * * /usr/local/bin/aio-clean-up >> /var/log/cron.log 2>&1
0 12 * * * root /usr/local/bin/aio-clean-up >> /var/log/cron.log 2>&1

View File

@ -6,11 +6,11 @@ server=8.8.4.4
# Listen for DHCP and DNS requests only on this address.
listen-address=127.0.0.1
# Force an IP address for these domains.
# Force an IP addres for these domains.
address=/{{$AIO_NGINX_HOSTNAME}}/127.0.0.1
address=/{{$AIO_PREVIEW_SERVER_HOSTNAME}}/127.0.0.1
address=/{{$AIO_UPLOAD_HOSTNAME}}/127.0.0.1
address=/{{$TEST_AIO_NGINX_HOSTNAME}}/127.0.0.1
address=/{{$TEST_AIO_PREVIEW_SERVER_HOSTNAME}}/127.0.0.1
address=/{{$TEST_AIO_UPLOAD_HOSTNAME}}/127.0.0.1
# Run as root (required from inside docker container).
user=root

View File

@ -1,9 +0,0 @@
/var/log/aio/preview-server-*.log {
compress
copytruncate
delaycompress
missingok
monthly
notifempty
rotate 6
}

View File

@ -0,0 +1,9 @@
/var/log/aio/upload-server-*.log {
compress
copytruncate
delaycompress
missingok
monthly
notifempty
rotate 6
}

View File

@ -15,32 +15,21 @@ server {
# Serve PR-preview requests
server {
server_name "~^pr(?<pr>[1-9][0-9]*)-(?<sha>[0-9a-f]{7,40})\.";
server_name "~^pr(?<pr>[1-9][0-9]*)-(?<sha>[0-9a-f]{40})\.";
listen {{$AIO_NGINX_PORT_HTTPS}} ssl http2;
listen [::]:{{$AIO_NGINX_PORT_HTTPS}} ssl http2;
listen {{$AIO_NGINX_PORT_HTTPS}} ssl;
listen [::]:{{$AIO_NGINX_PORT_HTTPS}} ssl;
ssl_certificate {{$AIO_LOCALCERTS_DIR}}/{{$AIO_DOMAIN_NAME}}.crt;
ssl_certificate_key {{$AIO_LOCALCERTS_DIR}}/{{$AIO_DOMAIN_NAME}}.key;
ssl_prefer_server_ciphers on;
ssl_ciphers EECDH+CHACHA20:EECDH+AES128:RSA+AES128:EECDH+AES256:RSA+AES256:EECDH+3DES:RSA+3DES:!MD5;
ssl_certificate {{$AIO_LOCALCERTS_DIR}}/{{$AIO_DOMAIN_NAME}}.crt;
ssl_certificate_key {{$AIO_LOCALCERTS_DIR}}/{{$AIO_DOMAIN_NAME}}.key;
root {{$AIO_BUILDS_DIR}}/$pr/$sha;
disable_symlinks on from=$document_root;
index index.html;
gzip on;
gzip_comp_level 7;
gzip_types *;
access_log {{$AIO_NGINX_LOGS_DIR}}/access.log;
error_log {{$AIO_NGINX_LOGS_DIR}}/error.log;
error_page 404 /404.html;
location "=/404.html" {
internal;
}
location "~/[^/]+\.[^/]+$" {
try_files $uri $uri/ =404;
}
@ -54,13 +43,11 @@ server {
server {
server_name _;
listen {{$AIO_NGINX_PORT_HTTPS}} ssl http2 default_server;
listen [::]:{{$AIO_NGINX_PORT_HTTPS}} ssl http2;
listen {{$AIO_NGINX_PORT_HTTPS}} ssl default_server;
listen [::]:{{$AIO_NGINX_PORT_HTTPS}} ssl;
ssl_certificate {{$AIO_LOCALCERTS_DIR}}/{{$AIO_DOMAIN_NAME}}.crt;
ssl_certificate_key {{$AIO_LOCALCERTS_DIR}}/{{$AIO_DOMAIN_NAME}}.key;
ssl_prefer_server_ciphers on;
ssl_ciphers EECDH+CHACHA20:EECDH+AES128:RSA+AES128:EECDH+AES256:RSA+AES256:EECDH+3DES:RSA+3DES:!MD5;
ssl_certificate {{$AIO_LOCALCERTS_DIR}}/{{$AIO_DOMAIN_NAME}}.crt;
ssl_certificate_key {{$AIO_LOCALCERTS_DIR}}/{{$AIO_DOMAIN_NAME}}.key;
access_log {{$AIO_NGINX_LOGS_DIR}}/access.log;
error_log {{$AIO_NGINX_LOGS_DIR}}/error.log;
@ -71,47 +58,24 @@ server {
return 200 '';
}
# Check PRs previewability
location "~^/can-have-public-preview/\d+/?$" {
if ($request_method != "GET") {
add_header Allow "GET";
# Upload builds
location "~^/create-build/(?<pr>[1-9][0-9]*)/(?<sha>[0-9a-f]{40})/?$" {
if ($request_method != "POST") {
add_header Allow "POST";
return 405;
}
client_body_temp_path /tmp/aio-create-builds;
client_body_buffer_size 128K;
client_max_body_size {{$AIO_UPLOAD_MAX_SIZE}};
client_body_in_file_only on;
proxy_pass_request_headers on;
proxy_set_header X-FILE $request_body_file;
proxy_set_body off;
proxy_redirect off;
proxy_method GET;
proxy_pass http://{{$AIO_PREVIEW_SERVER_HOSTNAME}}:{{$AIO_PREVIEW_SERVER_PORT}}$request_uri;
resolver 127.0.0.1;
}
# Notify about CircleCI builds
location "~^/circle-build/?$" {
if ($request_method != "POST") {
add_header Allow "POST";
return 405;
}
proxy_pass_request_headers on;
proxy_redirect off;
proxy_method POST;
proxy_pass http://{{$AIO_PREVIEW_SERVER_HOSTNAME}}:{{$AIO_PREVIEW_SERVER_PORT}}$request_uri;
resolver 127.0.0.1;
}
# Notify about PR changes
location "~^/pr-updated/?$" {
if ($request_method != "POST") {
add_header Allow "POST";
return 405;
}
proxy_pass_request_headers on;
proxy_redirect off;
proxy_method POST;
proxy_pass http://{{$AIO_PREVIEW_SERVER_HOSTNAME}}:{{$AIO_PREVIEW_SERVER_PORT}}$request_uri;
proxy_pass http://{{$AIO_UPLOAD_HOSTNAME}}:{{$AIO_UPLOAD_PORT}}$request_uri;
resolver 127.0.0.1;
}

View File

@ -2,126 +2,70 @@
import * as fs from 'fs';
import * as path from 'path';
import * as shell from 'shelljs';
import {HIDDEN_DIR_PREFIX} from '../common/constants';
import {GithubApi} from '../common/github-api';
import {GithubPullRequests} from '../common/github-pull-requests';
import {assertNotMissingOrEmpty, getPrInfoFromDownloadPath, Logger} from '../common/utils';
import {assertNotMissingOrEmpty} from '../common/utils';
// Classes
export class BuildCleaner {
private logger = new Logger('BuildCleaner');
// Constructor
constructor(protected buildsDir: string, protected githubOrg: string, protected githubRepo: string,
protected githubToken: string, protected downloadsDir: string, protected artifactPath: string) {
constructor(protected buildsDir: string, protected repoSlug: string, protected githubToken: string) {
assertNotMissingOrEmpty('buildsDir', buildsDir);
assertNotMissingOrEmpty('githubOrg', githubOrg);
assertNotMissingOrEmpty('githubRepo', githubRepo);
assertNotMissingOrEmpty('repoSlug', repoSlug);
assertNotMissingOrEmpty('githubToken', githubToken);
assertNotMissingOrEmpty('downloadsDir', downloadsDir);
assertNotMissingOrEmpty('artifactPath', artifactPath);
}
// Methods - Public
public async cleanUp(): Promise<void> {
try {
this.logger.log('Cleaning up builds and downloads');
const openPrs = await this.getOpenPrNumbers();
this.logger.log(`Open pull requests: ${openPrs.length}`);
await Promise.all([
this.cleanBuilds(openPrs),
this.cleanDownloads(openPrs),
]);
} catch (error) {
this.logger.error('ERROR:', error);
}
public cleanUp(): Promise<void> {
return Promise.all([
this.getExistingBuildNumbers(),
this.getOpenPrNumbers(),
]).then(([existingBuilds, openPrs]) => this.removeUnnecessaryBuilds(existingBuilds, openPrs));
}
public async cleanBuilds(openPrs: number[]): Promise<void> {
const existingBuilds = await this.getExistingBuildNumbers();
await this.removeUnnecessaryBuilds(existingBuilds, openPrs);
}
public async cleanDownloads(openPrs: number[]): Promise<void> {
const existingDownloads = await this.getExistingDownloads();
await this.removeUnnecessaryDownloads(existingDownloads, openPrs);
}
public getExistingBuildNumbers(): Promise<number[]> {
return new Promise<number[]>((resolve, reject) => {
// Methods - Protected
protected getExistingBuildNumbers(): Promise<number[]> {
return new Promise((resolve, reject) => {
fs.readdir(this.buildsDir, (err, files) => {
if (err) {
return reject(err);
}
const buildNumbers = files.
map(name => name.replace(HIDDEN_DIR_PREFIX, '')). // Remove the "hidden dir" prefix
map(Number). // Convert string to number
filter(Boolean); // Ignore NaN (or 0), because they are not builds
map(Number). // Convert string to number
filter(Boolean); // Ignore NaN (or 0), because they are not builds
resolve(buildNumbers);
});
});
}
public async getOpenPrNumbers(): Promise<number[]> {
const api = new GithubApi(this.githubToken);
const githubPullRequests = new GithubPullRequests(api, this.githubOrg, this.githubRepo);
const prs = await githubPullRequests.fetchAll('open');
return prs.map(pr => pr.number);
protected getOpenPrNumbers(): Promise<number[]> {
const githubPullRequests = new GithubPullRequests(this.githubToken, this.repoSlug);
return githubPullRequests.
fetchAll('open').
then(prs => prs.map(pr => pr.number));
}
public removeDir(dir: string): void {
protected removeDir(dir: string) {
try {
if (shell.test('-d', dir)) {
shell.chmod('-R', 'a+w', dir);
shell.rm('-rf', dir);
}
// Undocumented signature (see https://github.com/shelljs/shelljs/pull/663).
(shell as any).chmod('-R', 'a+w', dir);
shell.rm('-rf', dir);
} catch (err) {
this.logger.error(`ERROR: Unable to remove '${dir}' due to:`, err);
console.error(`ERROR: Unable to remove '${dir}' due to:`, err);
}
}
public removeUnnecessaryBuilds(existingBuildNumbers: number[], openPrNumbers: number[]): void {
protected removeUnnecessaryBuilds(existingBuildNumbers: number[], openPrNumbers: number[]) {
const toRemove = existingBuildNumbers.filter(num => !openPrNumbers.includes(num));
this.logger.log(`Existing builds: ${existingBuildNumbers.length}`);
this.logger.log(`Removing ${toRemove.length} build(s): ${toRemove.join(', ')}`);
console.log(`Existing builds: ${existingBuildNumbers.length}`);
console.log(`Open pull requests: ${openPrNumbers.length}`);
console.log(`Removing ${toRemove.length} build(s): ${toRemove.join(', ')}`);
// Try removing public dirs.
toRemove.
map(num => path.join(this.buildsDir, String(num))).
forEach(dir => this.removeDir(dir));
// Try removing hidden dirs.
toRemove.
map(num => path.join(this.buildsDir, HIDDEN_DIR_PREFIX + String(num))).
forEach(dir => this.removeDir(dir));
}
public getExistingDownloads(): Promise<string[]> {
const artifactFile = path.basename(this.artifactPath);
return new Promise<string[]>((resolve, reject) => {
fs.readdir(this.downloadsDir, (err, files) => {
if (err) {
return reject(err);
}
files = files.filter(file => file.endsWith(artifactFile));
resolve(files);
});
});
}
public removeUnnecessaryDownloads(existingDownloads: string[], openPrNumbers: number[]): void {
const toRemove = existingDownloads.filter(filePath => {
const {pr} = getPrInfoFromDownloadPath(filePath);
return !openPrNumbers.includes(pr);
});
this.logger.log(`Existing downloads: ${existingDownloads.length}`);
this.logger.log(`Removing ${toRemove.length} download(s): ${toRemove.join(', ')}`);
toRemove.forEach(filePath => shell.rm(path.join(this.downloadsDir, filePath)));
}
}

View File

@ -1,26 +1,23 @@
// Imports
import {AIO_DOWNLOADS_DIR} from '../common/constants';
import {
AIO_ARTIFACT_PATH,
AIO_BUILDS_DIR,
AIO_GITHUB_ORGANIZATION,
AIO_GITHUB_REPO,
AIO_GITHUB_TOKEN,
} from '../common/env-variables';
import {getEnvVar} from '../common/utils';
import {BuildCleaner} from './build-cleaner';
// Constants
const AIO_BUILDS_DIR = getEnvVar('AIO_BUILDS_DIR');
const AIO_GITHUB_TOKEN = getEnvVar('AIO_GITHUB_TOKEN', true);
const AIO_REPO_SLUG = getEnvVar('AIO_REPO_SLUG');
// Run
_main();
// Functions
function _main(): void {
const buildCleaner = new BuildCleaner(
AIO_BUILDS_DIR,
AIO_GITHUB_ORGANIZATION,
AIO_GITHUB_REPO,
AIO_GITHUB_TOKEN,
AIO_DOWNLOADS_DIR,
AIO_ARTIFACT_PATH);
function _main() {
console.log(`[${new Date()}] - Cleaning up builds...`);
buildCleaner.cleanUp().catch(() => process.exit(1));
const buildCleaner = new BuildCleaner(AIO_BUILDS_DIR, AIO_REPO_SLUG, AIO_GITHUB_TOKEN);
buildCleaner.cleanUp().catch(err => {
console.error('ERROR:', err);
process.exit(1);
});
}

View File

@ -1,90 +0,0 @@
// Imports
import fetch from 'node-fetch';
import {assertNotMissingOrEmpty} from './utils';
// Constants
const CIRCLE_CI_API_URL = 'https://circleci.com/api/v1.1/project/github';
// Interfaces - Types
export interface ArtifactInfo {
path: string;
pretty_path: string;
node_index: number;
url: string;
}
export type ArtifactResponse = ArtifactInfo[];
export interface BuildInfo {
reponame: string;
failed: boolean;
branch: string;
username: string;
build_num: number;
has_artifacts: boolean;
outcome: string; // e.g. 'success'
vcs_revision: string; // HEAD SHA
// there are other fields but they are not used in this code
}
/**
* A Helper that can interact with the CircleCI API.
*/
export class CircleCiApi {
private tokenParam = `circle-token=${this.circleCiToken}`;
/**
* Construct a helper that can interact with the CircleCI REST API.
* @param githubOrg The Github organisation whose repos we want to access in CircleCI (e.g. angular).
* @param githubRepo The Github repo whose builds we want to access in CircleCI (e.g. angular).
* @param circleCiToken The CircleCI API access token (secret).
*/
constructor(
private githubOrg: string,
private githubRepo: string,
private circleCiToken: string,
) {
assertNotMissingOrEmpty('githubOrg', githubOrg);
assertNotMissingOrEmpty('githubRepo', githubRepo);
assertNotMissingOrEmpty('circleCiToken', circleCiToken);
}
/**
* Get the info for a build from the CircleCI API
* @param buildNumber The CircleCI build number that generated the artifact.
* @returns A promise to the info about the build
*/
public async getBuildInfo(buildNumber: number): Promise<BuildInfo> {
try {
const baseUrl = `${CIRCLE_CI_API_URL}/${this.githubOrg}/${this.githubRepo}/${buildNumber}`;
const response = await fetch(`${baseUrl}?${this.tokenParam}`);
if (response.status !== 200) {
throw new Error(`${baseUrl}: ${response.status} - ${response.statusText}`);
}
return response.json();
} catch (error) {
throw new Error(`CircleCI build info request failed (${error.message})`);
}
}
/**
* Query the CircleCI API to get a URL for a specified artifact from a specified build.
* @param artifactPath The path, within the build to the artifact.
* @returns A promise to the URL that can be requested to download the actual build artifact file.
*/
public async getBuildArtifactUrl(buildNumber: number, artifactPath: string): Promise<string> {
const baseUrl = `${CIRCLE_CI_API_URL}/${this.githubOrg}/${this.githubRepo}/${buildNumber}`;
try {
const response = await fetch(`${baseUrl}/artifacts?${this.tokenParam}`);
const artifacts = await response.json() as ArtifactResponse;
const artifact = artifacts.find(item => item.path === artifactPath);
if (!artifact) {
throw new Error(`Missing artifact (${artifactPath}) for CircleCI build: ${buildNumber}`);
}
return artifact.url;
} catch (error) {
throw new Error(`CircleCI artifact URL request failed (${error.message})`);
}
}
}

View File

@ -1,4 +0,0 @@
// Constants
export const AIO_DOWNLOADS_DIR = '/tmp/aio-downloads';
export const HIDDEN_DIR_PREFIX = 'hidden--';
export const SHORT_SHA_LEN = 7;

View File

@ -1,19 +0,0 @@
import {getEnvVar} from './utils';
export const AIO_ARTIFACT_PATH = getEnvVar('AIO_ARTIFACT_PATH');
export const AIO_BUILDS_DIR = getEnvVar('AIO_BUILDS_DIR');
export const AIO_GITHUB_TOKEN = getEnvVar('AIO_GITHUB_TOKEN');
export const AIO_CIRCLE_CI_TOKEN = getEnvVar('AIO_CIRCLE_CI_TOKEN');
export const AIO_DOMAIN_NAME = getEnvVar('AIO_DOMAIN_NAME');
export const AIO_GITHUB_ORGANIZATION = getEnvVar('AIO_GITHUB_ORGANIZATION');
export const AIO_GITHUB_REPO = getEnvVar('AIO_GITHUB_REPO');
export const AIO_GITHUB_TEAM_SLUGS = getEnvVar('AIO_GITHUB_TEAM_SLUGS');
export const AIO_NGINX_HOSTNAME = getEnvVar('AIO_NGINX_HOSTNAME');
export const AIO_NGINX_PORT_HTTP = +getEnvVar('AIO_NGINX_PORT_HTTP');
export const AIO_NGINX_PORT_HTTPS = +getEnvVar('AIO_NGINX_PORT_HTTPS');
export const AIO_SIGNIFICANT_FILES_PATTERN = getEnvVar('AIO_SIGNIFICANT_FILES_PATTERN');
export const AIO_TRUSTED_PR_LABEL = getEnvVar('AIO_TRUSTED_PR_LABEL');
export const AIO_PREVIEW_SERVER_HOSTNAME = getEnvVar('AIO_PREVIEW_SERVER_HOSTNAME');
export const AIO_PREVIEW_SERVER_PORT = +getEnvVar('AIO_PREVIEW_SERVER_PORT');
export const AIO_ARTIFACT_MAX_SIZE = +getEnvVar('AIO_ARTIFACT_MAX_SIZE');
export const AIO_WWW_USER = getEnvVar('AIO_WWW_USER');

View File

@ -28,34 +28,16 @@ export class GithubApi {
}
// Methods - Public
public get<T = any>(pathname: string, params?: RequestParamsOrNull): Promise<T> {
public get<T>(pathname: string, params?: RequestParamsOrNull): Promise<T> {
const path = this.buildPath(pathname, params);
return this.request<T>('get', path);
}
public post<T = any>(pathname: string, params?: RequestParamsOrNull, data?: any): Promise<T> {
public post<T>(pathname: string, params?: RequestParamsOrNull, data?: any): Promise<T> {
const path = this.buildPath(pathname, params);
return this.request<T>('post', path, data);
}
// In GitHub API paginated requests, page numbering is 1-based. (https://developer.github.com/v3/#pagination)
public getPaginated<T>(pathname: string, baseParams: RequestParams = {}, currentPage: number = 1): Promise<T[]> {
const perPage = 100;
const params = {
...baseParams,
page: currentPage,
per_page: perPage,
};
return this.get<T[]>(pathname, params).then(items => {
if (items.length < perPage) {
return items;
}
return this.getPaginated<T>(pathname, baseParams, currentPage + 1).then(moreItems => [...items, ...moreItems]);
});
}
// Methods - Protected
protected buildPath(pathname: string, params?: RequestParamsOrNull): string {
if (params == null) {
@ -68,6 +50,23 @@ export class GithubApi {
return `${pathname}${joiner}${search}`;
}
protected getPaginated<T>(pathname: string, baseParams: RequestParams = {}, currentPage: number = 0): Promise<T[]> {
const perPage = 100;
const params = {
...baseParams,
page: currentPage,
per_page: perPage,
};
return this.get<T[]>(pathname, params).then(items => {
if (items.length < perPage) {
return items;
}
return this.getPaginated(pathname, baseParams, currentPage + 1).then(moreItems => [...items, ...moreItems]);
});
}
protected request<T>(method: string, path: string, data: any = null): Promise<T> {
return new Promise<T>((resolve, reject) => {
const options = {
@ -82,7 +81,7 @@ export class GithubApi {
reject(`Request to '${url}' failed (status: ${statusCode}): ${responseText}`);
};
const onSuccess = (responseText: string) => {
try { resolve(responseText && JSON.parse(responseText)); } catch (err) { reject(err); }
try { resolve(JSON.parse(responseText)); } catch (err) { reject(err); }
};
const onResponse = (res: IncomingMessage) => {
const statusCode = res.statusCode || -1;

View File

@ -1,79 +1,44 @@
// Imports
import {assertNotMissingOrEmpty} from '../common/utils';
import {GithubApi} from './github-api';
import {assert, assertNotMissingOrEmpty} from './utils';
export interface PullRequest {
// Interfaces - Types
export interface PullRequest {
number: number;
user: {login: string};
labels: {name: string}[];
}
export interface FileInfo {
sha: string;
filename: string;
}
export type PullRequestState = 'all' | 'closed' | 'open';
/**
* Access pull requests on GitHub.
*/
export class GithubPullRequests {
public repoSlug: string;
/**
* Create an instance of this helper
* @param api An instance of the Github API helper.
* @param githubOrg The organisation on GitHub whose repo we will interrogate.
* @param githubRepo The repository on Github with whose PRs we will interact.
*/
constructor(private api: GithubApi, githubOrg: string, githubRepo: string) {
assertNotMissingOrEmpty('githubOrg', githubOrg);
assertNotMissingOrEmpty('githubRepo', githubRepo);
this.repoSlug = `${githubOrg}/${githubRepo}`;
// Classes
export class GithubPullRequests extends GithubApi {
// Constructor
constructor(githubToken: string, protected repoSlug: string) {
super(githubToken);
assertNotMissingOrEmpty('repoSlug', repoSlug);
}
/**
* Post a comment on a PR.
* @param pr The number of the PR on which to comment.
* @param body The body of the comment to post.
* @returns A promise that resolves when the comment has been posted.
*/
public addComment(pr: number, body: string): Promise<any> {
assert(pr > 0, `Invalid PR number: ${pr}`);
assert(!!body, `Invalid or empty comment body: ${body}`);
return this.api.post<any>(`/repos/${this.repoSlug}/issues/${pr}/comments`, null, {body});
// Methods - Public
public addComment(pr: number, body: string): Promise<void> {
if (!(pr > 0)) {
throw new Error(`Invalid PR number: ${pr}`);
} else if (!body) {
throw new Error(`Invalid or empty comment body: ${body}`);
}
return this.post<void>(`/repos/${this.repoSlug}/issues/${pr}/comments`, null, {body});
}
/**
* Request information about a PR.
* @param pr The number of the PR for which to request info.
* @returns A promise that is resolves with information about the specified PR.
*/
public fetch(pr: number): Promise<PullRequest> {
assert(pr > 0, `Invalid PR number: ${pr}`);
// Using the `/issues/` URL, because the `/pulls/` one does not provide labels.
return this.api.get<PullRequest>(`/repos/${this.repoSlug}/issues/${pr}`);
return this.get<PullRequest>(`/repos/${this.repoSlug}/pulls/${pr}`);
}
/**
* Request information about all PRs that match the given state.
* @param state Only retrieve PRs that have this state.
* @returns A promise that is resolved with information about the requested PRs.
*/
public fetchAll(state: PullRequestState = 'all'): Promise<PullRequest[]> {
console.log(`Fetching ${state} pull requests...`);
const pathname = `/repos/${this.repoSlug}/pulls`;
const params = {state};
return this.api.getPaginated<PullRequest>(pathname, params);
}
/**
* Request a list of files for the given PR.
* @param pr The number of the PR for which to request files.
* @returns A promise that resolves to an array of file information
*/
public fetchFiles(pr: number): Promise<FileInfo[]> {
assert(pr > 0, `Invalid PR number: ${pr}`);
return this.api.getPaginated<FileInfo>(`/repos/${this.repoSlug}/pulls/${pr}/files`);
return this.getPaginated<PullRequest>(pathname, params);
}
}

View File

@ -1,72 +1,45 @@
// Imports
import {assertNotMissingOrEmpty} from '../common/utils';
import {GithubApi} from './github-api';
import {assertNotMissingOrEmpty} from './utils';
export interface Team {
// Interfaces - Types
interface Team {
id: number;
slug: string;
}
export interface TeamMembership {
interface TeamMembership {
state: string;
}
export class GithubTeams {
/**
* Create an instance of this helper
* @param api An instance of the Github API helper.
* @param githubOrg The organisation on GitHub whose repo we will interrogate.
*/
constructor(private api: GithubApi, protected githubOrg: string) {
assertNotMissingOrEmpty('githubOrg', githubOrg);
// Classes
export class GithubTeams extends GithubApi {
// Constructor
constructor(githubToken: string, protected organization: string) {
super(githubToken);
assertNotMissingOrEmpty('organization', organization);
}
/**
* Request information about all the organisation's teams in GitHub.
* @returns A promise that is resolved with information about the teams.
*/
// Methods - Public
public fetchAll(): Promise<Team[]> {
return this.api.getPaginated<Team>(`/orgs/${this.githubOrg}/teams`);
return this.getPaginated<Team>(`/orgs/${this.organization}/teams`);
}
/**
* Check whether the specified username is a member of the specified team.
* @param username The usernane to check for in the team.
* @param teamIds The team to check for the username.
* @returns a Promise that resolves to `true` if the username is a member of the team.
*/
public async isMemberById(username: string, teamIds: number[]): Promise<boolean> {
public isMemberById(username: string, teamIds: number[]): Promise<boolean> {
const getMembership = (teamId: number) =>
this.get<TeamMembership>(`/teams/${teamId}/memberships/${username}`).
then(membership => membership.state === 'active').
catch(() => false);
const reduceFn = (promise: Promise<boolean>, teamId: number) =>
promise.then(isMember => isMember || getMembership(teamId));
const getMembership = async (teamId: number) => {
try {
const {state} = await this.api.get<TeamMembership>(`/teams/${teamId}/memberships/${username}`);
return state === 'active';
} catch (error) {
return false;
}
};
for (const teamId of teamIds) {
if (await getMembership(teamId)) {
return true;
}
}
return false;
return teamIds.reduce(reduceFn, Promise.resolve(false));
}
/**
* Check whether the given username is a member of the teams specified by the team slugs.
* @param username The username to check for in the teams.
* @param teamSlugs A collection of slugs that represent the teams to check for the the username.
* @returns a Promise that resolves to `true` if the usernane is a member of at least one of the specified teams.
*/
public async isMemberBySlug(username: string, teamSlugs: string[]): Promise<boolean> {
try {
const teams = await this.fetchAll();
const teamIds = teams.filter(team => teamSlugs.includes(team.slug)).map(team => team.id);
return await this.isMemberById(username, teamIds);
} catch (error) {
return false;
}
public isMemberBySlug(username: string, teamSlugs: string[]): Promise<boolean> {
return this.fetchAll().
then(teams => teams.filter(team => teamSlugs.includes(team.slug)).map(team => team.id)).
then(teamIds => this.isMemberById(username, teamIds)).
catch(() => false);
}
}

View File

@ -1,14 +1,14 @@
// We can't use `import...from` here, because of the following mess:
// - GitHub project `jasmine/jasmine` is `jasmine-core` on npm and its typings `@types/jasmine`.
// - GitHub project `jasmine/jasmine-npm` is `jasmine` on npm and has no typings.
//
// Using `import...from 'jasmine'` here, would import from `@types/jasmine` (which refers to the
// `jasmine-core` module and the `jasmine` module).
import Jasmine = require('jasmine');
import 'source-map-support/register';
export const runTests = (specFiles: string[]) => {
export const runTests = (specFiles: string[], helpers?: string[]) => {
// We can't use `import` here, because of the following mess:
// - GitHub project `jasmine/jasmine` is `jasmine-core` on npm and its typings `@types/jasmine`.
// - GitHub project `jasmine/jasmine-npm` is `jasmine` on npm and has no typings.
//
// Using `import...from 'jasmine'` here, would import from `@types/jasmine` (which refers to the
// `jasmine-core` module and the `jasmine` module).
// tslint:disable-next-line: no-var-requires variable-name
const Jasmine = require('jasmine');
const config = {
helpers,
random: true,
spec_files: specFiles,
stopSpecOnExpectationFailure: true,
@ -16,7 +16,7 @@ export const runTests = (specFiles: string[]) => {
process.on('unhandledRejection', (reason: any) => console.log('Unhandled rejection:', reason));
const runner = new Jasmine({});
const runner = new Jasmine();
runner.loadConfig(config);
runner.onComplete((passed: boolean) => process.exit(passed ? 0 : 1));
runner.execute();

View File

@ -1,98 +1,17 @@
import {basename, resolve as resolvePath} from 'path';
import {SHORT_SHA_LEN} from './constants';
/**
* Shorten a SHA to make it more readable
* @param sha The SHA to shorten.
*/
export function computeShortSha(sha: string) {
return sha.substr(0, SHORT_SHA_LEN);
}
/**
* Compute the path for a downloaded artifact file.
* @param downloadsDir The directory where artifacts are downloaded
* @param pr The PR associated with this artifact.
* @param sha The SHA associated with the build for this artifact.
* @param artifactPath The path to the artifact on CircleCI.
* @returns The fully resolved location for the specified downloaded artifact.
*/
export function computeArtifactDownloadPath(downloadsDir: string, pr: number, sha: string, artifactPath: string) {
return resolvePath(downloadsDir, `${pr}-${computeShortSha(sha)}-${basename(artifactPath)}`);
}
/**
* Extract the PR number and latest commit SHA from a downloaded file path.
* @param downloadPath the path to the downloaded file.
* @returns An object whose keys are the PR and SHA extracted from the file path.
*/
export function getPrInfoFromDownloadPath(downloadPath: string) {
const file = basename(downloadPath);
const [pr, sha] = file.split('-');
return {pr: +pr, sha};
}
/**
* Assert that a value is true.
* @param value The value to assert.
* @param message The message if the value is not true.
*/
export function assert(value: boolean, message: string) {
if (!value) {
throw new Error(message);
}
}
/**
* Assert that a parameter is not equal to "".
* @param name The name of the parameter.
* @param value The value of the parameter.
*/
// Functions
export const assertNotMissingOrEmpty = (name: string, value: string | null | undefined) => {
assert(!!value, `Missing or empty required parameter '${name}'!`);
if (!value) {
throw new Error(`Missing or empty required parameter '${name}'!`);
}
};
/**
* Get an environment variable.
* @param name The name of the environment variable.
* @param isOptional True if the variable is optional.
* @returns The value of the variable or "" if it is optional and falsy.
* @throws `Error` if the variable is falsy and not optional.
*/
export const getEnvVar = (name: string, isOptional = false): string => {
const value = process.env[name];
if (!isOptional && !value) {
try {
throw new Error(`ERROR: Missing required environment variable '${name}'!`);
} catch (error) {
console.error(error.stack);
process.exit(1);
}
console.error(`ERROR: Missing required environment variable '${name}'!`);
process.exit(1);
}
return value || '';
};
/**
* A basic logger implementation.
* Delegates to `console`, but prepends each message with the current date and specified scope (i.e caller).
*/
export class Logger {
private padding = ' '.repeat(20 - this.scope.length);
/**
* Create a new `Logger` instance for the specified `scope`.
* @param scope The logger's scope (added to all messages).
*/
constructor(private scope: string) {}
public error(...args: any[]) { this.callMethod('error', args); }
public info(...args: any[]) { this.callMethod('info', args); }
public log(...args: any[]) { this.callMethod('log', args); }
public warn(...args: any[]) { this.callMethod('warn', args); }
private callMethod(method: 'error' | 'info' | 'log' | 'warn', args: any[]) {
console[method](`[${new Date()}]`, `${this.scope}:${this.padding}`, ...args);
}
}

View File

@ -1,144 +0,0 @@
// Imports
import * as cp from 'child_process';
import {EventEmitter} from 'events';
import * as fs from 'fs';
import * as path from 'path';
import * as shell from 'shelljs';
import {HIDDEN_DIR_PREFIX} from '../common/constants';
import {assertNotMissingOrEmpty, computeShortSha, Logger} from '../common/utils';
import {ChangedPrVisibilityEvent, CreatedBuildEvent} from './build-events';
import {PreviewServerError} from './preview-error';
// Classes
export class BuildCreator extends EventEmitter {
private logger = new Logger('BuildCreator');
// Constructor
constructor(protected buildsDir: string) {
super();
assertNotMissingOrEmpty('buildsDir', buildsDir);
}
// Methods - Public
public create(pr: number, sha: string, archivePath: string, isPublic: boolean): Promise<void> {
// Use only part of the SHA for more readable URLs.
sha = computeShortSha(sha);
const {newPrDir: prDir} = this.getCandidatePrDirs(pr, isPublic);
const shaDir = path.join(prDir, sha);
let dirToRemoveOnError: string;
return Promise.resolve().
// If the same PR exists with different visibility, update the visibility first.
then(() => this.updatePrVisibility(pr, isPublic)).
then(() => Promise.all([this.exists(prDir), this.exists(shaDir)])).
then(([prDirExisted, shaDirExisted]) => {
if (shaDirExisted) {
const publicOrNot = isPublic ? 'public' : 'non-public';
throw new PreviewServerError(409, `Request to overwrite existing ${publicOrNot} directory: ${shaDir}`);
}
dirToRemoveOnError = prDirExisted ? shaDir : prDir;
return Promise.resolve().
then(() => shell.mkdir('-p', shaDir)).
then(() => this.extractArchive(archivePath, shaDir)).
then(() => this.emit(CreatedBuildEvent.type, new CreatedBuildEvent(+pr, sha, isPublic))).
then(() => undefined);
}).
catch(err => {
if (dirToRemoveOnError) {
shell.rm('-rf', dirToRemoveOnError);
}
if (!(err instanceof PreviewServerError)) {
err = new PreviewServerError(500, `Error while creating preview at: ${shaDir}\n${err}`);
}
throw err;
});
}
public updatePrVisibility(pr: number, makePublic: boolean): Promise<boolean> {
const {oldPrDir: otherVisPrDir, newPrDir: targetVisPrDir} = this.getCandidatePrDirs(pr, makePublic);
return Promise.
all([this.exists(otherVisPrDir), this.exists(targetVisPrDir)]).
then(([otherVisPrDirExisted, targetVisPrDirExisted]) => {
if (!otherVisPrDirExisted) {
// No visibility change: Either the visibility is up-to-date or the PR does not exist.
return false;
} else if (targetVisPrDirExisted) {
// Error: Directories for both visibilities exist.
throw new PreviewServerError(409,
`Request to move '${otherVisPrDir}' to existing directory '${targetVisPrDir}'.`);
}
// Visibility change: Moving `otherVisPrDir` to `targetVisPrDir`.
return Promise.resolve().
then(() => shell.mv(otherVisPrDir, targetVisPrDir)).
then(() => this.listShasByDate(targetVisPrDir)).
then(shas => this.emit(ChangedPrVisibilityEvent.type, new ChangedPrVisibilityEvent(+pr, shas, makePublic))).
then(() => true);
}).
catch(err => {
if (!(err instanceof PreviewServerError)) {
err = new PreviewServerError(500, `Error while making PR ${pr} ${makePublic ? 'public' : 'hidden'}.\n${err}`);
}
throw err;
});
}
// Methods - Protected
protected exists(fileOrDir: string): Promise<boolean> {
return new Promise(resolve => fs.access(fileOrDir, err => resolve(!err)));
}
protected extractArchive(inputFile: string, outputDir: string): Promise<void> {
return new Promise<void>((resolve, reject) => {
const cmd = `tar --extract --gzip --directory "${outputDir}" --file "${inputFile}"`;
cp.exec(cmd, (err, _stdout, stderr) => {
if (err) {
return reject(err);
}
if (stderr) {
this.logger.warn(stderr);
}
try {
shell.chmod('-R', 'a-w', outputDir);
shell.rm('-f', inputFile);
resolve();
} catch (err) {
reject(err);
}
});
});
}
protected getCandidatePrDirs(pr: number, isPublic: boolean): {oldPrDir: string, newPrDir: string} {
const hiddenPrDir = path.join(this.buildsDir, HIDDEN_DIR_PREFIX + pr);
const publicPrDir = path.join(this.buildsDir, `${pr}`);
const oldPrDir = isPublic ? hiddenPrDir : publicPrDir;
const newPrDir = isPublic ? publicPrDir : hiddenPrDir;
return {oldPrDir, newPrDir};
}
protected listShasByDate(inputDir: string): Promise<string[]> {
return Promise.resolve().
then(() => shell.ls('-l', inputDir) as any as Promise<(fs.Stats & {name: string})[]>).
// Keep directories only.
// (Also, convert to standard Array - ShellJS provides custom `sort()` method for sorting file contents.)
then(items => items.filter(item => item.isDirectory())).
// Sort by modification date.
then(items => items.sort((a, b) => a.mtime.getTime() - b.mtime.getTime())).
// Return directory names.
then(items => items.map(item => item.name));
}
}

View File

@ -1,16 +0,0 @@
// Classes
export class ChangedPrVisibilityEvent {
// Properties - Public, Static
public static type = 'pr.changedVisibility';
// Constructor
constructor(public pr: number, public shas: string[], public isPublic: boolean) {}
}
export class CreatedBuildEvent {
// Properties - Public, Static
public static type = 'build.created';
// Constructor
constructor(public pr: number, public sha: string, public isPublic: boolean) {}
}

View File

@ -1,83 +0,0 @@
import * as fs from 'fs';
import fetch from 'node-fetch';
import {dirname} from 'path';
import {mkdir} from 'shelljs';
import {promisify} from 'util';
import {CircleCiApi} from '../common/circle-ci-api';
import {assert, assertNotMissingOrEmpty, computeArtifactDownloadPath, Logger} from '../common/utils';
import {PreviewServerError} from './preview-error';
export interface GithubInfo {
org: string;
pr: number;
repo: string;
sha: string;
success: boolean;
}
/**
* A helper that can get information about builds and download build artifacts.
*/
export class BuildRetriever {
private logger = new Logger('BuildRetriever');
constructor(private api: CircleCiApi, private downloadSizeLimit: number, private downloadDir: string) {
assert(downloadSizeLimit > 0, 'Invalid parameter "downloadSizeLimit" should be a number greater than 0.');
assertNotMissingOrEmpty('downloadDir', downloadDir);
}
/**
* Get GitHub information about a build
* @param buildNum The number of the build for which to retrieve the info.
* @returns The Github org, repo, PR and latest SHA for the specified build.
*/
public async getGithubInfo(buildNum: number): Promise<GithubInfo> {
const buildInfo = await this.api.getBuildInfo(buildNum);
const githubInfo: GithubInfo = {
org: buildInfo.username,
pr: getPrFromBranch(buildInfo.branch),
repo: buildInfo.reponame,
sha: buildInfo.vcs_revision,
success: !buildInfo.failed,
};
return githubInfo;
}
/**
* Make a request to the given URL for a build artifact and store it locally.
* @param buildNum the number of the CircleCI build whose artifact we want to download.
* @param pr the number of the PR that triggered the CircleCI build.
* @param sha the commit in the PR that triggered the CircleCI build.
* @param artifactPath the path on CircleCI where the artifact was stored.
* @returns A promise to the file path where the downloaded file was stored.
*/
public async downloadBuildArtifact(buildNum: number, pr: number, sha: string, artifactPath: string): Promise<string> {
try {
const outPath = computeArtifactDownloadPath(this.downloadDir, pr, sha, artifactPath);
const downloadExists = await new Promise(resolve => fs.exists(outPath, exists => resolve(exists)));
if (!downloadExists) {
const url = await this.api.getBuildArtifactUrl(buildNum, artifactPath);
const response = await fetch(url, {size: this.downloadSizeLimit});
if (response.status !== 200) {
throw new PreviewServerError(response.status, `Error ${response.status} - ${response.statusText}`);
}
const buffer = await response.buffer();
mkdir('-p', dirname(outPath));
await promisify(fs.writeFile)(outPath, buffer);
}
return outPath;
} catch (error) {
this.logger.warn(error);
const status = (error.type === 'max-size') ? 413 : 500;
throw new PreviewServerError(status, `CircleCI artifact download failed (${error.message || error})`);
}
}
}
function getPrFromBranch(branch: string): number {
// CircleCI only exposes PR numbers via the `branch` field :-(
const match = /^pull\/(\d+)$/.exec(branch);
if (!match) {
throw new Error(`No PR found in branch field: ${branch}`);
}
return +match[1];
}

View File

@ -1,46 +0,0 @@
import {GithubPullRequests, PullRequest} from '../common/github-pull-requests';
import {GithubTeams} from '../common/github-teams';
import {assertNotMissingOrEmpty} from '../common/utils';
/**
* A helper to verify whether builds are trusted.
*/
export class BuildVerifier {
/**
* Construct a new BuildVerifier instance.
* @param prs A helper to access PR information.
* @param teams A helper to access Github team information.
* @param allowedTeamSlugs The teams that are trusted.
* @param trustedPrLabel The github label that indicates that a PR is trusted.
*/
constructor(protected prs: GithubPullRequests, protected teams: GithubTeams,
protected allowedTeamSlugs: string[], protected trustedPrLabel: string) {
assertNotMissingOrEmpty('allowedTeamSlugs', allowedTeamSlugs && allowedTeamSlugs.join(''));
assertNotMissingOrEmpty('trustedPrLabel', trustedPrLabel);
}
/**
* Check whether a PR contains files that are significant to the build.
* @param pr The number of the PR to check
* @param significantFilePattern A regex that selects files that are significant.
*/
public async getSignificantFilesChanged(pr: number, significantFilePattern: RegExp): Promise<boolean> {
const files = await this.prs.fetchFiles(pr);
return files.some(file => significantFilePattern.test(file.filename));
}
/**
* Check whether a PR is trusted.
* @param pr The number of the PR to check.
* @returns true if the PR is trusted.
*/
public async getPrIsTrusted(pr: number): Promise<boolean> {
const prInfo = await this.prs.fetch(pr);
return this.hasLabel(prInfo, this.trustedPrLabel) ||
(await this.teams.isMemberBySlug(prInfo.user.login, this.allowedTeamSlugs));
}
protected hasLabel(prInfo: PullRequest, label: string): boolean {
return prInfo.labels.some(labelObj => labelObj.name === label);
}
}

View File

@ -1,41 +0,0 @@
// Imports
import {AIO_DOWNLOADS_DIR} from '../common/constants';
import {
AIO_ARTIFACT_MAX_SIZE,
AIO_ARTIFACT_PATH,
AIO_BUILDS_DIR,
AIO_CIRCLE_CI_TOKEN,
AIO_DOMAIN_NAME,
AIO_GITHUB_ORGANIZATION,
AIO_GITHUB_REPO,
AIO_GITHUB_TEAM_SLUGS,
AIO_GITHUB_TOKEN,
AIO_PREVIEW_SERVER_HOSTNAME,
AIO_PREVIEW_SERVER_PORT,
AIO_SIGNIFICANT_FILES_PATTERN,
AIO_TRUSTED_PR_LABEL,
} from '../common/env-variables';
import {PreviewServerFactory} from './preview-server-factory';
// Run
_main();
// Functions
function _main(): void {
PreviewServerFactory
.create({
buildArtifactPath: AIO_ARTIFACT_PATH,
buildsDir: AIO_BUILDS_DIR,
circleCiToken: AIO_CIRCLE_CI_TOKEN,
domainName: AIO_DOMAIN_NAME,
downloadSizeLimit: AIO_ARTIFACT_MAX_SIZE,
downloadsDir: AIO_DOWNLOADS_DIR,
githubOrg: AIO_GITHUB_ORGANIZATION,
githubRepo: AIO_GITHUB_REPO,
githubTeamSlugs: AIO_GITHUB_TEAM_SLUGS.split(','),
githubToken: AIO_GITHUB_TOKEN,
significantFilesPattern: AIO_SIGNIFICANT_FILES_PATTERN,
trustedPrLabel: AIO_TRUSTED_PR_LABEL,
})
.listen(AIO_PREVIEW_SERVER_PORT, AIO_PREVIEW_SERVER_HOSTNAME);
}

View File

@ -1,8 +0,0 @@
// Classes
export class PreviewServerError extends Error {
// Constructor
constructor(public status: number = 500, message?: string) {
super(message);
Object.setPrototypeOf(this, PreviewServerError.prototype);
}
}

View File

@ -1,213 +0,0 @@
// Imports
import * as bodyParser from 'body-parser';
import * as express from 'express';
import * as http from 'http';
import {AddressInfo} from 'net';
import {CircleCiApi} from '../common/circle-ci-api';
import {GithubApi} from '../common/github-api';
import {GithubPullRequests} from '../common/github-pull-requests';
import {GithubTeams} from '../common/github-teams';
import {assert, assertNotMissingOrEmpty, computeShortSha, Logger} from '../common/utils';
import {BuildCreator} from './build-creator';
import {ChangedPrVisibilityEvent, CreatedBuildEvent} from './build-events';
import {BuildRetriever} from './build-retriever';
import {BuildVerifier} from './build-verifier';
import {respondWithError, throwRequestError} from './utils';
const AIO_PREVIEW_JOB = 'aio_preview';
// Interfaces - Types
export interface PreviewServerConfig {
downloadsDir: string;
downloadSizeLimit: number;
buildArtifactPath: string;
buildsDir: string;
domainName: string;
githubOrg: string;
githubRepo: string;
githubTeamSlugs: string[];
circleCiToken: string;
githubToken: string;
significantFilesPattern: string;
trustedPrLabel: string;
}
const logger = new Logger('PreviewServer');
// Classes
export class PreviewServerFactory {
// Methods - Public
public static create(cfg: PreviewServerConfig): http.Server {
assertNotMissingOrEmpty('domainName', cfg.domainName);
const circleCiApi = new CircleCiApi(cfg.githubOrg, cfg.githubRepo, cfg.circleCiToken);
const githubApi = new GithubApi(cfg.githubToken);
const prs = new GithubPullRequests(githubApi, cfg.githubOrg, cfg.githubRepo);
const teams = new GithubTeams(githubApi, cfg.githubOrg);
const buildRetriever = new BuildRetriever(circleCiApi, cfg.downloadSizeLimit, cfg.downloadsDir);
const buildVerifier = new BuildVerifier(prs, teams, cfg.githubTeamSlugs, cfg.trustedPrLabel);
const buildCreator = PreviewServerFactory.createBuildCreator(prs, cfg.buildsDir, cfg.domainName);
const middleware = PreviewServerFactory.createMiddleware(buildRetriever, buildVerifier, buildCreator, cfg);
const httpServer = http.createServer(middleware as any);
httpServer.on('listening', () => {
const info = httpServer.address() as AddressInfo;
logger.info(`Up and running (and listening on ${info.address}:${info.port})...`);
});
return httpServer;
}
public static createMiddleware(buildRetriever: BuildRetriever, buildVerifier: BuildVerifier,
buildCreator: BuildCreator, cfg: PreviewServerConfig): express.Express {
const middleware = express();
const jsonParser = bodyParser.json();
const significantFilesRe = new RegExp(cfg.significantFilesPattern);
// RESPOND TO IS-ALIVE PING
middleware.get(/^\/health-check\/?$/, (_req, res) => res.sendStatus(200));
// RESPOND TO CAN-HAVE-PUBLIC-PREVIEW CHECK
const canHavePublicPreviewRe = /^\/can-have-public-preview\/(\d+)\/?$/;
middleware.get(canHavePublicPreviewRe, async (req, res) => {
try {
const pr = +canHavePublicPreviewRe.exec(req.url)![1];
if (!await buildVerifier.getSignificantFilesChanged(pr, significantFilesRe)) {
// Cannot have preview: PR did not touch relevant files: `aio/` or `packages/` (except for spec files).
res.send({canHavePublicPreview: false, reason: 'No significant files touched.'});
logger.log(`PR:${pr} - Cannot have a public preview, because it did not touch any significant files.`);
} else if (!await buildVerifier.getPrIsTrusted(pr)) {
// Cannot have preview: PR not automatically verifiable as "trusted".
res.send({canHavePublicPreview: false, reason: 'Not automatically verifiable as "trusted".'});
logger.log(`PR:${pr} - Cannot have a public preview, because not automatically verifiable as "trusted".`);
} else {
// Can have preview.
res.send({canHavePublicPreview: true, reason: null});
logger.log(`PR:${pr} - Can have a public preview.`);
}
} catch (err) {
logger.error('Previewability check error', err);
respondWithError(res, err);
}
});
// CIRCLE_CI BUILD COMPLETE WEBHOOK
middleware.post(/^\/circle-build\/?$/, jsonParser, async (req, res) => {
try {
if (!(
req.is('json') &&
req.body &&
req.body.payload &&
req.body.payload.build_num > 0 &&
req.body.payload.build_parameters &&
req.body.payload.build_parameters.CIRCLE_JOB
)) {
throwRequestError(400, `Incorrect body content. Expected JSON`, req);
}
const job = req.body.payload.build_parameters.CIRCLE_JOB;
const buildNum = req.body.payload.build_num;
logger.log(`Build:${buildNum}, Job:${job} - processing web-hook trigger`);
if (job !== AIO_PREVIEW_JOB) {
res.sendStatus(204);
logger.log(`Build:${buildNum}, Job:${job} -`,
`Skipping preview processing because this is not the "${AIO_PREVIEW_JOB}" job.`);
return;
}
const { pr, sha, org, repo, success } = await buildRetriever.getGithubInfo(buildNum);
if (!success) {
res.sendStatus(204);
logger.log(`PR:${pr}, Build:${buildNum} - Skipping preview processing because this build did not succeed.`);
return;
}
assert(cfg.githubOrg === org,
`Invalid webhook: expected "githubOrg" property to equal "${cfg.githubOrg}" but got "${org}".`);
assert(cfg.githubRepo === repo,
`Invalid webhook: expected "githubRepo" property to equal "${cfg.githubRepo}" but got "${repo}".`);
// Do not deploy unless this PR has touched relevant files: `aio/` or `packages/` (except for spec files)
if (!await buildVerifier.getSignificantFilesChanged(pr, significantFilesRe)) {
res.sendStatus(204);
logger.log(`PR:${pr}, Build:${buildNum} - ` +
`Skipping preview processing because this PR did not touch any significant files.`);
return;
}
const artifactPath = await buildRetriever.downloadBuildArtifact(buildNum, pr, sha, cfg.buildArtifactPath);
const isPublic = await buildVerifier.getPrIsTrusted(pr);
await buildCreator.create(pr, sha, artifactPath, isPublic);
res.sendStatus(isPublic ? 201 : 202);
logger.log(`PR:${pr}, SHA:${computeShortSha(sha)}, Build:${buildNum} - ` +
`Successfully created ${isPublic ? 'public' : 'non-public'} preview.`);
} catch (err) {
logger.error('CircleCI webhook error', err);
respondWithError(res, err);
}
});
// GITHUB PR UPDATED WEBHOOK
middleware.post(/^\/pr-updated\/?$/, jsonParser, async (req, res) => {
const { action, number: prNo }: { action?: string, number?: number } = req.body;
const visMayHaveChanged = !action || (action === 'labeled') || (action === 'unlabeled');
try {
if (!visMayHaveChanged) {
res.sendStatus(200);
} else if (!prNo) {
throwRequestError(400, `Missing or empty 'number' field`, req);
} else {
const isPublic = await buildVerifier.getPrIsTrusted(prNo);
await buildCreator.updatePrVisibility(prNo, isPublic);
res.sendStatus(200);
}
} catch (err) {
logger.error('PR update hook error', err);
respondWithError(res, err);
}
});
// ALL OTHER REQUESTS
middleware.all('*', req => throwRequestError(404, 'Unknown resource', req));
middleware.use((err: any, _req: any, res: express.Response, _next: any) => {
const statusText = http.STATUS_CODES[err.status] || '???';
logger.error(`Preview server error: ${err.status} - ${statusText}:`, err.message);
respondWithError(res, err);
});
return middleware;
}
public static createBuildCreator(prs: GithubPullRequests, buildsDir: string, domainName: string): BuildCreator {
const buildCreator = new BuildCreator(buildsDir);
const postPreviewsComment = (pr: number, shas: string[]) => {
const body = shas.
map(sha => `You can preview ${sha} at https://pr${pr}-${sha}.${domainName}/.`).
join('\n');
return prs.addComment(pr, body);
};
buildCreator.on(CreatedBuildEvent.type, ({pr, sha, isPublic}: CreatedBuildEvent) => {
if (isPublic) {
postPreviewsComment(pr, [sha]);
}
});
buildCreator.on(ChangedPrVisibilityEvent.type, ({pr, shas, isPublic}: ChangedPrVisibilityEvent) => {
if (isPublic && shas.length) {
postPreviewsComment(pr, shas);
}
});
return buildCreator;
}
}

View File

@ -1,29 +0,0 @@
import * as express from 'express';
import {promisify} from 'util';
import {PreviewServerError} from './preview-error';
/**
* Update the response to report that an error has occurred.
* @param res The response to configure as an error.
* @param err The error that needs to be reported.
*/
export async function respondWithError(res: express.Response, err: any): Promise<void> {
if (!(err instanceof PreviewServerError)) {
err = new PreviewServerError(500, String((err && err.message) || err));
}
res.status(err.status);
await promisify(res.end.bind(res))(err.message);
}
/**
* Throw an exception that describes the given error information.
* @param status The HTTP status code include in the error.
* @param error The error message to include in the error.
* @param req The request that triggered this error.
*/
export function throwRequestError(status: number, error: string, req: express.Request): never {
const message = `${error} in request: ${req.method} ${req.originalUrl}` +
(!req.body ? '' : ` ${JSON.stringify(req.body)}`);
throw new PreviewServerError(status, message);
}

View File

@ -0,0 +1,81 @@
// Imports
import * as cp from 'child_process';
import {EventEmitter} from 'events';
import * as fs from 'fs';
import * as path from 'path';
import * as shell from 'shelljs';
import {assertNotMissingOrEmpty} from '../common/utils';
import {CreatedBuildEvent} from './build-events';
import {UploadError} from './upload-error';
// Classes
export class BuildCreator extends EventEmitter {
// Constructor
constructor(protected buildsDir: string) {
super();
assertNotMissingOrEmpty('buildsDir', buildsDir);
}
// Methods - Public
public create(pr: string, sha: string, archivePath: string): Promise<any> {
const prDir = path.join(this.buildsDir, pr);
const shaDir = path.join(prDir, sha);
let dirToRemoveOnError: string;
return Promise.
all([this.exists(prDir), this.exists(shaDir)]).
then(([prDirExisted, shaDirExisted]) => {
if (shaDirExisted) {
throw new UploadError(409, `Request to overwrite existing directory: ${shaDir}`);
}
dirToRemoveOnError = prDirExisted ? shaDir : prDir;
return Promise.resolve().
then(() => shell.mkdir('-p', shaDir)).
then(() => this.extractArchive(archivePath, shaDir)).
then(() => this.emit(CreatedBuildEvent.type, new CreatedBuildEvent(+pr, sha)));
}).
catch(err => {
if (dirToRemoveOnError) {
shell.rm('-rf', dirToRemoveOnError);
}
if (!(err instanceof UploadError)) {
err = new UploadError(500, `Error while uploading to directory: ${shaDir}\n${err}`);
}
throw err;
});
}
// Methods - Protected
protected exists(fileOrDir: string): Promise<boolean> {
return new Promise(resolve => fs.access(fileOrDir, err => resolve(!err)));
}
protected extractArchive(inputFile: string, outputDir: string): Promise<void> {
return new Promise<void>((resolve, reject) => {
const cmd = `tar --extract --gzip --directory "${outputDir}" --file "${inputFile}"`;
cp.exec(cmd, (err, _stdout, stderr) => {
if (err) {
return reject(err);
}
if (stderr) {
console.warn(stderr);
}
try {
// Undocumented signature (see https://github.com/shelljs/shelljs/pull/663).
(shell as any).chmod('-R', 'a-w', outputDir);
shell.rm('-f', inputFile);
resolve();
} catch (err) {
reject(err);
}
});
});
}
}

View File

@ -0,0 +1,15 @@
// Classes
export class BuildEvent {
// Constructor
constructor(public type: string, public pr: number, public sha: string) {}
}
export class CreatedBuildEvent extends BuildEvent {
// Properties - Public, Static
public static type = 'build.created';
// Constructor
constructor(pr: number, sha: string) {
super(CreatedBuildEvent.type, pr, sha);
}
}

View File

@ -0,0 +1,78 @@
// Imports
import * as jwt from 'jsonwebtoken';
import {GithubPullRequests} from '../common/github-pull-requests';
import {GithubTeams} from '../common/github-teams';
import {assertNotMissingOrEmpty} from '../common/utils';
import {UploadError} from './upload-error';
// Interfaces - Types
interface JwtPayload {
slug: string;
'pull-request': number;
}
// Classes
export class BuildVerifier {
// Properties - Protected
protected githubPullRequests: GithubPullRequests;
protected githubTeams: GithubTeams;
// Constructor
constructor(protected secret: string, githubToken: string, protected repoSlug: string, organization: string,
protected allowedTeamSlugs: string[]) {
assertNotMissingOrEmpty('secret', secret);
assertNotMissingOrEmpty('githubToken', githubToken);
assertNotMissingOrEmpty('repoSlug', repoSlug);
assertNotMissingOrEmpty('organization', organization);
assertNotMissingOrEmpty('allowedTeamSlugs', allowedTeamSlugs && allowedTeamSlugs.join(''));
this.githubPullRequests = new GithubPullRequests(githubToken, repoSlug);
this.githubTeams = new GithubTeams(githubToken, organization);
}
// Methods - Public
public getPrAuthorTeamMembership(pr: number): Promise<{author: string, isMember: boolean}> {
return Promise.resolve().
then(() => this.githubPullRequests.fetch(pr)).
then(prInfo => prInfo.user.login).
then(author => this.githubTeams.isMemberBySlug(author, this.allowedTeamSlugs).
then(isMember => ({author, isMember})));
}
public verify(expectedPr: number, authHeader: string): Promise<void> {
return Promise.resolve().
then(() => this.extractJwtString(authHeader)).
then(jwtString => this.verifyJwt(expectedPr, jwtString)).
then(jwtPayload => this.verifyPr(jwtPayload['pull-request'])).
catch(err => { throw new UploadError(403, `Error while verifying upload for PR ${expectedPr}: ${err}`); });
}
// Methods - Protected
protected extractJwtString(input: string): string {
return input.replace(/^token +/i, '');
}
protected verifyJwt(expectedPr: number, token: string): Promise<JwtPayload> {
return new Promise((resolve, reject) => {
jwt.verify(token, this.secret, {issuer: 'Travis CI, GmbH'}, (err, payload) => {
if (err) {
reject(err.message || err);
} else if (payload.slug !== this.repoSlug) {
reject(`jwt slug invalid. expected: ${this.repoSlug}`);
} else if (payload['pull-request'] !== expectedPr) {
reject(`jwt pull-request invalid. expected: ${expectedPr}`);
} else {
resolve(payload);
}
});
});
}
protected verifyPr(pr: number): Promise<void> {
return this.getPrAuthorTeamMembership(pr).
then(({author, isMember}) => isMember ? Promise.resolve() : Promise.reject(
`User '${author}' is not an active member of any of the following teams: ` +
`${this.allowedTeamSlugs.join(', ')}`,
));
}
}

View File

@ -0,0 +1,39 @@
// Imports
import {getEnvVar} from '../common/utils';
import {BuildVerifier} from './build-verifier';
// Run
_main();
// Functions
function _main() {
const secret = 'unused';
const githubToken = getEnvVar('AIO_GITHUB_TOKEN');
const repoSlug = getEnvVar('AIO_REPO_SLUG');
const organization = getEnvVar('AIO_GITHUB_ORGANIZATION');
const allowedTeamSlugs = getEnvVar('AIO_GITHUB_TEAM_SLUGS').split(',');
const pr = +getEnvVar('AIO_PREVERIFY_PR');
const buildVerifier = new BuildVerifier(secret, githubToken, repoSlug, organization, allowedTeamSlugs);
// Exit codes:
// - 0: The PR author is a member.
// - 1: The PR author is not a member.
// - 2: An error occurred.
buildVerifier.getPrAuthorTeamMembership(pr).
then(({author, isMember}) => {
if (isMember) {
process.exit(0);
} else {
const errorMessage = `User '${author}' is not an active member of any of the following teams: ` +
`${allowedTeamSlugs.join(', ')}`;
onError(errorMessage, 1);
}
}).
catch(err => onError(err, 2));
}
function onError(err: string, exitCode: number) {
console.error(err);
process.exit(exitCode || 1);
}

View File

@ -0,0 +1,10 @@
// Imports
import {GithubPullRequests} from '../common/github-pull-requests';
import {BuildVerifier} from './build-verifier';
// Run
// TODO(gkalpak): Add e2e tests to cover these interactions as well.
GithubPullRequests.prototype.addComment = () => Promise.resolve();
BuildVerifier.prototype.verify = () => Promise.resolve();
// tslint:disable-next-line: no-var-requires
require('./index');

View File

@ -0,0 +1,35 @@
// TODO(gkalpak): Find more suitable way to run as `www-data`.
process.setuid('www-data');
// Imports
import {getEnvVar} from '../common/utils';
import {uploadServerFactory} from './upload-server-factory';
// Constants
const AIO_BUILDS_DIR = getEnvVar('AIO_BUILDS_DIR');
const AIO_DOMAIN_NAME = getEnvVar('AIO_DOMAIN_NAME');
const AIO_GITHUB_ORGANIZATION = getEnvVar('AIO_GITHUB_ORGANIZATION');
const AIO_GITHUB_TEAM_SLUGS = getEnvVar('AIO_GITHUB_TEAM_SLUGS');
const AIO_GITHUB_TOKEN = getEnvVar('AIO_GITHUB_TOKEN');
const AIO_PREVIEW_DEPLOYMENT_TOKEN = getEnvVar('AIO_PREVIEW_DEPLOYMENT_TOKEN');
const AIO_REPO_SLUG = getEnvVar('AIO_REPO_SLUG');
const AIO_UPLOAD_HOSTNAME = getEnvVar('AIO_UPLOAD_HOSTNAME');
const AIO_UPLOAD_PORT = +getEnvVar('AIO_UPLOAD_PORT');
// Run
_main();
// Functions
function _main() {
uploadServerFactory.
create({
buildsDir: AIO_BUILDS_DIR,
domainName: AIO_DOMAIN_NAME,
githubOrganization: AIO_GITHUB_ORGANIZATION,
githubTeamSlugs: AIO_GITHUB_TEAM_SLUGS.split(','),
githubToken: AIO_GITHUB_TOKEN,
repoSlug: AIO_REPO_SLUG,
secret: AIO_PREVIEW_DEPLOYMENT_TOKEN,
}).
listen(AIO_UPLOAD_PORT, AIO_UPLOAD_HOSTNAME);
}

View File

@ -0,0 +1,8 @@
// Classes
export class UploadError extends Error {
// Constructor
constructor(public status: number = 500, message?: string) {
super(message);
Object.setPrototypeOf(this, UploadError.prototype);
}
}

View File

@ -0,0 +1,117 @@
// Imports
import * as express from 'express';
import * as http from 'http';
import {GithubPullRequests} from '../common/github-pull-requests';
import {assertNotMissingOrEmpty} from '../common/utils';
import {BuildCreator} from './build-creator';
import {CreatedBuildEvent} from './build-events';
import {BuildVerifier} from './build-verifier';
import {UploadError} from './upload-error';
// Constants
const AUTHORIZATION_HEADER = 'AUTHORIZATION';
const X_FILE_HEADER = 'X-FILE';
// Interfaces - Types
interface UploadServerConfig {
buildsDir: string;
domainName: string;
githubOrganization: string;
githubTeamSlugs: string[];
githubToken: string;
repoSlug: string;
secret: string;
}
// Classes
class UploadServerFactory {
// Methods - Public
public create({
buildsDir,
domainName,
githubOrganization,
githubTeamSlugs,
githubToken,
repoSlug,
secret,
}: UploadServerConfig): http.Server {
assertNotMissingOrEmpty('domainName', domainName);
const buildVerifier = new BuildVerifier(secret, githubToken, repoSlug, githubOrganization, githubTeamSlugs);
const buildCreator = this.createBuildCreator(buildsDir, githubToken, repoSlug, domainName);
const middleware = this.createMiddleware(buildVerifier, buildCreator);
const httpServer = http.createServer(middleware);
httpServer.on('listening', () => {
const info = httpServer.address();
console.info(`Up and running (and listening on ${info.address}:${info.port})...`);
});
return httpServer;
}
// Methods - Protected
protected createBuildCreator(buildsDir: string, githubToken: string, repoSlug: string,
domainName: string): BuildCreator {
const buildCreator = new BuildCreator(buildsDir);
const githubPullRequests = new GithubPullRequests(githubToken, repoSlug);
buildCreator.on(CreatedBuildEvent.type, ({pr, sha}: CreatedBuildEvent) => {
const body = `The angular.io preview for ${sha} is available [here][1].\n\n` +
`[1]: https://pr${pr}-${sha}.${domainName}/`;
githubPullRequests.addComment(pr, body);
});
return buildCreator;
}
protected createMiddleware(buildVerifier: BuildVerifier, buildCreator: BuildCreator): express.Express {
const middleware = express();
middleware.get(/^\/create-build\/([1-9][0-9]*)\/([0-9a-f]{40})\/?$/, (req, res) => {
const pr = req.params[0];
const sha = req.params[1];
const archive = req.header(X_FILE_HEADER);
const authHeader = req.header(AUTHORIZATION_HEADER);
if (!authHeader) {
this.throwRequestError(401, `Missing or empty '${AUTHORIZATION_HEADER}' header`, req);
} else if (!archive) {
this.throwRequestError(400, `Missing or empty '${X_FILE_HEADER}' header`, req);
}
buildVerifier.
verify(+pr, authHeader).
then(() => buildCreator.create(pr, sha, archive)).
then(() => res.sendStatus(201)).
catch(err => this.respondWithError(res, err));
});
middleware.get(/^\/health-check\/?$/, (_req, res) => res.sendStatus(200));
middleware.get('*', req => this.throwRequestError(404, 'Unknown resource', req));
middleware.all('*', req => this.throwRequestError(405, 'Unsupported method', req));
middleware.use((err: any, _req: any, res: express.Response, _next: any) => this.respondWithError(res, err));
return middleware;
}
protected respondWithError(res: express.Response, err: any) {
if (!(err instanceof UploadError)) {
err = new UploadError(500, String((err && err.message) || err));
}
const statusText = http.STATUS_CODES[err.status] || '???';
console.error(`Upload error: ${err.status} - ${statusText}`);
console.error(err.message);
res.status(err.status).end(err.message);
}
protected throwRequestError(status: number, error: string, req: express.Request) {
throw new UploadError(status, `${error} in request: ${req.method} ${req.originalUrl}`);
}
}
// Exports
export const uploadServerFactory = new UploadServerFactory();

View File

@ -1,37 +0,0 @@
export const enum BuildNums {
BUILD_INFO_ERROR = 1,
BUILD_INFO_404,
BUILD_INFO_BUILD_FAILED,
BUILD_INFO_INVALID_GH_ORG,
BUILD_INFO_INVALID_GH_REPO,
CHANGED_FILES_ERROR,
CHANGED_FILES_404,
CHANGED_FILES_NONE,
BUILD_ARTIFACTS_ERROR,
BUILD_ARTIFACTS_404,
BUILD_ARTIFACTS_EMPTY,
BUILD_ARTIFACTS_MISSING,
DOWNLOAD_ARTIFACT_ERROR,
DOWNLOAD_ARTIFACT_404,
DOWNLOAD_ARTIFACT_TOO_BIG,
TRUST_CHECK_ERROR,
TRUST_CHECK_UNTRUSTED,
TRUST_CHECK_TRUSTED_LABEL,
TRUST_CHECK_ACTIVE_TRUSTED_USER,
TRUST_CHECK_INACTIVE_TRUSTED_USER,
}
export const enum PrNums {
CHANGED_FILES_ERROR = 1,
CHANGED_FILES_404,
CHANGED_FILES_NONE,
TRUST_CHECK_ERROR,
TRUST_CHECK_UNTRUSTED,
TRUST_CHECK_TRUSTED_LABEL,
TRUST_CHECK_ACTIVE_TRUSTED_USER,
TRUST_CHECK_INACTIVE_TRUSTED_USER,
}
export const SHA = '1234567890'.repeat(4);
export const ALT_SHA = 'abcde'.repeat(8);
export const SIMILAR_SHA = SHA.slice(0, -1) + 'A';

View File

@ -1,10 +0,0 @@
declare module 'delete-empty' {
interface Options {
dryRun: boolean;
verbose: boolean;
filter: (filePath: string) => boolean;
}
export default function deleteEmpty(cwd: string, options?: Options): Promise<string[]>;
export default function deleteEmpty(cwd: string, options?: Options, callback?: (err: any, deleted: string[]) => void): void;
export function sync(cwd: string, options?: Options): string[];
}

View File

@ -1,19 +1,23 @@
// Imports
import * as cp from 'child_process';
import * as fs from 'fs';
import * as http from 'http';
import * as path from 'path';
import * as shell from 'shelljs';
import {AIO_DOWNLOADS_DIR, HIDDEN_DIR_PREFIX} from '../common/constants';
import {
AIO_BUILDS_DIR,
AIO_NGINX_PORT_HTTP,
AIO_NGINX_PORT_HTTPS,
AIO_WWW_USER,
} from '../common/env-variables';
import {computeShortSha, Logger} from '../common/utils';
import {getEnvVar} from '../common/utils';
// Constans
const SERVER_USER = 'www-data';
const TEST_AIO_BUILDS_DIR = getEnvVar('TEST_AIO_BUILDS_DIR');
const TEST_AIO_NGINX_HOSTNAME = getEnvVar('TEST_AIO_NGINX_HOSTNAME');
const TEST_AIO_NGINX_PORT_HTTP = +getEnvVar('TEST_AIO_NGINX_PORT_HTTP');
const TEST_AIO_NGINX_PORT_HTTPS = +getEnvVar('TEST_AIO_NGINX_PORT_HTTPS');
const TEST_AIO_UPLOAD_HOSTNAME = getEnvVar('TEST_AIO_UPLOAD_HOSTNAME');
const TEST_AIO_UPLOAD_MAX_SIZE = +getEnvVar('TEST_AIO_UPLOAD_MAX_SIZE');
const TEST_AIO_UPLOAD_PORT = +getEnvVar('TEST_AIO_UPLOAD_PORT');
// Interfaces - Types
export interface CmdResult { success: boolean; err: Error | null; stdout: string; stderr: string; }
export interface CmdResult { success: boolean; err: Error; stdout: string; stderr: string; }
export interface FileSpecs { content?: string; size?: number; }
export type CleanUpFn = () => void;
@ -22,74 +26,79 @@ export type VerifyCmdResultFn = (result: CmdResult) => void;
// Classes
class Helper {
// Properties - Public
public get buildsDir() { return TEST_AIO_BUILDS_DIR; }
public get nginxHostname() { return TEST_AIO_NGINX_HOSTNAME; }
public get nginxPortHttp() { return TEST_AIO_NGINX_PORT_HTTP; }
public get nginxPortHttps() { return TEST_AIO_NGINX_PORT_HTTPS; }
public get serverUser() { return SERVER_USER; }
public get uploadHostname() { return TEST_AIO_UPLOAD_HOSTNAME; }
public get uploadPort() { return TEST_AIO_UPLOAD_PORT; }
public get uploadMaxSize() { return TEST_AIO_UPLOAD_MAX_SIZE; }
// Properties - Protected
protected cleanUpFns: CleanUpFn[] = [];
protected portPerScheme: {[scheme: string]: number} = {
http: AIO_NGINX_PORT_HTTP,
https: AIO_NGINX_PORT_HTTPS,
http: this.nginxPortHttp,
https: this.nginxPortHttps,
};
private logger = new Logger('TestHelper');
// Constructor
constructor() {
shell.mkdir('-p', AIO_BUILDS_DIR);
shell.exec(`chown -R ${AIO_WWW_USER} ${AIO_BUILDS_DIR}`);
shell.mkdir('-p', AIO_DOWNLOADS_DIR);
shell.exec(`chown -R ${AIO_WWW_USER} ${AIO_DOWNLOADS_DIR}`);
shell.mkdir('-p', this.buildsDir);
shell.exec(`chown -R ${this.serverUser} ${this.buildsDir}`);
}
// Methods - Public
public cleanUp(): void {
public cleanUp() {
while (this.cleanUpFns.length) {
// Clean-up fns remove themselves from the list.
this.cleanUpFns[0]();
}
const leftoverDownloads = fs.readdirSync(AIO_DOWNLOADS_DIR);
const leftoverBuilds = fs.readdirSync(AIO_BUILDS_DIR);
if (leftoverDownloads.length) {
this.logger.log(`Downloads directory '${AIO_DOWNLOADS_DIR}' is not empty after clean-up.`, leftoverDownloads);
shell.rm('-rf', `${AIO_DOWNLOADS_DIR}/*`);
}
if (leftoverBuilds.length) {
this.logger.log(`Builds directory '${AIO_BUILDS_DIR}' is not empty after clean-up.`, leftoverBuilds);
shell.rm('-rf', `${AIO_BUILDS_DIR}/*`);
}
if (leftoverBuilds.length || leftoverDownloads.length) {
throw new Error(`Unexpected test files not cleaned up.`);
if (fs.readdirSync(this.buildsDir).length) {
throw new Error(`Directory '${this.buildsDir}' is not empty after clean-up.`);
}
}
public createDummyBuild(pr: number, sha: string, isPublic = true, force = false, legacy = false): CleanUpFn {
const prDir = this.getPrDir(pr, isPublic);
const shaDir = this.getShaDir(prDir, sha, legacy);
public createDummyArchive(pr: string, sha: string, archivePath: string): CleanUpFn {
const inputDir = path.join(this.buildsDir, 'uploaded', pr, sha);
const cmd1 = `tar --create --gzip --directory "${inputDir}" --file "${archivePath}" .`;
const cmd2 = `chown ${this.serverUser} ${archivePath}`;
const cleanUpTemp = this.createDummyBuild(`uploaded/${pr}`, sha, true);
shell.exec(cmd1);
shell.exec(cmd2);
cleanUpTemp();
return this.createCleanUpFn(() => shell.rm('-rf', archivePath));
}
public createDummyBuild(pr: string, sha: string, force = false): CleanUpFn {
const prDir = path.join(this.buildsDir, pr);
const shaDir = path.join(prDir, sha);
const idxPath = path.join(shaDir, 'index.html');
const barPath = path.join(shaDir, 'foo', 'bar.js');
this.writeFile(idxPath, {content: `PR: ${pr} | SHA: ${sha} | File: /index.html`}, force);
this.writeFile(barPath, {content: `PR: ${pr} | SHA: ${sha} | File: /foo/bar.js`}, force);
shell.exec(`chown -R ${AIO_WWW_USER} ${prDir}`);
shell.exec(`chown -R ${this.serverUser} ${prDir}`);
return this.createCleanUpFn(() => shell.rm('-rf', prDir));
}
public getPrDir(pr: number, isPublic: boolean): string {
const prDirName = isPublic ? '' + pr : HIDDEN_DIR_PREFIX + pr;
return path.join(AIO_BUILDS_DIR, prDirName);
public deletePrDir(pr: string) {
const prDir = path.join(this.buildsDir, pr);
if (fs.existsSync(prDir)) {
// Undocumented signature (see https://github.com/shelljs/shelljs/pull/663).
(shell as any).chmod('-R', 'a+w', prDir);
shell.rm('-rf', prDir);
}
}
public getShaDir(prDir: string, sha: string, legacy = false): string {
return path.join(prDir, legacy ? sha : computeShortSha(sha));
}
public readBuildFile(pr: number, sha: string, relFilePath: string, isPublic = true, legacy = false): string {
const shaDir = this.getShaDir(this.getPrDir(pr, isPublic), sha, legacy);
const absFilePath = path.join(shaDir, relFilePath);
public readBuildFile(pr: string, sha: string, relFilePath: string): string {
const absFilePath = path.join(this.buildsDir, pr, sha, relFilePath);
return fs.readFileSync(absFilePath, 'utf8');
}
@ -100,38 +109,46 @@ class Helper {
});
}
public runForAllSupportedSchemes(suiteFactory: TestSuiteFactory): void {
public runForAllSupportedSchemes(suiteFactory: TestSuiteFactory) {
Object.keys(this.portPerScheme).forEach(scheme => suiteFactory(scheme, this.portPerScheme[scheme]));
}
public verifyResponse(status: number, regex: string | RegExp = /^/): VerifyCmdResultFn {
public verifyResponse(status: number | [number, string], regex = /^/): VerifyCmdResultFn {
let statusCode: number;
let statusText: string;
if (Array.isArray(status)) {
statusCode = status[0];
statusText = status[1];
} else {
statusCode = status;
statusText = http.STATUS_CODES[statusCode];
}
return (result: CmdResult) => {
const [headers, body] = result.stdout.
split(/(?:\r?\n){2,}/).
map(s => s.trim()).
slice(-2); // In case of redirect, discard the previous headers.
// Only keep the last to sections (final headers and body).
slice(-2);
if (!result.success) {
this.logger.log('Stdout:', result.stdout);
this.logger.error('Stderr:', result.stderr);
this.logger.error('Error:', result.err);
console.log('Stdout:', result.stdout);
console.log('Stderr:', result.stderr);
console.log('Error:', result.err);
}
expect(result.success).toBe(true);
expect(headers).toMatch(new RegExp(`HTTP/(?:1\\.1|2) ${status} `));
expect(headers).toContain(`${statusCode} ${statusText}`);
expect(body).toMatch(regex);
};
}
public writeBuildFile(pr: number, sha: string, relFilePath: string, content: string, isPublic = true,
legacy = false): void {
const shaDir = this.getShaDir(this.getPrDir(pr, isPublic), sha, legacy);
const absFilePath = path.join(shaDir, relFilePath);
this.writeFile(absFilePath, {content}, true);
public writeBuildFile(pr: string, sha: string, relFilePath: string, content: string): CleanUpFn {
const absFilePath = path.join(this.buildsDir, pr, sha, relFilePath);
return this.writeFile(absFilePath, {content}, true);
}
public writeFile(filePath: string, {content, size}: FileSpecs, force = false): void {
public writeFile(filePath: string, {content, size}: FileSpecs, force = false): CleanUpFn {
if (!force && fs.existsSync(filePath)) {
throw new Error(`Refusing to overwrite existing file '${filePath}'.`);
}
@ -149,11 +166,13 @@ class Helper {
// Create a file with the specified content.
fs.writeFileSync(filePath, content || '');
}
shell.exec(`chown ${AIO_WWW_USER} ${filePath}`);
shell.exec(`chown ${this.serverUser} ${filePath}`);
return this.createCleanUpFn(() => shell.rm('-rf', cleanUpTarget));
}
// Methods - Protected
protected createCleanUpFn(fn: () => void): CleanUpFn {
protected createCleanUpFn(fn: Function): CleanUpFn {
const cleanUpFn = () => {
const idx = this.cleanUpFns.indexOf(cleanUpFn);
if (idx !== -1) {
@ -168,70 +187,5 @@ class Helper {
}
}
interface DefaultCurlOptions {
defaultMethod?: CurlOptions['method'];
defaultOptions?: CurlOptions['options'];
defaultHeaders?: CurlOptions['headers'];
defaultData?: CurlOptions['data'];
defaultExtraPath?: CurlOptions['extraPath'];
}
interface CurlOptions {
method?: string;
options?: string;
headers?: string[];
data?: any;
url?: string;
extraPath?: string;
}
export function makeCurl(baseUrl: string, {
defaultMethod = 'POST',
defaultOptions = '',
defaultHeaders = ['Content-Type: application/json'],
defaultData = {},
defaultExtraPath = '',
}: DefaultCurlOptions = {}) {
return function curl({
method = defaultMethod,
options = defaultOptions,
headers = defaultHeaders,
data = defaultData,
url = baseUrl,
extraPath = defaultExtraPath,
}: CurlOptions) {
const dataString = data ? JSON.stringify(data) : '';
const cmd = `curl -iLX ${method} ` +
`${options} ` +
headers.map(header => `--header "${header}" `).join('') +
`--data '${dataString}' ` +
`${url}${extraPath}`;
return helper.runCmd(cmd);
};
}
export interface PayloadData {
data: {
payload: {
build_num: number,
build_parameters: {
CIRCLE_JOB: string;
};
};
};
}
export function payload(buildNum: number): PayloadData {
return {
data: {
payload: {
build_num: buildNum,
build_parameters: { CIRCLE_JOB: 'aio_preview' },
},
},
};
}
// Exports
export const helper = new Helper();

View File

@ -1,7 +0,0 @@
declare module jasmine {
interface Matchers {
toExistAsAFile(remove = true): boolean;
toExistAsABuild(remove = true): boolean;
toExistAsAnArtifact(remove = true): boolean;
}
}

View File

@ -1,88 +0,0 @@
import {sync as deleteEmpty} from 'delete-empty';
import {existsSync, unlinkSync} from 'fs';
import {join} from 'path';
import {AIO_DOWNLOADS_DIR} from '../common/constants';
import {computeShortSha} from '../common/utils';
import {SHA} from './constants';
import {helper} from './helper';
function checkFile(filePath: string, remove: boolean): boolean {
const exists = existsSync(filePath);
if (exists && remove) {
// if we expected the file to exist then we remove it to prevent leftover file errors
unlinkSync(filePath);
}
return exists;
}
function getArtifactPath(prNum: number, sha: string = SHA): string {
return `${AIO_DOWNLOADS_DIR}/${prNum}-${computeShortSha(sha)}-aio-snapshot.tgz`;
}
function checkFiles(prNum: number, isPublic: boolean, sha: string, isLegacy: boolean, remove: boolean) {
const files = ['/index.html', '/foo/bar.js'];
const prPath = helper.getPrDir(prNum, isPublic);
const shaPath = helper.getShaDir(prPath, sha, isLegacy);
const existingFiles: string[] = [];
const missingFiles: string[] = [];
files
.map(file => join(shaPath, file))
.forEach(file => (checkFile(file, remove) ? existingFiles : missingFiles).push(file));
deleteEmpty(prPath);
return { existingFiles, missingFiles };
}
class ToExistAsAFile implements jasmine.CustomMatcher {
public compare(actual: string, remove = true): jasmine.CustomMatcherResult {
const pass = checkFile(actual, remove);
return {
message: `Expected file at "${actual}" ${pass ? 'not' : ''} to exist`,
pass,
};
}
}
class ToExistAsAnArtifact implements jasmine.CustomMatcher {
public compare(actual: {prNum: number, sha?: string}, remove = true): jasmine.CustomMatcherResult {
const { prNum, sha = SHA } = actual;
const filePath = getArtifactPath(prNum, sha);
const pass = checkFile(filePath, remove);
return {
message: `Expected artifact "PR:${prNum}, SHA:${sha}, FILE:${filePath}" ${pass ? 'not' : '\b'} to exist`,
pass,
};
}
}
class ToExistAsABuild implements jasmine.CustomMatcher {
public compare(actual: {prNum: number, isPublic?: boolean, sha?: string, isLegacy?: boolean}, remove = true):
jasmine.CustomMatcherResult {
const {prNum, isPublic = true, sha = SHA, isLegacy = false} = actual;
const {missingFiles} = checkFiles(prNum, isPublic, sha, isLegacy, remove);
return {
message: `Expected files for build "PR:${prNum}, SHA:${sha}" to exist:\n` +
missingFiles.map(file => ` - ${file}`).join('\n'),
pass: missingFiles.length === 0,
};
}
public negativeCompare(actual: {prNum: number, isPublic?: boolean, sha?: string, isLegacy?: boolean}):
jasmine.CustomMatcherResult {
const {prNum, isPublic = true, sha = SHA, isLegacy = false} = actual;
const { existingFiles } = checkFiles(prNum, isPublic, sha, isLegacy, false);
return {
message: `Expected files for build "PR:${prNum}, SHA:${sha}" not to exist:\n` +
existingFiles.map(file => ` - ${file}`).join('\n'),
pass: existingFiles.length === 0,
};
}
}
export const customMatchers = {
toExistAsABuild: () => new ToExistAsABuild(),
toExistAsAFile: () => new ToExistAsAFile(),
toExistAsAnArtifact: () => new ToExistAsAnArtifact(),
};

View File

@ -1,171 +0,0 @@
/* tslint:disable:max-line-length */
import * as nock from 'nock';
import * as tar from 'tar-stream';
import {gzipSync} from 'zlib';
import {getEnvVar, Logger} from '../common/utils';
import {BuildNums, PrNums, SHA} from './constants';
// We are using the `nock` library to fake responses from REST requests, when testing.
// This is necessary, because the test preview-server runs as a separate node process to
// the test harness, so we do not have direct access to the code (e.g. for mocking).
// (See also 'lib/verify-setup/start-test-preview-server.ts'.)
// Each of the potential requests to an external API (e.g. Github or CircleCI) are mocked
// below and return a suitable response. This is quite complicated to setup since the
// response from, say, CircleCI will affect what request is made to, say, Github.
const logger = new Logger('mock-external-apis');
const log = (...args: any[]) => {
// Filter out non-matching URL checks
if (!/^matching.+: false$/.test(args[0])) {
logger.log(...args);
}
};
const AIO_CIRCLE_CI_TOKEN = getEnvVar('AIO_CIRCLE_CI_TOKEN');
const AIO_GITHUB_TOKEN = getEnvVar('AIO_GITHUB_TOKEN');
const AIO_ARTIFACT_PATH = getEnvVar('AIO_ARTIFACT_PATH');
const AIO_GITHUB_ORGANIZATION = getEnvVar('AIO_GITHUB_ORGANIZATION');
const AIO_GITHUB_REPO = getEnvVar('AIO_GITHUB_REPO');
const AIO_TRUSTED_PR_LABEL = getEnvVar('AIO_TRUSTED_PR_LABEL');
const AIO_GITHUB_TEAM_SLUGS = getEnvVar('AIO_GITHUB_TEAM_SLUGS').split(',');
const ACTIVE_TRUSTED_USER = 'active-trusted-user';
const INACTIVE_TRUSTED_USER = 'inactive-trusted-user';
const UNTRUSTED_USER = 'untrusted-user';
const BASIC_BUILD_INFO = {
branch: `pull/${PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER}`,
failed: false,
reponame: AIO_GITHUB_REPO,
username: AIO_GITHUB_ORGANIZATION,
vcs_revision: SHA,
};
const ISSUE_INFO_TRUSTED_LABEL = { labels: [{ name: AIO_TRUSTED_PR_LABEL }], user: { login: UNTRUSTED_USER } };
const ISSUE_INFO_ACTIVE_TRUSTED_USER = { labels: [], user: { login: ACTIVE_TRUSTED_USER } };
const ISSUE_INFO_INACTIVE_TRUSTED_USER = { labels: [], user: { login: INACTIVE_TRUSTED_USER } };
const ISSUE_INFO_UNTRUSTED = { labels: [], user: { login: UNTRUSTED_USER } };
const ACTIVE_STATE = { state: 'active' };
const INACTIVE_STATE = { state: 'inactive' };
const TEST_TEAM_INFO = AIO_GITHUB_TEAM_SLUGS.map((slug, index) => ({ slug, id: index }));
const CIRCLE_CI_API_HOST = 'https://circleci.com';
const CIRCLE_CI_TOKEN_PARAM = `circle-token=${AIO_CIRCLE_CI_TOKEN}`;
const ARTIFACT_1 = { path: 'artifact-1', url: `${CIRCLE_CI_API_HOST}/artifacts/artifact-1`, _urlPath: '/artifacts/artifact-1' };
const ARTIFACT_2 = { path: 'artifact-2', url: `${CIRCLE_CI_API_HOST}/artifacts/artifact-2`, _urlPath: '/artifacts/artifact-2' };
const ARTIFACT_3 = { path: 'artifact-3', url: `${CIRCLE_CI_API_HOST}/artifacts/artifact-3`, _urlPath: '/artifacts/artifact-3' };
const ARTIFACT_ERROR = { path: AIO_ARTIFACT_PATH, url: `${CIRCLE_CI_API_HOST}/artifacts/error`, _urlPath: '/artifacts/error' };
const ARTIFACT_404 = { path: AIO_ARTIFACT_PATH, url: `${CIRCLE_CI_API_HOST}/artifacts/404`, _urlPath: '/artifacts/404' };
const ARTIFACT_VALID_TRUSTED_USER = { path: AIO_ARTIFACT_PATH, url: `${CIRCLE_CI_API_HOST}/artifacts/valid/user`, _urlPath: '/artifacts/valid/user' };
const ARTIFACT_VALID_TRUSTED_LABEL = { path: AIO_ARTIFACT_PATH, url: `${CIRCLE_CI_API_HOST}/artifacts/valid/label`, _urlPath: '/artifacts/valid/label' };
const ARTIFACT_VALID_UNTRUSTED = { path: AIO_ARTIFACT_PATH, url: `${CIRCLE_CI_API_HOST}/artifacts/valid/untrusted`, _urlPath: '/artifacts/valid/untrusted' };
const CIRCLE_CI_BUILD_INFO_URL = `/api/v1.1/project/github/${AIO_GITHUB_ORGANIZATION}/${AIO_GITHUB_REPO}`;
const buildInfoUrl = (buildNum: number) => `${CIRCLE_CI_BUILD_INFO_URL}/${buildNum}?${CIRCLE_CI_TOKEN_PARAM}`;
const buildArtifactsUrl = (buildNum: number) => `${CIRCLE_CI_BUILD_INFO_URL}/${buildNum}/artifacts?${CIRCLE_CI_TOKEN_PARAM}`;
const buildInfo = (prNum: number) => ({ ...BASIC_BUILD_INFO, branch: `pull/${prNum}` });
const GITHUB_API_HOST = 'https://api.github.com';
const GITHUB_ISSUES_URL = `/repos/${AIO_GITHUB_ORGANIZATION}/${AIO_GITHUB_REPO}/issues`;
const GITHUB_PULLS_URL = `/repos/${AIO_GITHUB_ORGANIZATION}/${AIO_GITHUB_REPO}/pulls`;
const GITHUB_TEAMS_URL = `/orgs/${AIO_GITHUB_ORGANIZATION}/teams`;
const getIssueUrl = (prNum: number) => `${GITHUB_ISSUES_URL}/${prNum}`;
const getFilesUrl = (prNum: number, pageNum = 1) => `${GITHUB_PULLS_URL}/${prNum}/files?page=${pageNum}&per_page=100`;
const getCommentUrl = (prNum: number) => `${getIssueUrl(prNum)}/comments`;
const getTeamMembershipUrl = (teamId: number, username: string) => `/teams/${teamId}/memberships/${username}`;
const createArchive = (buildNum: number, prNum: number, sha: string) => {
logger.log('createArchive', buildNum, prNum, sha);
const pack = tar.pack();
pack.entry({name: 'index.html'}, `BUILD: ${buildNum} | PR: ${prNum} | SHA: ${sha} | File: /index.html`);
pack.entry({name: 'foo/bar.js'}, `BUILD: ${buildNum} | PR: ${prNum} | SHA: ${sha} | File: /foo/bar.js`);
pack.finalize();
const zip = gzipSync(pack.read());
return zip;
};
// Create request scopes
const circleCiApi = nock(CIRCLE_CI_API_HOST).log(log).persist();
const githubApi = nock(GITHUB_API_HOST).log(log).persist().matchHeader('Authorization', `token ${AIO_GITHUB_TOKEN}`);
//////////////////////////////
// GENERAL responses
githubApi.get(GITHUB_TEAMS_URL + '?page=1&per_page=100').reply(200, TEST_TEAM_INFO);
githubApi.post(getCommentUrl(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER)).reply(200);
// BUILD_INFO errors
circleCiApi.get(buildInfoUrl(BuildNums.BUILD_INFO_ERROR)).replyWithError('BUILD_INFO_ERROR');
circleCiApi.get(buildInfoUrl(BuildNums.BUILD_INFO_404)).reply(404, 'BUILD_INFO_404');
circleCiApi.get(buildInfoUrl(BuildNums.BUILD_INFO_BUILD_FAILED)).reply(200, { ...BASIC_BUILD_INFO, failed: true });
circleCiApi.get(buildInfoUrl(BuildNums.BUILD_INFO_INVALID_GH_ORG)).reply(200, { ...BASIC_BUILD_INFO, username: 'bad' });
circleCiApi.get(buildInfoUrl(BuildNums.BUILD_INFO_INVALID_GH_REPO)).reply(200, { ...BASIC_BUILD_INFO, reponame: 'bad' });
// CHANGED FILE errors
circleCiApi.get(buildInfoUrl(BuildNums.CHANGED_FILES_ERROR)).reply(200, buildInfo(PrNums.CHANGED_FILES_ERROR));
githubApi.get(getFilesUrl(PrNums.CHANGED_FILES_ERROR)).replyWithError('CHANGED_FILES_ERROR');
circleCiApi.get(buildInfoUrl(BuildNums.CHANGED_FILES_404)).reply(200, buildInfo(PrNums.CHANGED_FILES_404));
githubApi.get(getFilesUrl(PrNums.CHANGED_FILES_404)).reply(404, 'CHANGED_FILES_404');
circleCiApi.get(buildInfoUrl(BuildNums.CHANGED_FILES_NONE)).reply(200, buildInfo(PrNums.CHANGED_FILES_NONE));
githubApi.get(getFilesUrl(PrNums.CHANGED_FILES_NONE)).reply(200, []);
// ARTIFACT URL errors
circleCiApi.get(buildInfoUrl(BuildNums.BUILD_ARTIFACTS_ERROR)).reply(200, buildInfo(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER));
circleCiApi.get(buildArtifactsUrl(BuildNums.BUILD_ARTIFACTS_ERROR)).replyWithError('BUILD_ARTIFACTS_ERROR');
circleCiApi.get(buildInfoUrl(BuildNums.BUILD_ARTIFACTS_404)).reply(200, buildInfo(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER));
circleCiApi.get(buildArtifactsUrl(BuildNums.BUILD_ARTIFACTS_404)).reply(404, 'BUILD_ARTIFACTS_ERROR');
circleCiApi.get(buildInfoUrl(BuildNums.BUILD_ARTIFACTS_EMPTY)).reply(200, buildInfo(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER));
circleCiApi.get(buildArtifactsUrl(BuildNums.BUILD_ARTIFACTS_EMPTY)).reply(200, []);
circleCiApi.get(buildInfoUrl(BuildNums.BUILD_ARTIFACTS_MISSING)).reply(200, buildInfo(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER));
circleCiApi.get(buildArtifactsUrl(BuildNums.BUILD_ARTIFACTS_MISSING)).reply(200, [ARTIFACT_1, ARTIFACT_2, ARTIFACT_3]);
// ARTIFACT DOWNLOAD errors
circleCiApi.get(buildInfoUrl(BuildNums.DOWNLOAD_ARTIFACT_ERROR)).reply(200, buildInfo(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER));
circleCiApi.get(buildArtifactsUrl(BuildNums.DOWNLOAD_ARTIFACT_ERROR)).reply(200, [ARTIFACT_ERROR]);
circleCiApi.get(ARTIFACT_ERROR._urlPath).replyWithError(ARTIFACT_ERROR._urlPath);
circleCiApi.get(buildInfoUrl(BuildNums.DOWNLOAD_ARTIFACT_404)).reply(200, buildInfo(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER));
circleCiApi.get(buildArtifactsUrl(BuildNums.DOWNLOAD_ARTIFACT_404)).reply(200, [ARTIFACT_404]);
circleCiApi.get(ARTIFACT_ERROR._urlPath).reply(404, ARTIFACT_ERROR._urlPath);
// TRUST CHECK errors
circleCiApi.get(buildInfoUrl(BuildNums.TRUST_CHECK_ERROR)).reply(200, buildInfo(PrNums.TRUST_CHECK_ERROR));
githubApi.get(getFilesUrl(PrNums.TRUST_CHECK_ERROR)).reply(200, [{ filename: 'aio/a' }]);
circleCiApi.get(buildArtifactsUrl(BuildNums.TRUST_CHECK_ERROR)).reply(200, [ARTIFACT_VALID_TRUSTED_USER]);
githubApi.get(getIssueUrl(PrNums.TRUST_CHECK_ERROR)).replyWithError('TRUST_CHECK_ERROR');
// ACTIVE TRUSTED USER response
circleCiApi.get(buildInfoUrl(BuildNums.TRUST_CHECK_ACTIVE_TRUSTED_USER)).reply(200, BASIC_BUILD_INFO);
githubApi.get(getFilesUrl(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER)).reply(200, [{ filename: 'aio/a' }]);
circleCiApi.get(buildArtifactsUrl(BuildNums.TRUST_CHECK_ACTIVE_TRUSTED_USER)).reply(200, [ARTIFACT_VALID_TRUSTED_USER]);
circleCiApi.get(ARTIFACT_VALID_TRUSTED_USER._urlPath).reply(200, createArchive(BuildNums.TRUST_CHECK_ACTIVE_TRUSTED_USER, PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER, SHA));
githubApi.get(getIssueUrl(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER)).reply(200, ISSUE_INFO_ACTIVE_TRUSTED_USER);
githubApi.get(getTeamMembershipUrl(0, ACTIVE_TRUSTED_USER)).reply(200, ACTIVE_STATE);
// TRUSTED LABEL response
circleCiApi.get(buildInfoUrl(BuildNums.TRUST_CHECK_TRUSTED_LABEL)).reply(200, BASIC_BUILD_INFO);
githubApi.get(getFilesUrl(PrNums.TRUST_CHECK_TRUSTED_LABEL)).reply(200, [{ filename: 'aio/a' }]);
circleCiApi.get(buildArtifactsUrl(BuildNums.TRUST_CHECK_TRUSTED_LABEL)).reply(200, [ARTIFACT_VALID_TRUSTED_LABEL]);
circleCiApi.get(ARTIFACT_VALID_TRUSTED_LABEL._urlPath).reply(200, createArchive(BuildNums.TRUST_CHECK_TRUSTED_LABEL, PrNums.TRUST_CHECK_TRUSTED_LABEL, SHA));
githubApi.get(getIssueUrl(PrNums.TRUST_CHECK_TRUSTED_LABEL)).reply(200, ISSUE_INFO_TRUSTED_LABEL);
githubApi.get(getTeamMembershipUrl(0, ACTIVE_TRUSTED_USER)).reply(200, ACTIVE_STATE);
// INACTIVE TRUSTED USER response
circleCiApi.get(buildInfoUrl(BuildNums.TRUST_CHECK_INACTIVE_TRUSTED_USER)).reply(200, BASIC_BUILD_INFO);
githubApi.get(getFilesUrl(PrNums.TRUST_CHECK_INACTIVE_TRUSTED_USER)).reply(200, [{ filename: 'aio/a' }]);
circleCiApi.get(buildArtifactsUrl(BuildNums.TRUST_CHECK_INACTIVE_TRUSTED_USER)).reply(200, [ARTIFACT_VALID_TRUSTED_USER]);
githubApi.get(getIssueUrl(PrNums.TRUST_CHECK_INACTIVE_TRUSTED_USER)).reply(200, ISSUE_INFO_INACTIVE_TRUSTED_USER);
githubApi.get(getTeamMembershipUrl(0, INACTIVE_TRUSTED_USER)).reply(200, INACTIVE_STATE);
// UNTRUSTED reponse
circleCiApi.get(buildInfoUrl(BuildNums.TRUST_CHECK_UNTRUSTED)).reply(200, buildInfo(PrNums.TRUST_CHECK_UNTRUSTED));
githubApi.get(getFilesUrl(PrNums.TRUST_CHECK_UNTRUSTED)).reply(200, [{ filename: 'aio/a' }]);
circleCiApi.get(buildArtifactsUrl(BuildNums.TRUST_CHECK_UNTRUSTED)).reply(200, [ARTIFACT_VALID_UNTRUSTED]);
circleCiApi.get(ARTIFACT_VALID_UNTRUSTED._urlPath).reply(200, createArchive(BuildNums.TRUST_CHECK_UNTRUSTED, PrNums.TRUST_CHECK_UNTRUSTED, SHA));
githubApi.get(getIssueUrl(PrNums.TRUST_CHECK_UNTRUSTED)).reply(200, ISSUE_INFO_UNTRUSTED);
githubApi.get(getTeamMembershipUrl(0, UNTRUSTED_USER)).reply(404);

View File

@ -1,23 +1,17 @@
// Imports
import * as path from 'path';
import {rm} from 'shelljs';
import {AIO_BUILDS_DIR, AIO_NGINX_HOSTNAME, AIO_NGINX_PORT_HTTP, AIO_NGINX_PORT_HTTPS} from '../common/env-variables';
import {computeShortSha} from '../common/utils';
import {PrNums} from './constants';
import {helper as h} from './helper';
import {customMatchers} from './jasmine-custom-matchers';
// Tests
describe(`nginx`, () => {
beforeEach(() => jasmine.DEFAULT_TIMEOUT_INTERVAL = 5000);
beforeEach(() => jasmine.addMatchers(customMatchers));
beforeEach(() => jasmine.DEFAULT_TIMEOUT_INTERVAL = 10000);
afterEach(() => h.cleanUp());
it('should redirect HTTP to HTTPS', done => {
const httpHost = `${AIO_NGINX_HOSTNAME}:${AIO_NGINX_PORT_HTTP}`;
const httpsHost = `${AIO_NGINX_HOSTNAME}:${AIO_NGINX_PORT_HTTPS}`;
const httpHost = `${h.nginxHostname}:${h.nginxPortHttp}`;
const httpsHost = `${h.nginxHostname}:${h.nginxPortHttps}`;
const urlMap = {
[`http://${httpHost}/`]: `https://${httpsHost}/`,
[`http://${httpHost}/foo`]: `https://${httpsHost}/foo`,
@ -37,193 +31,111 @@ describe(`nginx`, () => {
});
h.runForAllSupportedSchemes((scheme, port) => describe(`(on ${scheme.toUpperCase()})`, () => {
const hostname = AIO_NGINX_HOSTNAME;
h.runForAllSupportedSchemes((scheme, port) => describe(`nginx (on ${scheme.toUpperCase()})`, () => {
const hostname = h.nginxHostname;
const host = `${hostname}:${port}`;
const pr = 9;
const pr = '9';
const sha9 = '9'.repeat(40);
const sha0 = '0'.repeat(40);
const shortSha9 = computeShortSha(sha9);
const shortSha0 = computeShortSha(sha0);
describe(`pr<pr>-<sha>.${host}/*`, () => {
describe('(for public builds)', () => {
beforeEach(() => {
h.createDummyBuild(pr, sha9);
h.createDummyBuild(pr, sha0);
});
afterEach(() => {
expect({ prNum: pr, sha: sha9 }).toExistAsABuild();
expect({ prNum: pr, sha: sha0 }).toExistAsABuild();
});
it('should return /index.html', done => {
const origin = `${scheme}://pr${pr}-${shortSha9}.${host}`;
const bodyRegex = new RegExp(`^PR: ${pr} | SHA: ${sha9} | File: /index\\.html$`);
Promise.all([
h.runCmd(`curl -iL ${origin}/index.html`).then(h.verifyResponse(200, bodyRegex)),
h.runCmd(`curl -iL ${origin}/`).then(h.verifyResponse(200, bodyRegex)),
h.runCmd(`curl -iL ${origin}`).then(h.verifyResponse(200, bodyRegex)),
]).then(done);
});
it('should return /index.html (for legacy builds)', async () => {
const origin = `${scheme}://pr${pr}-${sha9}.${host}`;
const bodyRegex = new RegExp(`^PR: ${pr} | SHA: ${sha9} | File: /index\\.html$`);
h.createDummyBuild(pr, sha9, true, false, true);
await Promise.all([
h.runCmd(`curl -iL ${origin}/index.html`).then(h.verifyResponse(200, bodyRegex)),
h.runCmd(`curl -iL ${origin}/`).then(h.verifyResponse(200, bodyRegex)),
h.runCmd(`curl -iL ${origin}`).then(h.verifyResponse(200, bodyRegex)),
]);
expect({ prNum: pr, sha: sha9, isLegacy: true }).toExistAsABuild();
});
it('should return /foo/bar.js', done => {
const bodyRegex = new RegExp(`^PR: ${pr} | SHA: ${sha9} | File: /foo/bar\\.js$`);
h.runCmd(`curl -iL ${scheme}://pr${pr}-${shortSha9}.${host}/foo/bar.js`).
then(h.verifyResponse(200, bodyRegex)).
then(done);
});
it('should return /foo/bar.js (for legacy builds)', async () => {
const origin = `${scheme}://pr${pr}-${sha9}.${host}`;
const bodyRegex = new RegExp(`^PR: ${pr} | SHA: ${sha9} | File: /foo/bar\\.js$`);
h.createDummyBuild(pr, sha9, true, false, true);
await h.runCmd(`curl -iL ${origin}/foo/bar.js`).then(h.verifyResponse(200, bodyRegex));
expect({ prNum: pr, sha: sha9, isLegacy: true }).toExistAsABuild();
});
it('should respond with 403 for directories', done => {
Promise.all([
h.runCmd(`curl -iL ${scheme}://pr${pr}-${shortSha9}.${host}/foo/`).then(h.verifyResponse(403)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${shortSha9}.${host}/foo`).then(h.verifyResponse(403)),
]).then(done);
});
it('should respond with 404 for unknown paths to files', done => {
h.runCmd(`curl -iL ${scheme}://pr${pr}-${shortSha9}.${host}/foo/baz.css`).
then(h.verifyResponse(404)).
then(done);
});
it('should rewrite to \'index.html\' for unknown paths that don\'t look like files', done => {
const origin = `${scheme}://pr${pr}-${shortSha9}.${host}`;
const bodyRegex = new RegExp(`^PR: ${pr} | SHA: ${sha9} | File: /index\\.html$`);
Promise.all([
h.runCmd(`curl -iL ${origin}/foo/baz`).then(h.verifyResponse(200, bodyRegex)),
h.runCmd(`curl -iL ${origin}/foo/baz/`).then(h.verifyResponse(200, bodyRegex)),
]).then(done);
});
it('should respond with 404 for unknown PRs/SHAs', done => {
const otherPr = 54321;
const otherShortSha = computeShortSha('8'.repeat(40));
Promise.all([
h.runCmd(`curl -iL ${scheme}://pr${pr}9-${shortSha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${otherPr}-${shortSha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${shortSha9}9.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${otherShortSha}.${host}`).then(h.verifyResponse(404)),
]).then(done);
});
it('should respond with 404 if the subdomain format is wrong', done => {
Promise.all([
h.runCmd(`curl -iL ${scheme}://xpr${pr}-${shortSha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://prx${pr}-${shortSha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://xx${pr}-${shortSha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://p${pr}-${shortSha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://r${pr}-${shortSha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${pr}-${shortSha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${pr}${shortSha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${pr}_${shortSha9}.${host}`).then(h.verifyResponse(404)),
]).then(done);
});
it('should reject PRs with leading zeros', done => {
h.runCmd(`curl -iL ${scheme}://pr0${pr}-${shortSha9}.${host}`).
then(h.verifyResponse(404)).
then(done);
});
it('should accept SHAs with leading zeros (but not trim the zeros)', done => {
const bodyRegex9 = new RegExp(`^PR: ${pr} | SHA: ${sha9} | File: /index\\.html$`);
const bodyRegex0 = new RegExp(`^PR: ${pr} | SHA: ${sha0} | File: /index\\.html$`);
Promise.all([
h.runCmd(`curl -iL ${scheme}://pr${pr}-0${shortSha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${shortSha9}.${host}`).then(h.verifyResponse(200, bodyRegex9)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${shortSha0}.${host}`).then(h.verifyResponse(200, bodyRegex0)),
]).then(done);
});
beforeEach(() => {
h.createDummyBuild(pr, sha9);
h.createDummyBuild(pr, sha0);
});
describe('(for hidden builds)', () => {
it('should return /index.html', done => {
const origin = `${scheme}://pr${pr}-${sha9}.${host}`;
const bodyRegex = new RegExp(`^PR: ${pr} | SHA: ${sha9} | File: /index\\.html$`);
it('should respond with 404 for any file or directory', async () => {
const origin = `${scheme}://pr${pr}-${shortSha9}.${host}`;
const assert404 = h.verifyResponse(404);
h.createDummyBuild(pr, sha9, false);
await Promise.all([
h.runCmd(`curl -iL ${origin}/index.html`).then(assert404),
h.runCmd(`curl -iL ${origin}/`).then(assert404),
h.runCmd(`curl -iL ${origin}`).then(assert404),
h.runCmd(`curl -iL ${origin}/foo/bar.js`).then(assert404),
h.runCmd(`curl -iL ${origin}/foo/`).then(assert404),
h.runCmd(`curl -iL ${origin}/foo`).then(assert404),
]);
expect({ prNum: pr, sha: sha9, isPublic: false }).toExistAsABuild();
});
Promise.all([
h.runCmd(`curl -iL ${origin}/index.html`).then(h.verifyResponse(200, bodyRegex)),
h.runCmd(`curl -iL ${origin}/`).then(h.verifyResponse(200, bodyRegex)),
h.runCmd(`curl -iL ${origin}`).then(h.verifyResponse(200, bodyRegex)),
]).then(done);
});
it('should respond with 404 for any file or directory (for legacy builds)', async () => {
const origin = `${scheme}://pr${pr}-${sha9}.${host}`;
const assert404 = h.verifyResponse(404);
it('should return /foo/bar.js', done => {
const bodyRegex = new RegExp(`^PR: ${pr} | SHA: ${sha9} | File: /foo/bar\\.js$`);
h.createDummyBuild(pr, sha9, false, false, true);
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}.${host}/foo/bar.js`).
then(h.verifyResponse(200, bodyRegex)).
then(done);
});
await Promise.all([
h.runCmd(`curl -iL ${origin}/index.html`).then(assert404),
h.runCmd(`curl -iL ${origin}/`).then(assert404),
h.runCmd(`curl -iL ${origin}`).then(assert404),
h.runCmd(`curl -iL ${origin}/foo/bar.js`).then(assert404),
h.runCmd(`curl -iL ${origin}/foo/`).then(assert404),
h.runCmd(`curl -iL ${origin}/foo`).then(assert404),
]);
expect({ prNum: pr, sha: sha9, isPublic: false, isLegacy: true }).toExistAsABuild();
});
it('should respond with 403 for directories', done => {
Promise.all([
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}.${host}/foo/`).then(h.verifyResponse(403)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}.${host}/foo`).then(h.verifyResponse(403)),
]).then(done);
});
it('should respond with 404 for unknown paths to files', done => {
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}.${host}/foo/baz.css`).
then(h.verifyResponse(404)).
then(done);
});
it('should rewrite to \'index.html\' for unknown paths that don\'t look like files', done => {
const bodyRegex = new RegExp(`^PR: ${pr} | SHA: ${sha9} | File: /index\\.html$`);
Promise.all([
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}.${host}/foo/baz`).then(h.verifyResponse(200, bodyRegex)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}.${host}/foo/baz/`).then(h.verifyResponse(200, bodyRegex)),
]).then(done);
});
it('should respond with 404 for unknown PRs/SHAs', done => {
const otherPr = 54321;
const otherSha = '8'.repeat(40);
Promise.all([
h.runCmd(`curl -iL ${scheme}://pr${pr}9-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${otherPr}-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}9.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${otherSha}.${host}`).then(h.verifyResponse(404)),
]).then(done);
});
it('should respond with 404 if the subdomain format is wrong', done => {
Promise.all([
h.runCmd(`curl -iL ${scheme}://xpr${pr}-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://prx${pr}-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://xx${pr}-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://p${pr}-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://r${pr}-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${pr}-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${pr}${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${pr}_${sha9}.${host}`).then(h.verifyResponse(404)),
]).then(done);
});
it('should reject PRs with leading zeros', done => {
h.runCmd(`curl -iL ${scheme}://pr0${pr}-${sha9}.${host}`).
then(h.verifyResponse(404)).
then(done);
});
it('should accept SHAs with leading zeros (but not trim the zeros)', done => {
const bodyRegex9 = new RegExp(`^PR: ${pr} | SHA: ${sha9} | File: /index\\.html$`);
const bodyRegex0 = new RegExp(`^PR: ${pr} | SHA: ${sha0} | File: /index\\.html$`);
Promise.all([
h.runCmd(`curl -iL ${scheme}://pr${pr}-0${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}.${host}`).then(h.verifyResponse(200, bodyRegex9)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha0}.${host}`).then(h.verifyResponse(200, bodyRegex0)),
]).then(done);
});
});
@ -253,59 +165,45 @@ describe(`nginx`, () => {
});
describe(`${host}/can-have-public-preview`, () => {
const baseUrl = `${scheme}://${host}/can-have-public-preview`;
it('should disallow non-GET requests', async () => {
await Promise.all([
h.runCmd(`curl -iLX POST ${baseUrl}/42`).then(h.verifyResponse(405)),
h.runCmd(`curl -iLX PUT ${baseUrl}/42`).then(h.verifyResponse(405)),
h.runCmd(`curl -iLX PATCH ${baseUrl}/42`).then(h.verifyResponse(405)),
h.runCmd(`curl -iLX DELETE ${baseUrl}/42`).then(h.verifyResponse(405)),
]);
});
it('should pass requests through to the preview server', async () => {
await h.runCmd(`curl -iLX GET ${baseUrl}/${PrNums.CHANGED_FILES_ERROR}`).
then(h.verifyResponse(500, /CHANGED_FILES_ERROR/));
});
it('should respond with 404 for unknown paths', async () => {
const cmdPrefix = `curl -iLX GET ${baseUrl}`;
await Promise.all([
h.runCmd(`${cmdPrefix}/foo/42`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}-foo/42`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}nfoo/42`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/42/foo`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/f00`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/`).then(h.verifyResponse(404)),
]);
});
});
describe(`${host}/circle-build`, () => {
describe(`${host}/create-build/<pr>/<sha>`, () => {
it('should disallow non-POST requests', done => {
const url = `${scheme}://${host}/circle-build`;
const url = `${scheme}://${host}/create-build/${pr}/${sha9}`;
Promise.all([
h.runCmd(`curl -iLX GET ${url}`).then(h.verifyResponse(405)),
h.runCmd(`curl -iLX PUT ${url}`).then(h.verifyResponse(405)),
h.runCmd(`curl -iLX PATCH ${url}`).then(h.verifyResponse(405)),
h.runCmd(`curl -iLX DELETE ${url}`).then(h.verifyResponse(405)),
h.runCmd(`curl -iLX GET ${url}`).then(h.verifyResponse([405, 'Not Allowed'])),
h.runCmd(`curl -iLX PUT ${url}`).then(h.verifyResponse([405, 'Not Allowed'])),
h.runCmd(`curl -iLX PATCH ${url}`).then(h.verifyResponse([405, 'Not Allowed'])),
h.runCmd(`curl -iLX DELETE ${url}`).then(h.verifyResponse([405, 'Not Allowed'])),
]).then(done);
});
it('should pass requests through to the preview server', done => {
h.runCmd(`curl -iLX POST ${scheme}://${host}/circle-build`).
then(h.verifyResponse(400, /Incorrect body content. Expected JSON/)).
it(`should reject files larger than ${h.uploadMaxSize}B (according to header)`, done => {
const headers = `--header "Content-Length: ${1.5 * h.uploadMaxSize}"`;
const url = `${scheme}://${host}/create-build/${pr}/${sha9}`;
h.runCmd(`curl -iLX POST ${headers} ${url}`).
then(h.verifyResponse([413, 'Request Entity Too Large'])).
then(done);
});
it(`should reject files larger than ${h.uploadMaxSize}B (without header)`, done => {
const filePath = path.join(h.buildsDir, 'snapshot.tar.gz');
const url = `${scheme}://${host}/create-build/${pr}/${sha9}`;
h.writeFile(filePath, {size: 1.5 * h.uploadMaxSize});
h.runCmd(`curl -iLX POST --data-binary "@${filePath}" ${url}`).
then(h.verifyResponse([413, 'Request Entity Too Large'])).
then(done);
});
it('should pass requests through to the upload server', done => {
h.runCmd(`curl -iLX POST ${scheme}://${host}/create-build/${pr}/${sha9}`).
then(h.verifyResponse(401, /Missing or empty 'AUTHORIZATION' header/)).
then(done);
});
@ -314,55 +212,32 @@ describe(`nginx`, () => {
const cmdPrefix = `curl -iLX POST ${scheme}://${host}`;
Promise.all([
h.runCmd(`${cmdPrefix}/foo/circle-build/`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/foo-circle-build/`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/fooncircle-build/`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/circle-build/foo/`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/circle-build-foo/`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/circle-buildnfoo/`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/circle-build/pr`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/circle-build/42`).then(h.verifyResponse(404)),
]).then(done);
});
});
describe(`${host}/pr-updated`, () => {
const url = `${scheme}://${host}/pr-updated`;
it('should disallow non-POST requests', done => {
Promise.all([
h.runCmd(`curl -iLX GET ${url}`).then(h.verifyResponse(405)),
h.runCmd(`curl -iLX PUT ${url}`).then(h.verifyResponse(405)),
h.runCmd(`curl -iLX PATCH ${url}`).then(h.verifyResponse(405)),
h.runCmd(`curl -iLX DELETE ${url}`).then(h.verifyResponse(405)),
h.runCmd(`${cmdPrefix}/foo/create-build/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/foo-create-build/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/fooncreate-build/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build/foo/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build-foo/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-buildnfoo/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build/pr${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build/${pr}/${sha9}42`).then(h.verifyResponse(404)),
]).then(done);
});
it('should pass requests through to the preview server', done => {
const cmdPrefix = `curl -iLX POST --header "Content-Type: application/json"`;
const cmd1 = `${cmdPrefix} ${url}`;
Promise.all([
h.runCmd(cmd1).then(h.verifyResponse(400, /Missing or empty 'number' field/)),
]).then(done);
it('should reject PRs with leading zeros', done => {
h.runCmd(`curl -iLX POST ${scheme}://${host}/create-build/0${pr}/${sha9}`).
then(h.verifyResponse(404)).
then(done);
});
it('should respond with 404 for unknown paths', done => {
const cmdPrefix = `curl -iLX POST ${scheme}://${host}`;
it('should accept SHAs with leading zeros (but not trim the zeros)', done => {
const cmdPrefix = `curl -iLX POST ${scheme}://${host}/create-build/${pr}`;
const bodyRegex = /Missing or empty 'AUTHORIZATION' header/;
Promise.all([
h.runCmd(`${cmdPrefix}/foo/pr-updated`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/foo-pr-updated`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/foonpr-updated`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/pr-updated/foo`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/pr-updated-foo`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/pr-updatednfoo`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/0${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/${sha0}`).then(h.verifyResponse(401, bodyRegex)),
]).then(done);
});
@ -371,15 +246,13 @@ describe(`nginx`, () => {
describe(`${host}/*`, () => {
beforeEach(() => {
it('should respond with 404 for unkown URLs (even if the resource exists)', done => {
['index.html', 'foo.js', 'foo/index.html'].forEach(relFilePath => {
const absFilePath = path.join(AIO_BUILDS_DIR, relFilePath);
return h.writeFile(absFilePath, {content: `File: /${relFilePath}`});
const absFilePath = path.join(h.buildsDir, relFilePath);
h.writeFile(absFilePath, {content: `File: /${relFilePath}`});
});
});
it('should respond with 404 for unknown URLs (even if the resource exists)', async () => {
await Promise.all([
Promise.all([
h.runCmd(`curl -iL ${scheme}://${host}/index.html`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${host}/`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${host}`).then(h.verifyResponse(404)),
@ -388,14 +261,7 @@ describe(`nginx`, () => {
h.runCmd(`curl -iL ${scheme}://foo.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${host}/foo.js`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${host}/foo/index.html`).then(h.verifyResponse(404)),
]);
});
afterEach(() => {
['index.html', 'foo.js', 'foo/index.html', 'foo'].forEach(relFilePath => {
const absFilePath = path.join(AIO_BUILDS_DIR, relFilePath);
rm('-r', absFilePath);
});
]).then(done);
});
});

View File

@ -1,569 +0,0 @@
// Imports
import * as fs from 'fs';
import {join} from 'path';
import {AIO_PREVIEW_SERVER_HOSTNAME, AIO_PREVIEW_SERVER_PORT, AIO_WWW_USER} from '../common/env-variables';
import {computeShortSha} from '../common/utils';
import {ALT_SHA, BuildNums, PrNums, SHA, SIMILAR_SHA} from './constants';
import {helper as h, makeCurl, payload} from './helper';
import {customMatchers} from './jasmine-custom-matchers';
// Tests
describe('preview-server', () => {
const hostname = AIO_PREVIEW_SERVER_HOSTNAME;
const port = AIO_PREVIEW_SERVER_PORT;
const host = `http://${hostname}:${port}`;
beforeEach(() => jasmine.DEFAULT_TIMEOUT_INTERVAL = 5000);
beforeEach(() => jasmine.addMatchers(customMatchers));
afterEach(() => h.cleanUp());
describe(`${host}/can-have-public-preview`, () => {
const curl = makeCurl(`${host}/can-have-public-preview`, {
defaultData: null,
defaultExtraPath: `/${PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER}`,
defaultHeaders: [],
defaultMethod: 'GET',
});
it('should disallow non-GET requests', async () => {
const bodyRegex = /^Unknown resource in request/;
await Promise.all([
curl({method: 'POST'}).then(h.verifyResponse(404, bodyRegex)),
curl({method: 'PUT'}).then(h.verifyResponse(404, bodyRegex)),
curl({method: 'PATCH'}).then(h.verifyResponse(404, bodyRegex)),
curl({method: 'DELETE'}).then(h.verifyResponse(404, bodyRegex)),
]);
});
it('should respond with 404 for unknown paths', async () => {
const bodyRegex = /^Unknown resource in request/;
await Promise.all([
curl({extraPath: `/foo/${PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER}`}).then(h.verifyResponse(404, bodyRegex)),
curl({extraPath: `-foo/${PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER}`}).then(h.verifyResponse(404, bodyRegex)),
curl({extraPath: `nfoo/${PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER}`}).then(h.verifyResponse(404, bodyRegex)),
curl({extraPath: `/${PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER}/foo`}).then(h.verifyResponse(404, bodyRegex)),
curl({extraPath: '/f00'}).then(h.verifyResponse(404, bodyRegex)),
curl({extraPath: '/'}).then(h.verifyResponse(404, bodyRegex)),
]);
});
it('should respond with 500 if checking for significant file changes fails', async () => {
await Promise.all([
curl({extraPath: `/${PrNums.CHANGED_FILES_404}`}).then(h.verifyResponse(500, /CHANGED_FILES_404/)),
curl({extraPath: `/${PrNums.CHANGED_FILES_ERROR}`}).then(h.verifyResponse(500, /CHANGED_FILES_ERROR/)),
]);
});
it('should respond with 200 (false) if no significant files were touched', async () => {
const expectedResponse = JSON.stringify({
canHavePublicPreview: false,
reason: 'No significant files touched.',
});
await curl({extraPath: `/${PrNums.CHANGED_FILES_NONE}`}).then(h.verifyResponse(200, expectedResponse));
});
it('should respond with 500 if checking "trusted" status fails', async () => {
await curl({extraPath: `/${PrNums.TRUST_CHECK_ERROR}`}).then(h.verifyResponse(500, 'TRUST_CHECK_ERROR'));
});
it('should respond with 200 (false) if the PR is not automatically verifiable as "trusted"', async () => {
const expectedResponse = JSON.stringify({
canHavePublicPreview: false,
reason: 'Not automatically verifiable as \\"trusted\\".',
});
await Promise.all([
curl({extraPath: `/${PrNums.TRUST_CHECK_INACTIVE_TRUSTED_USER}`}).then(h.verifyResponse(200, expectedResponse)),
curl({extraPath: `/${PrNums.TRUST_CHECK_UNTRUSTED}`}).then(h.verifyResponse(200, expectedResponse)),
]);
});
it('should respond with 200 (true) if the PR can have a public preview', async () => {
const expectedResponse = JSON.stringify({
canHavePublicPreview: true,
reason: null,
});
await Promise.all([
curl({extraPath: `/${PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER}`}).then(h.verifyResponse(200, expectedResponse)),
curl({extraPath: `/${PrNums.TRUST_CHECK_TRUSTED_LABEL}`}).then(h.verifyResponse(200, expectedResponse)),
]);
});
});
describe(`${host}/circle-build`, () => {
const curl = makeCurl(`${host}/circle-build`);
it('should disallow non-POST requests', async () => {
const bodyRegex = /^Unknown resource/;
await Promise.all([
curl({method: 'GET'}).then(h.verifyResponse(404, bodyRegex)),
curl({method: 'PUT'}).then(h.verifyResponse(404, bodyRegex)),
curl({method: 'PATCH'}).then(h.verifyResponse(404, bodyRegex)),
curl({method: 'DELETE'}).then(h.verifyResponse(404, bodyRegex)),
]);
});
it('should respond with 404 for unknown paths', async () => {
await Promise.all([
curl({url: `${host}/foo/circle-build`}).then(h.verifyResponse(404)),
curl({url: `${host}/foo-circle-build`}).then(h.verifyResponse(404)),
curl({url: `${host}/fooncircle-build`}).then(h.verifyResponse(404)),
curl({url: `${host}/circle-build/foo`}).then(h.verifyResponse(404)),
curl({url: `${host}/circle-build-foo`}).then(h.verifyResponse(404)),
curl({url: `${host}/circle-buildnfoo`}).then(h.verifyResponse(404)),
curl({url: `${host}/circle-build/pr`}).then(h.verifyResponse(404)),
curl({url: `${host}/circle-build42`}).then(h.verifyResponse(404)),
]);
});
it('should respond with 400 if the body is not valid', async () => {
await Promise.all([
curl({ data: '' }).then(h.verifyResponse(400)),
curl({ data: {} }).then(h.verifyResponse(400)),
curl({ data: { payload: {} } }).then(h.verifyResponse(400)),
curl({ data: { payload: { build_num: 1 } } }).then(h.verifyResponse(400)),
curl({ data: { payload: { build_num: 1, build_parameters: {} } } }).then(h.verifyResponse(400)),
curl(payload(0)).then(h.verifyResponse(400)),
curl(payload(-1)).then(h.verifyResponse(400)),
]);
});
it('should respond with 500 if the CircleCI API request errors', async () => {
await curl(payload(BuildNums.BUILD_INFO_ERROR)).then(h.verifyResponse(500));
await curl(payload(BuildNums.BUILD_INFO_404)).then(h.verifyResponse(500));
});
it('should respond with 204 if the build on CircleCI failed', async () => {
await curl(payload(BuildNums.BUILD_INFO_BUILD_FAILED)).then(h.verifyResponse(204));
});
it('should respond with 500 if the github org from CircleCI does not match what is configured', async () => {
await curl(payload(BuildNums.BUILD_INFO_INVALID_GH_ORG)).then(h.verifyResponse(500));
});
it('should respond with 500 if the github repo from CircleCI does not match what is configured', async () => {
await curl(payload(BuildNums.BUILD_INFO_INVALID_GH_REPO)).then(h.verifyResponse(500));
});
it('should respond with 500 if the github files API errors', async () => {
await curl(payload(BuildNums.CHANGED_FILES_ERROR)).then(h.verifyResponse(500));
await curl(payload(BuildNums.CHANGED_FILES_404)).then(h.verifyResponse(500));
});
it('should respond with 204 if no significant files are changed by the PR', async () => {
await curl(payload(BuildNums.CHANGED_FILES_NONE)).then(h.verifyResponse(204));
});
it('should respond with 500 if the CircleCI artifact API fails', async () => {
await curl(payload(BuildNums.BUILD_ARTIFACTS_ERROR)).then(h.verifyResponse(500));
await curl(payload(BuildNums.BUILD_ARTIFACTS_404)).then(h.verifyResponse(500));
await curl(payload(BuildNums.BUILD_ARTIFACTS_EMPTY)).then(h.verifyResponse(500));
await curl(payload(BuildNums.BUILD_ARTIFACTS_MISSING)).then(h.verifyResponse(500));
});
it('should respond with 500 if fetching the artifact errors', async () => {
await curl(payload(BuildNums.DOWNLOAD_ARTIFACT_ERROR)).then(h.verifyResponse(500));
await curl(payload(BuildNums.DOWNLOAD_ARTIFACT_404)).then(h.verifyResponse(500));
});
it('should respond with 500 if the GH trusted API fails', async () => {
await curl(payload(BuildNums.TRUST_CHECK_ERROR)).then(h.verifyResponse(500));
expect({ prNum: PrNums.TRUST_CHECK_ERROR }).toExistAsAnArtifact();
});
it('should respond with 201 if a new public build is created', async () => {
await curl(payload(BuildNums.TRUST_CHECK_ACTIVE_TRUSTED_USER))
.then(h.verifyResponse(201));
expect({ prNum: PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER }).toExistAsABuild();
});
it('should respond with 202 if a new private build is created', async () => {
await curl(payload(BuildNums.TRUST_CHECK_UNTRUSTED)).then(h.verifyResponse(202));
expect({ prNum: PrNums.TRUST_CHECK_UNTRUSTED, isPublic: false }).toExistAsABuild();
});
[true].forEach(isPublic => {
const build = isPublic ? BuildNums.TRUST_CHECK_ACTIVE_TRUSTED_USER : BuildNums.TRUST_CHECK_UNTRUSTED;
const prNum = isPublic ? PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER : PrNums.TRUST_CHECK_UNTRUSTED;
const label = isPublic ? 'public' : 'non-public';
const overwriteRe = RegExp(`^Request to overwrite existing ${label} directory`);
const statusCode = isPublic ? 201 : 202;
describe(`for ${label} builds`, () => {
it('should extract the contents of the build artifact', async () => {
await curl(payload(build))
.then(h.verifyResponse(statusCode));
expect(h.readBuildFile(prNum, SHA, 'index.html', isPublic))
.toContain(`PR: ${prNum} | SHA: ${SHA} | File: /index.html`);
expect(h.readBuildFile(prNum, SHA, 'foo/bar.js', isPublic))
.toContain(`PR: ${prNum} | SHA: ${SHA} | File: /foo/bar.js`);
expect({ prNum, isPublic }).toExistAsABuild();
});
it(`should create files/directories owned by '${AIO_WWW_USER}'`, async () => {
await curl(payload(build))
.then(h.verifyResponse(statusCode));
const shaDir = h.getShaDir(h.getPrDir(prNum, isPublic), SHA);
const { stdout: allFiles } = await h.runCmd(`find ${shaDir}`);
const { stdout: userFiles } = await h.runCmd(`find ${shaDir} -user ${AIO_WWW_USER}`);
expect(userFiles).toBe(allFiles);
expect(userFiles).toContain(shaDir);
expect(userFiles).toContain(join(shaDir, 'index.html'));
expect(userFiles).toContain(join(shaDir, 'foo', 'bar.js'));
expect({ prNum, isPublic }).toExistAsABuild();
});
it('should delete the build artifact file', async () => {
await curl(payload(build))
.then(h.verifyResponse(statusCode));
expect({ prNum, SHA }).not.toExistAsAnArtifact();
expect({ prNum, isPublic }).toExistAsABuild();
});
it('should make the build directory non-writable', async () => {
await curl(payload(build))
.then(h.verifyResponse(statusCode));
// See https://github.com/nodejs/node-v0.x-archive/issues/3045#issuecomment-4862588.
const isNotWritable = (fileOrDir: string) => {
const mode = fs.statSync(fileOrDir).mode;
// tslint:disable-next-line: no-bitwise
return !(mode & parseInt('222', 8));
};
const shaDir = h.getShaDir(h.getPrDir(prNum, isPublic), SHA);
expect(isNotWritable(shaDir)).toBe(true);
expect(isNotWritable(join(shaDir, 'index.html'))).toBe(true);
expect(isNotWritable(join(shaDir, 'foo', 'bar.js'))).toBe(true);
expect({ prNum, isPublic }).toExistAsABuild();
});
it('should ignore a legacy 40-chars long build directory (even if it starts with the same chars)',
async () => {
// It is possible that 40-chars long build directories exist, if they had been deployed
// before implementing the shorter build directory names. In that case, we don't want the
// second (shorter) name to be considered the same as the old one (even if they originate
// from the same SHA).
h.createDummyBuild(prNum, SHA, isPublic, false, true);
h.writeBuildFile(prNum, SHA, 'index.html', 'My content', isPublic, true);
expect(h.readBuildFile(prNum, SHA, 'index.html', isPublic, true)).toBe('My content');
await curl(payload(build))
.then(h.verifyResponse(statusCode));
expect(h.readBuildFile(prNum, SHA, 'index.html', isPublic, false)).toContain('index.html');
expect(h.readBuildFile(prNum, SHA, 'index.html', isPublic, true)).toBe('My content');
expect({ prNum, isPublic, sha: SHA, isLegacy: false }).toExistAsABuild();
expect({ prNum, isPublic, sha: SHA, isLegacy: true }).toExistAsABuild();
});
it(`should not overwrite existing builds`, async () => {
// setup a build already in place
h.createDummyBuild(prNum, SHA, isPublic);
// distinguish this build from the downloaded one
h.writeBuildFile(prNum, SHA, 'index.html', 'My content', isPublic);
await curl(payload(build)).then(h.verifyResponse(409, overwriteRe));
expect(h.readBuildFile(prNum, SHA, 'index.html', isPublic)).toBe('My content');
expect({ prNum, isPublic }).toExistAsABuild();
expect({ prNum }).toExistAsAnArtifact();
});
it(`should not overwrite existing builds (even if the SHA is different)`, async () => {
// Since only the first few characters of the SHA are used, it is possible for two different
// SHAs to correspond to the same directory. In that case, we don't want the second SHA to
// overwrite the first.
expect(SIMILAR_SHA).not.toEqual(SHA);
expect(computeShortSha(SIMILAR_SHA)).toEqual(computeShortSha(SHA));
h.createDummyBuild(prNum, SIMILAR_SHA, isPublic);
expect(h.readBuildFile(prNum, SIMILAR_SHA, 'index.html', isPublic)).toContain('index.html');
h.writeBuildFile(prNum, SIMILAR_SHA, 'index.html', 'My content', isPublic);
expect(h.readBuildFile(prNum, SIMILAR_SHA, 'index.html', isPublic)).toBe('My content');
await curl(payload(build)).then(h.verifyResponse(409, overwriteRe));
expect(h.readBuildFile(prNum, SIMILAR_SHA, 'index.html', isPublic)).toBe('My content');
expect({ prNum, isPublic, sha: SIMILAR_SHA }).toExistAsABuild();
expect({ prNum, sha: SIMILAR_SHA }).toExistAsAnArtifact();
});
it('should only delete the SHA directory on error (for existing PR)', async () => {
h.createDummyBuild(prNum, ALT_SHA, isPublic);
await curl(payload(BuildNums.TRUST_CHECK_ERROR)).then(h.verifyResponse(500));
expect({ prNum: PrNums.TRUST_CHECK_ERROR }).toExistAsAnArtifact();
expect({ prNum, isPublic, sha: SHA }).not.toExistAsABuild();
expect({ prNum, isPublic, sha: ALT_SHA }).toExistAsABuild();
});
describe('when the PR\'s visibility has changed', () => {
it('should update the PR\'s visibility', async () => {
h.createDummyBuild(prNum, ALT_SHA, !isPublic);
await curl(payload(build)).then(h.verifyResponse(statusCode));
expect({ prNum, isPublic }).toExistAsABuild();
expect({ prNum, isPublic, sha: ALT_SHA }).toExistAsABuild();
});
it('should not overwrite existing builds (but keep the updated visibility)', async () => {
h.createDummyBuild(prNum, SHA, !isPublic);
await curl(payload(build)).then(h.verifyResponse(409));
expect({ prNum, isPublic }).toExistAsABuild();
expect({ prNum, isPublic: !isPublic }).not.toExistAsABuild();
// since it errored we didn't clear up the downloaded artifact - perhaps we should?
expect({ prNum }).toExistAsAnArtifact();
});
it('should reject the request if it fails to update the PR\'s visibility', async () => {
// One way to cause an error is to have both a public and a hidden directory for the same PR.
h.createDummyBuild(prNum, ALT_SHA, isPublic);
h.createDummyBuild(prNum, ALT_SHA, !isPublic);
const errorRegex = new RegExp(`^Request to move '${h.getPrDir(prNum, !isPublic)}' ` +
`to existing directory '${h.getPrDir(prNum, isPublic)}'.`);
await curl(payload(build)).then(h.verifyResponse(409, errorRegex));
expect({ prNum, isPublic }).not.toExistAsABuild();
// The bad folders should have been deleted
expect({ prNum, sha: ALT_SHA, isPublic }).toExistAsABuild();
expect({ prNum, sha: ALT_SHA, isPublic: !isPublic }).toExistAsABuild();
// since it errored we didn't clear up the downloaded artifact - perhaps we should?
expect({ prNum }).toExistAsAnArtifact();
});
});
});
});
});
describe(`${host}/health-check`, () => {
it('should respond with 200', done => {
Promise.all([
h.runCmd(`curl -iL ${host}/health-check`).then(h.verifyResponse(200)),
h.runCmd(`curl -iL ${host}/health-check/`).then(h.verifyResponse(200)),
]).then(done);
});
it('should respond with 404 if the path does not match exactly', done => {
Promise.all([
h.runCmd(`curl -iL ${host}/health-check/foo`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${host}/health-check-foo`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${host}/health-checknfoo`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${host}/foo/health-check`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${host}/foo-health-check`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${host}/foonhealth-check`).then(h.verifyResponse(404)),
]).then(done);
});
});
describe(`${host}/pr-updated`, () => {
const curl = makeCurl(`${host}/pr-updated`);
it('should disallow non-POST requests', async () => {
const bodyRegex = /^Unknown resource in request/;
await Promise.all([
curl({method: 'GET'}).then(h.verifyResponse(404, bodyRegex)),
curl({method: 'PUT'}).then(h.verifyResponse(404, bodyRegex)),
curl({method: 'PATCH'}).then(h.verifyResponse(404, bodyRegex)),
curl({method: 'DELETE'}).then(h.verifyResponse(404, bodyRegex)),
]);
});
it('should respond with 400 for requests without a payload', async () => {
const bodyRegex = /^Missing or empty 'number' field in request/;
await Promise.all([
curl({ data: '' }).then(h.verifyResponse(400, bodyRegex)),
curl({ data: {} }).then(h.verifyResponse(400, bodyRegex)),
]);
});
it('should respond with 400 for requests without a \'number\' field', async () => {
const bodyRegex = /^Missing or empty 'number' field in request/;
await Promise.all([
curl({ data: {} }).then(h.verifyResponse(400, bodyRegex)),
curl({ data: { number: null} }).then(h.verifyResponse(400, bodyRegex)),
]);
});
it('should reject requests for which checking the PR visibility fails', async () => {
await curl({ data: { number: PrNums.TRUST_CHECK_ERROR } }).then(h.verifyResponse(500, /TRUST_CHECK_ERROR/));
});
it('should respond with 404 for unknown paths', done => {
const mockPayload = JSON.stringify({number: 1}); // MockExternalApiFlags.TRUST_CHECK_ACTIVE_TRUSTED_USER });
const cmdPrefix = `curl -iLX POST --data "${mockPayload}" ${host}`;
Promise.all([
h.runCmd(`${cmdPrefix}/foo/pr-updated`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/foo-pr-updated`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/foonpr-updated`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/pr-updated/foo`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/pr-updated-foo`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/pr-updatednfoo`).then(h.verifyResponse(404)),
]).then(done);
});
it('should do nothing if PR\'s visibility is already up-to-date', async () => {
const publicPr = PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER;
const hiddenPr = PrNums.TRUST_CHECK_UNTRUSTED;
const checkVisibilities = (remove: boolean) => {
// Public build is already public.
expect({ prNum: publicPr, isPublic: false }).not.toExistAsABuild(remove);
expect({ prNum: publicPr, isPublic: true }).toExistAsABuild(remove);
// Hidden build is already hidden.
expect({ prNum: hiddenPr, isPublic: false }).toExistAsABuild(remove);
expect({ prNum: hiddenPr, isPublic: true }).not.toExistAsABuild(remove);
};
h.createDummyBuild(publicPr, SHA, true);
h.createDummyBuild(hiddenPr, SHA, false);
checkVisibilities(false);
await Promise.all([
curl({ data: {number: +publicPr, action: 'foo' } }).then(h.verifyResponse(200)),
curl({ data: {number: +hiddenPr, action: 'foo' } }).then(h.verifyResponse(200)),
]);
// Visibilities should not have changed, because the specified action could not have triggered a change.
checkVisibilities(true);
});
it('should do nothing if \'action\' implies no visibility change', async () => {
const publicPr = PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER;
const hiddenPr = PrNums.TRUST_CHECK_UNTRUSTED;
const checkVisibilities = (remove: boolean) => {
// Public build is hidden atm.
expect({ prNum: publicPr, isPublic: false }).toExistAsABuild(remove);
expect({ prNum: publicPr, isPublic: true }).not.toExistAsABuild(remove);
// Hidden build is public atm.
expect({ prNum: hiddenPr, isPublic: false }).not.toExistAsABuild(remove);
expect({ prNum: hiddenPr, isPublic: true }).toExistAsABuild(remove);
};
h.createDummyBuild(publicPr, SHA, false);
h.createDummyBuild(hiddenPr, SHA, true);
checkVisibilities(false);
await Promise.all([
curl({ data: {number: +publicPr, action: 'foo' } }).then(h.verifyResponse(200)),
curl({ data: {number: +hiddenPr, action: 'foo' } }).then(h.verifyResponse(200)),
]);
// Visibilities should not have changed, because the specified action could not have triggered a change.
checkVisibilities(true);
});
describe('when the visiblity has changed', () => {
const publicPr = PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER;
const hiddenPr = PrNums.TRUST_CHECK_UNTRUSTED;
beforeEach(() => {
// Create initial PR builds with opposite visibilities as the ones that will be reported:
// - The now public PR was previously hidden.
// - The now hidden PR was previously public.
h.createDummyBuild(publicPr, SHA, false);
h.createDummyBuild(hiddenPr, SHA, true);
expect({ prNum: publicPr, isPublic: false }).toExistAsABuild(false);
expect({ prNum: publicPr, isPublic: true }).not.toExistAsABuild(false);
expect({ prNum: hiddenPr, isPublic: false }).not.toExistAsABuild(false);
expect({ prNum: hiddenPr, isPublic: true }).toExistAsABuild(false);
});
afterEach(() => {
// Expect PRs' visibility to have been updated:
// - The public PR should be actually public (previously it was hidden).
// - The hidden PR should be actually hidden (previously it was public).
expect({ prNum: publicPr, isPublic: false }).not.toExistAsABuild();
expect({ prNum: publicPr, isPublic: true }).toExistAsABuild();
expect({ prNum: hiddenPr, isPublic: false }).toExistAsABuild();
expect({ prNum: hiddenPr, isPublic: true }).not.toExistAsABuild();
});
it('should update the PR\'s visibility (action: undefined)', async () => {
await Promise.all([
curl({ data: {number: +publicPr } }).then(h.verifyResponse(200)),
curl({ data: {number: +hiddenPr } }).then(h.verifyResponse(200)),
]);
});
it('should update the PR\'s visibility (action: labeled)', async () => {
await Promise.all([
curl({ data: {number: +publicPr, action: 'labeled' } }).then(h.verifyResponse(200)),
curl({ data: {number: +hiddenPr, action: 'labeled' } }).then(h.verifyResponse(200)),
]);
});
it('should update the PR\'s visibility (action: unlabeled)', async () => {
await Promise.all([
curl({ data: {number: +publicPr, action: 'unlabeled' } }).then(h.verifyResponse(200)),
curl({ data: {number: +hiddenPr, action: 'unlabeled' } }).then(h.verifyResponse(200)),
]);
});
});
});
describe(`${host}/*`, () => {
it('should respond with 404 for requests to unknown URLs', done => {
const bodyRegex = /^Unknown resource/;
Promise.all([
h.runCmd(`curl -iL ${host}/index.html`).then(h.verifyResponse(404, bodyRegex)),
h.runCmd(`curl -iL ${host}/`).then(h.verifyResponse(404, bodyRegex)),
h.runCmd(`curl -iL ${host}`).then(h.verifyResponse(404, bodyRegex)),
h.runCmd(`curl -iLX PUT ${host}`).then(h.verifyResponse(404, bodyRegex)),
h.runCmd(`curl -iLX POST ${host}`).then(h.verifyResponse(404, bodyRegex)),
h.runCmd(`curl -iLX PATCH ${host}`).then(h.verifyResponse(404, bodyRegex)),
h.runCmd(`curl -iLX DELETE ${host}`).then(h.verifyResponse(404, bodyRegex)),
]).then(done);
});
});
});

View File

@ -1,269 +1,84 @@
// Imports
import {AIO_NGINX_HOSTNAME} from '../common/env-variables';
import {computeShortSha} from '../common/utils';
import {ALT_SHA, BuildNums, PrNums, SHA} from './constants';
import {helper as h, makeCurl, payload} from './helper';
import {customMatchers} from './jasmine-custom-matchers';
import * as path from 'path';
import {helper as h} from './helper';
// Tests
h.runForAllSupportedSchemes((scheme, port) => describe(`integration (on ${scheme.toUpperCase()})`, () => {
const hostname = AIO_NGINX_HOSTNAME;
const hostname = h.nginxHostname;
const host = `${hostname}:${port}`;
const curlPrUpdated = makeCurl(`${scheme}://${host}/pr-updated`);
const pr9 = '9';
const sha9 = '9'.repeat(40);
const sha0 = '0'.repeat(40);
const archivePath = path.join(h.buildsDir, 'snapshot.tar.gz');
const getFile = (pr: number, sha: string, file: string) =>
h.runCmd(`curl -iL ${scheme}://pr${pr}-${computeShortSha(sha)}.${host}/${file}`);
const prUpdated = (prNum: number, action?: string) => curlPrUpdated({ data: { number: prNum, action } });
const circleBuild = makeCurl(`${scheme}://${host}/circle-build`);
beforeEach(() => {
jasmine.DEFAULT_TIMEOUT_INTERVAL = 5000;
jasmine.addMatchers(customMatchers);
});
afterEach(() => h.cleanUp());
describe('for a new/non-existing PR', () => {
it('should be able to create and serve a public preview', async () => {
const BUILD = BuildNums.TRUST_CHECK_ACTIVE_TRUSTED_USER;
const PR = PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER;
const regexPrefix = `^BUILD: ${BUILD} \\| PR: ${PR} \\| SHA: ${SHA} \\| File:`;
const idxContentRegex = new RegExp(`${regexPrefix} \\/index\\.html$`);
const barContentRegex = new RegExp(`${regexPrefix} \\/foo\\/bar\\.js$`);
await circleBuild(payload(BUILD)).then(h.verifyResponse(201));
await Promise.all([
getFile(PR, SHA, 'index.html').then(h.verifyResponse(200, idxContentRegex)),
getFile(PR, SHA, 'foo/bar.js').then(h.verifyResponse(200, barContentRegex)),
]);
expect({ prNum: PR }).toExistAsABuild();
expect({ prNum: PR, isPublic: false }).not.toExistAsABuild();
});
it('should be able to create but not serve a hidden preview', async () => {
const BUILD = BuildNums.TRUST_CHECK_UNTRUSTED;
const PR = PrNums.TRUST_CHECK_UNTRUSTED;
await circleBuild(payload(BUILD)).then(h.verifyResponse(202));
await Promise.all([
getFile(PR, SHA, 'index.html').then(h.verifyResponse(404)),
getFile(PR, SHA, 'foo/bar.js').then(h.verifyResponse(404)),
]);
expect({ prNum: PR }).not.toExistAsABuild();
expect({ prNum: PR, isPublic: false }).toExistAsABuild();
});
it('should reject if verification fails', async () => {
const BUILD = BuildNums.TRUST_CHECK_ERROR;
const PR = PrNums.TRUST_CHECK_ERROR;
await circleBuild(payload(BUILD)).then(h.verifyResponse(500));
expect({ prNum: PR }).toExistAsAnArtifact();
expect({ prNum: PR }).not.toExistAsABuild();
expect({ prNum: PR, isPublic: false }).not.toExistAsABuild();
});
it('should be able to notify that a PR has been updated (and do nothing)', async () => {
await prUpdated(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER).then(h.verifyResponse(200));
// The PR should still not exist.
expect({ prNum: PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER, isPublic: false }).not.toExistAsABuild();
expect({ prNum: PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER, isPublic: true }).not.toExistAsABuild();
});
const getFile = (pr: string, sha: string, file: string) =>
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha}.${host}/${file}`);
const uploadBuild = (pr: string, sha: string, archive: string) => {
const curlPost = 'curl -iLX POST --header "Authorization: Token FOO"';
return h.runCmd(`${curlPost} --data-binary "@${archive}" ${scheme}://${host}/create-build/${pr}/${sha}`);
};
beforeEach(() => jasmine.DEFAULT_TIMEOUT_INTERVAL = 10000);
afterEach(() => {
h.deletePrDir(pr9);
h.cleanUp();
});
describe('for an existing PR', () => {
it('should be able to upload and serve a build for a new PR', done => {
const regexPrefix9 = `^PR: uploaded\\/${pr9} \\| SHA: ${sha9} \\| File:`;
const idxContentRegex9 = new RegExp(`${regexPrefix9} \\/index\\.html$`);
const barContentRegex9 = new RegExp(`${regexPrefix9} \\/foo\\/bar\\.js$`);
it('should be able to create and serve a public preview', async () => {
const BUILD = BuildNums.TRUST_CHECK_ACTIVE_TRUSTED_USER;
const PR = PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER;
h.createDummyArchive(pr9, sha9, archivePath);
const regexPrefix1 = `^PR: ${PR} \\| SHA: ${ALT_SHA} \\| File:`;
const idxContentRegex1 = new RegExp(`${regexPrefix1} \\/index\\.html$`);
const barContentRegex1 = new RegExp(`${regexPrefix1} \\/foo\\/bar\\.js$`);
const regexPrefix2 = `^BUILD: ${BUILD} \\| PR: ${PR} \\| SHA: ${SHA} \\| File:`;
const idxContentRegex2 = new RegExp(`${regexPrefix2} \\/index\\.html$`);
const barContentRegex2 = new RegExp(`${regexPrefix2} \\/foo\\/bar\\.js$`);
h.createDummyBuild(PR, ALT_SHA);
await circleBuild(payload(BUILD)).then(h.verifyResponse(201));
await Promise.all([
getFile(PR, ALT_SHA, 'index.html').then(h.verifyResponse(200, idxContentRegex1)),
getFile(PR, ALT_SHA, 'foo/bar.js').then(h.verifyResponse(200, barContentRegex1)),
getFile(PR, SHA, 'index.html').then(h.verifyResponse(200, idxContentRegex2)),
getFile(PR, SHA, 'foo/bar.js').then(h.verifyResponse(200, barContentRegex2)),
]);
expect({ prNum: PR, sha: SHA }).toExistAsABuild();
expect({ prNum: PR, sha: ALT_SHA }).toExistAsABuild();
});
uploadBuild(pr9, sha9, archivePath).
then(() => Promise.all([
getFile(pr9, sha9, 'index.html').then(h.verifyResponse(200, idxContentRegex9)),
getFile(pr9, sha9, 'foo/bar.js').then(h.verifyResponse(200, barContentRegex9)),
])).
then(done);
});
it('should be able to create but not serve a hidden preview', async () => {
const BUILD = BuildNums.TRUST_CHECK_UNTRUSTED;
const PR = PrNums.TRUST_CHECK_UNTRUSTED;
it('should be able to upload and serve a build for an existing PR', done => {
const regexPrefix0 = `^PR: ${pr9} \\| SHA: ${sha0} \\| File:`;
const idxContentRegex0 = new RegExp(`${regexPrefix0} \\/index\\.html$`);
const barContentRegex0 = new RegExp(`${regexPrefix0} \\/foo\\/bar\\.js$`);
h.createDummyBuild(PR, ALT_SHA, false);
await circleBuild(payload(BUILD)).then(h.verifyResponse(202));
const regexPrefix9 = `^PR: uploaded\\/${pr9} \\| SHA: ${sha9} \\| File:`;
const idxContentRegex9 = new RegExp(`${regexPrefix9} \\/index\\.html$`);
const barContentRegex9 = new RegExp(`${regexPrefix9} \\/foo\\/bar\\.js$`);
await Promise.all([
getFile(PR, ALT_SHA, 'index.html').then(h.verifyResponse(404)),
getFile(PR, ALT_SHA, 'foo/bar.js').then(h.verifyResponse(404)),
getFile(PR, SHA, 'index.html').then(h.verifyResponse(404)),
getFile(PR, SHA, 'foo/bar.js').then(h.verifyResponse(404)),
]);
h.createDummyBuild(pr9, sha0);
h.createDummyArchive(pr9, sha9, archivePath);
expect({ prNum: PR, sha: SHA }).not.toExistAsABuild();
expect({ prNum: PR, sha: SHA, isPublic: false }).toExistAsABuild();
expect({ prNum: PR, sha: ALT_SHA }).not.toExistAsABuild();
expect({ prNum: PR, sha: ALT_SHA, isPublic: false }).toExistAsABuild();
});
uploadBuild(pr9, sha9, archivePath).
then(() => Promise.all([
getFile(pr9, sha0, 'index.html').then(h.verifyResponse(200, idxContentRegex0)),
getFile(pr9, sha0, 'foo/bar.js').then(h.verifyResponse(200, barContentRegex0)),
getFile(pr9, sha9, 'index.html').then(h.verifyResponse(200, idxContentRegex9)),
getFile(pr9, sha9, 'foo/bar.js').then(h.verifyResponse(200, barContentRegex9)),
])).
then(done);
});
it('should reject if verification fails', async () => {
const BUILD = BuildNums.TRUST_CHECK_ERROR;
const PR = PrNums.TRUST_CHECK_ERROR;
it('should not be able to overwrite a build', done => {
const regexPrefix9 = `^PR: ${pr9} \\| SHA: ${sha9} \\| File:`;
const idxContentRegex9 = new RegExp(`${regexPrefix9} \\/index\\.html$`);
const barContentRegex9 = new RegExp(`${regexPrefix9} \\/foo\\/bar\\.js$`);
h.createDummyBuild(PR, ALT_SHA, false);
await circleBuild(payload(BUILD)).then(h.verifyResponse(500));
expect({ prNum: PR }).toExistAsAnArtifact();
expect({ prNum: PR }).not.toExistAsABuild();
expect({ prNum: PR, isPublic: false }).not.toExistAsABuild();
expect({ prNum: PR, sha: ALT_SHA, isPublic: false }).toExistAsABuild();
});
it('should not be able to overwrite an existing public preview', async () => {
const BUILD = BuildNums.TRUST_CHECK_ACTIVE_TRUSTED_USER;
const PR = PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER;
const regexPrefix = `^PR: ${PR} \\| SHA: ${SHA} \\| File:`;
const idxContentRegex = new RegExp(`${regexPrefix} \\/index\\.html$`);
const barContentRegex = new RegExp(`${regexPrefix} \\/foo\\/bar\\.js$`);
h.createDummyBuild(PR, SHA);
await circleBuild(payload(BUILD)).then(h.verifyResponse(409));
await Promise.all([
getFile(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER, SHA, 'index.html').then(h.verifyResponse(200, idxContentRegex)),
getFile(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER, SHA, 'foo/bar.js').then(h.verifyResponse(200, barContentRegex)),
]);
expect({ prNum: PR }).toExistAsAnArtifact();
expect({ prNum: PR }).toExistAsABuild();
});
it('should not be able to overwrite an existing hidden preview', async () => {
const BUILD = BuildNums.TRUST_CHECK_UNTRUSTED;
const PR = PrNums.TRUST_CHECK_UNTRUSTED;
h.createDummyBuild(PR, SHA, false);
await circleBuild(payload(BUILD)).then(h.verifyResponse(409));
expect({ prNum: PR }).toExistAsAnArtifact();
expect({ prNum: PR, isPublic: false }).toExistAsABuild();
});
it('should be able to request re-checking visibility (if outdated)', async () => {
const publicPr = PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER;
const hiddenPr = PrNums.TRUST_CHECK_UNTRUSTED;
h.createDummyBuild(publicPr, SHA, false);
h.createDummyBuild(hiddenPr, SHA, true);
// PR visibilities are outdated (i.e. the opposte of what the should).
expect({ prNum: publicPr, sha: SHA, isPublic: false }).toExistAsABuild(false);
expect({ prNum: publicPr, sha: SHA, isPublic: true }).not.toExistAsABuild(false);
expect({ prNum: hiddenPr, sha: SHA, isPublic: false }).not.toExistAsABuild(false);
expect({ prNum: hiddenPr, sha: SHA, isPublic: true }).toExistAsABuild(false);
await Promise.all([
prUpdated(publicPr).then(h.verifyResponse(200)),
prUpdated(hiddenPr).then(h.verifyResponse(200)),
]);
// PR visibilities should have been updated.
expect({ prNum: publicPr, isPublic: false }).not.toExistAsABuild();
expect({ prNum: publicPr, isPublic: true }).toExistAsABuild();
expect({ prNum: hiddenPr, isPublic: false }).toExistAsABuild();
expect({ prNum: hiddenPr, isPublic: true }).not.toExistAsABuild();
});
it('should be able to request re-checking visibility (if up-to-date)', async () => {
const publicPr = PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER;
const hiddenPr = PrNums.TRUST_CHECK_UNTRUSTED;
h.createDummyBuild(publicPr, SHA, true);
h.createDummyBuild(hiddenPr, SHA, false);
// PR visibilities are already up-to-date.
expect({ prNum: publicPr, sha: SHA, isPublic: false }).not.toExistAsABuild(false);
expect({ prNum: publicPr, sha: SHA, isPublic: true }).toExistAsABuild(false);
expect({ prNum: hiddenPr, sha: SHA, isPublic: false }).toExistAsABuild(false);
expect({ prNum: hiddenPr, sha: SHA, isPublic: true }).not.toExistAsABuild(false);
await Promise.all([
prUpdated(publicPr).then(h.verifyResponse(200)),
prUpdated(hiddenPr).then(h.verifyResponse(200)),
]);
// PR visibilities are still up-to-date.
expect({ prNum: publicPr, isPublic: true }).toExistAsABuild();
expect({ prNum: publicPr, isPublic: false }).not.toExistAsABuild();
expect({ prNum: hiddenPr, isPublic: true }).not.toExistAsABuild();
expect({ prNum: hiddenPr, isPublic: false }).toExistAsABuild();
});
it('should reject a request if re-checking visibility fails', async () => {
const errorPr = PrNums.TRUST_CHECK_ERROR;
h.createDummyBuild(errorPr, SHA, true);
expect({ prNum: errorPr, isPublic: false }).not.toExistAsABuild(false);
expect({ prNum: errorPr, isPublic: true }).toExistAsABuild(false);
await prUpdated(errorPr).then(h.verifyResponse(500, /TRUST_CHECK_ERROR/));
// PR visibility should not have been updated.
expect({ prNum: errorPr, isPublic: false }).not.toExistAsABuild();
expect({ prNum: errorPr, isPublic: true }).toExistAsABuild();
});
it('should reject a request if updating visibility fails', async () => {
// One way to cause an error is to have both a public and a hidden directory for the same PR.
h.createDummyBuild(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER, SHA, false);
h.createDummyBuild(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER, SHA, true);
const hiddenPrDir = h.getPrDir(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER, false);
const publicPrDir = h.getPrDir(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER, true);
const bodyRegex = new RegExp(`Request to move '${hiddenPrDir}' to existing directory '${publicPrDir}'`);
expect({ prNum: PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER, isPublic: false }).toExistAsABuild(false);
expect({ prNum: PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER, isPublic: true }).toExistAsABuild(false);
await prUpdated(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER).then(h.verifyResponse(409, bodyRegex));
// PR visibility should not have been updated.
expect({ prNum: PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER, isPublic: false }).toExistAsABuild();
expect({ prNum: PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER, isPublic: true }).toExistAsABuild();
});
h.createDummyBuild(pr9, sha9);
h.createDummyArchive(pr9, sha9, archivePath);
uploadBuild(pr9, sha9, archivePath).
then(h.verifyResponse(409)).
then(() => Promise.all([
getFile(pr9, sha9, 'index.html').then(h.verifyResponse(200, idxContentRegex9)),
getFile(pr9, sha9, 'foo/bar.js').then(h.verifyResponse(200, barContentRegex9)),
])).
then(done);
});
}));

View File

@ -1,2 +0,0 @@
import '../preview-server';
import './mock-external-apis';

View File

@ -1,30 +0,0 @@
declare module 'tar-stream' {
import {Readable, Writable} from 'stream';
export interface Pack extends Readable {
entry(header: Header, callback?: (err?: any) => {}): Writable;
entry(header: Header, contents: string, callback?: (err?: any) => {}): Writable;
entry(header: Header, buffer: Buffer, callback?: (err?: any) => {}): Writable;
entry(header: Header, buffer: string|Buffer, callback?: (err?: any) => {}): Writable;
finalize();
destroy(err: any);
}
export interface Header {
name: string;
mode?: number;
uid?: number;
gid?: number;
size?: number;
mtime?: Date;
type?: type;
linkname?: string;
uname?: string;
gname?: string;
devmajor?: number;
devminor?: number;
}
export function pack(): Pack;
}

View File

@ -0,0 +1,266 @@
// Imports
import * as fs from 'fs';
import * as path from 'path';
import {CmdResult, helper as h} from './helper';
// Tests
describe('upload-server (on HTTP)', () => {
const hostname = h.uploadHostname;
const port = h.uploadPort;
const host = `${hostname}:${port}`;
const pr = '9';
const sha9 = '9'.repeat(40);
const sha0 = '0'.repeat(40);
beforeEach(() => jasmine.DEFAULT_TIMEOUT_INTERVAL = 10000);
afterEach(() => h.cleanUp());
describe(`${host}/create-build/<pr>/<sha>`, () => {
const authorizationHeader = `--header "Authorization: Token FOO"`;
const xFileHeader = `--header "X-File: ${h.buildsDir}/snapshot.tar.gz"`;
const curl = `curl -iL ${authorizationHeader} ${xFileHeader}`;
it('should disallow non-GET requests', done => {
const url = `http://${host}/create-build/${pr}/${sha9}`;
const bodyRegex = /^Unsupported method/;
Promise.all([
h.runCmd(`curl -iLX PUT ${url}`).then(h.verifyResponse(405, bodyRegex)),
h.runCmd(`curl -iLX POST ${url}`).then(h.verifyResponse(405, bodyRegex)),
h.runCmd(`curl -iLX PATCH ${url}`).then(h.verifyResponse(405, bodyRegex)),
h.runCmd(`curl -iLX DELETE ${url}`).then(h.verifyResponse(405, bodyRegex)),
]).then(done);
});
it('should reject requests without an \'AUTHORIZATION\' header', done => {
const headers1 = '';
const headers2 = '--header "AUTHORIXATION: "';
const url = `http://${host}/create-build/${pr}/${sha9}`;
const bodyRegex = /^Missing or empty 'AUTHORIZATION' header/;
Promise.all([
h.runCmd(`curl -iL ${headers1} ${url}`).then(h.verifyResponse(401, bodyRegex)),
h.runCmd(`curl -iL ${headers2} ${url}`).then(h.verifyResponse(401, bodyRegex)),
]).then(done);
});
it('should reject requests without an \'X-FILE\' header', done => {
const headers1 = authorizationHeader;
const headers2 = `${authorizationHeader} --header "X-FILE: "`;
const url = `http://${host}/create-build/${pr}/${sha9}`;
const bodyRegex = /^Missing or empty 'X-FILE' header/;
Promise.all([
h.runCmd(`curl -iL ${headers1} ${url}`).then(h.verifyResponse(400, bodyRegex)),
h.runCmd(`curl -iL ${headers2} ${url}`).then(h.verifyResponse(400, bodyRegex)),
]).then(done);
});
it('should respond with 404 for unknown paths', done => {
const cmdPrefix = `${curl} http://${host}`;
Promise.all([
h.runCmd(`${cmdPrefix}/foo/create-build/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/foo-create-build/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/fooncreate-build/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build/foo/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build-foo/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-buildnfoo/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build/pr${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build/${pr}/${sha9}42`).then(h.verifyResponse(404)),
]).then(done);
});
it('should reject PRs with leading zeros', done => {
h.runCmd(`${curl} http://${host}/create-build/0${pr}/${sha9}`).
then(h.verifyResponse(404)).
then(done);
});
it('should accept SHAs with leading zeros (but not trim the zeros)', done => {
Promise.all([
h.runCmd(`${curl} http://${host}/create-build/${pr}/0${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${curl} http://${host}/create-build/${pr}/${sha9}`).then(h.verifyResponse(500)),
h.runCmd(`${curl} http://${host}/create-build/${pr}/${sha0}`).then(h.verifyResponse(500)),
]).then(done);
});
it('should not overwrite existing builds', done => {
h.createDummyBuild(pr, sha9);
expect(h.readBuildFile(pr, sha9, 'index.html')).toContain('index.html');
h.writeBuildFile(pr, sha9, 'index.html', 'My content');
expect(h.readBuildFile(pr, sha9, 'index.html')).toBe('My content');
h.runCmd(`${curl} http://${host}/create-build/${pr}/${sha9}`).
then(h.verifyResponse(409, /^Request to overwrite existing directory/)).
then(() => expect(h.readBuildFile(pr, sha9, 'index.html')).toBe('My content')).
then(done);
});
it('should delete the PR directory on error (for new PR)', done => {
const prDir = path.join(h.buildsDir, pr);
h.runCmd(`${curl} http://${host}/create-build/${pr}/${sha9}`).
then(h.verifyResponse(500)).
then(() => expect(fs.existsSync(prDir)).toBe(false)).
then(done);
});
it('should only delete the SHA directory on error (for existing PR)', done => {
const prDir = path.join(h.buildsDir, pr);
const shaDir = path.join(prDir, sha9);
h.createDummyBuild(pr, sha0);
h.runCmd(`${curl} http://${host}/create-build/${pr}/${sha9}`).
then(h.verifyResponse(500)).
then(() => {
expect(fs.existsSync(shaDir)).toBe(false);
expect(fs.existsSync(prDir)).toBe(true);
}).
then(done);
});
describe('on successful upload', () => {
const archivePath = path.join(h.buildsDir, 'snapshot.tar.gz');
let uploadPromise: Promise<CmdResult>;
beforeEach(() => {
h.createDummyArchive(pr, sha9, archivePath);
uploadPromise = h.runCmd(`${curl} http://${host}/create-build/${pr}/${sha9}`);
});
afterEach(() => h.deletePrDir(pr));
it('should respond with 201', done => {
uploadPromise.then(h.verifyResponse(201)).then(done);
});
it('should extract the contents of the uploaded file', done => {
uploadPromise.
then(() => {
expect(h.readBuildFile(pr, sha9, 'index.html')).toContain(`uploaded/${pr}`);
expect(h.readBuildFile(pr, sha9, 'foo/bar.js')).toContain(`uploaded/${pr}`);
}).
then(done);
});
it(`should create files/directories owned by '${h.serverUser}'`, done => {
const shaDir = path.join(h.buildsDir, pr, sha9);
const idxPath = path.join(shaDir, 'index.html');
const barPath = path.join(shaDir, 'foo', 'bar.js');
uploadPromise.
then(() => Promise.all([
h.runCmd(`find ${shaDir}`),
h.runCmd(`find ${shaDir} -user ${h.serverUser}`),
])).
then(([{stdout: allFiles}, {stdout: userFiles}]) => {
expect(userFiles).toBe(allFiles);
expect(userFiles).toContain(shaDir);
expect(userFiles).toContain(idxPath);
expect(userFiles).toContain(barPath);
}).
then(done);
});
it('should delete the uploaded file', done => {
expect(fs.existsSync(archivePath)).toBe(true);
uploadPromise.
then(() => expect(fs.existsSync(archivePath)).toBe(false)).
then(done);
});
it('should make the build directory non-writable', done => {
const shaDir = path.join(h.buildsDir, pr, sha9);
const idxPath = path.join(shaDir, 'index.html');
const barPath = path.join(shaDir, 'foo', 'bar.js');
// See https://github.com/nodejs/node-v0.x-archive/issues/3045#issuecomment-4862588.
const isNotWritable = (fileOrDir: string) => {
const mode = fs.statSync(fileOrDir).mode;
// tslint:disable-next-line: no-bitwise
return !(mode & parseInt('222', 8));
};
uploadPromise.
then(() => {
expect(isNotWritable(shaDir)).toBe(true);
expect(isNotWritable(idxPath)).toBe(true);
expect(isNotWritable(barPath)).toBe(true);
}).
then(done);
});
});
});
describe(`${host}/health-check`, () => {
it('should respond with 200', done => {
Promise.all([
h.runCmd(`curl -iL http://${host}/health-check`).then(h.verifyResponse(200)),
h.runCmd(`curl -iL http://${host}/health-check/`).then(h.verifyResponse(200)),
]).then(done);
});
it('should respond with 404 if the path does not match exactly', done => {
Promise.all([
h.runCmd(`curl -iL http://${host}/health-check/foo`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL http://${host}/health-check-foo`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL http://${host}/health-checknfoo`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL http://${host}/foo/health-check`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL http://${host}/foo-health-check`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL http://${host}/foonhealth-check`).then(h.verifyResponse(404)),
]).then(done);
});
});
describe(`${host}/*`, () => {
it('should respond with 404 for GET requests to unknown URLs', done => {
const bodyRegex = /^Unknown resource/;
Promise.all([
h.runCmd(`curl -iL http://${host}/index.html`).then(h.verifyResponse(404, bodyRegex)),
h.runCmd(`curl -iL http://${host}/`).then(h.verifyResponse(404, bodyRegex)),
h.runCmd(`curl -iL http://${host}`).then(h.verifyResponse(404, bodyRegex)),
]).then(done);
});
it('should respond with 405 for non-GET requests to any URL', done => {
const bodyRegex = /^Unsupported method/;
Promise.all([
h.runCmd(`curl -iLX PUT http://${host}`).then(h.verifyResponse(405, bodyRegex)),
h.runCmd(`curl -iLX POST http://${host}`).then(h.verifyResponse(405, bodyRegex)),
h.runCmd(`curl -iLX PATCH http://${host}`).then(h.verifyResponse(405, bodyRegex)),
h.runCmd(`curl -iLX DELETE http://${host}`).then(h.verifyResponse(405, bodyRegex)),
]).then(done);
});
});
});

View File

@ -6,50 +6,38 @@
"author": "Angular",
"license": "MIT",
"scripts": {
"prebuild": "yarn clean-dist",
"build": "yarn ~~build",
"prebuild-watch": "yarn prebuild",
"build-watch": "yarn ~~build-watch",
"clean-dist": "node --eval \"require('shelljs').rm('-rf', 'dist')\"",
"predev": "yarn build || true",
"dev": "run-p ~~build-watch ~~test-watch",
"prebuild": "yarn run clean",
"build": "tsc",
"build-watch": "yarn run tsc -- --watch",
"clean": "node --eval \"require('shelljs').rm('-rf', 'dist')\"",
"dev": "concurrently --kill-others --raw --success first \"yarn run build-watch\" \"yarn run test-watch\"",
"lint": "tslint --project tsconfig.json",
"pretest": "yarn build",
"test": "yarn ~~test-only",
"pretest-watch": "yarn pretest",
"test-watch": "yarn ~~test-watch",
"~~build": "tsc",
"~~build-watch": "yarn ~~build --watch",
"pre~~test-only": "yarn lint",
"pre~~test-only": "yarn run lint",
"~~test-only": "node dist/test",
"~~test-watch": "nodemon --delay 1 --exec \"yarn ~~test-only\" --watch dist"
"pretest": "yarn run build",
"test": "yarn run ~~test-only",
"pretest-watch": "yarn run build",
"test-watch": "nodemon --exec \"yarn run ~~test-only\" --watch dist"
},
"dependencies": {
"body-parser": "^1.18.3",
"delete-empty": "^2.0.0",
"express": "^4.16.3",
"jasmine": "^3.2.0",
"nock": "^9.6.1",
"node-fetch": "^2.2.0",
"shelljs": "^0.8.2",
"source-map-support": "^0.5.9",
"tar-stream": "^1.6.1",
"tslib": "^1.10.0"
"express": "^4.14.1",
"jasmine": "^2.5.3",
"jsonwebtoken": "^7.3.0",
"shelljs": "^0.7.6"
},
"devDependencies": {
"@types/body-parser": "^1.17.0",
"@types/express": "^4.16.0",
"@types/jasmine": "^2.8.8",
"@types/nock": "^9.3.0",
"@types/node": "^10.9.2",
"@types/node-fetch": "^2.1.2",
"@types/shelljs": "^0.8.0",
"@types/supertest": "^2.0.5",
"nodemon": "^1.18.3",
"npm-run-all": "^4.1.5",
"supertest": "^3.1.0",
"tslint": "^5.11.0",
"tslint-jasmine-noSkipOrFocus": "^1.0.9",
"typescript": "^3.0.1"
"@types/express": "^4.0.35",
"@types/jasmine": "^2.5.43",
"@types/jsonwebtoken": "^7.2.0",
"@types/node": "^7.0.5",
"@types/shelljs": "^0.7.0",
"@types/supertest": "^2.0.0",
"concurrently": "^3.3.0",
"eslint": "^3.15.0",
"eslint-plugin-jasmine": "^2.2.0",
"nodemon": "^1.11.0",
"supertest": "^3.0.0",
"tslint": "^4.4.2",
"typescript": "^2.1.6"
}
}

Some files were not shown because too many files have changed in this diff Show More