Compare commits

..

262 Commits

Author SHA1 Message Date
6b57928d35 docs: add changelog for 2.4.10 2017-03-16 19:26:58 -07:00
47471ee49e release: cut the 2.4.10 release 2017-03-16 19:22:09 -07:00
b0ae464695 ci: changing deployment token 2017-03-16 16:11:17 -07:00
34403cda60 fix(router): do not finish bootstrap until all the routes are resolved (#15121)
Cherry-pick of 5df998d086 onto 2.4.x branch.

DEPRECATION:

Use `RouterModule.forRoot(routes, {initialNavigation: 'enabled'})` instead of
`RouterModule.forRoot(routes, {initialNavigtaion: true})`.

Before doing this, move the initialization logic affecting the router
from the bootstrapped component to the boostrapped module.

Similarly, use `RouterModule.forRoot(routes, {initialNavigation: 'disabled'})`
instead of `RouterModule.forRoot(routes, {initialNavigation: false})`.

Deprecated options: 'legacy_enabled', `true` (same as 'legacy_enabled'),
'legacy_disabled', `false` (same as 'legacy_disabled').

The "Router Initial Navigation" design document covers this change.
Read more here:
https://docs.google.com/document/d/1Hlw1fPaVs-PCj5KPeJRKhrQGAvFOxdvTlwAcnZosu5A/edit?usp=sharing
2017-03-16 15:39:30 -07:00
e5c9bbcbdd fix(compiler): fix decoding surrogate pairs (#15154) 2017-03-16 15:37:25 -07:00
80fe41a88e release: cut v2.4.9 2017-03-01 23:11:26 -08:00
af652a7c8b docs: add release notes for 2.4.9 2017-03-01 23:11:08 -08:00
de36f8a3b9 revert: fix(router): do not finish bootstrap until all the routes are resolved (#14327)
This reverts commit 541de26f7e because it introduced a regression.

Closes #14681, #14588
2017-02-27 14:37:33 -08:00
b658fa9ea0 fix(http): Make ResponseOptionsArgs an interface
closes #13708
2017-02-20 17:34:30 -08:00
2a123463ac fix(router): improve robustness (#14602)
sync a 4.x change from https://github.com/angular/angular/pull/14155
2017-02-20 16:59:28 -08:00
4f93ac8762 release: cut v2.4.8 2017-02-18 13:55:23 -08:00
37ec5b9c1a docs: add changelog for 2.4.8 2017-02-18 13:54:34 -08:00
612950bdb2 test: pin down @types/* dependencies in typings test (#14569)
This is needed because the latest versions are no longer compatible with typescript 1.8 which results in build errors:
https://travis-ci.org/angular/angular/jobs/202752040#L3863
2017-02-17 14:35:56 -08:00
3804ad1d23 revert: build: first pass of de-duplicating tsconfig.json content (#14369)
This reverts commit 1c112ae66e.

First failed build (one commit after the change being reverted): https://travis-ci.org/angular/angular/jobs/202414385#L2511
2017-02-17 12:15:49 -08:00
1b1f228525 revert: build: update jasmine to 2.4 (#14362)
This reverts commit d6a8b0b686.

Jasmine 2.4 requires new typings, which require typescript 2.

See CI failure https://travis-ci.org/angular/angular/jobs/202412220#L3859
2017-02-17 11:57:17 -08:00
19e9094275 docs(router): fix broken link (#14431)
Closes #14430
2017-02-16 15:13:43 -08:00
8ff3ab0e6d docs(router): fix guards API docs (#14528) 2017-02-16 15:13:30 -08:00
9e8d740a96 ci: fix getLatestLabel (#14535) 2017-02-16 15:12:32 -08:00
3a3a100b27 build: update .pullapprove (#14506)
* build: update .pullapprove

Add tbosch/vicb/chuckjaz to more projects.

* Update .pullapprove.yml
2017-02-16 15:02:42 -08:00
f0575e014c fix(compiler): REVERT allow absolute style urls (#14365)
This reverts commit 6b9aa2ca3d.
2017-02-16 14:51:01 -08:00
ea7737ee11 fix(forms): getRawValue should correctly work with nested FormGroups/Arrays (#12964)
Closed #12963

PR Close #12964
2017-02-16 14:50:49 -08:00
fadaf1e01a fix(platform-browser): should only add styles with native encapsulation in shadow DOM (#14313)
Closes #7887

PR Close #14313
2017-02-16 14:50:11 -08:00
c716532ff2 test(forms): test undefined as argument to forms (#13720)
PR Close #13720
2017-02-16 13:49:33 -08:00
1c112ae66e build: first pass of de-duplicating tsconfig.json content (#14369)
PR Close #14369
2017-02-16 13:47:10 -08:00
193a0ae4a0 fix(compiler): allow absolute style urls (#14365)
Closes #4974

PR Close #14365
2017-02-16 13:46:42 -08:00
9ceb5d1afe fix(http): REVERT: remove dots from jsonp callback name (#13219)
This reverts commit 9e5617e41e.
2017-02-16 13:45:48 -08:00
d6a8b0b686 build: update jasmine to 2.4 (#14362)
PR Close #14362
2017-02-16 13:44:47 -08:00
3e216dd4ad ci: find latest tag when deeper than the git clone depth (#14231)
Since we have a shallow clone of the repository, it might be the case that the
latest tag (which we need for publishing the build artifacts) might not be in
the current history.

This commit incrementally deepens the clone until it finds a tag (or reaches a
max depth).

PR Close #14231
2017-02-16 13:43:15 -08:00
7b0aba4655 ci: bump node version to 6.9.5 and npm to 3.10.7
this is required by @angular/cli
2017-02-16 13:42:25 -08:00
7c87c52c38 fix(upgrade): pass correct values to ngOnChanges for interpolation bindings (#14400)
Previously, the `previousValue` and `currentValue` arguments passed to the
`SimpleChange` constructor were swapped for interpolation bindings.

This commit also refactors the code, so that interpolation bindings and property
bindings share the same implementation, and fixes some broken tests (that hide
failures by allowing the `$exceptionHandler` to swallow thrown exceptions).

This is the same as #14301, but for the 2.4.x branch.
2017-02-10 12:53:26 -08:00
541de26f7e fix(router): do not finish bootstrap until all the routes are resolved (#14327)
Fixes #12162
2017-02-09 11:59:08 -08:00
74cb575219 fix(upgrade): correctly project content on downgraded components with structural directives (#14274)
Previously, downgraded component adapters were compiling and projecting the
contents of the template element instead of the link-element. This didn't make
any difference is most cases (as the elements are the same), but broke with
structural directives (e.g. `ngRepeat`), which compile the orginal (template)
element once and then create and link clones of it.

This commit fixes it by always compiling and projecting the contents of the
correct (link) element.

Fixes #14260
2017-02-09 10:07:11 -08:00
e90661aaee docs(changelog): add changelog for 2.4.7 2017-02-08 20:28:11 -08:00
73034c75bd chore(release): cut the 2.4.7 release 2017-02-08 20:26:10 -08:00
059085b943 docs(compiler): incorrect method reference (#14314)
PR Close #14314
2017-02-08 20:23:36 -08:00
8ab89153e0 docs(core): fix typo (#14299)
Replace `than` with `then`.

PR Close #14299
2017-02-08 20:23:36 -08:00
NFM
82923d8314 docs(router): fix typos (#14213)
PR Close #14213
2017-02-08 20:23:36 -08:00
6921b3d21d docs(zone): fix whitespace around backtick code
If there is no leading empty line then the markdown renderers get confused.
2017-02-08 20:23:36 -08:00
c0ef8b25a6 ci: add petebacondarwin to the angular.io pullapprove group (#14268)
PR Close #14268
2017-02-08 20:08:20 -08:00
e845f3b226 ci: add 'public-api' pullapprove group (#14268) 2017-02-08 20:08:20 -08:00
9779f397b7 test(compiler): add integration like tests to compiler unit tests (#14157)
Closes PR #14157

PR Close #14157
2017-02-08 20:08:20 -08:00
5bb47db887 fix(upgrade): allow non-element selectors for downgraded components (#14291)
This affects the dynamic version of `upgrade` and makes it more consistent with
the static version, while removing an artificial limitation.

(This commit backports the fix from 9aafdc7 to 2.4.x.)
2017-02-07 10:02:11 -08:00
343ee8a3a2 docs(changelog): add changelog for 2.4.6 2017-02-02 20:04:17 -08:00
223b5eb367 chore(release): cut the 2.4.6 release 2017-02-02 19:18:08 -08:00
7e639aac15 fix: ngModel should use rxjs/symbol/observable to detect observable (#14236)
PR closes #14236
2017-02-02 19:18:08 -08:00
83dafd3054 ci: increase git fetch depth to 150 2017-02-02 19:18:08 -08:00
e641636624 fix(common): DatePipe parses input string if it's not a valid date in browser (#13895)
Closes #12334
Closes #13874

PR Close #13895
2017-02-02 19:18:08 -08:00
c409860a9f refactor(common): remove isDate from facade (#13895) 2017-02-02 19:18:07 -08:00
0101aa31d6 ci: fix .pullapprove.yaml'sfile conditions (#14214)
According to [the docs](http://docs.pullapprove.com/groups/conditions/), the correct keywords are `include`/`exclude`, without the trailing `s`.
2017-02-02 19:18:07 -08:00
a5b4af0fdd fix(language-service): do not crash when Angular cannot be located (#14123)
Fixes #14122

PR Close #14123
2017-02-02 19:18:07 -08:00
d9420311ca docs(forms): fix FormArray description (#14094)
Closes #14075

PR Close #14094
2017-02-02 18:50:40 -08:00
774e1db87c fix(forms): Verify functions passed into async validators returns Observable or Promise (#14053) 2017-02-02 18:50:40 -08:00
109f0d16ef fix(common): introduce isObservable method (#14067)
Closes #8848

PR Close #14067
2017-02-02 18:50:40 -08:00
71567d1eee fix(common): add PopStateEvent interface (#13400)
Closes #13378

PR Close #13400
2017-02-02 18:50:40 -08:00
bb71acc172 build: fix red travis: fetch more github history (#14193) 2017-02-02 16:02:55 -08:00
e98d6f0912 ci: fix compiler-cli paths (#14177) 2017-02-02 16:02:55 -08:00
1dbebb184f ci: fix pullapprove groups and conditions (#14167)
- restrict root to be just root
- add fallback users to all groups
- fix indentation
- change order of users so that primary reviewers are first, follow by alpha-sorted secondaries, followed by fallback reviewers
2017-02-02 16:02:55 -08:00
8882b86b54 fix(common) add interface PipeTransform to Async pipe (#14049)
PR Close #14049
2017-02-02 16:02:54 -08:00
0965636735 fix(router): fix CanActivateChild guard provided in a lazy loaded module (#13989)
Closes #12275

PR Close #13989
2017-02-02 16:02:54 -08:00
4d2901d480 fix(router): fix navigation from the root component ngOnInit hook (#13932)
Closes #13795

PR Close #13932
2017-02-02 16:02:54 -08:00
a047124e1a fix(router): fix CanActivate redirect to the root on initial load (#13929)
Closes #13530

PR Close #13929
2017-02-02 16:02:54 -08:00
09e2d20e22 fix(forms): select shows blank line when nothing is selected in IE/Edge (#13903)
Closes #10010

PR Close #13903
2017-02-02 16:02:54 -08:00
e3bdf82c0d docs(developer): add description of npm-run to run locally installed npm scripts (#13765)
PR Close #13765
2017-02-02 16:02:54 -08:00
0614289608 fix(platform-browser): remove style nodes on destroy (#13744)
Closes #11746

PR Close #13744
2017-02-02 16:02:53 -08:00
7c344a4e49 refactor(platform-browser): polishing (#13744) 2017-02-02 16:02:53 -08:00
250dbc4bc8 fix(core): add bootstrapped modules into platform modules list (#13740)
Closes #12015

PR Close #13740
2017-02-02 16:02:53 -08:00
70bbdf55da fix(testing): async/fakeAsync/inject/withModule helpers should pass through context to callback functions (#13718)
Make sure that context (`this`) that is passed to functions generated by test helpers is passed through to the callback functions. Enables usage of Jasmine's variable sharing system to prevent accidental memory leaks during test runs.
2017-02-02 16:02:53 -08:00
41b8d95fa7 fix(core): ViewContainerRef.indexOf doesn't throw error when empty (#13220)
PR Close #13220
2017-02-02 16:02:53 -08:00
1eece5046d fix(http): remove dots from jsonp callback name (#13219)
PR Close #13219
2017-02-02 16:02:52 -08:00
1ef3eeecbd docs: update COMITTER.md with info about pullapprove.com 2017-02-02 16:02:52 -08:00
94500e0fad ci: configure pullapprove to cover the whole repository 2017-02-02 16:02:52 -08:00
gc
dd53606f69 docs(public_api): change description (#13583)
* doc(public_api): change description

Benchpress has been moved to angular/angular in modules/@angular/benchpress

* docs(public_api): change description

Here means 'other projects',like angular-cli, Angular Material. And as we know, benchpress project has been moved to angular/angular in modules/@angular/benchpress. It should not be 'other projects'.
2017-02-02 16:02:52 -08:00
6c8b5dda87 style(docs): update copyright years (#13736) 2017-02-02 16:02:52 -08:00
458ccc1aff refactor(core): simplify ReflectiveInjector by removing code for Dart implementation (#14126)
ReflectiveInjector previously used two strategies for resolving dependencies. These
were to support the Dart implementation, but are no longer needed. A result of this
PR is there is no longer a 20 dependency limit and the generated code is smaller.

PR Close #14126
2017-02-02 16:01:04 -08:00
07cfd8c432 docs: remove obsolete bundles/overview.md file (#14132) 2017-02-02 16:01:03 -08:00
23bd0fbfc1 docs(http): vanilla links do not need link tags (#14097) 2017-02-02 16:32:23 -06:00
3d1e536143 docs(router): remove invalid jsdoc tags (#14097)
The `@selector` tags are not valid.
Dgeni should be able to extract this information
from the directive annotation metadata.
2017-02-02 16:31:49 -06:00
c827097610 ci: add pullapprove config for angular.io 2017-02-02 16:31:49 -06:00
8d4aa82c04 fix(i18n): parse ICU messages while normalizing templates (#14153)
Fixes:
- Inject the i18n specific HtmlParser into the directive normalizer,
- Parse ICU messages while normalizing templates,
- Normalize (visit) the content of ICU messages.

🎄🎁🎅
2017-01-31 21:00:32 -08:00
14e97516cb refactor(abstract): Use abstract keyword where possible to decrease file size. (#14112)
PR Close: #14112
2017-01-27 12:52:12 -08:00
bc47a8cc74 refactor(compiler): add ability to get the context around a ParseLocation (#14113) 2017-01-27 12:55:54 -06:00
32cc6759ef fix(common): DatePipe doesn't throw for NaN (#14117)
Fixes #14103

PR Close #14117
2017-01-27 12:55:28 -06:00
d5f1419afe refactor(size): Use abstract keyword where possible to decrease file size. (#14112) 2017-01-27 12:55:20 -06:00
117fa79c7c fix(upgrade): detect async downgrade component changes (#14039)
This commit effectively reverts 7e0f02f but for `upgrade/static`
as it was an invalid fix for #6385, that created a more significant
bug, which was that changes were not always being detected.

Angular 1 digests should be run inside the ngZone to ensure
that async changes are detected.

We don't know how to fix #6385 without breaking change detection
at this stage. That issue is triggered by async operations, such as
`setTimeout`, being triggered inside scope watcher functions.

One could argue that watcher functions should be pure and not do
work such as triggering async operations. It is possible that the
original use case could be supported by moving the debounce
logic into the watch listener function, which is only called if the
watched value actually changes.

See #13812

PR Close #14039
2017-01-27 12:53:48 -06:00
777ba46837 refactor(compiler): improve error messages in aot compiler (#14017)
Previously aot compiler prints stack traces when it fails to resolve.
New behavior: aot compiler outputs the error message.
Example: https://gist.github.com/bowenni/a7fe81d916e8cd4a06b0e133436f40fb

PR Close #14017
2017-01-27 12:53:43 -06:00
f3d55068a8 fix(compiler): allow empty translations for attributes (#14085)
fixes #13897
2017-01-27 12:53:13 -06:00
7ed39ebaaf docs(changelog): add changelog for 2.4.5 2017-01-25 13:48:29 -08:00
091f0a5aaa chore(release): cut the 2.4.5 release 2017-01-25 13:48:21 -08:00
315606e02c style(compiler): run format 2017-01-25 13:21:04 -08:00
5ea373d184 docs(core): add docs for AnimationStyles and AnimationKeyframe (#14107) 2017-01-25 11:51:02 -08:00
6e36bb7b20 docs(compiler): add comment to warn about regexp changes (#14106)
ref #14082
2017-01-25 11:50:55 -08:00
3b2fb23805 fix(upgrade/static): ensure upgraded injector is initialized early enough (#14065)
This change ensures that the upgraded AngularJS injector is initialized
before the application run blocks are executed.

Closes #13811
2017-01-25 11:49:59 -08:00
bd2eecb4de fix(compiler): fix regexp to support firefox 31 (#14082)
fixes #14029
closes #13900
2017-01-25 11:44:09 -08:00
3d351a4f5f fixup: remove message.id check from this branch 2017-01-25 11:43:16 -08:00
5492fada21 fix(compiler): [i18n] XMB/XTB placeholder names can contain only A-Z, 0-9, _n
There are restrictions on the character set that can be used for xmb and xtb
placeholder names.

However because changing the placeholder names would change the message IDs it
is not possible to add those restrictions to the names used internally. Then we
have to map internal name to public names when generating an xmb file and back
when translating using an xtb file.

Note for implementors of `Serializer`:
- When writing a file, the implementor should take care of converting the
internal names to public names while visiting the message nodes - this is
required because the original nodes are needed to compute the message ID.
- When reading a file, the implementor does not need to take care of the mapping
back to internal names as this is handled in the `I18nToHtmlVisitor` used by the
`TranslationBundle`.

fixes b/34339636
2017-01-25 10:35:03 -08:00
fd4f9acbcf fix(core): export animation classes required for Renderer impl (#14002)
Closes #14001
2017-01-25 10:32:16 -08:00
48528a86e1 docs(common): fix a typo on the DatePipe API docs (#14060) 2017-01-25 10:32:08 -08:00
80364def27 ci: bump node and npm versions in circle.yaml to match travis 2017-01-25 10:31:50 -08:00
1803beb4d5 Fixed documentation reference to canActivate in canDeactivate (#14018)
Simple update to code sample which references canActivate: ['canDeactivateTeam'].
2017-01-25 10:31:42 -08:00
3bcba8a570 chore(docs): add missing comments (#14003)
This is a load-bearing change to avoid duplicate licenses in closure-compiled bundles.
See https://github.com/angular/tsickle/issues/332
2017-01-25 10:30:46 -08:00
84542d8ae7 docs(changelog): add changelog for 2.4.4 2017-01-18 18:35:54 -06:00
17cb3ec565 chore(release): cut the 2.4.4 release 2017-01-18 18:32:57 -06:00
015878afe6 fix(http): don't create a blob out of ArrayBuffer when type is application/octet-stream (#13992)
Closes #13973
2017-01-18 18:28:37 -06:00
2af58622c1 fix(router): enable loadChildren with function in aot (#13909)
Closes #11075
2017-01-18 18:28:02 -06:00
7ffd10541d refactor(core): remove an unused import in application_ref (#13901) 2017-01-18 18:27:52 -06:00
481b099d82 docs(CHANGELOG): added reference to closed issue in CHANGELOG for informational purposes (#13985) 2017-01-18 18:27:25 -06:00
49c4b0fa92 fix(router): routerLinkActive should not throw when not initialized (#13273)
Fixes #13270

PR Close #13273
2017-01-18 18:27:14 -06:00
b8b6b1d27a refactor(router): clean up RouterLinkActive (#13273)
PR Close #13273
2017-01-18 18:27:03 -06:00
892b5ba950 chore(tsc-wrapped): update tsickle to latest (#13471) 2017-01-18 18:26:37 -06:00
bd15110c7d feat(security): allow calc and gradient functions. (#13943)
PR Close #13943

Also includes support for # color notation in function arguments (common
in gradient functions).
2017-01-18 18:25:45 -06:00
2250082fd7 fix(upgrade): detect async downgrade component changes (#13812)
This commit effectively reverts 7e0f02f96e
as it was an invalid fix for #6385, that created a more significant
bug, which was that changes were not always being detected.

Angular 1 digests should be run inside the ngZone to ensure
that async changes are detected.

We don't know how to fix #6385 without breaking change detection
at this stage. That issue is triggered by async operations, such as
`setTimeout`, being triggered inside scope watcher functions.

One could argue that watcher functions should be pure and not do
work such as triggering async operations. It is possible that the
original use case could be supported by moving the debounce
logic into the watch listener function, which is only called if the
watched value actually changes.

Closes #10660, #12318, #12034

PR Close #13812
2017-01-18 18:21:29 -06:00
87316c52db test(upgrade): reorganise test layout (#13812) 2017-01-18 18:21:24 -06:00
606b76d9bb chore(compiler-cli): Move calculateEmitPath into CompilerHost (#13904)
This is so that it can be overriden in an environment specific CompilerHost(like within Google) to customize the output paths.

PR Close #13904
2017-01-18 18:21:09 -06:00
3d0b1b8184 fix(common): support numeric value as discrete cases for NgPlural (#13876)
PR Close #13876
2017-01-18 18:20:56 -06:00
261fd16780 fix(animations): fix internal jscompiler issue and AOT quoting (#13798)
CL #143630929
PR Close #13798
2017-01-18 18:20:47 -06:00
104cc42f6d docs(http): Spelling Fix #13867 2017-01-18 18:20:30 -06:00
a7d28044c5 docs(changelog): add changelog for 2.4.3 2017-01-11 13:38:23 -08:00
055bea2969 chore(release): cut v2.4.3 2017-01-11 13:38:16 -08:00
dad0d21b89 chore(owners): configure pullapprove.com 2017-01-11 11:35:31 -08:00
313683f6f3 fix(compiler-cli): avoid handling functions in loadChildren as lazy load routes paths
The change avoids the compiler CLI internal API from mismatching the following case as lazy loading

```
import { NonLazyLoadedModule } from './non-lazy-loaded/non-lazy-loaded.module';

export function getNonLazyLoadedModule() { return NonLazyLoadedModule; }

export const routes = [
{ path: '/some-path', loadChildren: getNonLazyLoadedModule }
];
```

The output of the check is later passed to `RouteDef.fromString()`, so, it makes sense to be only a string.

Fixes angular/angular-cli#3204
2017-01-11 11:35:23 -08:00
338be6d6a5 refactor(common): remove some facade usages 2017-01-11 11:34:03 -08:00
4b56f79328 refactor(test): <template>/<ng-container>/*-directives
- remove outer `<div>` in tests,
- use `<ng-container>` instead of `<template>` where possible,
- use *... instead of template (tag or attr) where possible.

Fixes #13816
2017-01-11 11:33:30 -08:00
d7f2a3c71b fix(i18n): translate attributes inside elements marked for translation 2017-01-10 17:15:42 -08:00
1c929ae244 docs(NgPlural): fix API docs
Fixes #13786
2017-01-10 16:51:52 -08:00
83d0ff6d13 refactor(Compiler): misc cleanup 2017-01-10 16:50:20 -08:00
d43e5dd44d chore(benchmarks): change var to let 2017-01-09 16:08:33 -08:00
61ba223c1a fix(router): throw an error when navigate to null/undefined path
Closes #10560
Fixes #13384
2017-01-09 18:55:31 -05:00
6164eb25f3 fix(compiler-cli): add support for more than 2 levels of nested lazy routes
This change adds Compiler CLI support for any level of nesting for lazy routes.

For example `{app-root}/lazy-loaded-module-1/lazy-loaded-module-2/lazy-loaded-module-3`

Where `lazy-loaded-module-3` is lazy loaded from `lazy-loaded-module-2`,
and `lazy-loaded-module-2` is lazy loaded from module `lazy-loaded-module-1`,
and `lazy-loaded-module-1` is lazy loaded from `AppModule`

Fixes angular/angular-cli#3663
2017-01-09 18:06:26 -05:00
5e9d3dba3a fix(compiler): avoid evaluating arguments to unknown decorators
Fixes #13605
2017-01-09 18:06:11 -05:00
16922655ca fix(Router): fix checking for object intersection 2017-01-09 18:06:03 -05:00
7dc12b93fe fix(Compiler): fix template binding parsing (*directive="-...")
fixes #13800
2017-01-09 16:50:14 -05:00
1c82b58185 fix(router): RouterLink mirrors input target as attribute
Closes #13837
2017-01-09 16:50:06 -05:00
d6c414c08f fix: correctly show error when karma fails to load 2017-01-09 16:49:01 -05:00
d25d1730c7 chore(tsc-wrapped): bump version number to 0.5.1 2017-01-06 12:46:07 -08:00
03b35d2e8f docs(changelog): add release notes for 2.4.2 2017-01-06 12:40:35 -08:00
722543739e chore(release): cut the 2.4.2 release 2017-01-06 12:37:50 -08:00
56b4296a09 fix(language-service): support TypeScript 2.1 (#13655)
@angular/language-service now supports using TypeScript 2.1 as the
the TypeScript host. TypeScript 2.1 is now also partially supported
in `ngc` but is not recommended as Tsickle does not yet support 2.1.
2017-01-06 11:00:12 -08:00
f1cde4339b fix(core): animations no longer silently exits if the element is not apart of the DOM (#13763) 2017-01-06 11:00:12 -08:00
b245b920a6 fix(core): animations should blend in all previously transitioned styles into next animation if interrupted (#13148) 2017-01-06 11:00:12 -08:00
f47a71689c refactor: remove unused imports 2017-01-06 11:00:11 -08:00
6be55cc214 fix(Common): allow null/undefined values for NgForTrackBy
Reverts a breaking change introduced in 2.4.1 by #13420
fixes #13641
2017-01-06 11:00:11 -08:00
504199cf5a docs(common): add an example how to bind multiple classes based on a single parameter (#13779)
Closes #13778
2017-01-06 11:00:11 -08:00
17c5fa9293 fix(forms): Validators.required properly validate arrays (#13362)
Closes #12274
2017-01-06 11:00:11 -08:00
5f49c3ed23 fix(common): do not override locale provided on bootstrap (#13654)
Closes #13607
2017-01-06 11:00:11 -08:00
ebba63057f docs(developer): add linting section and correct command to verify API changes 2017-01-06 11:00:11 -08:00
5058461af7 feat(benchmarks): add detectChanges test for ng2 tree benchmark 2017-01-06 11:00:11 -08:00
21f5f05893 fix(core): Remove reference to "Angular 2" in dev mode warning (#13751) 2017-01-06 11:00:11 -08:00
f2ee81fa7a Typo (#13698) 2017-01-06 11:00:11 -08:00
ae1029da35 docs(Http): fix and extend samples for testing/MockBackend (#13689)
Fix samples for MockBackend and MockBackend.connections that were outdated. Also extend central sample for MockBackend to ease getting started.
2017-01-06 11:00:10 -08:00
230e33f3f1 fix(compiler): don’t throw when using ANALYZE_FOR_ENTRY_COMPONENTS with user classes (#13679)
Fixed #13565
2017-01-05 21:21:33 -08:00
ec0ca01224 docs(Core): fix API docs for ContentChild and ViewChildren (#13656)
Move the documentations of the ContentChild and ViewChildren decorators
so that they appear correctly on angular.io.

Closes #13625
2017-01-05 21:21:22 -08:00
1cd73c7a79 fix(compiler): query <template> elements before their children. (#13677)
Fixes #13118
Closes #13167
2017-01-05 21:20:56 -08:00
9f6a647908 fix(router): update route snapshot before emit new values (#13558)
Closes #12912
2017-01-05 21:20:45 -08:00
29ffdfdffe fix(Compiler): allow "." in attribute selectors (#13653)
fixes #13645
2017-01-05 21:14:54 -08:00
5754ecc3e1 fix(router): fix lazy loaded module with wildcard route (#13649)
Closes #12955
2017-01-05 21:14:42 -08:00
dab15c79dd chore(tslint): update tslint to 4.x (#13603) 2017-01-05 21:14:28 -08:00
21942a88f0 fix(upgrade): fix/improve support for lifecycle hooks (#13020)
With the exception of `$onChanges()`, all lifecycle hooks in ng1 are called on
the controller, regardless if it is the binding destination or not (i.e.
regardless of the value of `bindToController`).

This change makes `upgrade` mimic that behavior when calling lifecycle hooks.

Additionally, calling the `$onInit()` hook has been moved before calling the
linking functions, which also mimics the ng1 behavior.
2017-01-05 21:14:14 -08:00
018865ee6b fix(router): routerLink support of null/undefined (#13380)
Closes #6971
2017-01-05 21:11:56 -08:00
f7234378b6 fix(common): add link to trackBy docs (#13634) 2017-01-05 21:11:34 -08:00
5f47583c94 fixed minor typo (#13626) 2017-01-05 21:06:13 -08:00
0e7f9f0bff fix(testing): improve misleading error message when don't call compileComponents (#13543)
Closes #11301
2017-01-05 21:05:39 -08:00
28a92b2bcd docs(changelog): add changelog for 2.4.1 2016-12-21 14:26:13 -08:00
48be539824 chore(release): cut the 2.4.1 release 2016-12-21 14:22:42 -08:00
d788c679b6 fix(animations): always recover from a failed animation step (#13604) 2016-12-21 14:17:45 -08:00
a38f14b39c fix(router): should reset location if a navigation by location is successful (#13545)
Closes #13491
2016-12-21 14:17:25 -08:00
6a5e46cedd fix(animations): always quote string map key values in AOT code (#13602) 2016-12-21 09:49:03 -08:00
6316e5df71 fix(compiler): ignore @import in comments (#13368)
* refactor(compiler): clean up style url resolver
* fix(compiler): ignore @import in css comments

Closes #12196
2016-12-21 09:49:03 -08:00
90fca7c879 Include bower instructions in DEVELOPER.md (#13591) 2016-12-21 09:49:03 -08:00
d871ae2dc6 refactor(platform-browser): resolver merge conflict for tslint (#13601) 2016-12-21 09:49:03 -08:00
44e84d87f9 fix(common): throw an error if trackBy is not a function (#13420)
* fix(common): throw an error if trackBy is not a function

Closes #13388

* refactor(platform-browser): disable no-console rule in DomAdapter
2016-12-21 09:49:03 -08:00
b9e979e0a5 fix(core): improve error message when component factory cannot be found (#13541)
Closes #12678
2016-12-21 09:49:03 -08:00
cb2aa41782 build: fix publish-build-artifacts branch detection (#13599) 2016-12-21 09:49:03 -08:00
189a7e3750 build: publish build artifacts to branches (#13529)
Fix #13126
2016-12-21 09:49:03 -08:00
6efdf84d3e docs(changelog): add changelog for 2.4.0 2016-12-19 17:42:17 -08:00
e61bfc8b24 chore(release): cut the 2.4.0 release 2016-12-19 17:19:25 -08:00
070f9d0644 refactor: formatting fixes 2016-12-19 17:18:33 -08:00
8d5da1e57a feat: update to rxjs@5.0.1 and unpin the rxjs peerDeps via ^5.0.1 (#13572)
Now that rxjs is stable and the rxjs team follows semver, we can update and unpin the dependency safely.

From now on the Angular application/library developers are in charge of controlling the rxjs version as long as it's newer than 5.0.1.

closes #13561
closes #13478
closes #13572
2016-12-19 17:09:41 -08:00
b6406191c7 build(npm): update angular version in shrinkwrap files 2016-12-19 17:09:41 -08:00
124face441 refactor(compiler-cli): support extracting the mesage bundle without writing a file (#13580) 2016-12-19 17:09:41 -08:00
de4ace77fe feat(compiler-cli): private i18n API for the CLI (#13536)
Also change the Extractor API to align with the Codegen API (internal APIs)
2016-12-19 17:09:41 -08:00
debb0c9798 fix(compiler-cli): produce metadata for .d.ts files without metadata (#13526)
Fixes #13307
Fixes #13473
Fixes #13521
2016-12-19 17:09:40 -08:00
9b87bb6d7f fix(compiler): do not lex }} when interpolation is disabled (#13531)
* doc(compiler): fix the ICU expander API docs

* test(compiler): add lexer and parser specs

* fix(compiler): do not lex `}}` when interpolation is disabled

fix #13525
2016-12-19 17:09:40 -08:00
71e88a8c3c refactor(core): fix typo (#13515)
Closes #13512
2016-12-19 17:09:40 -08:00
c26c24c544 fix(upgrade): fix registerForNg1Tests (#13522)
Fix an issue in `registerForNg1Tests`, where it passes a `null` as
`ng1Injector` to `_bootstrapDone`. This causes a "TypeError: Cannot
read property 'get' of null" to be thrown from `_bootstrapDone`.
2016-12-19 17:09:40 -08:00
3f178410c3 fix(i18n): add a default example to xmb placeholders (#13507)
Otherwise the TC would not be able to load the message
2016-12-19 17:09:40 -08:00
b36f4bc00d fix(animations): allow players to be destroyed before initialized (#13346)
Closes #13293
Closes #13346
2016-12-19 17:09:40 -08:00
355c537883 refactor(compiler): format update (#13506) 2016-12-19 17:09:39 -08:00
f277303ca3 refactor(compiler): don't print stack trace on template parse errors (#13390) 2016-12-19 17:09:39 -08:00
50afbe094f fix(build): use bash string comparison operator (#13502) 2016-12-19 17:09:39 -08:00
15ea758d01 feature(DEVELOPER.md): add easy way to publish personal snapshot builds (#13469) 2016-12-19 17:09:39 -08:00
1f0f429f2a refactor(compiler): store metadata of top level symbols also in summaries (#13289)
This allows a build using summaries to not need .metadata.json files at all
any more.

Part of #12787
2016-12-19 17:09:39 -08:00
dbb364e23a docs(changelog): minor updates to 2.3.1 changelog 2016-12-14 21:57:56 -08:00
540b1197a6 fix(form): fix merge errors 2016-12-14 18:22:03 -08:00
d30cc8461b docs(changelog): add changelog for 2.3.1 2016-12-14 18:14:34 -08:00
f27954e62c build: bump angular to 2.3.1 & tsc-wrapped to 0.5.0 2016-12-14 18:11:35 -08:00
69b52eb2b3 fix(compiler): fix merge error in compiler_host 2016-12-14 18:08:54 -08:00
b9b557cdb0 fix(compiler): update to metadata version 3 (#13464)
This change retracts support for metadata version 2.

The collector used to produce version 2 metadata was incomplete
and can cause the AOT compiler to fail to resolve symbols or
produce other spurious errors.

All libraries compiled and published with 2.3.0 ngc will need
to be recompiled and updated with this change.
2016-12-14 18:08:48 -08:00
a72a002a8d refactor: format & lint 2016-12-14 18:08:43 -08:00
a0437f8c9d chore(animations/aot): always export NoOpAnimationDriver (#13480) 2016-12-14 18:08:36 -08:00
1c279b3264 fix(compiler): fix simplify a reference without a name
closes #13470
2016-12-14 18:08:32 -08:00
cd03c77364 fix(tsc-wrapped): generate metadata for exports without module specifier
fixes #13327
2016-12-14 18:08:29 -08:00
f6ef7d6e5a fix(compiler): propagate exports when upgrading metadata to v2 2016-12-14 18:08:25 -08:00
6aeaca3fb4 fix(compiler): resolver should merge host bindings and listeners (#13474)
fixes #13327
2016-12-14 18:07:41 -08:00
af62050729 docs(upgrade): fix UpgradeAdapter examples
closes #12675
2016-12-14 18:02:26 -08:00
cb69656b56 docs(upgrade/upgrade_adapter): fix up references to AngularJS and Angular 2 2016-12-14 18:02:14 -08:00
2fc0560988 feat(upgrade): enable Angular 1 unit testing of upgrade module
- New method `UpgradeAdapter.registerForNg1Tests(modules)` declares the
  Angular 1 upgrade module and provides it to the `angular.mock.module()`
  helper.
  This prevents the need to bootstrap the entire hybrid for every test.

Closes #5462, #12675
2016-12-14 18:02:05 -08:00
86c50983d7 fix(upgrade): fix downgrade content projection and injector inheritance
- Full support for content projection in downgraded Angular 2
  components. In particular, this enables multi-slot projection and
  other features on <ng-content>.
- Correctly wire up hierarchical injectors for downgraded Angular 2
  components: downgraded components inherit the injector of the first
  other downgraded Angular 2 component they find up the DOM tree.

Closes #6629, #7727, #8729, #9643, #9649, #12675
2016-12-14 17:56:20 -08:00
21976446e0 refactor(upgrade/upgrade_adapter): use Deferred helper
Making Angular 1's `$compile` asynchronous by chaining injector promises
in linking functions can cause flickering views in applications.
2016-12-14 17:56:16 -08:00
998ce9ad7e refactor(upgrade/util): remove unused stringify() method 2016-12-14 17:56:11 -08:00
111523677c refactor(compiler/template_parser): export createElementCssSelector
This is needed in `ngUpgrade`.
2016-12-14 17:56:06 -08:00
2d74a224d0 refactor(upgrade): add missing Angular 1 type info 2016-12-14 17:55:56 -08:00
4d6ac9d414 fix(core): detectChanges() doesn't work on detached instance
Closes #13426
Closes #13472
2016-12-14 17:55:38 -08:00
6557bc34f6 fix(animations): throw errors and normalize offset beyond the range of [0,1]
Closes #13348
Closes #13440
2016-12-14 17:55:34 -08:00
e2622add07 perf(animations): always run the animation queue outside of zones
Related #12732
Closes #13440
2016-12-14 17:55:27 -08:00
ecfad467a1 fix(compiler): emit quoted object literal keys if the source is quoted
feat(tsc-wrapped): recored when to quote a object literal key

Collecting quoted literals is off by default as it introduces
a breaking change in the .metadata.json file. A follow-up commit
will address this.

Fixes #13249
Closes #13356
2016-12-14 17:55:22 -08:00
5918133784 Revert "fix(compiler): xmb <ph> tags should not self close (#13413)"
This reverts commit 4b3d135193.
closes #13463
2016-12-14 17:55:18 -08:00
700bce9ec1 Revert "test(i18n): fix a typo in the reference xmb (#13441)"
This reverts commit a8d237581d.
2016-12-14 17:55:14 -08:00
a64a35a8c1 refactor(facade): don't expect super() to return a new Error object in BaseError (#12600)
Related to #12575
2016-12-14 17:55:10 -08:00
b3dcff0cc1 fix(forms): ensure select[multiple] retains selections
If you bound an array to select[multiple] via ngModel and subsequently
changed the options to select from, the UI would drop any selections
made since by the user. This was due to
SelectMultipleControlValueAccessor not keeping a reference to the new
model arrays it generated when users interacted with the select control.
Update code to keep the reference.

Closes #12527
Closes #12654
2016-12-14 17:55:02 -08:00
124267c87a fix(forms): introduce checkbox required validator
Closes #11459
Closes #13364
2016-12-14 17:52:53 -08:00
547bfa92ef fix (forms): clear selected options when model is not an array (#12519)
When an invalid model value (eg empty string) was preset ngModel on
select[multiple] would throw an error, which is inconsistent with how it
works on other user input elements. Setting the model value to null or
undefined would also have no effect on what was already selected in the
UI. Fix this by clearing selected options when model set to null,
undefined or a type other than Array.

Closes #11926
2016-12-14 17:52:02 -08:00
d40bbf4d5c fix(core): properly destroy embedded Views attatched to ApplicationRef (#13459)
Fixes #13062
2016-12-14 17:51:56 -08:00
94b7031fe9 refactor: format & lint 2016-12-14 17:51:48 -08:00
df0bf1dd74 chore(internal API): introduce an internal API for ngtools. (#13415) 2016-12-14 17:51:40 -08:00
c8a9b70890 fix(compiler): generated CSS files suffixed with ngstyle. (#13353)
Mirrors factories which ends in `ngfactory`.

Closes #13141.
2016-12-14 17:50:29 -08:00
efa2d80df8 fix(compiler): make sure provider values with name property don’t break.
Fixes #13394
Closes #13445
2016-12-14 17:50:22 -08:00
a58e5efd09 test(i18n): fix a typo in the reference xmb (#13441) 2016-12-14 17:50:12 -08:00
86cf0ef892 refactor: remove intl from facades (#13404)
The existing intl.ts file is not a facade but
rather a set of utils used by i18n-related pipes only.
As such moving it back to common module so those utils
are not used accidently from other places.
2016-12-14 17:50:02 -08:00
5c568fab86 test(upgrade): fix failing test in browsers which do not support RAF
closes #13399
2016-12-14 17:49:52 -08:00
566104504c ci(browser providers): update browsers in SL and BS (#13431) 2016-12-14 17:49:37 -08:00
307d305b2d fix(compiler): narrow the span reported for invalid pipes
fixes #13326
closes #13411
2016-12-14 17:49:05 -08:00
0a7364feea fix(language-service): correctly type undefined
fixes #13412
closes #13414
2016-12-14 17:40:58 -08:00
4544b1d7a6 fix(compiler): xmb <ph> tags should not self close (#13413) 2016-12-14 17:39:51 -08:00
9e0e6b59d1 docs(core): update OnDestroy description (#13369)
Closes #11228
2016-12-14 17:39:45 -08:00
14dd2b367a fix(language-service): treat string unions as strings (#13406)
Fixes #13403
2016-12-14 17:39:36 -08:00
91eb8914dd build: update the package list of the symlinks scripts for Windows (#13408) 2016-12-14 17:39:30 -08:00
77823d721f refactor: format and lint code 2016-12-14 17:38:31 -08:00
2afe2d107f docs(Location): updating Location docs and adding example
closes #11500
2016-12-14 17:38:22 -08:00
17f40fb75f chore: Add @types/systemjs 2016-12-14 17:37:43 -08:00
98936fdf16 chore: convert hash_location_strategy example to a tested spec 2016-12-14 17:37:33 -08:00
7383e4a801 fix(forms): fix Validators.min/maxLength with FormArray (#13095)
Fixes #13089
2016-12-14 17:37:18 -08:00
65c9b5b6aa fix(http): create a copy of headers when merge options (#13365)
Closes #11980
2016-12-14 17:36:48 -08:00
5fab8710cb fix(dom_adapter): remove logError from logGroup (#12925) 2016-12-14 17:35:12 -08:00
f106a18b96 fix(http): check response body text against undefined (#13017) 2016-12-14 17:35:05 -08:00
8db184d349 fix(compiler): support dotted property binding
fixes angular/flex-layout#34
2016-12-14 17:31:08 -08:00
c18eb298eb test(Selector): add a test for dotted attribute names 2016-12-14 17:31:00 -08:00
3f4aa59cfa refactor(Compiler): cleanup 2016-12-14 17:30:50 -08:00
79728b4c41 fix(compiler): fix PR 13322 (#13331) 2016-12-14 17:30:40 -08:00
413167ab1b style: clang-format the code 2016-12-14 17:26:52 -08:00
203cc7e1f1 fix: Better instructions on running examples and their tests 2016-12-14 17:23:11 -08:00
b0cd514709 fix: Better error when directive not listed in NgModule.declarations 2016-12-14 17:22:58 -08:00
392c9ac214 fix(selector): SelectorMatcher match elements with :not selector (#12977) 2016-12-14 17:21:34 -08:00
a26e054857 fix(animations): always cleanup players after they have finished internally (#13334)
Closes #13333
Closes #13334
2016-12-14 17:21:23 -08:00
c0b001a6af refactor(router): misc refactoring (#13330) 2016-12-14 17:21:09 -08:00
c8c1f22f9c refactor(router): simplify regexp
closes #11373
closes #13329
2016-12-14 17:20:47 -08:00
e4d5a5f003 fix(router): add support for query params with mulitple values
closes #11373
2016-12-14 17:20:22 -08:00
03d9de33a1 Revert "fix(compiler): fix transpiled ES5 code (#13322)"
This reverts commit 4398056146.
2016-12-14 17:20:14 -08:00
a8a80cf523 doc: update triage owners for language service and router (#13325) 2016-12-14 17:19:50 -08:00
6c1d7908d5 fix(compiler): fix transpiled ES5 code (#13322)
fixes #13301

The inner class would transpile to a nested function declaration which is not
allowed in ES5.

See http://eslint.org/docs/rules/no-inner-declarations
2016-12-14 17:18:12 -08:00
9aab6d24eb build(tslint): enable no-inner-declarations (#13316) 2016-12-14 17:18:02 -08:00
5ee8155e4e fix(router): Use T type in Resolve interface (#13242) 2016-12-14 17:17:45 -08:00
21de0f239d docs(changelog): fix a typo (#13298) 2016-12-14 17:16:49 -08:00
4393 changed files with 92229 additions and 263343 deletions

View File

@ -1,6 +0,0 @@
# Disable sandboxing because it's too slow.
# https://github.com/bazelbuild/bazel/issues/2424
build --spawn_strategy=standalone
# Performance: avoid stat'ing input files
build --watchfs

View File

@ -1,20 +0,0 @@
version: 2
jobs:
build:
working_directory: ~/ng
docker:
- image: alexeagle/ngcontainer
steps:
- checkout
- restore_cache:
key: angular-{{ .Branch }}-{{ checksum "npm-shrinkwrap.json" }}
- run: npm install
- run: npm run postinstall
- run: ./node_modules/.bin/gulp lint
# Build twice, workaround for
# https://github.com/bazelbuild/bazel/issues/3114
- run: bazel build ... || bazel build ...
- save_cache:
key: angular-{{ .Branch }}-{{ checksum "npm-shrinkwrap.json" }}
paths:
- "node_modules"

4
.gitignore vendored
View File

@ -1,7 +1,6 @@
.DS_STORE
/dist/
bazel-*
node_modules
bower_components
@ -18,9 +17,8 @@ modules/.vscode
# Don't check in secret files
*secret.js
# Ignore npm/yarn debug log
# Ignore npm debug log
npm-debug.log
yarn-error.log
# build-analytics
.build-analytics

View File

@ -10,7 +10,6 @@
# chuckjaz - Chuck Jazdzewski
# gkalpak - George Kalpakas
# IgorMinar - Igor Minar
# jasonaden - Jason Aden
# kara - Kara Erickson
# matsko - Matias Niemelä
# mhevery - Misko Hevery
@ -20,8 +19,6 @@
# tbosch - Tobias Bosch
# vicb - Victor Berchet
# vikerman - Vikram Subramanian
# wardbell - Ward Bell
# tinayuangao - Tina Gao
version: 2
@ -39,10 +36,9 @@ groups:
include:
- "*"
exclude:
- "aio/*"
- "angular.io/*"
- "integration/*"
- "modules/*"
- "packages/*"
- "tools/*"
users:
- IgorMinar
@ -68,12 +64,9 @@ groups:
exclude:
- "tools/@angular/tsc-wrapped/*"
- "tools/public_api_guard/*"
- "aio/*"
users:
- IgorMinar #primary
- alexeagle
- jasonaden
- mhevery #fallback
- mhevery
integration:
conditions:
@ -90,27 +83,27 @@ groups:
core:
conditions:
files:
- "packages/core/*"
- "modules/@angular/core/*"
users:
- tbosch #primary
- mhevery
- vicb
- IgorMinar #fallback
animations:
compiler/animations:
conditions:
files:
- "packages/animation/*"
- "packages/platform-browser/animations/*"
- "modules/@angular/compiler/src/animation/*"
users:
- matsko #primary
- mhevery #fallback
- tbosch
- IgorMinar #fallback
- mhevery #fallback
compiler/i18n:
conditions:
files:
- "packages/compiler/src/i18n/*"
- "modules/@angular/compiler/src/i18n/*"
users:
- vicb #primary
- tbosch
@ -120,7 +113,7 @@ groups:
compiler:
conditions:
files:
- "packages/compiler/*"
- "modules/@angular/compiler/*"
users:
- tbosch #primary
- vicb
@ -132,11 +125,10 @@ groups:
conditions:
files:
- "tools/@angular/tsc-wrapped/*"
- "packages/compiler-cli/*"
- "modules/@angular/compiler-cli/*"
users:
- alexeagle
- chuckjaz
- vicb
- tbosch
- IgorMinar #fallback
- mhevery #fallback
@ -144,7 +136,7 @@ groups:
common:
conditions:
files:
- "packages/common/*"
- "modules/@angular/common/*"
users:
- pkozlowski-opensource #primary
- vicb
@ -154,17 +146,17 @@ groups:
forms:
conditions:
files:
- "packages/forms/*"
- "modules/@angular/forms/*"
users:
- kara #primary
- tinayuangao #secondary
# needs secondary
- IgorMinar #fallback
- mhevery #fallback
http:
conditions:
files:
- "packages/http/*"
- "modules/@angular/http/*"
users:
- vikerman #primary
- alxhub
@ -174,28 +166,27 @@ groups:
language-service:
conditions:
files:
- "packages/language-service/*"
- "modules/@angular/language-service/*"
users:
- chuckjaz #primary
- tbosch #secondary
- vicb
# needs secondary
- IgorMinar #fallback
- mhevery #fallback
router:
conditions:
files:
- "packages/router/*"
- "modules/@angular/router/*"
users:
- jasonaden
- vicb
- vicb #primary
# needs secondary
- IgorMinar #fallback
- mhevery #fallback
upgrade:
conditions:
files:
- "packages/upgrade/*"
- "modules/@angular/upgrade/*"
users:
- petebacondarwin #primary
- gkalpak
@ -205,7 +196,7 @@ groups:
platform-browser:
conditions:
files:
- "packages/platform-browser/*"
- "modules/@angular/platform-browser/*"
users:
- tbosch #primary
- vicb #secondary
@ -215,7 +206,7 @@ groups:
platform-server:
conditions:
files:
- "packages/platform-server/*"
- "modules/@angular/platform-server/*"
users:
- vikerman #primary
- alxhub
@ -227,7 +218,7 @@ groups:
platform-webworker:
conditions:
files:
- "packages/platform-webworker/*"
- "modules/@angular/platform-webworker/*"
users:
- vicb #primary
- tbosch #secondary
@ -239,7 +230,7 @@ groups:
benchpress:
conditions:
files:
- "packages/benchpress/*"
- "modules/@angular/benchpress/*"
users:
- tbosch #primary
# needs secondary
@ -249,10 +240,9 @@ groups:
angular.io:
conditions:
files:
- "aio/*"
- "angular.io/*"
users:
- IgorMinar #primary
- petebacondarwin #secondary
- gkalpak
- wardbell
- IgorMinar
- robwormald
- petebacondarwin
- mhevery #fallback

View File

@ -10,66 +10,42 @@ addons:
# needed to install g++ that is used by npms's native modules
- ubuntu-toolchain-r-test
packages:
# needed to install g++ that is used by npms's native modules
- g++-4.8
# https://docs.travis-ci.com/user/jwt
jwt:
# SAUCE_ACCESS_KEY<=secret for NGBUILDS_IO_KEY to work around travis-ci/travis-ci#7223, unencrypted value in valentine as NGBUILDS_IO_KEY>
# we alias NGBUILDS_IO_KEY to $SAUCE_ACCESS_KEY in env.sh and set the SAUCE_ACCESS_KEY there
- secure: "L7nrZwkAtFtYrP2DykPXgZvEKjkv0J/TwQ/r2QGxFTaBq4VZn+2Dw0YS7uCxoMqYzDwH0aAOqxoutibVpk8Z/16nE3tNmU5RzltMd6Xmt3qU2f/JDQLMo6PSlBodnjOUsDHJgmtrcbjhqrx/znA237BkNUu6UZRT7mxhXIZpn0U="
branches:
except:
- g3
- g3_v2_0
cache:
yarn: true
directories:
- ./node_modules
- ./.chrome/chromium
- ./aio/node_modules
env:
global:
# GITHUB_TOKEN_ANGULAR=<github token, a personal access token of the angular-builds account, account access in valentine>
# This is needed for the e2e Travis matrix task to publish packages to github for continuous packages delivery.
- secure: "aCdHveZuY8AT4Jr1JoJB4LxZsnGWRe/KseZh1YXYe5UtufFCtTVHvUcLn0j2aLBF0KpdyS+hWf0i4np9jthKu2xPKriefoPgCMpisYeC0MFkwbmv+XlgkUbgkgVZMGiVyX7DCYXVahxIoOUjVMEDCbNiHTIrfEuyq24U3ok2tHc="
# FIREBASE_TOKEN
# This is needed for publishing builds to the "aio-staging" and "angular-io" firebase projects.
# This token was generated using the aio-deploy@angular.io account using `firebase login:ci` and password from valentine
- secure: "L5CyQmpwWtoR4Qi4xlWQh/cL1M6ZeJL4W4QAr4HdKFMgYt9h+Whqkymyh2NxwmCbPvWa7yUd+OiLQUDCY7L2VIg16hTwoe2CgYDyQA0BEwLzxtRrJXl93TfwMlrUx5JSIzAccD6D4sjtz8kSFMomK2Nls33xOXOukwyhVMjd0Cg="
- secure: "rNqXoy2gqjbF5tBXlRBy+oiYntO3BtzcxZuEtlLMzNaTNzC4dyMOFub0GkzIPWwOzkARoEU9Kv+bC97fDVbCBUKeyzzEqxqddUKhzRxeaYjsefJ6XeTvBvDxwo7wDwyxZSuWdBeGAe4eARVHm7ypsd+AlvqxtzjyS27TK2BzdL4="
matrix:
# Order: a slower build first, so that we don't occupy an idle travis worker waiting for others to complete.
- CI_MODE=e2e
- CI_MODE=e2e_2
- CI_MODE=js
- CI_MODE=saucelabs_required
- CI_MODE=browserstack_required
- CI_MODE=saucelabs_optional
- CI_MODE=browserstack_optional
- CI_MODE=docs_test
- CI_MODE=aio
- CI_MODE=aio_e2e
- CI_MODE=js
- CI_MODE=e2e
- CI_MODE=saucelabs_required
- CI_MODE=browserstack_required
- CI_MODE=saucelabs_optional
- CI_MODE=browserstack_optional
matrix:
fast_finish: true
allow_failures:
- env: "CI_MODE=saucelabs_optional"
- env: "CI_MODE=browserstack_optional"
- env: "CI_MODE=aio_e2e"
before_install:
# source the env.sh script so that the exported variables are available to other scripts later on
- source ./scripts/ci/env.sh print
install:
- ./scripts/ci/install.sh
- ./scripts/ci-lite/install.sh
script:
- ./scripts/ci/build.sh
- ./scripts/ci/test.sh
# deploy is part of 'script' and not 'after_success' so that we fail the build if the deployment fails
- ./scripts/ci/deploy.sh
- ./scripts/ci/angular.sh
# all the scripts under this line will not quickly abort in case ${TRAVIS_TEST_RESULT} is 1 (job failure)
- ./scripts/ci/cleanup.sh
- ./scripts/ci/print-logs.sh
- ./scripts/ci-lite/build.sh && ./scripts/ci-lite/test.sh
after_script:
- ./scripts/ci-lite/cleanup.sh

18
BUILD
View File

@ -1,18 +0,0 @@
package(default_visibility = ["//visibility:public"])
exports_files(["tsconfig.json"])
# This rule belongs in node_modules/BUILD
# It's here as a workaround for
# https://github.com/bazelbuild/bazel/issues/374#issuecomment-296217940
filegroup(
name = "node_modules",
srcs = glob([
# Performance workaround: list individual files
# This won't scale in the general case.
# TODO(alexeagle): figure out what to do
"node_modules/typescript/lib/**",
"node_modules/zone.js/**/*.d.ts",
"node_modules/rxjs/**/*.d.ts",
"node_modules/@types/**/*.d.ts",
]),
)

File diff suppressed because it is too large Load Diff

View File

@ -1,4 +1,4 @@
# Pushing changes into the Angular tree
# Pushing changes into the Angular 2 tree
Please see [Using git with Angular repositories](https://docs.google.com/document/d/1h8nijFSaa1jG_UE8v4WP7glh5qOUXnYtAtJh_gwOQHI/edit)
for details about how we maintain a linear commit history, and the rules for committing.

View File

@ -147,7 +147,7 @@ To ensure consistency throughout the source code, keep these rules in mind as yo
* All public API methods **must be documented**. (Details TBC).
* We follow [Google's JavaScript Style Guide][js-style-guide], but wrap all code at
**100 characters**. An automated formatter is available, see
[DEVELOPER.md](docs/DEVELOPER.md#clang-format).
[DEVELOPER.md](DEVELOPER.md#clang-format).
## <a name="commit"></a> Commit Message Guidelines
@ -191,44 +191,21 @@ If the commit reverts a previous commit, it should begin with `revert: `, follow
### Type
Must be one of the following:
* **build**: Changes that affect the build system or external dependencies (example scopes: gulp, broccoli, npm)
* **ci**: Changes to our CI configuration files and scripts (example scopes: Travis, Circle, BrowserStack, SauceLabs)
* **docs**: Documentation only changes
* **feat**: A new feature
* **fix**: A bug fix
* **perf**: A code change that improves performance
* **refactor**: A code change that neither fixes a bug nor adds a feature
* **docs**: Documentation only changes
* **style**: Changes that do not affect the meaning of the code (white-space, formatting, missing
semi-colons, etc)
* **refactor**: A code change that neither fixes a bug nor adds a feature
* **perf**: A code change that improves performance
* **test**: Adding missing tests or correcting existing tests
* **build**: Changes that affect the build system or external dependencies (example scopes: gulp, broccoli, npm)
* **ci**: Changes to our CI configuration files and scripts (example scopes: Travis, Circle, BrowserStack, SauceLabs)
* **chore**: Other changes that don't modify `src` or `test` files
### Scope
The scope should be the name of the npm package affected (as perceived by person reading changelog generated from commit messages.
The following is the list of supported scopes:
* **common**
* **compiler**
* **compiler-cli**
* **core**
* **forms**
* **http**
* **language-service**
* **platform-browser**
* **platform-browser-dynamic**
* **platform-server**
* **platform-webworker**
* **platform-webworker-dynamic**
* **router**
* **upgrade**
* **tsc-wrapped**
There is currently few exception to the "use package name" rule:
* **packaging**: used for changes that change the npm package layout in all of our packages, e.g. public path changes, package.json changes done to all packages, d.ts file/format changes, changes to bundles, etc.
* **changelog**: used for updating the release notes in CHANGELOG.md
* **aio**: used for docs-app (angular.io) related changes within the /aio directory of the repo
* none/empty string: useful for `style`, `test` and `refactor` changes that are done across all packages (e.g. `style: add missing semicolons`)
The scope could be anything specifying place of the commit change. For example
`Compiler`, `ElementInjector`, etc.
### Subject
The subject contains succinct description of the change:
@ -263,7 +240,7 @@ changes to be accepted, the CLA must be signed. It's a quick process, we promise
[coc]: https://github.com/angular/code-of-conduct/blob/master/CODE_OF_CONDUCT.md
[commit-message-format]: https://docs.google.com/document/d/1QrDFcIiPjSLDn3EL15IJygNPiHORgU1_OOAqWjiDU5Y/edit#
[corporate-cla]: http://code.google.com/legal/corporate-cla-v1.0.html
[dev-doc]: https://github.com/angular/angular/blob/master/docs/DEVELOPER.md
[dev-doc]: https://github.com/angular/angular/blob/master/DEVELOPER.md
[github]: https://github.com/angular/angular
[gitter]: https://gitter.im/angular/angular
[individual-cla]: http://code.google.com/legal/individual-cla-v1.0.html

View File

@ -161,13 +161,7 @@ You can check that your code is properly formatted and adheres to coding style b
$ gulp lint
```
## Publishing snapshot builds
When the `master` branch successfully builds on Travis, it automatically publishes build artifacts
to repositories in the Angular org, eg. the `@angular/core` package is published to
http://github.com/angular/core-builds.
The ES2015 version of Angular is published to a different branch in these repos, for example
http://github.com/angular/core-builds#master-es2015
## Publishing your own personal snapshot build
You may find that your un-merged change needs some validation from external participants.
Rather than requiring them to pull your Pull Request and build Angular locally, you can

View File

@ -1,7 +1,7 @@
Naming Conventions in Angular
Naming Conventions in Angular2
---
In general Angular should follow TypeScript naming conventions.
In general Angular2 should follow TypeScript naming conventions.
See: https://github.com/Microsoft/TypeScript/wiki/Coding-guidelines

View File

@ -5,13 +5,16 @@
[![Issue Stats](http://issuestats.com/github/angular/angular/badge/issue?style=flat)](http://issuestats.com/github/angular/angular)
[![npm version](https://badge.fury.io/js/%40angular%2Fcore.svg)](https://badge.fury.io/js/%40angular%2Fcore)
[![Sauce Test Status](https://saucelabs.com/browser-matrix/angular2-ci.svg)](https://saucelabs.com/u/angular2-ci)
[![Sauce Test Status](https://saucelabs.com/browser-matrix/angular2-ci.svg)](https://saucelabs.com/u/angular2-ci)
*Safari (7+), iOS (7+), Edge (14) and IE mobile (11) are tested on [BrowserStack][browserstack].*
Angular
=========
Angular is a development platform for building mobile and desktop web applications using Typescript/JavaScript (JS) and other languages.
Angular is a development platform for building mobile and desktop web applications. This is the
repository for [Angular 2][ng2] Typescript/JavaScript (JS).
Angular2 for [Dart][dart] can be found at [dart-lang/angular2][ng2dart].
## Quickstart
@ -26,5 +29,9 @@ guidelines for [contributing][contributing] and then check out one of our issues
[browserstack]: https://www.browserstack.com/
[contributing]: http://github.com/angular/angular/blob/master/CONTRIBUTING.md
[dart]: http://www.dartlang.org
[quickstart]: https://angular.io/docs/ts/latest/quickstart.html
[ng]: http://angular.io
[ng2]: http://angular.io
[ngDart]: http://angulardart.org
[ngJS]: http://angularjs.org
[ng2dart]: https://github.com/dart-lang/angular2

View File

@ -19,12 +19,6 @@ I'm sorry but we don't understand the problem you are reporting.
If the problem still exists please open a new issue and provide a plunker reproducing the problem and describing the difference between the expected and current behavior. You can use this plunker template: http://plnkr.co/edit/tpl:AvJOMERrnz94ekVua0u5?p=catalogue
```
## Angular: Plunker Needed (v1)
```
I'm sorry but reported issues require a plunker reproducing the problem.
If this issue persists, please create a plunker using this template and describe the difference between the expected and current behavior and create a new issue: http://plnkr.co/edit/tpl:AvJOMERrnz94ekVua0u5?p=catalogue
```
## Angular: Duplicate (v1)
```
@ -44,10 +38,6 @@ If the problem still persists, please file a new issue and ensure you provide al
I'm sorry but this issue is not caused by Angular. Please contact the author(s) of project <PROJECT NAME> or file issue on their issue tracker.
```
## Angular: Behaving as Expected (v1)
```
It appears this behaves as expected. If you still feel there is an issue, please provide further details in a new issue.
```
## Angular: Non-reproducible (v1)
```

View File

@ -1,4 +1,4 @@
# Developer Tools for Angular
# Developer Tools for Angular 2
Here you will find a collection of tools and tips for keeping your application
perform well and contain fewer bugs.

View File

@ -1,4 +1,4 @@
# Triage Process and Github Labels for Angular
# Triage Process and Github Labels for Angular 2
This document describes how the Angular team uses labels and milestones
to triage issues on github. The basic idea of the process is that
@ -71,7 +71,7 @@ issues within the component will be resolved.
Several owners have adopted the issue categorization based on
[user pain](http://www.lostgarden.com/2008/05/improving-bug-triage-with-user-pain.html)
used by AngularJS. In this system every issue is assigned frequency and
used by Angular 1. In this system every issue is assigned frequency and
severity based on which the total user pain score is calculated.
Following is the definition of various frequency and severity levels:

View File

@ -1,12 +0,0 @@
load("@bazel_tools//tools/build_defs/repo:git.bzl", "git_repository")
git_repository(
name = "io_bazel_rules_typescript",
remote = "https://github.com/bazelbuild/rules_typescript.git",
tag = "0.0.3",
)
load("@io_bazel_rules_typescript//:defs.bzl", "node_repositories", "yarn_install")
node_repositories()
yarn_install(package_json = "//:package.json")

View File

@ -1,65 +0,0 @@
{
"$schema": "./node_modules/@angular/cli/lib/config/schema.json",
"project": {
"name": "site"
},
"apps": [
{
"root": "src",
"outDir": "dist",
"assets": [
"assets",
"generated",
"app/search/search-worker.js",
"favicon.ico",
"pwa-manifest.json",
"google385281288605d160.html"
],
"index": "index.html",
"main": "main.ts",
"polyfills": "polyfills.ts",
"test": "test.ts",
"tsconfig": "tsconfig.app.json",
"testTsconfig": "tsconfig.spec.json",
"prefix": "aio",
"serviceWorker": false,
"styles": [
"styles.scss"
],
"scripts": [
],
"environmentSource": "environments/environment.ts",
"environments": {
"dev": "environments/environment.ts",
"prod": "environments/environment.prod.ts"
}
}
],
"e2e": {
"protractor": {
"config": "./protractor.conf.js"
}
},
"lint": [
{
"project": "src/tsconfig.app.json"
},
{
"project": "src/tsconfig.spec.json"
},
{
"project": "e2e/tsconfig.e2e.json"
}
],
"test": {
"karma": {
"config": "./karma.conf.js"
}
},
"defaults": {
"styleExt": "scss",
"component": {
"inlineStyle": true
}
}
}

View File

@ -1,5 +0,0 @@
{
"projects": {
"staging": "aio-staging"
}
}

45
aio/.gitignore vendored
View File

@ -1,45 +0,0 @@
# See http://help.github.com/ignore-files/ for more about ignoring files.
# compiled output
/dist
/out-tsc
/src/generated
/tmp
# dependencies
/node_modules
# IDEs and editors
/.idea
.project
.classpath
.c9/
*.launch
.settings/
*.sublime-workspace
# IDE - VSCode
.vscode/*
!.vscode/settings.json
!.vscode/tasks.json
!.vscode/launch.json
!.vscode/extensions.json
# misc
/.sass-cache
/connect.lock
/coverage
/libpeerconnection.log
npm-debug.log
testem.log
/typings
yarn-error.log
# e2e
/e2e/*.js
/e2e/*.map
protractor-results*.txt
# System Files
.DS_Store
Thumbs.db

View File

@ -1,105 +0,0 @@
# Angular documentation project (https://angular.io)
Everything in this folder is part of the documentation project. This includes
* the web site for displaying the documentation
* the dgeni configuration for converting source files to rendered files that can be viewed in the web site.
* the tooling for setting up examples for development; and generating plunkers and zip files from the examples.
## Developer tasks
We use `yarn` to manage the dependencies and to run build tasks.
You should run all these tasks from the `angular/aio` folder.
Here are the most important tasks you might need to use:
* `yarn` - install all the dependencies.
* `yarn setup` - Install all the dependencies, boilerplate, plunkers, zips and runs dgeni on the docs.
* `yarn start` - run a development web server that watches the files; then builds the doc-viewer and reloads the page, as necessary.
* `yarn lint` - check that the doc-viewer code follows our style rules.
* `yarn test` - watch all the source files, for the doc-viewer, and run all the unit tests when any change.
* `yarn e2e` - run all the e2e tests for the doc-viewer.
* `yarn docs` - generate all the docs from the source files.
* `yarn docs-watch` - watch the Angular source and the docs files and run a short-circuited doc-gen for the docs that changed.
* `yarn docs-lint` - check that the doc gen code follows our style rules.
* `yarn docs-test` - run the unit tests for the doc generation code.
* `yarn boilerplate:add` - generate all the boilerplate code for the examples, so that they can be run locally.
* `yarn boilerplate:remove` - remove all the boilerplate code that was added via `yarn boilerplate:add`.
* `yarn generate-plunkers` - generate the plunker files that are used by the `live-example` tags in the docs.
* `yarn generate-zips` - generate the zip files from the examples. Zip available via the `live-example` tags in the docs.
* `yarn build-ie-polyfills` - generates a js file of polyfills that can be loaded in Internet Explorer.
## Using ServiceWorker locally
Since abb36e3cb, running `yarn start -- --prod` will no longer set up the ServiceWorker, which
would require manually running `yarn sw-manifest` and `yarn sw-copy` (something that is not possible
with webpack serving the files from memory).
If you want to test ServiceWorker locally, you can use `yarn build` and serve the files in `dist/`
with `yarn http-server -- dist -p 4200`.
For more details see #16745.
## Guide to authoring
There are two types of content in the documentatation:
* **API docs**: descriptions of the modules, classes, interfaces, decorators, etc that make up the Angular platform.
API docs are generated directly from the source code.
The source code is contained in TypeScript files, located in the `angular/packages` folder.
Each API item may have a preceding comment, which contains JSDoc style tags and content.
The content is written in markdown.
* **Other content**: guides, tutorials, and other marketing material.
All other content is written using markdown in text files, located in the `angular/aio/content` folder.
More specifically, there are sub-folders that contain particular types of content: guides, tutorial and marketing.
We use the [dgeni](https://github.com/angular/dgeni) tool to convert these files into docs that can be viewed in the doc-viewer.
### Generating the complete docs
The main task for generating the docs is `yarn docs`. This will process all the source files (API and other),
extracting the documentation and generating JSON files that can be consumed by the doc-viewer.
### Partial doc generation for editors
Full doc generation can take up to one minute. That's too slow for efficient document creation and editing.
You can make small changes in a smart editor that displays formatted markdown:
>In VS Code, _Cmd-K, V_ opens markdown preview in side pane; _Cmd-B_ toggles left sidebar
You also want to see those changes displayed properly in the doc viewer
with a quick, edit/view cycle time.
For this purpose, use the `yarn docs-watch` task, which watches for changes to source files and only
re-processes the the files necessary to generate the docs that are related to the file that has changed.
Since this task takes shortcuts, it is much faster (often less than 1 second) but it won't produce full
fidelity content. For example, links to other docs and code examples may not render correctly. This is
most particularly noticed in links to other docs and in the embedded examples, which may not always render
correctly.
The general setup is as follows:
* Open a terminal, ensure the dependencies are installed; run an initial doc generation; then start the doc-viewer:
```bash
yarn
yarn docs
yarn start
```
* Open a second terminal and start watching the docs
```bash
yarn docs-watch
```
* Open a browser at https://localhost:4200/ and navigate to the document on which you want to work.
You can automatically open the browser by using `yarn start -- -o` in the first terminal.
* Make changes to the page's associated doc or example files. Every time a file is saved, the doc will
be regenerated, the app will rebuild and the page will reload.

View File

@ -1,3 +0,0 @@
scripts-js/lib
scripts-js/node_modules
scripts-js/**/test

View File

@ -1,165 +0,0 @@
# Image metadata and config
FROM debian:jessie
LABEL name="angular.io PR preview" \
description="This image implements the PR preview functionality for angular.io." \
vendor="Angular" \
version="1.0"
VOLUME /aio-secrets
VOLUME /var/www/aio-builds
EXPOSE 80 443
# Build-time args and env vars
ARG AIO_BUILDS_DIR=/var/www/aio-builds
ARG TEST_AIO_BUILDS_DIR=/tmp/aio-builds
ARG AIO_DOMAIN_NAME=ngbuilds.io
ARG TEST_AIO_DOMAIN_NAME=$AIO_DOMAIN_NAME.localhost
ARG AIO_GITHUB_ORGANIZATION=angular
ARG TEST_AIO_GITHUB_ORGANIZATION=angular
ARG AIO_GITHUB_TEAM_SLUGS=angular-core,aio-contributors
ARG TEST_AIO_GITHUB_TEAM_SLUGS=angular-core,aio-contributors
ARG AIO_NGINX_HOSTNAME=$AIO_DOMAIN_NAME
ARG TEST_AIO_NGINX_HOSTNAME=$TEST_AIO_DOMAIN_NAME
ARG AIO_NGINX_PORT_HTTP=80
ARG TEST_AIO_NGINX_PORT_HTTP=8080
ARG AIO_NGINX_PORT_HTTPS=443
ARG TEST_AIO_NGINX_PORT_HTTPS=4433
ARG AIO_REPO_SLUG=angular/angular
ARG TEST_AIO_REPO_SLUG=test-repo/test-slug
ARG AIO_UPLOAD_HOSTNAME=upload.localhost
ARG TEST_AIO_UPLOAD_HOSTNAME=upload.localhost
ARG AIO_UPLOAD_MAX_SIZE=20971520
ARG TEST_AIO_UPLOAD_MAX_SIZE=20971520
ARG AIO_UPLOAD_PORT=3000
ARG TEST_AIO_UPLOAD_PORT=3001
ENV AIO_BUILDS_DIR=$AIO_BUILDS_DIR TEST_AIO_BUILDS_DIR=$TEST_AIO_BUILDS_DIR \
AIO_DOMAIN_NAME=$AIO_DOMAIN_NAME TEST_AIO_DOMAIN_NAME=$TEST_AIO_DOMAIN_NAME \
AIO_GITHUB_ORGANIZATION=$AIO_GITHUB_ORGANIZATION TEST_AIO_GITHUB_ORGANIZATION=$TEST_AIO_GITHUB_ORGANIZATION \
AIO_GITHUB_TEAM_SLUGS=$AIO_GITHUB_TEAM_SLUGS TEST_AIO_GITHUB_TEAM_SLUGS=$TEST_AIO_GITHUB_TEAM_SLUGS \
AIO_LOCALCERTS_DIR=/etc/ssl/localcerts TEST_AIO_LOCALCERTS_DIR=/etc/ssl/localcerts-test \
AIO_NGINX_HOSTNAME=$AIO_NGINX_HOSTNAME TEST_AIO_NGINX_HOSTNAME=$TEST_AIO_NGINX_HOSTNAME \
AIO_NGINX_LOGS_DIR=/var/log/aio/nginx TEST_AIO_NGINX_LOGS_DIR=/var/log/aio/nginx-test \
AIO_NGINX_PORT_HTTP=$AIO_NGINX_PORT_HTTP TEST_AIO_NGINX_PORT_HTTP=$TEST_AIO_NGINX_PORT_HTTP \
AIO_NGINX_PORT_HTTPS=$AIO_NGINX_PORT_HTTPS TEST_AIO_NGINX_PORT_HTTPS=$TEST_AIO_NGINX_PORT_HTTPS \
AIO_REPO_SLUG=$AIO_REPO_SLUG TEST_AIO_REPO_SLUG=$TEST_AIO_REPO_SLUG \
AIO_SCRIPTS_JS_DIR=/usr/share/aio-scripts-js \
AIO_SCRIPTS_SH_DIR=/usr/share/aio-scripts-sh \
AIO_UPLOAD_HOSTNAME=$AIO_UPLOAD_HOSTNAME TEST_AIO_UPLOAD_HOSTNAME=$TEST_AIO_UPLOAD_HOSTNAME \
AIO_UPLOAD_MAX_SIZE=$AIO_UPLOAD_MAX_SIZE TEST_AIO_UPLOAD_MAX_SIZE=$TEST_AIO_UPLOAD_MAX_SIZE \
AIO_UPLOAD_PORT=$AIO_UPLOAD_PORT TEST_AIO_UPLOAD_PORT=$TEST_AIO_UPLOAD_PORT \
AIO_WWW_USER=www-data \
NODE_ENV=production
# Create directory for logs
RUN mkdir /var/log/aio
# Add extra package sources
RUN apt-get update -y && apt-get install -y curl
RUN curl --silent --show-error --location https://deb.nodesource.com/setup_6.x | bash -
RUN curl --silent --show-error https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add -
RUN echo "deb https://dl.yarnpkg.com/debian/ stable main" | tee /etc/apt/sources.list.d/yarn.list
RUN echo "deb http://ftp.debian.org/debian jessie-backports main" | tee /etc/apt/sources.list.d/backports.list
# Install packages
RUN apt-get update -y && apt-get install -y \
chkconfig \
cron \
dnsmasq \
nano \
nodejs \
openssl \
rsyslog \
yarn
RUN apt-get install -t jessie-backports -y nginx
RUN yarn global add pm2@2
# Set up log rotation
COPY logrotate/* /etc/logrotate.d/
RUN chmod 0644 /etc/logrotate.d/*
# Set up cronjobs
COPY cronjobs/aio-builds-cleanup /etc/cron.d/
RUN chmod 0744 /etc/cron.d/aio-builds-cleanup
RUN crontab /etc/cron.d/aio-builds-cleanup
RUN printenv | grep AIO_ >> /etc/environment
# Set up dnsmasq
COPY dnsmasq/dnsmasq.conf /etc/
RUN sed -i "s|{{\$AIO_NGINX_HOSTNAME}}|$AIO_NGINX_HOSTNAME|g" /etc/dnsmasq.conf
RUN sed -i "s|{{\$AIO_UPLOAD_HOSTNAME}}|$AIO_UPLOAD_HOSTNAME|g" /etc/dnsmasq.conf
RUN sed -i "s|{{\$TEST_AIO_NGINX_HOSTNAME}}|$TEST_AIO_NGINX_HOSTNAME|g" /etc/dnsmasq.conf
RUN sed -i "s|{{\$TEST_AIO_UPLOAD_HOSTNAME}}|$TEST_AIO_UPLOAD_HOSTNAME|g" /etc/dnsmasq.conf
# Set up SSL/TLS certificates
COPY nginx/create-selfsigned-cert.sh /tmp/
RUN chmod a+x /tmp/create-selfsigned-cert.sh
RUN /tmp/create-selfsigned-cert.sh "selfcert-prod" "$AIO_NGINX_HOSTNAME" "$AIO_LOCALCERTS_DIR"
RUN /tmp/create-selfsigned-cert.sh "selfcert-test" "$TEST_AIO_NGINX_HOSTNAME" "$TEST_AIO_LOCALCERTS_DIR"
RUN rm /tmp/create-selfsigned-cert.sh
RUN update-ca-certificates
# Set up nginx (for production and testing)
RUN sed -i -E "s|^user\s+\S+;|user $AIO_WWW_USER;|" /etc/nginx/nginx.conf
RUN rm -f /etc/nginx/conf.d/*
RUN rm -f /etc/nginx/sites-enabled/*
COPY nginx/aio-builds.conf /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_BUILDS_DIR}}|$AIO_BUILDS_DIR|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_DOMAIN_NAME}}|$AIO_DOMAIN_NAME|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_LOCALCERTS_DIR}}|$AIO_LOCALCERTS_DIR|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_NGINX_LOGS_DIR}}|$AIO_NGINX_LOGS_DIR|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_NGINX_PORT_HTTP}}|$AIO_NGINX_PORT_HTTP|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_NGINX_PORT_HTTPS}}|$AIO_NGINX_PORT_HTTPS|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_UPLOAD_HOSTNAME}}|$AIO_UPLOAD_HOSTNAME|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_UPLOAD_MAX_SIZE}}|$AIO_UPLOAD_MAX_SIZE|g" /etc/nginx/conf.d/aio-builds-prod.conf
RUN sed -i "s|{{\$AIO_UPLOAD_PORT}}|$AIO_UPLOAD_PORT|g" /etc/nginx/conf.d/aio-builds-prod.conf
COPY nginx/aio-builds.conf /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_BUILDS_DIR}}|$TEST_AIO_BUILDS_DIR|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_DOMAIN_NAME}}|$TEST_AIO_DOMAIN_NAME|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_LOCALCERTS_DIR}}|$TEST_AIO_LOCALCERTS_DIR|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_NGINX_LOGS_DIR}}|$TEST_AIO_NGINX_LOGS_DIR|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_NGINX_PORT_HTTP}}|$TEST_AIO_NGINX_PORT_HTTP|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_NGINX_PORT_HTTPS}}|$TEST_AIO_NGINX_PORT_HTTPS|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_UPLOAD_HOSTNAME}}|$TEST_AIO_UPLOAD_HOSTNAME|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_UPLOAD_MAX_SIZE}}|$TEST_AIO_UPLOAD_MAX_SIZE|g" /etc/nginx/conf.d/aio-builds-test.conf
RUN sed -i "s|{{\$AIO_UPLOAD_PORT}}|$TEST_AIO_UPLOAD_PORT|g" /etc/nginx/conf.d/aio-builds-test.conf
# Set up pm2
RUN pm2 startup systemv -u root > /dev/null
RUN chkconfig pm2-root on
# Set up the shell scripts
COPY scripts-sh/ $AIO_SCRIPTS_SH_DIR/
RUN chmod a+x $AIO_SCRIPTS_SH_DIR/*
RUN find $AIO_SCRIPTS_SH_DIR -maxdepth 1 -type f -printf "%P\n" \
| while read file; do ln -s $AIO_SCRIPTS_SH_DIR/$file /usr/local/bin/aio-${file%.*}; done
# Set up the Node.js scripts
COPY scripts-js/ $AIO_SCRIPTS_JS_DIR/
WORKDIR $AIO_SCRIPTS_JS_DIR/
RUN yarn install --production
# Set up health check
HEALTHCHECK --interval=5m CMD /usr/local/bin/aio-health-check
# Go!
WORKDIR /
CMD aio-init && tail -f /dev/null

View File

@ -1,2 +0,0 @@
# Periodically clean up builds that do not correspond to currently open PRs
0 12 * * * root /usr/local/bin/aio-clean-up >> /var/log/cron.log 2>&1

View File

@ -1,16 +0,0 @@
# Do not read /etc/resolv.conf. Get servers from this file instead.
no-resolv
server=8.8.8.8
server=8.8.4.4
# Listen for DHCP and DNS requests only on this address.
listen-address=127.0.0.1
# Force an IP addres for these domains.
address=/{{$AIO_NGINX_HOSTNAME}}/127.0.0.1
address=/{{$AIO_UPLOAD_HOSTNAME}}/127.0.0.1
address=/{{$TEST_AIO_NGINX_HOSTNAME}}/127.0.0.1
address=/{{$TEST_AIO_UPLOAD_HOSTNAME}}/127.0.0.1
# Run as root (required from inside docker container).
user=root

View File

@ -1,9 +0,0 @@
/var/log/aio/clean-up.log /var/log/aio/init.log /var/log/aio/verify-setup.log {
compress
create
delaycompress
missingok
monthly
notifempty
rotate 6
}

View File

@ -1,13 +0,0 @@
/var/log/aio/nginx/*.log /var/log/aio/nginx-test/*.log {
compress
create
delaycompress
missingok
monthly
notifempty
rotate 6
sharedscripts
postrotate
service nginx rotate >/dev/null 2>&1
endscript
}

View File

@ -1,9 +0,0 @@
/var/log/aio/upload-server-*.log {
compress
copytruncate
delaycompress
missingok
monthly
notifempty
rotate 6
}

View File

@ -1,95 +0,0 @@
# Redirect all HTTP traffic to HTTPS
server {
server_name _;
listen {{$AIO_NGINX_PORT_HTTP}} default_server;
listen [::]:{{$AIO_NGINX_PORT_HTTP}};
access_log {{$AIO_NGINX_LOGS_DIR}}/access.log;
error_log {{$AIO_NGINX_LOGS_DIR}}/error.log;
# Ideally we want 308 (permanent + keep original method),
# but it is relatively new and not supported by some clients (e.g. cURL).
return 307 https://$host:{{$AIO_NGINX_PORT_HTTPS}}$request_uri;
}
# Serve PR-preview requests
server {
server_name "~^pr(?<pr>[1-9][0-9]*)-(?<sha>[0-9a-f]{40})\.";
listen {{$AIO_NGINX_PORT_HTTPS}} ssl http2;
listen [::]:{{$AIO_NGINX_PORT_HTTPS}} ssl http2;
ssl_certificate {{$AIO_LOCALCERTS_DIR}}/{{$AIO_DOMAIN_NAME}}.crt;
ssl_certificate_key {{$AIO_LOCALCERTS_DIR}}/{{$AIO_DOMAIN_NAME}}.key;
ssl_prefer_server_ciphers on;
ssl_ciphers EECDH+CHACHA20:EECDH+AES128:RSA+AES128:EECDH+AES256:RSA+AES256:EECDH+3DES:RSA+3DES:!MD5;
root {{$AIO_BUILDS_DIR}}/$pr/$sha;
disable_symlinks on from=$document_root;
index index.html;
gzip on;
gzip_comp_level 7;
gzip_types *;
access_log {{$AIO_NGINX_LOGS_DIR}}/access.log;
error_log {{$AIO_NGINX_LOGS_DIR}}/error.log;
location "~/[^/]+\.[^/]+$" {
try_files $uri $uri/ =404;
}
location / {
try_files $uri $uri/ /index.html =404;
}
}
# Handle all other requests
server {
server_name _;
listen {{$AIO_NGINX_PORT_HTTPS}} ssl http2 default_server;
listen [::]:{{$AIO_NGINX_PORT_HTTPS}} ssl http2;
ssl_certificate {{$AIO_LOCALCERTS_DIR}}/{{$AIO_DOMAIN_NAME}}.crt;
ssl_certificate_key {{$AIO_LOCALCERTS_DIR}}/{{$AIO_DOMAIN_NAME}}.key;
ssl_prefer_server_ciphers on;
ssl_ciphers EECDH+CHACHA20:EECDH+AES128:RSA+AES128:EECDH+AES256:RSA+AES256:EECDH+3DES:RSA+3DES:!MD5;
access_log {{$AIO_NGINX_LOGS_DIR}}/access.log;
error_log {{$AIO_NGINX_LOGS_DIR}}/error.log;
# Health check
location "~^/health-check/?$" {
add_header Content-Type text/plain;
return 200 '';
}
# Upload builds
location "~^/create-build/(?<pr>[1-9][0-9]*)/(?<sha>[0-9a-f]{40})/?$" {
if ($request_method != "POST") {
add_header Allow "POST";
return 405;
}
client_body_temp_path /tmp/aio-create-builds;
client_body_buffer_size 128K;
client_max_body_size {{$AIO_UPLOAD_MAX_SIZE}};
client_body_in_file_only on;
proxy_pass_request_headers on;
proxy_set_header X-FILE $request_body_file;
proxy_set_body off;
proxy_redirect off;
proxy_method GET;
proxy_pass http://{{$AIO_UPLOAD_HOSTNAME}}:{{$AIO_UPLOAD_PORT}}$request_uri;
resolver 127.0.0.1;
}
# Everything else
location / {
return 404;
}
}

View File

@ -1,20 +0,0 @@
#!/bin/bash
set -eu -o pipefail
# Variables
confFile=/tmp/$1.conf
domainName=$2
outDir=$3
# Create certificate
cp /etc/ssl/openssl.cnf "$confFile"
echo "[subjectAltName]" >> "$confFile"
echo "subjectAltName = DNS:$domainName, DNS:*.$domainName" >> "$confFile"
mkdir -p $outDir
openssl req -days 365 -newkey rsa:2048 -nodes -sha256 -x509 \
-config "$confFile" -extensions subjectAltName -subj "/CN=$domainName" \
-out "$outDir/$domainName.crt" -keyout "$outDir/$domainName.key"
chmod -R 400 "$outDir"
cp "$outDir/$domainName.crt" /usr/local/share/ca-certificates

View File

@ -1,2 +0,0 @@
/dist
/node_modules

View File

@ -1,71 +0,0 @@
// Imports
import * as fs from 'fs';
import * as path from 'path';
import * as shell from 'shelljs';
import {GithubPullRequests} from '../common/github-pull-requests';
import {assertNotMissingOrEmpty} from '../common/utils';
// Classes
export class BuildCleaner {
// Constructor
constructor(protected buildsDir: string, protected repoSlug: string, protected githubToken: string) {
assertNotMissingOrEmpty('buildsDir', buildsDir);
assertNotMissingOrEmpty('repoSlug', repoSlug);
assertNotMissingOrEmpty('githubToken', githubToken);
}
// Methods - Public
public cleanUp(): Promise<void> {
return Promise.all([
this.getExistingBuildNumbers(),
this.getOpenPrNumbers(),
]).then(([existingBuilds, openPrs]) => this.removeUnnecessaryBuilds(existingBuilds, openPrs));
}
// Methods - Protected
protected getExistingBuildNumbers(): Promise<number[]> {
return new Promise((resolve, reject) => {
fs.readdir(this.buildsDir, (err, files) => {
if (err) {
return reject(err);
}
const buildNumbers = files.
map(Number). // Convert string to number
filter(Boolean); // Ignore NaN (or 0), because they are not builds
resolve(buildNumbers);
});
});
}
protected getOpenPrNumbers(): Promise<number[]> {
const githubPullRequests = new GithubPullRequests(this.githubToken, this.repoSlug);
return githubPullRequests.
fetchAll('open').
then(prs => prs.map(pr => pr.number));
}
protected removeDir(dir: string) {
try {
// Undocumented signature (see https://github.com/shelljs/shelljs/pull/663).
(shell as any).chmod('-R', 'a+w', dir);
shell.rm('-rf', dir);
} catch (err) {
console.error(`ERROR: Unable to remove '${dir}' due to:`, err);
}
}
protected removeUnnecessaryBuilds(existingBuildNumbers: number[], openPrNumbers: number[]) {
const toRemove = existingBuildNumbers.filter(num => !openPrNumbers.includes(num));
console.log(`Existing builds: ${existingBuildNumbers.length}`);
console.log(`Open pull requests: ${openPrNumbers.length}`);
console.log(`Removing ${toRemove.length} build(s): ${toRemove.join(', ')}`);
toRemove.
map(num => path.join(this.buildsDir, String(num))).
forEach(dir => this.removeDir(dir));
}
}

View File

@ -1,23 +0,0 @@
// Imports
import {getEnvVar} from '../common/utils';
import {BuildCleaner} from './build-cleaner';
// Constants
const AIO_BUILDS_DIR = getEnvVar('AIO_BUILDS_DIR');
const AIO_GITHUB_TOKEN = getEnvVar('AIO_GITHUB_TOKEN', true);
const AIO_REPO_SLUG = getEnvVar('AIO_REPO_SLUG');
// Run
_main();
// Functions
function _main() {
console.log(`[${new Date()}] - Cleaning up builds...`);
const buildCleaner = new BuildCleaner(AIO_BUILDS_DIR, AIO_REPO_SLUG, AIO_GITHUB_TOKEN);
buildCleaner.cleanUp().catch(err => {
console.error('ERROR:', err);
process.exit(1);
});
}

View File

@ -1,110 +0,0 @@
// Imports
import {IncomingMessage} from 'http';
import * as https from 'https';
import {assertNotMissingOrEmpty} from './utils';
// Constants
const GITHUB_HOSTNAME = 'api.github.com';
// Interfaces - Types
interface RequestParams {
[key: string]: string | number;
}
type RequestParamsOrNull = RequestParams | null;
// Classes
export class GithubApi {
protected requestHeaders: {[key: string]: string};
// Constructor
constructor(githubToken: string) {
assertNotMissingOrEmpty('githubToken', githubToken);
this.requestHeaders = {
'Authorization': `token ${githubToken}`,
'User-Agent': `Node/${process.versions.node}`,
};
}
// Methods - Public
public get<T>(pathname: string, params?: RequestParamsOrNull): Promise<T> {
const path = this.buildPath(pathname, params);
return this.request<T>('get', path);
}
public post<T>(pathname: string, params?: RequestParamsOrNull, data?: any): Promise<T> {
const path = this.buildPath(pathname, params);
return this.request<T>('post', path, data);
}
// Methods - Protected
protected buildPath(pathname: string, params?: RequestParamsOrNull): string {
if (params == null) {
return pathname;
}
const search = (params === null) ? '' : this.serializeSearchParams(params);
const joiner = search && '?';
return `${pathname}${joiner}${search}`;
}
protected getPaginated<T>(pathname: string, baseParams: RequestParams = {}, currentPage: number = 0): Promise<T[]> {
const perPage = 100;
const params = {
...baseParams,
page: currentPage,
per_page: perPage,
};
return this.get<T[]>(pathname, params).then(items => {
if (items.length < perPage) {
return items;
}
return this.getPaginated(pathname, baseParams, currentPage + 1).then(moreItems => [...items, ...moreItems]);
});
}
protected request<T>(method: string, path: string, data: any = null): Promise<T> {
return new Promise<T>((resolve, reject) => {
const options = {
headers: {...this.requestHeaders},
host: GITHUB_HOSTNAME,
method,
path,
};
const onError = (statusCode: number, responseText: string) => {
const url = `https://${GITHUB_HOSTNAME}${path}`;
reject(`Request to '${url}' failed (status: ${statusCode}): ${responseText}`);
};
const onSuccess = (responseText: string) => {
try { resolve(JSON.parse(responseText)); } catch (err) { reject(err); }
};
const onResponse = (res: IncomingMessage) => {
const statusCode = res.statusCode || -1;
const isSuccess = (200 <= statusCode) && (statusCode < 400);
let responseText = '';
res.
on('data', d => responseText += d).
on('end', () => isSuccess ? onSuccess(responseText) : onError(statusCode, responseText)).
on('error', reject);
};
https.
request(options, onResponse).
on('error', reject).
end(data && JSON.stringify(data));
});
}
protected serializeSearchParams(params: RequestParams): string {
return Object.keys(params).
filter(key => params[key] != null).
map(key => `${key}=${encodeURIComponent(String(params[key]))}`).
join('&');
}
}

View File

@ -1,44 +0,0 @@
// Imports
import {assertNotMissingOrEmpty} from '../common/utils';
import {GithubApi} from './github-api';
// Interfaces - Types
export interface PullRequest {
number: number;
user: {login: string};
}
export type PullRequestState = 'all' | 'closed' | 'open';
// Classes
export class GithubPullRequests extends GithubApi {
// Constructor
constructor(githubToken: string, protected repoSlug: string) {
super(githubToken);
assertNotMissingOrEmpty('repoSlug', repoSlug);
}
// Methods - Public
public addComment(pr: number, body: string): Promise<void> {
if (!(pr > 0)) {
throw new Error(`Invalid PR number: ${pr}`);
} else if (!body) {
throw new Error(`Invalid or empty comment body: ${body}`);
}
return this.post<void>(`/repos/${this.repoSlug}/issues/${pr}/comments`, null, {body});
}
public fetch(pr: number): Promise<PullRequest> {
return this.get<PullRequest>(`/repos/${this.repoSlug}/pulls/${pr}`);
}
public fetchAll(state: PullRequestState = 'all'): Promise<PullRequest[]> {
console.log(`Fetching ${state} pull requests...`);
const pathname = `/repos/${this.repoSlug}/pulls`;
const params = {state};
return this.getPaginated<PullRequest>(pathname, params);
}
}

View File

@ -1,45 +0,0 @@
// Imports
import {assertNotMissingOrEmpty} from '../common/utils';
import {GithubApi} from './github-api';
// Interfaces - Types
interface Team {
id: number;
slug: string;
}
interface TeamMembership {
state: string;
}
// Classes
export class GithubTeams extends GithubApi {
// Constructor
constructor(githubToken: string, protected organization: string) {
super(githubToken);
assertNotMissingOrEmpty('organization', organization);
}
// Methods - Public
public fetchAll(): Promise<Team[]> {
return this.getPaginated<Team>(`/orgs/${this.organization}/teams`);
}
public isMemberById(username: string, teamIds: number[]): Promise<boolean> {
const getMembership = (teamId: number) =>
this.get<TeamMembership>(`/teams/${teamId}/memberships/${username}`).
then(membership => membership.state === 'active').
catch(() => false);
const reduceFn = (promise: Promise<boolean>, teamId: number) =>
promise.then(isMember => isMember || getMembership(teamId));
return teamIds.reduce(reduceFn, Promise.resolve(false));
}
public isMemberBySlug(username: string, teamSlugs: string[]): Promise<boolean> {
return this.fetchAll().
then(teams => teams.filter(team => teamSlugs.includes(team.slug)).map(team => team.id)).
then(teamIds => this.isMemberById(username, teamIds)).
catch(() => false);
}
}

View File

@ -1,23 +0,0 @@
export const runTests = (specFiles: string[], helpers?: string[]) => {
// We can't use `import` here, because of the following mess:
// - GitHub project `jasmine/jasmine` is `jasmine-core` on npm and its typings `@types/jasmine`.
// - GitHub project `jasmine/jasmine-npm` is `jasmine` on npm and has no typings.
//
// Using `import...from 'jasmine'` here, would import from `@types/jasmine` (which refers to the
// `jasmine-core` module and the `jasmine` module).
// tslint:disable-next-line: no-var-requires variable-name
const Jasmine = require('jasmine');
const config = {
helpers,
random: true,
spec_files: specFiles,
stopSpecOnExpectationFailure: true,
};
process.on('unhandledRejection', (reason: any) => console.log('Unhandled rejection:', reason));
const runner = new Jasmine();
runner.loadConfig(config);
runner.onComplete((passed: boolean) => process.exit(passed ? 0 : 1));
runner.execute();
};

View File

@ -1,17 +0,0 @@
// Functions
export const assertNotMissingOrEmpty = (name: string, value: string | null | undefined) => {
if (!value) {
throw new Error(`Missing or empty required parameter '${name}'!`);
}
};
export const getEnvVar = (name: string, isOptional = false): string => {
const value = process.env[name];
if (!isOptional && !value) {
console.error(`ERROR: Missing required environment variable '${name}'!`);
process.exit(1);
}
return value || '';
};

View File

@ -1,81 +0,0 @@
// Imports
import * as cp from 'child_process';
import {EventEmitter} from 'events';
import * as fs from 'fs';
import * as path from 'path';
import * as shell from 'shelljs';
import {assertNotMissingOrEmpty} from '../common/utils';
import {CreatedBuildEvent} from './build-events';
import {UploadError} from './upload-error';
// Classes
export class BuildCreator extends EventEmitter {
// Constructor
constructor(protected buildsDir: string) {
super();
assertNotMissingOrEmpty('buildsDir', buildsDir);
}
// Methods - Public
public create(pr: string, sha: string, archivePath: string): Promise<any> {
const prDir = path.join(this.buildsDir, pr);
const shaDir = path.join(prDir, sha);
let dirToRemoveOnError: string;
return Promise.
all([this.exists(prDir), this.exists(shaDir)]).
then(([prDirExisted, shaDirExisted]) => {
if (shaDirExisted) {
throw new UploadError(409, `Request to overwrite existing directory: ${shaDir}`);
}
dirToRemoveOnError = prDirExisted ? shaDir : prDir;
return Promise.resolve().
then(() => shell.mkdir('-p', shaDir)).
then(() => this.extractArchive(archivePath, shaDir)).
then(() => this.emit(CreatedBuildEvent.type, new CreatedBuildEvent(+pr, sha)));
}).
catch(err => {
if (dirToRemoveOnError) {
shell.rm('-rf', dirToRemoveOnError);
}
if (!(err instanceof UploadError)) {
err = new UploadError(500, `Error while uploading to directory: ${shaDir}\n${err}`);
}
throw err;
});
}
// Methods - Protected
protected exists(fileOrDir: string): Promise<boolean> {
return new Promise(resolve => fs.access(fileOrDir, err => resolve(!err)));
}
protected extractArchive(inputFile: string, outputDir: string): Promise<void> {
return new Promise<void>((resolve, reject) => {
const cmd = `tar --extract --gzip --directory "${outputDir}" --file "${inputFile}"`;
cp.exec(cmd, (err, _stdout, stderr) => {
if (err) {
return reject(err);
}
if (stderr) {
console.warn(stderr);
}
try {
// Undocumented signature (see https://github.com/shelljs/shelljs/pull/663).
(shell as any).chmod('-R', 'a-w', outputDir);
shell.rm('-f', inputFile);
resolve();
} catch (err) {
reject(err);
}
});
});
}
}

View File

@ -1,15 +0,0 @@
// Classes
export class BuildEvent {
// Constructor
constructor(public type: string, public pr: number, public sha: string) {}
}
export class CreatedBuildEvent extends BuildEvent {
// Properties - Public, Static
public static type = 'build.created';
// Constructor
constructor(pr: number, sha: string) {
super(CreatedBuildEvent.type, pr, sha);
}
}

View File

@ -1,78 +0,0 @@
// Imports
import * as jwt from 'jsonwebtoken';
import {GithubPullRequests} from '../common/github-pull-requests';
import {GithubTeams} from '../common/github-teams';
import {assertNotMissingOrEmpty} from '../common/utils';
import {UploadError} from './upload-error';
// Interfaces - Types
interface JwtPayload {
slug: string;
'pull-request': number;
}
// Classes
export class BuildVerifier {
// Properties - Protected
protected githubPullRequests: GithubPullRequests;
protected githubTeams: GithubTeams;
// Constructor
constructor(protected secret: string, githubToken: string, protected repoSlug: string, organization: string,
protected allowedTeamSlugs: string[]) {
assertNotMissingOrEmpty('secret', secret);
assertNotMissingOrEmpty('githubToken', githubToken);
assertNotMissingOrEmpty('repoSlug', repoSlug);
assertNotMissingOrEmpty('organization', organization);
assertNotMissingOrEmpty('allowedTeamSlugs', allowedTeamSlugs && allowedTeamSlugs.join(''));
this.githubPullRequests = new GithubPullRequests(githubToken, repoSlug);
this.githubTeams = new GithubTeams(githubToken, organization);
}
// Methods - Public
public getPrAuthorTeamMembership(pr: number): Promise<{author: string, isMember: boolean}> {
return Promise.resolve().
then(() => this.githubPullRequests.fetch(pr)).
then(prInfo => prInfo.user.login).
then(author => this.githubTeams.isMemberBySlug(author, this.allowedTeamSlugs).
then(isMember => ({author, isMember})));
}
public verify(expectedPr: number, authHeader: string): Promise<void> {
return Promise.resolve().
then(() => this.extractJwtString(authHeader)).
then(jwtString => this.verifyJwt(expectedPr, jwtString)).
then(jwtPayload => this.verifyPr(jwtPayload['pull-request'])).
catch(err => { throw new UploadError(403, `Error while verifying upload for PR ${expectedPr}: ${err}`); });
}
// Methods - Protected
protected extractJwtString(input: string): string {
return input.replace(/^token +/i, '');
}
protected verifyJwt(expectedPr: number, token: string): Promise<JwtPayload> {
return new Promise((resolve, reject) => {
jwt.verify(token, this.secret, {issuer: 'Travis CI, GmbH'}, (err, payload) => {
if (err) {
reject(err.message || err);
} else if (payload.slug !== this.repoSlug) {
reject(`jwt slug invalid. expected: ${this.repoSlug}`);
} else if (payload['pull-request'] !== expectedPr) {
reject(`jwt pull-request invalid. expected: ${expectedPr}`);
} else {
resolve(payload);
}
});
});
}
protected verifyPr(pr: number): Promise<void> {
return this.getPrAuthorTeamMembership(pr).
then(({author, isMember}) => isMember ? Promise.resolve() : Promise.reject(
`User '${author}' is not an active member of any of the following teams: ` +
`${this.allowedTeamSlugs.join(', ')}`,
));
}
}

View File

@ -1,39 +0,0 @@
// Imports
import {getEnvVar} from '../common/utils';
import {BuildVerifier} from './build-verifier';
// Run
_main();
// Functions
function _main() {
const secret = 'unused';
const githubToken = getEnvVar('AIO_GITHUB_TOKEN');
const repoSlug = getEnvVar('AIO_REPO_SLUG');
const organization = getEnvVar('AIO_GITHUB_ORGANIZATION');
const allowedTeamSlugs = getEnvVar('AIO_GITHUB_TEAM_SLUGS').split(',');
const pr = +getEnvVar('AIO_PREVERIFY_PR');
const buildVerifier = new BuildVerifier(secret, githubToken, repoSlug, organization, allowedTeamSlugs);
// Exit codes:
// - 0: The PR author is a member.
// - 1: An error occurred.
// - 2: The PR author is not a member.
buildVerifier.getPrAuthorTeamMembership(pr).
then(({author, isMember}) => {
if (isMember) {
process.exit(0);
} else {
const errorMessage = `User '${author}' is not an active member of any of the following teams: ` +
`${allowedTeamSlugs.join(', ')}`;
onError(errorMessage, 2);
}
}).
catch(err => onError(err, 1));
}
function onError(err: string, exitCode: number) {
console.error(err);
process.exit(exitCode || 1);
}

View File

@ -1,10 +0,0 @@
// Imports
import {GithubPullRequests} from '../common/github-pull-requests';
import {BuildVerifier} from './build-verifier';
// Run
// TODO(gkalpak): Add e2e tests to cover these interactions as well.
GithubPullRequests.prototype.addComment = () => Promise.resolve();
BuildVerifier.prototype.verify = () => Promise.resolve();
// tslint:disable-next-line: no-var-requires
require('./index');

View File

@ -1,34 +0,0 @@
// Imports
import {getEnvVar} from '../common/utils';
import {uploadServerFactory} from './upload-server-factory';
// Constants
const AIO_BUILDS_DIR = getEnvVar('AIO_BUILDS_DIR');
const AIO_DOMAIN_NAME = getEnvVar('AIO_DOMAIN_NAME');
const AIO_GITHUB_ORGANIZATION = getEnvVar('AIO_GITHUB_ORGANIZATION');
const AIO_GITHUB_TEAM_SLUGS = getEnvVar('AIO_GITHUB_TEAM_SLUGS');
const AIO_GITHUB_TOKEN = getEnvVar('AIO_GITHUB_TOKEN');
const AIO_PREVIEW_DEPLOYMENT_TOKEN = getEnvVar('AIO_PREVIEW_DEPLOYMENT_TOKEN');
const AIO_REPO_SLUG = getEnvVar('AIO_REPO_SLUG');
const AIO_UPLOAD_HOSTNAME = getEnvVar('AIO_UPLOAD_HOSTNAME');
const AIO_UPLOAD_PORT = +getEnvVar('AIO_UPLOAD_PORT');
const AIO_WWW_USER = getEnvVar('AIO_WWW_USER');
// Run
process.setuid(AIO_WWW_USER); // TODO(gkalpak): Find more suitable way to run as `www-data`.
_main();
// Functions
function _main() {
uploadServerFactory.
create({
buildsDir: AIO_BUILDS_DIR,
domainName: AIO_DOMAIN_NAME,
githubOrganization: AIO_GITHUB_ORGANIZATION,
githubTeamSlugs: AIO_GITHUB_TEAM_SLUGS.split(','),
githubToken: AIO_GITHUB_TOKEN,
repoSlug: AIO_REPO_SLUG,
secret: AIO_PREVIEW_DEPLOYMENT_TOKEN,
}).
listen(AIO_UPLOAD_PORT, AIO_UPLOAD_HOSTNAME);
}

View File

@ -1,8 +0,0 @@
// Classes
export class UploadError extends Error {
// Constructor
constructor(public status: number = 500, message?: string) {
super(message);
Object.setPrototypeOf(this, UploadError.prototype);
}
}

View File

@ -1,117 +0,0 @@
// Imports
import * as express from 'express';
import * as http from 'http';
import {GithubPullRequests} from '../common/github-pull-requests';
import {assertNotMissingOrEmpty} from '../common/utils';
import {BuildCreator} from './build-creator';
import {CreatedBuildEvent} from './build-events';
import {BuildVerifier} from './build-verifier';
import {UploadError} from './upload-error';
// Constants
const AUTHORIZATION_HEADER = 'AUTHORIZATION';
const X_FILE_HEADER = 'X-FILE';
// Interfaces - Types
interface UploadServerConfig {
buildsDir: string;
domainName: string;
githubOrganization: string;
githubTeamSlugs: string[];
githubToken: string;
repoSlug: string;
secret: string;
}
// Classes
class UploadServerFactory {
// Methods - Public
public create({
buildsDir,
domainName,
githubOrganization,
githubTeamSlugs,
githubToken,
repoSlug,
secret,
}: UploadServerConfig): http.Server {
assertNotMissingOrEmpty('domainName', domainName);
const buildVerifier = new BuildVerifier(secret, githubToken, repoSlug, githubOrganization, githubTeamSlugs);
const buildCreator = this.createBuildCreator(buildsDir, githubToken, repoSlug, domainName);
const middleware = this.createMiddleware(buildVerifier, buildCreator);
const httpServer = http.createServer(middleware);
httpServer.on('listening', () => {
const info = httpServer.address();
console.info(`Up and running (and listening on ${info.address}:${info.port})...`);
});
return httpServer;
}
// Methods - Protected
protected createBuildCreator(buildsDir: string, githubToken: string, repoSlug: string,
domainName: string): BuildCreator {
const buildCreator = new BuildCreator(buildsDir);
const githubPullRequests = new GithubPullRequests(githubToken, repoSlug);
buildCreator.on(CreatedBuildEvent.type, ({pr, sha}: CreatedBuildEvent) => {
const body = `The angular.io preview for ${sha} is available [here][1].\n\n` +
`[1]: https://pr${pr}-${sha}.${domainName}/`;
githubPullRequests.addComment(pr, body);
});
return buildCreator;
}
protected createMiddleware(buildVerifier: BuildVerifier, buildCreator: BuildCreator): express.Express {
const middleware = express();
middleware.get(/^\/create-build\/([1-9][0-9]*)\/([0-9a-f]{40})\/?$/, (req, res) => {
const pr = req.params[0];
const sha = req.params[1];
const archive = req.header(X_FILE_HEADER);
const authHeader = req.header(AUTHORIZATION_HEADER);
if (!authHeader) {
this.throwRequestError(401, `Missing or empty '${AUTHORIZATION_HEADER}' header`, req);
} else if (!archive) {
this.throwRequestError(400, `Missing or empty '${X_FILE_HEADER}' header`, req);
}
buildVerifier.
verify(+pr, authHeader).
then(() => buildCreator.create(pr, sha, archive)).
then(() => res.sendStatus(201)).
catch(err => this.respondWithError(res, err));
});
middleware.get(/^\/health-check\/?$/, (_req, res) => res.sendStatus(200));
middleware.get('*', req => this.throwRequestError(404, 'Unknown resource', req));
middleware.all('*', req => this.throwRequestError(405, 'Unsupported method', req));
middleware.use((err: any, _req: any, res: express.Response, _next: any) => this.respondWithError(res, err));
return middleware;
}
protected respondWithError(res: express.Response, err: any) {
if (!(err instanceof UploadError)) {
err = new UploadError(500, String((err && err.message) || err));
}
const statusText = http.STATUS_CODES[err.status] || '???';
console.error(`Upload error: ${err.status} - ${statusText}`);
console.error(err.message);
res.status(err.status).end(err.message);
}
protected throwRequestError(status: number, error: string, req: express.Request) {
throw new UploadError(status, `${error} in request: ${req.method} ${req.originalUrl}`);
}
}
// Exports
export const uploadServerFactory = new UploadServerFactory();

View File

@ -1,191 +0,0 @@
// Imports
import * as cp from 'child_process';
import * as fs from 'fs';
import * as http from 'http';
import * as path from 'path';
import * as shell from 'shelljs';
import {getEnvVar} from '../common/utils';
// Constans
const TEST_AIO_BUILDS_DIR = getEnvVar('TEST_AIO_BUILDS_DIR');
const TEST_AIO_NGINX_HOSTNAME = getEnvVar('TEST_AIO_NGINX_HOSTNAME');
const TEST_AIO_NGINX_PORT_HTTP = +getEnvVar('TEST_AIO_NGINX_PORT_HTTP');
const TEST_AIO_NGINX_PORT_HTTPS = +getEnvVar('TEST_AIO_NGINX_PORT_HTTPS');
const TEST_AIO_UPLOAD_HOSTNAME = getEnvVar('TEST_AIO_UPLOAD_HOSTNAME');
const TEST_AIO_UPLOAD_MAX_SIZE = +getEnvVar('TEST_AIO_UPLOAD_MAX_SIZE');
const TEST_AIO_UPLOAD_PORT = +getEnvVar('TEST_AIO_UPLOAD_PORT');
const WWW_USER = getEnvVar('AIO_WWW_USER');
// Interfaces - Types
export interface CmdResult { success: boolean; err: Error; stdout: string; stderr: string; }
export interface FileSpecs { content?: string; size?: number; }
export type CleanUpFn = () => void;
export type TestSuiteFactory = (scheme: string, port: number) => void;
export type VerifyCmdResultFn = (result: CmdResult) => void;
// Classes
class Helper {
// Properties - Public
public get buildsDir() { return TEST_AIO_BUILDS_DIR; }
public get nginxHostname() { return TEST_AIO_NGINX_HOSTNAME; }
public get nginxPortHttp() { return TEST_AIO_NGINX_PORT_HTTP; }
public get nginxPortHttps() { return TEST_AIO_NGINX_PORT_HTTPS; }
public get wwwUser() { return WWW_USER; }
public get uploadHostname() { return TEST_AIO_UPLOAD_HOSTNAME; }
public get uploadPort() { return TEST_AIO_UPLOAD_PORT; }
public get uploadMaxSize() { return TEST_AIO_UPLOAD_MAX_SIZE; }
// Properties - Protected
protected cleanUpFns: CleanUpFn[] = [];
protected portPerScheme: {[scheme: string]: number} = {
http: this.nginxPortHttp,
https: this.nginxPortHttps,
};
// Constructor
constructor() {
shell.mkdir('-p', this.buildsDir);
shell.exec(`chown -R ${this.wwwUser} ${this.buildsDir}`);
}
// Methods - Public
public cleanUp() {
while (this.cleanUpFns.length) {
// Clean-up fns remove themselves from the list.
this.cleanUpFns[0]();
}
if (fs.readdirSync(this.buildsDir).length) {
throw new Error(`Directory '${this.buildsDir}' is not empty after clean-up.`);
}
}
public createDummyArchive(pr: string, sha: string, archivePath: string): CleanUpFn {
const inputDir = path.join(this.buildsDir, 'uploaded', pr, sha);
const cmd1 = `tar --create --gzip --directory "${inputDir}" --file "${archivePath}" .`;
const cmd2 = `chown ${this.wwwUser} ${archivePath}`;
const cleanUpTemp = this.createDummyBuild(`uploaded/${pr}`, sha, true);
shell.exec(cmd1);
shell.exec(cmd2);
cleanUpTemp();
return this.createCleanUpFn(() => shell.rm('-rf', archivePath));
}
public createDummyBuild(pr: string, sha: string, force = false): CleanUpFn {
const prDir = path.join(this.buildsDir, pr);
const shaDir = path.join(prDir, sha);
const idxPath = path.join(shaDir, 'index.html');
const barPath = path.join(shaDir, 'foo', 'bar.js');
this.writeFile(idxPath, {content: `PR: ${pr} | SHA: ${sha} | File: /index.html`}, force);
this.writeFile(barPath, {content: `PR: ${pr} | SHA: ${sha} | File: /foo/bar.js`}, force);
shell.exec(`chown -R ${this.wwwUser} ${prDir}`);
return this.createCleanUpFn(() => shell.rm('-rf', prDir));
}
public deletePrDir(pr: string) {
const prDir = path.join(this.buildsDir, pr);
if (fs.existsSync(prDir)) {
// Undocumented signature (see https://github.com/shelljs/shelljs/pull/663).
(shell as any).chmod('-R', 'a+w', prDir);
shell.rm('-rf', prDir);
}
}
public readBuildFile(pr: string, sha: string, relFilePath: string): string {
const absFilePath = path.join(this.buildsDir, pr, sha, relFilePath);
return fs.readFileSync(absFilePath, 'utf8');
}
public runCmd(cmd: string, opts: cp.ExecFileOptions = {}): Promise<CmdResult> {
return new Promise(resolve => {
const proc = cp.exec(cmd, opts, (err, stdout, stderr) => resolve({success: !err, err, stdout, stderr}));
this.createCleanUpFn(() => proc.kill());
});
}
public runForAllSupportedSchemes(suiteFactory: TestSuiteFactory) {
Object.keys(this.portPerScheme).forEach(scheme => suiteFactory(scheme, this.portPerScheme[scheme]));
}
public verifyResponse(status: number | [number, string], regex = /^/): VerifyCmdResultFn {
let statusCode: number;
let statusText: string;
if (Array.isArray(status)) {
statusCode = status[0];
statusText = status[1];
} else {
statusCode = status;
statusText = http.STATUS_CODES[statusCode];
}
return (result: CmdResult) => {
const [headers, body] = result.stdout.
split(/(?:\r?\n){2,}/).
map(s => s.trim()).
slice(-2);
if (!result.success) {
console.log('Stdout:', result.stdout);
console.log('Stderr:', result.stderr);
console.log('Error:', result.err);
}
expect(result.success).toBe(true);
expect(headers).toContain(`${statusCode} ${statusText}`);
expect(body).toMatch(regex);
};
}
public writeBuildFile(pr: string, sha: string, relFilePath: string, content: string): CleanUpFn {
const absFilePath = path.join(this.buildsDir, pr, sha, relFilePath);
return this.writeFile(absFilePath, {content}, true);
}
public writeFile(filePath: string, {content, size}: FileSpecs, force = false): CleanUpFn {
if (!force && fs.existsSync(filePath)) {
throw new Error(`Refusing to overwrite existing file '${filePath}'.`);
}
let cleanUpTarget = filePath;
while (!fs.existsSync(path.dirname(cleanUpTarget))) {
cleanUpTarget = path.dirname(cleanUpTarget);
}
shell.mkdir('-p', path.dirname(filePath));
if (size) {
// Create a file of the specified size.
cp.execSync(`fallocate -l ${size} ${filePath}`);
} else {
// Create a file with the specified content.
fs.writeFileSync(filePath, content || '');
}
shell.exec(`chown ${this.wwwUser} ${filePath}`);
return this.createCleanUpFn(() => shell.rm('-rf', cleanUpTarget));
}
// Methods - Protected
protected createCleanUpFn(fn: Function): CleanUpFn {
const cleanUpFn = () => {
const idx = this.cleanUpFns.indexOf(cleanUpFn);
if (idx !== -1) {
this.cleanUpFns.splice(idx, 1);
fn();
}
};
this.cleanUpFns.push(cleanUpFn);
return cleanUpFn;
}
}
// Exports
export const helper = new Helper();

View File

@ -1,6 +0,0 @@
// Imports
import {runTests} from '../common/run-tests';
// Run
const specFiles = [`${__dirname}/**/*.e2e.js`];
runTests(specFiles);

View File

@ -1,271 +0,0 @@
// Imports
import * as path from 'path';
import {helper as h} from './helper';
// Tests
describe(`nginx`, () => {
beforeEach(() => jasmine.DEFAULT_TIMEOUT_INTERVAL = 10000);
afterEach(() => h.cleanUp());
it('should redirect HTTP to HTTPS', done => {
const httpHost = `${h.nginxHostname}:${h.nginxPortHttp}`;
const httpsHost = `${h.nginxHostname}:${h.nginxPortHttps}`;
const urlMap = {
[`http://${httpHost}/`]: `https://${httpsHost}/`,
[`http://${httpHost}/foo`]: `https://${httpsHost}/foo`,
[`http://foo.${httpHost}/`]: `https://foo.${httpsHost}/`,
};
const verifyRedirection = (httpUrl: string) => h.runCmd(`curl -i ${httpUrl}`).then(result => {
h.verifyResponse(307)(result);
const headers = result.stdout.split(/(?:\r?\n){2,}/)[0];
expect(headers).toContain(`Location: ${urlMap[httpUrl]}`);
});
Promise.
all(Object.keys(urlMap).map(verifyRedirection)).
then(done);
});
h.runForAllSupportedSchemes((scheme, port) => describe(`nginx (on ${scheme.toUpperCase()})`, () => {
const hostname = h.nginxHostname;
const host = `${hostname}:${port}`;
const pr = '9';
const sha9 = '9'.repeat(40);
const sha0 = '0'.repeat(40);
describe(`pr<pr>-<sha>.${host}/*`, () => {
beforeEach(() => {
h.createDummyBuild(pr, sha9);
h.createDummyBuild(pr, sha0);
});
it('should return /index.html', done => {
const origin = `${scheme}://pr${pr}-${sha9}.${host}`;
const bodyRegex = new RegExp(`^PR: ${pr} | SHA: ${sha9} | File: /index\\.html$`);
Promise.all([
h.runCmd(`curl -iL ${origin}/index.html`).then(h.verifyResponse(200, bodyRegex)),
h.runCmd(`curl -iL ${origin}/`).then(h.verifyResponse(200, bodyRegex)),
h.runCmd(`curl -iL ${origin}`).then(h.verifyResponse(200, bodyRegex)),
]).then(done);
});
it('should return /foo/bar.js', done => {
const bodyRegex = new RegExp(`^PR: ${pr} | SHA: ${sha9} | File: /foo/bar\\.js$`);
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}.${host}/foo/bar.js`).
then(h.verifyResponse(200, bodyRegex)).
then(done);
});
it('should respond with 403 for directories', done => {
Promise.all([
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}.${host}/foo/`).then(h.verifyResponse(403)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}.${host}/foo`).then(h.verifyResponse(403)),
]).then(done);
});
it('should respond with 404 for unknown paths to files', done => {
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}.${host}/foo/baz.css`).
then(h.verifyResponse(404)).
then(done);
});
it('should rewrite to \'index.html\' for unknown paths that don\'t look like files', done => {
const bodyRegex = new RegExp(`^PR: ${pr} | SHA: ${sha9} | File: /index\\.html$`);
Promise.all([
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}.${host}/foo/baz`).then(h.verifyResponse(200, bodyRegex)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}.${host}/foo/baz/`).then(h.verifyResponse(200, bodyRegex)),
]).then(done);
});
it('should respond with 404 for unknown PRs/SHAs', done => {
const otherPr = 54321;
const otherSha = '8'.repeat(40);
Promise.all([
h.runCmd(`curl -iL ${scheme}://pr${pr}9-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${otherPr}-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}9.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${otherSha}.${host}`).then(h.verifyResponse(404)),
]).then(done);
});
it('should respond with 404 if the subdomain format is wrong', done => {
Promise.all([
h.runCmd(`curl -iL ${scheme}://xpr${pr}-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://prx${pr}-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://xx${pr}-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://p${pr}-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://r${pr}-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${pr}-${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${pr}${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${pr}_${sha9}.${host}`).then(h.verifyResponse(404)),
]).then(done);
});
it('should reject PRs with leading zeros', done => {
h.runCmd(`curl -iL ${scheme}://pr0${pr}-${sha9}.${host}`).
then(h.verifyResponse(404)).
then(done);
});
it('should accept SHAs with leading zeros (but not trim the zeros)', done => {
const bodyRegex9 = new RegExp(`^PR: ${pr} | SHA: ${sha9} | File: /index\\.html$`);
const bodyRegex0 = new RegExp(`^PR: ${pr} | SHA: ${sha0} | File: /index\\.html$`);
Promise.all([
h.runCmd(`curl -iL ${scheme}://pr${pr}-0${sha9}.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha9}.${host}`).then(h.verifyResponse(200, bodyRegex9)),
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha0}.${host}`).then(h.verifyResponse(200, bodyRegex0)),
]).then(done);
});
});
describe(`${host}/health-check`, () => {
it('should respond with 200', done => {
Promise.all([
h.runCmd(`curl -iL ${scheme}://${host}/health-check`).then(h.verifyResponse(200)),
h.runCmd(`curl -iL ${scheme}://${host}/health-check/`).then(h.verifyResponse(200)),
]).then(done);
});
it('should respond with 404 if the path does not match exactly', done => {
Promise.all([
h.runCmd(`curl -iL ${scheme}://${host}/health-check/foo`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${host}/health-check-foo`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${host}/health-checknfoo`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${host}/foo/health-check`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${host}/foo-health-check`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${host}/foonhealth-check`).then(h.verifyResponse(404)),
]).then(done);
});
});
describe(`${host}/create-build/<pr>/<sha>`, () => {
it('should disallow non-POST requests', done => {
const url = `${scheme}://${host}/create-build/${pr}/${sha9}`;
Promise.all([
h.runCmd(`curl -iLX GET ${url}`).then(h.verifyResponse([405, 'Not Allowed'])),
h.runCmd(`curl -iLX PUT ${url}`).then(h.verifyResponse([405, 'Not Allowed'])),
h.runCmd(`curl -iLX PATCH ${url}`).then(h.verifyResponse([405, 'Not Allowed'])),
h.runCmd(`curl -iLX DELETE ${url}`).then(h.verifyResponse([405, 'Not Allowed'])),
]).then(done);
});
it(`should reject files larger than ${h.uploadMaxSize}B (according to header)`, done => {
const headers = `--header "Content-Length: ${1.5 * h.uploadMaxSize}"`;
const url = `${scheme}://${host}/create-build/${pr}/${sha9}`;
h.runCmd(`curl -iLX POST ${headers} ${url}`).
then(h.verifyResponse([413, 'Request Entity Too Large'])).
then(done);
});
it(`should reject files larger than ${h.uploadMaxSize}B (without header)`, done => {
const filePath = path.join(h.buildsDir, 'snapshot.tar.gz');
const url = `${scheme}://${host}/create-build/${pr}/${sha9}`;
h.writeFile(filePath, {size: 1.5 * h.uploadMaxSize});
h.runCmd(`curl -iLX POST --data-binary "@${filePath}" ${url}`).
then(h.verifyResponse([413, 'Request Entity Too Large'])).
then(done);
});
it('should pass requests through to the upload server', done => {
h.runCmd(`curl -iLX POST ${scheme}://${host}/create-build/${pr}/${sha9}`).
then(h.verifyResponse(401, /Missing or empty 'AUTHORIZATION' header/)).
then(done);
});
it('should respond with 404 for unknown paths', done => {
const cmdPrefix = `curl -iLX POST ${scheme}://${host}`;
Promise.all([
h.runCmd(`${cmdPrefix}/foo/create-build/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/foo-create-build/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/fooncreate-build/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build/foo/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build-foo/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-buildnfoo/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build/pr${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build/${pr}/${sha9}42`).then(h.verifyResponse(404)),
]).then(done);
});
it('should reject PRs with leading zeros', done => {
h.runCmd(`curl -iLX POST ${scheme}://${host}/create-build/0${pr}/${sha9}`).
then(h.verifyResponse(404)).
then(done);
});
it('should accept SHAs with leading zeros (but not trim the zeros)', done => {
const cmdPrefix = `curl -iLX POST ${scheme}://${host}/create-build/${pr}`;
const bodyRegex = /Missing or empty 'AUTHORIZATION' header/;
Promise.all([
h.runCmd(`${cmdPrefix}/0${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/${sha0}`).then(h.verifyResponse(401, bodyRegex)),
]).then(done);
});
});
describe(`${host}/*`, () => {
it('should respond with 404 for unkown URLs (even if the resource exists)', done => {
['index.html', 'foo.js', 'foo/index.html'].forEach(relFilePath => {
const absFilePath = path.join(h.buildsDir, relFilePath);
h.writeFile(absFilePath, {content: `File: /${relFilePath}`});
});
Promise.all([
h.runCmd(`curl -iL ${scheme}://${host}/index.html`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${host}/`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://foo.${host}/index.html`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://foo.${host}/`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://foo.${host}`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${host}/foo.js`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL ${scheme}://${host}/foo/index.html`).then(h.verifyResponse(404)),
]).then(done);
});
});
}));
});

View File

@ -1,84 +0,0 @@
// Imports
import * as path from 'path';
import {helper as h} from './helper';
// Tests
h.runForAllSupportedSchemes((scheme, port) => describe(`integration (on ${scheme.toUpperCase()})`, () => {
const hostname = h.nginxHostname;
const host = `${hostname}:${port}`;
const pr9 = '9';
const sha9 = '9'.repeat(40);
const sha0 = '0'.repeat(40);
const archivePath = path.join(h.buildsDir, 'snapshot.tar.gz');
const getFile = (pr: string, sha: string, file: string) =>
h.runCmd(`curl -iL ${scheme}://pr${pr}-${sha}.${host}/${file}`);
const uploadBuild = (pr: string, sha: string, archive: string) => {
const curlPost = 'curl -iLX POST --header "Authorization: Token FOO"';
return h.runCmd(`${curlPost} --data-binary "@${archive}" ${scheme}://${host}/create-build/${pr}/${sha}`);
};
beforeEach(() => jasmine.DEFAULT_TIMEOUT_INTERVAL = 10000);
afterEach(() => {
h.deletePrDir(pr9);
h.cleanUp();
});
it('should be able to upload and serve a build for a new PR', done => {
const regexPrefix9 = `^PR: uploaded\\/${pr9} \\| SHA: ${sha9} \\| File:`;
const idxContentRegex9 = new RegExp(`${regexPrefix9} \\/index\\.html$`);
const barContentRegex9 = new RegExp(`${regexPrefix9} \\/foo\\/bar\\.js$`);
h.createDummyArchive(pr9, sha9, archivePath);
uploadBuild(pr9, sha9, archivePath).
then(() => Promise.all([
getFile(pr9, sha9, 'index.html').then(h.verifyResponse(200, idxContentRegex9)),
getFile(pr9, sha9, 'foo/bar.js').then(h.verifyResponse(200, barContentRegex9)),
])).
then(done);
});
it('should be able to upload and serve a build for an existing PR', done => {
const regexPrefix0 = `^PR: ${pr9} \\| SHA: ${sha0} \\| File:`;
const idxContentRegex0 = new RegExp(`${regexPrefix0} \\/index\\.html$`);
const barContentRegex0 = new RegExp(`${regexPrefix0} \\/foo\\/bar\\.js$`);
const regexPrefix9 = `^PR: uploaded\\/${pr9} \\| SHA: ${sha9} \\| File:`;
const idxContentRegex9 = new RegExp(`${regexPrefix9} \\/index\\.html$`);
const barContentRegex9 = new RegExp(`${regexPrefix9} \\/foo\\/bar\\.js$`);
h.createDummyBuild(pr9, sha0);
h.createDummyArchive(pr9, sha9, archivePath);
uploadBuild(pr9, sha9, archivePath).
then(() => Promise.all([
getFile(pr9, sha0, 'index.html').then(h.verifyResponse(200, idxContentRegex0)),
getFile(pr9, sha0, 'foo/bar.js').then(h.verifyResponse(200, barContentRegex0)),
getFile(pr9, sha9, 'index.html').then(h.verifyResponse(200, idxContentRegex9)),
getFile(pr9, sha9, 'foo/bar.js').then(h.verifyResponse(200, barContentRegex9)),
])).
then(done);
});
it('should not be able to overwrite a build', done => {
const regexPrefix9 = `^PR: ${pr9} \\| SHA: ${sha9} \\| File:`;
const idxContentRegex9 = new RegExp(`${regexPrefix9} \\/index\\.html$`);
const barContentRegex9 = new RegExp(`${regexPrefix9} \\/foo\\/bar\\.js$`);
h.createDummyBuild(pr9, sha9);
h.createDummyArchive(pr9, sha9, archivePath);
uploadBuild(pr9, sha9, archivePath).
then(h.verifyResponse(409)).
then(() => Promise.all([
getFile(pr9, sha9, 'index.html').then(h.verifyResponse(200, idxContentRegex9)),
getFile(pr9, sha9, 'foo/bar.js').then(h.verifyResponse(200, barContentRegex9)),
])).
then(done);
});
}));

View File

@ -1,266 +0,0 @@
// Imports
import * as fs from 'fs';
import * as path from 'path';
import {CmdResult, helper as h} from './helper';
// Tests
describe('upload-server (on HTTP)', () => {
const hostname = h.uploadHostname;
const port = h.uploadPort;
const host = `${hostname}:${port}`;
const pr = '9';
const sha9 = '9'.repeat(40);
const sha0 = '0'.repeat(40);
beforeEach(() => jasmine.DEFAULT_TIMEOUT_INTERVAL = 10000);
afterEach(() => h.cleanUp());
describe(`${host}/create-build/<pr>/<sha>`, () => {
const authorizationHeader = `--header "Authorization: Token FOO"`;
const xFileHeader = `--header "X-File: ${h.buildsDir}/snapshot.tar.gz"`;
const curl = `curl -iL ${authorizationHeader} ${xFileHeader}`;
it('should disallow non-GET requests', done => {
const url = `http://${host}/create-build/${pr}/${sha9}`;
const bodyRegex = /^Unsupported method/;
Promise.all([
h.runCmd(`curl -iLX PUT ${url}`).then(h.verifyResponse(405, bodyRegex)),
h.runCmd(`curl -iLX POST ${url}`).then(h.verifyResponse(405, bodyRegex)),
h.runCmd(`curl -iLX PATCH ${url}`).then(h.verifyResponse(405, bodyRegex)),
h.runCmd(`curl -iLX DELETE ${url}`).then(h.verifyResponse(405, bodyRegex)),
]).then(done);
});
it('should reject requests without an \'AUTHORIZATION\' header', done => {
const headers1 = '';
const headers2 = '--header "AUTHORIXATION: "';
const url = `http://${host}/create-build/${pr}/${sha9}`;
const bodyRegex = /^Missing or empty 'AUTHORIZATION' header/;
Promise.all([
h.runCmd(`curl -iL ${headers1} ${url}`).then(h.verifyResponse(401, bodyRegex)),
h.runCmd(`curl -iL ${headers2} ${url}`).then(h.verifyResponse(401, bodyRegex)),
]).then(done);
});
it('should reject requests without an \'X-FILE\' header', done => {
const headers1 = authorizationHeader;
const headers2 = `${authorizationHeader} --header "X-FILE: "`;
const url = `http://${host}/create-build/${pr}/${sha9}`;
const bodyRegex = /^Missing or empty 'X-FILE' header/;
Promise.all([
h.runCmd(`curl -iL ${headers1} ${url}`).then(h.verifyResponse(400, bodyRegex)),
h.runCmd(`curl -iL ${headers2} ${url}`).then(h.verifyResponse(400, bodyRegex)),
]).then(done);
});
it('should respond with 404 for unknown paths', done => {
const cmdPrefix = `${curl} http://${host}`;
Promise.all([
h.runCmd(`${cmdPrefix}/foo/create-build/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/foo-create-build/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/fooncreate-build/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build/foo/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build-foo/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-buildnfoo/${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build/pr${pr}/${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${cmdPrefix}/create-build/${pr}/${sha9}42`).then(h.verifyResponse(404)),
]).then(done);
});
it('should reject PRs with leading zeros', done => {
h.runCmd(`${curl} http://${host}/create-build/0${pr}/${sha9}`).
then(h.verifyResponse(404)).
then(done);
});
it('should accept SHAs with leading zeros (but not trim the zeros)', done => {
Promise.all([
h.runCmd(`${curl} http://${host}/create-build/${pr}/0${sha9}`).then(h.verifyResponse(404)),
h.runCmd(`${curl} http://${host}/create-build/${pr}/${sha9}`).then(h.verifyResponse(500)),
h.runCmd(`${curl} http://${host}/create-build/${pr}/${sha0}`).then(h.verifyResponse(500)),
]).then(done);
});
it('should not overwrite existing builds', done => {
h.createDummyBuild(pr, sha9);
expect(h.readBuildFile(pr, sha9, 'index.html')).toContain('index.html');
h.writeBuildFile(pr, sha9, 'index.html', 'My content');
expect(h.readBuildFile(pr, sha9, 'index.html')).toBe('My content');
h.runCmd(`${curl} http://${host}/create-build/${pr}/${sha9}`).
then(h.verifyResponse(409, /^Request to overwrite existing directory/)).
then(() => expect(h.readBuildFile(pr, sha9, 'index.html')).toBe('My content')).
then(done);
});
it('should delete the PR directory on error (for new PR)', done => {
const prDir = path.join(h.buildsDir, pr);
h.runCmd(`${curl} http://${host}/create-build/${pr}/${sha9}`).
then(h.verifyResponse(500)).
then(() => expect(fs.existsSync(prDir)).toBe(false)).
then(done);
});
it('should only delete the SHA directory on error (for existing PR)', done => {
const prDir = path.join(h.buildsDir, pr);
const shaDir = path.join(prDir, sha9);
h.createDummyBuild(pr, sha0);
h.runCmd(`${curl} http://${host}/create-build/${pr}/${sha9}`).
then(h.verifyResponse(500)).
then(() => {
expect(fs.existsSync(shaDir)).toBe(false);
expect(fs.existsSync(prDir)).toBe(true);
}).
then(done);
});
describe('on successful upload', () => {
const archivePath = path.join(h.buildsDir, 'snapshot.tar.gz');
let uploadPromise: Promise<CmdResult>;
beforeEach(() => {
h.createDummyArchive(pr, sha9, archivePath);
uploadPromise = h.runCmd(`${curl} http://${host}/create-build/${pr}/${sha9}`);
});
afterEach(() => h.deletePrDir(pr));
it('should respond with 201', done => {
uploadPromise.then(h.verifyResponse(201)).then(done);
});
it('should extract the contents of the uploaded file', done => {
uploadPromise.
then(() => {
expect(h.readBuildFile(pr, sha9, 'index.html')).toContain(`uploaded/${pr}`);
expect(h.readBuildFile(pr, sha9, 'foo/bar.js')).toContain(`uploaded/${pr}`);
}).
then(done);
});
it(`should create files/directories owned by '${h.wwwUser}'`, done => {
const shaDir = path.join(h.buildsDir, pr, sha9);
const idxPath = path.join(shaDir, 'index.html');
const barPath = path.join(shaDir, 'foo', 'bar.js');
uploadPromise.
then(() => Promise.all([
h.runCmd(`find ${shaDir}`),
h.runCmd(`find ${shaDir} -user ${h.wwwUser}`),
])).
then(([{stdout: allFiles}, {stdout: userFiles}]) => {
expect(userFiles).toBe(allFiles);
expect(userFiles).toContain(shaDir);
expect(userFiles).toContain(idxPath);
expect(userFiles).toContain(barPath);
}).
then(done);
});
it('should delete the uploaded file', done => {
expect(fs.existsSync(archivePath)).toBe(true);
uploadPromise.
then(() => expect(fs.existsSync(archivePath)).toBe(false)).
then(done);
});
it('should make the build directory non-writable', done => {
const shaDir = path.join(h.buildsDir, pr, sha9);
const idxPath = path.join(shaDir, 'index.html');
const barPath = path.join(shaDir, 'foo', 'bar.js');
// See https://github.com/nodejs/node-v0.x-archive/issues/3045#issuecomment-4862588.
const isNotWritable = (fileOrDir: string) => {
const mode = fs.statSync(fileOrDir).mode;
// tslint:disable-next-line: no-bitwise
return !(mode & parseInt('222', 8));
};
uploadPromise.
then(() => {
expect(isNotWritable(shaDir)).toBe(true);
expect(isNotWritable(idxPath)).toBe(true);
expect(isNotWritable(barPath)).toBe(true);
}).
then(done);
});
});
});
describe(`${host}/health-check`, () => {
it('should respond with 200', done => {
Promise.all([
h.runCmd(`curl -iL http://${host}/health-check`).then(h.verifyResponse(200)),
h.runCmd(`curl -iL http://${host}/health-check/`).then(h.verifyResponse(200)),
]).then(done);
});
it('should respond with 404 if the path does not match exactly', done => {
Promise.all([
h.runCmd(`curl -iL http://${host}/health-check/foo`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL http://${host}/health-check-foo`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL http://${host}/health-checknfoo`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL http://${host}/foo/health-check`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL http://${host}/foo-health-check`).then(h.verifyResponse(404)),
h.runCmd(`curl -iL http://${host}/foonhealth-check`).then(h.verifyResponse(404)),
]).then(done);
});
});
describe(`${host}/*`, () => {
it('should respond with 404 for GET requests to unknown URLs', done => {
const bodyRegex = /^Unknown resource/;
Promise.all([
h.runCmd(`curl -iL http://${host}/index.html`).then(h.verifyResponse(404, bodyRegex)),
h.runCmd(`curl -iL http://${host}/`).then(h.verifyResponse(404, bodyRegex)),
h.runCmd(`curl -iL http://${host}`).then(h.verifyResponse(404, bodyRegex)),
]).then(done);
});
it('should respond with 405 for non-GET requests to any URL', done => {
const bodyRegex = /^Unsupported method/;
Promise.all([
h.runCmd(`curl -iLX PUT http://${host}`).then(h.verifyResponse(405, bodyRegex)),
h.runCmd(`curl -iLX POST http://${host}`).then(h.verifyResponse(405, bodyRegex)),
h.runCmd(`curl -iLX PATCH http://${host}`).then(h.verifyResponse(405, bodyRegex)),
h.runCmd(`curl -iLX DELETE http://${host}`).then(h.verifyResponse(405, bodyRegex)),
]).then(done);
});
});
});

View File

@ -1,43 +0,0 @@
{
"name": "aio-scripts-js",
"version": "1.0.0",
"description": "Performing various tasks on PR build artifacts for angular.io.",
"repository": "https://github.com/angular/angular.git",
"author": "Angular",
"license": "MIT",
"scripts": {
"prebuild": "yarn clean-dist",
"build": "tsc",
"build-watch": "yarn tsc -- --watch",
"clean-dist": "node --eval \"require('shelljs').rm('-rf', 'dist')\"",
"dev": "concurrently --kill-others --raw --success first \"yarn build-watch\" \"yarn test-watch\"",
"lint": "tslint --project tsconfig.json",
"pre~~test-only": "yarn lint",
"~~test-only": "node dist/test",
"pretest": "yarn build",
"test": "yarn ~~test-only",
"pretest-watch": "yarn build",
"test-watch": "nodemon --exec \"yarn ~~test-only\" --watch dist"
},
"dependencies": {
"express": "^4.14.1",
"jasmine": "^2.5.3",
"jsonwebtoken": "^7.3.0",
"shelljs": "^0.7.6"
},
"devDependencies": {
"@types/express": "^4.0.35",
"@types/jasmine": "^2.5.43",
"@types/jsonwebtoken": "^7.2.0",
"@types/node": "^7.0.5",
"@types/shelljs": "^0.7.0",
"@types/supertest": "^2.0.0",
"concurrently": "^3.3.0",
"eslint": "^3.15.0",
"eslint-plugin-jasmine": "^2.2.0",
"nodemon": "^1.11.0",
"supertest": "^3.0.0",
"tslint": "^4.4.2",
"typescript": "^2.1.6"
}
}

View File

@ -1,318 +0,0 @@
// Imports
import * as fs from 'fs';
import * as shell from 'shelljs';
import {BuildCleaner} from '../../lib/clean-up/build-cleaner';
import {GithubPullRequests} from '../../lib/common/github-pull-requests';
// Tests
describe('BuildCleaner', () => {
let cleaner: BuildCleaner;
beforeEach(() => cleaner = new BuildCleaner('/foo/bar', 'baz/qux', '12345'));
describe('constructor()', () => {
it('should throw if \'buildsDir\' is empty', () => {
expect(() => new BuildCleaner('', '/baz/qux', '12345')).
toThrowError('Missing or empty required parameter \'buildsDir\'!');
});
it('should throw if \'repoSlug\' is empty', () => {
expect(() => new BuildCleaner('/foo/bar', '', '12345')).
toThrowError('Missing or empty required parameter \'repoSlug\'!');
});
it('should throw if \'githubToken\' is empty', () => {
expect(() => new BuildCleaner('/foo/bar', 'baz/qux', '')).
toThrowError('Missing or empty required parameter \'githubToken\'!');
});
});
describe('cleanUp()', () => {
let cleanerGetExistingBuildNumbersSpy: jasmine.Spy;
let cleanerGetOpenPrNumbersSpy: jasmine.Spy;
let cleanerRemoveUnnecessaryBuildsSpy: jasmine.Spy;
let existingBuildsDeferred: {resolve: Function, reject: Function};
let openPrsDeferred: {resolve: Function, reject: Function};
let promise: Promise<void>;
beforeEach(() => {
cleanerGetExistingBuildNumbersSpy = spyOn(cleaner as any, 'getExistingBuildNumbers').and.callFake(() => {
return new Promise((resolve, reject) => existingBuildsDeferred = {resolve, reject});
});
cleanerGetOpenPrNumbersSpy = spyOn(cleaner as any, 'getOpenPrNumbers').and.callFake(() => {
return new Promise((resolve, reject) => openPrsDeferred = {resolve, reject});
});
cleanerRemoveUnnecessaryBuildsSpy = spyOn(cleaner as any, 'removeUnnecessaryBuilds');
promise = cleaner.cleanUp();
});
it('should return a promise', () => {
expect(promise).toEqual(jasmine.any(Promise));
});
it('should get the existing builds', () => {
expect(cleanerGetExistingBuildNumbersSpy).toHaveBeenCalled();
});
it('should get the open PRs', () => {
expect(cleanerGetOpenPrNumbersSpy).toHaveBeenCalled();
});
it('should reject if \'getExistingBuildNumbers()\' rejects', done => {
promise.catch(err => {
expect(err).toBe('Test');
done();
});
existingBuildsDeferred.reject('Test');
});
it('should reject if \'getOpenPrNumbers()\' rejects', done => {
promise.catch(err => {
expect(err).toBe('Test');
done();
});
openPrsDeferred.reject('Test');
});
it('should reject if \'removeUnnecessaryBuilds()\' rejects', done => {
promise.catch(err => {
expect(err).toBe('Test');
done();
});
cleanerRemoveUnnecessaryBuildsSpy.and.returnValue(Promise.reject('Test'));
existingBuildsDeferred.resolve();
openPrsDeferred.resolve();
});
it('should pass existing builds and open PRs to \'removeUnnecessaryBuilds()\'', done => {
promise.then(() => {
expect(cleanerRemoveUnnecessaryBuildsSpy).toHaveBeenCalledWith('foo', 'bar');
done();
});
existingBuildsDeferred.resolve('foo');
openPrsDeferred.resolve('bar');
});
it('should resolve with the value returned by \'removeUnnecessaryBuilds()\'', done => {
promise.then(result => {
expect(result).toBe('Test');
done();
});
cleanerRemoveUnnecessaryBuildsSpy.and.returnValue(Promise.resolve('Test'));
existingBuildsDeferred.resolve();
openPrsDeferred.resolve();
});
});
// Protected methods
describe('getExistingBuildNumbers()', () => {
let fsReaddirSpy: jasmine.Spy;
let readdirCb: (err: any, files?: string[]) => void;
let promise: Promise<number[]>;
beforeEach(() => {
fsReaddirSpy = spyOn(fs, 'readdir').and.callFake((_: string, cb: typeof readdirCb) => readdirCb = cb);
promise = (cleaner as any).getExistingBuildNumbers();
});
it('should return a promise', () => {
expect(promise).toEqual(jasmine.any(Promise));
});
it('should get the contents of the builds directory', () => {
expect(fsReaddirSpy).toHaveBeenCalled();
expect(fsReaddirSpy.calls.argsFor(0)[0]).toBe('/foo/bar');
});
it('should reject if an error occurs while getting the files', done => {
promise.catch(err => {
expect(err).toBe('Test');
done();
});
readdirCb('Test');
});
it('should resolve with the returned files (as numbers)', done => {
promise.then(result => {
expect(result).toEqual([12, 34, 56]);
done();
});
readdirCb(null, ['12', '34', '56']);
});
it('should ignore files with non-numeric (or zero) names', done => {
promise.then(result => {
expect(result).toEqual([12, 34, 56]);
done();
});
readdirCb(null, ['12', 'foo', '34', 'bar', '56', '000']);
});
});
describe('getOpenPrNumbers()', () => {
let prDeferred: {resolve: Function, reject: Function};
let promise: Promise<number[]>;
beforeEach(() => {
spyOn(GithubPullRequests.prototype, 'fetchAll').and.callFake(() => {
return new Promise((resolve, reject) => prDeferred = {resolve, reject});
});
promise = (cleaner as any).getOpenPrNumbers();
});
it('should return a promise', () => {
expect(promise).toEqual(jasmine.any(Promise));
});
it('should fetch open PRs via \'GithubPullRequests\'', () => {
expect(GithubPullRequests.prototype.fetchAll).toHaveBeenCalledWith('open');
});
it('should reject if an error occurs while fetching PRs', done => {
promise.catch(err => {
expect(err).toBe('Test');
done();
});
prDeferred.reject('Test');
});
it('should resolve with the numbers of the fetched PRs', done => {
promise.then(prNumbers => {
expect(prNumbers).toEqual([1, 2, 3]);
done();
});
prDeferred.resolve([{id: 0, number: 1}, {id: 1, number: 2}, {id: 2, number: 3}]);
});
});
describe('removeDir()', () => {
let shellChmodSpy: jasmine.Spy;
let shellRmSpy: jasmine.Spy;
beforeEach(() => {
shellChmodSpy = spyOn(shell, 'chmod');
shellRmSpy = spyOn(shell, 'rm');
});
it('should remove the specified directory and its content', () => {
(cleaner as any).removeDir('/foo/bar');
expect(shellRmSpy).toHaveBeenCalledWith('-rf', '/foo/bar');
});
it('should make the directory and its content writable before removing', () => {
shellRmSpy.and.callFake(() => expect(shellChmodSpy).toHaveBeenCalledWith('-R', 'a+w', '/foo/bar'));
(cleaner as any).removeDir('/foo/bar');
expect(shellRmSpy).toHaveBeenCalled();
});
it('should catch errors and log them', () => {
const consoleErrorSpy = spyOn(console, 'error');
shellRmSpy.and.callFake(() => { throw 'Test'; });
(cleaner as any).removeDir('/foo/bar');
expect(consoleErrorSpy).toHaveBeenCalled();
expect(consoleErrorSpy.calls.argsFor(0)[0]).toContain('Unable to remove \'/foo/bar\'');
expect(consoleErrorSpy.calls.argsFor(0)[1]).toBe('Test');
});
});
describe('removeUnnecessaryBuilds()', () => {
let consoleLogSpy: jasmine.Spy;
let cleanerRemoveDirSpy: jasmine.Spy;
beforeEach(() => {
consoleLogSpy = spyOn(console, 'log');
cleanerRemoveDirSpy = spyOn(cleaner as any, 'removeDir');
});
it('should log the number of existing builds, open PRs and builds to be removed', () => {
(cleaner as any).removeUnnecessaryBuilds([1, 2, 3], [3, 4, 5, 6]);
expect(console.log).toHaveBeenCalledWith('Existing builds: 3');
expect(console.log).toHaveBeenCalledWith('Open pull requests: 4');
expect(console.log).toHaveBeenCalledWith('Removing 2 build(s): 1, 2');
});
it('should construct full paths to directories (by prepending \'buildsDir\')', () => {
(cleaner as any).removeUnnecessaryBuilds([1, 2, 3], []);
expect(cleanerRemoveDirSpy).toHaveBeenCalledWith('/foo/bar/1');
expect(cleanerRemoveDirSpy).toHaveBeenCalledWith('/foo/bar/2');
expect(cleanerRemoveDirSpy).toHaveBeenCalledWith('/foo/bar/3');
});
it('should remove the builds that do not correspond to open PRs', () => {
(cleaner as any).removeUnnecessaryBuilds([1, 2, 3, 4], [2, 4]);
expect(cleanerRemoveDirSpy).toHaveBeenCalledTimes(2);
expect(cleanerRemoveDirSpy).toHaveBeenCalledWith('/foo/bar/1');
expect(cleanerRemoveDirSpy).toHaveBeenCalledWith('/foo/bar/3');
cleanerRemoveDirSpy.calls.reset();
(cleaner as any).removeUnnecessaryBuilds([1, 2, 3, 4], [1, 2, 3, 4]);
expect(cleanerRemoveDirSpy).toHaveBeenCalledTimes(0);
cleanerRemoveDirSpy.calls.reset();
(cleaner as any).removeUnnecessaryBuilds([1, 2, 3, 4], []);
expect(cleanerRemoveDirSpy).toHaveBeenCalledTimes(4);
expect(cleanerRemoveDirSpy).toHaveBeenCalledWith('/foo/bar/1');
expect(cleanerRemoveDirSpy).toHaveBeenCalledWith('/foo/bar/2');
expect(cleanerRemoveDirSpy).toHaveBeenCalledWith('/foo/bar/3');
expect(cleanerRemoveDirSpy).toHaveBeenCalledWith('/foo/bar/4');
cleanerRemoveDirSpy.calls.reset();
});
});
});

View File

@ -1,410 +0,0 @@
// Imports
import {EventEmitter} from 'events';
import {ClientRequest, IncomingMessage} from 'http';
import * as https from 'https';
import {GithubApi} from '../../lib/common/github-api';
// Tests
describe('GithubApi', () => {
let api: GithubApi;
beforeEach(() => api = new GithubApi('12345'));
describe('constructor()', () => {
it('should throw if \'githubToken\' is missing or empty', () => {
expect(() => new GithubApi('')).toThrowError('Missing or empty required parameter \'githubToken\'!');
});
});
describe('get()', () => {
let apiBuildPathSpy: jasmine.Spy;
let apiRequestSpy: jasmine.Spy;
beforeEach(() => {
apiBuildPathSpy = spyOn(api as any, 'buildPath');
apiRequestSpy = spyOn(api as any, 'request');
});
it('should call \'buildPath()\' with the pathname and params', () => {
api.get('/foo', {bar: 'baz'});
expect(apiBuildPathSpy).toHaveBeenCalled();
expect(apiBuildPathSpy.calls.argsFor(0)).toEqual(['/foo', {bar: 'baz'}]);
});
it('should call \'request()\' with the correct method', () => {
api.get('/foo');
expect(apiRequestSpy).toHaveBeenCalled();
expect(apiRequestSpy.calls.argsFor(0)[0]).toBe('get');
});
it('should call \'request()\' with the correct path', () => {
apiBuildPathSpy.and.returnValue('/foo/bar');
api.get('foo');
expect(apiRequestSpy).toHaveBeenCalled();
expect(apiRequestSpy.calls.argsFor(0)[1]).toBe('/foo/bar');
});
it('should not pass data to \'request()\'', () => {
(api.get as Function)('foo', {}, {});
expect(apiRequestSpy).toHaveBeenCalled();
expect(apiRequestSpy.calls.argsFor(0)[2]).toBeUndefined();
});
});
describe('post()', () => {
let apiBuildPathSpy: jasmine.Spy;
let apiRequestSpy: jasmine.Spy;
beforeEach(() => {
apiBuildPathSpy = spyOn(api as any, 'buildPath');
apiRequestSpy = spyOn(api as any, 'request');
});
it('should call \'buildPath()\' with the pathname and params', () => {
api.post('/foo', {bar: 'baz'});
expect(apiBuildPathSpy).toHaveBeenCalled();
expect(apiBuildPathSpy.calls.argsFor(0)).toEqual(['/foo', {bar: 'baz'}]);
});
it('should call \'request()\' with the correct method', () => {
api.post('/foo');
expect(apiRequestSpy).toHaveBeenCalled();
expect(apiRequestSpy.calls.argsFor(0)[0]).toBe('post');
});
it('should call \'request()\' with the correct path', () => {
apiBuildPathSpy.and.returnValue('/foo/bar');
api.post('/foo');
expect(apiRequestSpy).toHaveBeenCalled();
expect(apiRequestSpy.calls.argsFor(0)[1]).toBe('/foo/bar');
});
it('should pass the data to \'request()\'', () => {
api.post('/foo', {}, {bar: 'baz'});
expect(apiRequestSpy).toHaveBeenCalled();
expect(apiRequestSpy.calls.argsFor(0)[2]).toEqual({bar: 'baz'});
});
});
// Protected methods
describe('buildPath()', () => {
it('should return the pathname if no params', () => {
expect((api as any).buildPath('/foo')).toBe('/foo');
expect((api as any).buildPath('/foo', undefined)).toBe('/foo');
expect((api as any).buildPath('/foo', null)).toBe('/foo');
});
it('should append the params to the pathname', () => {
expect((api as any).buildPath('/foo', {bar: 'baz'})).toBe('/foo?bar=baz');
});
it('should join the params with \'&\'', () => {
expect((api as any).buildPath('/foo', {bar: 1, baz: 2})).toBe('/foo?bar=1&baz=2');
});
it('should ignore undefined/null params', () => {
expect((api as any).buildPath('/foo', {bar: undefined, baz: null})).toBe('/foo');
});
it('should encode param values as URI components', () => {
expect((api as any).buildPath('/foo', {bar: 'b a&z'})).toBe('/foo?bar=b%20a%26z');
});
});
describe('getPaginated()', () => {
let deferreds: {resolve: Function, reject: Function}[];
beforeEach(() => {
deferreds = [];
spyOn(api, 'get').and.callFake(() => new Promise((resolve, reject) => deferreds.push({resolve, reject})));
});
it('should return a promise', () => {
expect((api as any).getPaginated()).toEqual(jasmine.any(Promise));
});
it('should call \'get()\' with the correct pathname and params', () => {
(api as any).getPaginated('/foo/bar');
(api as any).getPaginated('/foo/bar', {baz: 'qux'});
expect(api.get).toHaveBeenCalledWith('/foo/bar', {page: 0, per_page: 100});
expect(api.get).toHaveBeenCalledWith('/foo/bar', {baz: 'qux', page: 0, per_page: 100});
});
it('should reject if the request fails', done => {
(api as any).getPaginated('/foo/bar').catch((err: any) => {
expect(err).toBe('Test');
done();
});
deferreds[0].reject('Test');
});
it('should resolve with the returned items', done => {
const items = [{id: 1}, {id: 2}];
(api as any).getPaginated('/foo/bar').then((data: any) => {
expect(data).toEqual(items);
done();
});
deferreds[0].resolve(items);
});
it('should iteratively call \'get()\' to fetch all items', done => {
// Create an array or 250 objects.
const allItems = '.'.repeat(250).split('').map((_, i) => ({id: i}));
const apiGetSpy = api.get as jasmine.Spy;
(api as any).getPaginated('/foo/bar', {baz: 'qux'}).then((data: any) => {
const paramsForPage = (page: number) => ({baz: 'qux', page, per_page: 100});
expect(apiGetSpy).toHaveBeenCalledTimes(3);
expect(apiGetSpy.calls.argsFor(0)).toEqual(['/foo/bar', paramsForPage(0)]);
expect(apiGetSpy.calls.argsFor(1)).toEqual(['/foo/bar', paramsForPage(1)]);
expect(apiGetSpy.calls.argsFor(2)).toEqual(['/foo/bar', paramsForPage(2)]);
expect(data).toEqual(allItems);
done();
});
deferreds[0].resolve(allItems.slice(0, 100));
setTimeout(() => {
deferreds[1].resolve(allItems.slice(100, 200));
setTimeout(() => {
deferreds[2].resolve(allItems.slice(200));
}, 0);
}, 0);
});
});
describe('request()', () => {
let httpsRequestSpy: jasmine.Spy;
let latestRequest: ClientRequest;
beforeEach(() => {
const originalRequest = https.request;
httpsRequestSpy = spyOn(https, 'request').and.callFake((...args: any[]) => {
latestRequest = originalRequest.apply(https, args);
spyOn(latestRequest, 'on').and.callThrough();
spyOn(latestRequest, 'end');
return latestRequest;
});
});
it('should return a promise', () => {
expect((api as any).request()).toEqual(jasmine.any(Promise));
});
it('should call \'https.request()\' with the correct options', () => {
(api as any).request('method', 'path');
expect(httpsRequestSpy).toHaveBeenCalled();
expect(httpsRequestSpy.calls.argsFor(0)[0]).toEqual(jasmine.objectContaining({
headers: jasmine.objectContaining({
'User-Agent': `Node/${process.versions.node}`,
}),
host: 'api.github.com',
method: 'method',
path: 'path',
}));
});
it('should call specify an \'Authorization\' header if \'githubToken\' is present', () => {
(api as any).request('method', 'path');
expect(httpsRequestSpy).toHaveBeenCalled();
expect(httpsRequestSpy.calls.argsFor(0)[0].headers).toEqual(jasmine.objectContaining({
Authorization: 'token 12345',
}));
});
it('should reject on request error', done => {
(api as any).request('method', 'path').catch((err: any) => {
expect(err).toBe('Test');
done();
});
latestRequest.emit('error', 'Test');
});
it('should send the request (i.e. call \'end()\')', () => {
(api as any).request('method', 'path');
expect(latestRequest.end).toHaveBeenCalled();
});
it('should \'JSON.stringify\' and send the data along with the request', () => {
(api as any).request('method', 'path');
expect(latestRequest.end).toHaveBeenCalledWith(null);
(api as any).request('method', 'path', {key: 'value'});
expect(latestRequest.end).toHaveBeenCalledWith('{"key":"value"}');
});
describe('onResponse', () => {
let promise: Promise<void>;
let respond: (statusCode: number) => IncomingMessage;
beforeEach(() => {
promise = (api as any).request('method', 'path');
respond = (statusCode: number) => {
const mockResponse = new EventEmitter() as IncomingMessage;
mockResponse.statusCode = statusCode;
const onResponse = httpsRequestSpy.calls.argsFor(0)[1];
onResponse(mockResponse);
return mockResponse;
};
});
it('should reject on response error', done => {
promise.catch(err => {
expect(err).toBe('Test');
done();
});
const res = respond(200);
res.emit('error', 'Test');
});
it('should reject if returned statusCode is <200', done => {
promise.catch(err => {
expect(err).toContain('failed');
expect(err).toContain('status: 199');
done();
});
const res = respond(199);
res.emit('end');
});
it('should reject if returned statusCode is >=400', done => {
promise.catch(err => {
expect(err).toContain('failed');
expect(err).toContain('status: 400');
done();
});
const res = respond(400);
res.emit('end');
});
it('should include the response text in the rejection message', done => {
promise.catch(err => {
expect(err).toContain('Test');
done();
});
const res = respond(500);
res.emit('data', 'Test');
res.emit('end');
});
it('should resolve if returned statusCode is <=200 <400', done => {
promise.then(done);
const res = respond(200);
res.emit('data', '{}');
res.emit('end');
});
it('should resolve with the response text \'JSON.parsed\'', done => {
promise.then(data => {
expect(data).toEqual({foo: 'bar'});
done();
});
const res = respond(300);
res.emit('data', '{"foo":"bar"}');
res.emit('end');
});
it('should collect and concatenate the whole response text', done => {
promise.then(data => {
expect(data).toEqual({foo: 'bar', baz: 'qux'});
done();
});
const res = respond(300);
res.emit('data', '{"foo":');
res.emit('data', '"bar","baz"');
res.emit('data', ':"qux"}');
res.emit('end');
});
it('should reject if the response text is malformed JSON', done => {
promise.catch(err => {
expect(err).toEqual(jasmine.any(SyntaxError));
done();
});
const res = respond(300);
res.emit('data', '}');
res.emit('end');
});
});
});
});

View File

@ -1,117 +0,0 @@
// Imports
import {GithubPullRequests} from '../../lib/common/github-pull-requests';
// Tests
describe('GithubPullRequests', () => {
describe('constructor()', () => {
it('should throw if \'githubToken\' is missing or empty', () => {
expect(() => new GithubPullRequests('', 'foo/bar')).
toThrowError('Missing or empty required parameter \'githubToken\'!');
});
it('should throw if \'repoSlug\' is missing or empty', () => {
expect(() => new GithubPullRequests('12345', '')).
toThrowError('Missing or empty required parameter \'repoSlug\'!');
});
});
describe('addComment()', () => {
let prs: GithubPullRequests;
let deferred: {resolve: Function, reject: Function};
beforeEach(() => {
prs = new GithubPullRequests('12345', 'foo/bar');
spyOn(prs, 'post').and.callFake(() => new Promise((resolve, reject) => deferred = {resolve, reject}));
});
it('should return a promise', () => {
expect(prs.addComment(42, 'body')).toEqual(jasmine.any(Promise));
});
it('should throw if the PR number is invalid', () => {
expect(() => prs.addComment(-1337, 'body')).toThrowError(`Invalid PR number: -1337`);
expect(() => prs.addComment(NaN, 'body')).toThrowError(`Invalid PR number: NaN`);
});
it('should throw if the comment body is invalid or empty', () => {
expect(() => prs.addComment(42, '')).toThrowError(`Invalid or empty comment body: `);
});
it('should call \'post()\' with the correct pathname, params and data', () => {
prs.addComment(42, 'body');
expect(prs.post).toHaveBeenCalledWith('/repos/foo/bar/issues/42/comments', null, {body: 'body'});
});
it('should reject if the request fails', done => {
prs.addComment(42, 'body').catch(err => {
expect(err).toBe('Test');
done();
});
deferred.reject('Test');
});
it('should resolve with the returned response', done => {
prs.addComment(42, 'body').then(data => {
expect(data).toEqual('Test');
done();
});
deferred.resolve('Test');
});
});
describe('fetchAll()', () => {
let prs: GithubPullRequests;
let prsGetPaginatedSpy: jasmine.Spy;
beforeEach(() => {
prs = new GithubPullRequests('12345', 'foo/bar');
prsGetPaginatedSpy = spyOn(prs as any, 'getPaginated');
spyOn(console, 'log');
});
it('should call \'getPaginated()\' with the correct pathname and params', () => {
const expectedPathname = '/repos/foo/bar/pulls';
prs.fetchAll('all');
prs.fetchAll('closed');
prs.fetchAll('open');
expect(prsGetPaginatedSpy).toHaveBeenCalledTimes(3);
expect(prsGetPaginatedSpy.calls.argsFor(0)).toEqual([expectedPathname, {state: 'all'}]);
expect(prsGetPaginatedSpy.calls.argsFor(1)).toEqual([expectedPathname, {state: 'closed'}]);
expect(prsGetPaginatedSpy.calls.argsFor(2)).toEqual([expectedPathname, {state: 'open'}]);
});
it('should default to \'all\' if no state is specified', () => {
prs.fetchAll();
expect(prsGetPaginatedSpy).toHaveBeenCalledWith('/repos/foo/bar/pulls', {state: 'all'});
});
it('should forward the value returned by \'getPaginated()\'', () => {
prsGetPaginatedSpy.and.returnValue('Test');
expect(prs.fetchAll()).toBe('Test');
});
});
});

View File

@ -1,232 +0,0 @@
// Imports
import {GithubTeams} from '../../lib/common/github-teams';
// Tests
describe('GithubTeams', () => {
describe('constructor()', () => {
it('should throw if \'githubToken\' is missing or empty', () => {
expect(() => new GithubTeams('', 'org')).
toThrowError('Missing or empty required parameter \'githubToken\'!');
});
it('should throw if \'organization\' is missing or empty', () => {
expect(() => new GithubTeams('12345', '')).
toThrowError('Missing or empty required parameter \'organization\'!');
});
});
describe('fetchAll()', () => {
let teams: GithubTeams;
let teamsGetPaginatedSpy: jasmine.Spy;
beforeEach(() => {
teams = new GithubTeams('12345', 'foo');
teamsGetPaginatedSpy = spyOn(teams as any, 'getPaginated');
});
it('should call \'getPaginated()\' with the correct pathname and params', () => {
teams.fetchAll();
expect(teamsGetPaginatedSpy).toHaveBeenCalledWith('/orgs/foo/teams');
});
it('should forward the value returned by \'getPaginated()\'', () => {
teamsGetPaginatedSpy.and.returnValue('Test');
expect(teams.fetchAll()).toBe('Test');
});
});
describe('isMemberById()', () => {
let teams: GithubTeams;
let teamsGetSpy: jasmine.Spy;
beforeEach(() => {
teams = new GithubTeams('12345', 'foo');
teamsGetSpy = spyOn(teams, 'get');
});
it('should return a promise', () => {
expect(teams.isMemberById('user', [1])).toEqual(jasmine.any(Promise));
});
it('should resolve with false if called with an empty array', done => {
teams.isMemberById('user', []).then(isMember => {
expect(isMember).toBe(false);
expect(teamsGetSpy).not.toHaveBeenCalled();
done();
});
});
it('should call \'get()\' with the correct pathname', done => {
teamsGetSpy.and.returnValue(Promise.resolve(null));
teams.isMemberById('user', [1]).then(() => {
expect(teamsGetSpy).toHaveBeenCalledWith('/teams/1/memberships/user');
done();
});
});
it('should resolve with false if \'get()\' rejects', done => {
teamsGetSpy.and.returnValue(Promise.reject(null));
teams.isMemberById('user', [1]).then(isMember => {
expect(isMember).toBe(false);
expect(teamsGetSpy).toHaveBeenCalled();
done();
});
});
it('should resolve with false if the membership is not active', done => {
teamsGetSpy.and.returnValue(Promise.resolve({state: 'pending'}));
teams.isMemberById('user', [1]).then(isMember => {
expect(isMember).toBe(false);
expect(teamsGetSpy).toHaveBeenCalled();
done();
});
});
it('should resolve with true if the membership is active', done => {
teamsGetSpy.and.returnValue(Promise.resolve({state: 'active'}));
teams.isMemberById('user', [1]).then(isMember => {
expect(isMember).toBe(true);
done();
});
});
it('should sequentially call \'get()\' until an active membership is found', done => {
const trainedResponses: {[pathname: string]: Promise<{state: string}>} = {
'/teams/1/memberships/user': Promise.resolve({state: 'pending'}),
'/teams/2/memberships/user': Promise.reject(null),
'/teams/3/memberships/user': Promise.resolve({state: 'active'}),
};
teamsGetSpy.and.callFake((pathname: string) => trainedResponses[pathname]);
teams.isMemberById('user', [1, 2, 3, 4]).then(isMember => {
expect(isMember).toBe(true);
expect(teamsGetSpy).toHaveBeenCalledTimes(3);
expect(teamsGetSpy.calls.argsFor(0)[0]).toBe('/teams/1/memberships/user');
expect(teamsGetSpy.calls.argsFor(1)[0]).toBe('/teams/2/memberships/user');
expect(teamsGetSpy.calls.argsFor(2)[0]).toBe('/teams/3/memberships/user');
done();
});
});
it('should resolve with false if no active membership is found', done => {
const trainedResponses: {[pathname: string]: Promise<{state: string}>} = {
'/teams/1/memberships/user': Promise.resolve({state: 'pending'}),
'/teams/2/memberships/user': Promise.reject(null),
'/teams/3/memberships/user': Promise.resolve({state: 'not active'}),
'/teams/4/memberships/user': Promise.reject(null),
};
teamsGetSpy.and.callFake((pathname: string) => trainedResponses[pathname]);
teams.isMemberById('user', [1, 2, 3, 4]).then(isMember => {
expect(isMember).toBe(false);
expect(teamsGetSpy).toHaveBeenCalledTimes(4);
expect(teamsGetSpy.calls.argsFor(0)[0]).toBe('/teams/1/memberships/user');
expect(teamsGetSpy.calls.argsFor(1)[0]).toBe('/teams/2/memberships/user');
expect(teamsGetSpy.calls.argsFor(2)[0]).toBe('/teams/3/memberships/user');
expect(teamsGetSpy.calls.argsFor(3)[0]).toBe('/teams/4/memberships/user');
done();
});
});
});
describe('isMemberBySlug()', () => {
let teams: GithubTeams;
let teamsFetchAllSpy: jasmine.Spy;
let teamsIsMemberByIdSpy: jasmine.Spy;
beforeEach(() => {
teams = new GithubTeams('12345', 'foo');
const mockResponse = Promise.resolve([{id: 1, slug: 'team1'}, {id: 2, slug: 'team2'}]);
teamsFetchAllSpy = spyOn(teams, 'fetchAll').and.returnValue(mockResponse);
teamsIsMemberByIdSpy = spyOn(teams, 'isMemberById');
});
it('should return a promise', () => {
expect(teams.isMemberBySlug('user', ['team-slug'])).toEqual(jasmine.any(Promise));
});
it('should call \'fetchAll()\'', () => {
teams.isMemberBySlug('user', ['team-slug']);
expect(teamsFetchAllSpy).toHaveBeenCalled();
});
it('should resolve with false if \'fetchAll()\' rejects', done => {
teamsFetchAllSpy.and.returnValue(Promise.reject(null));
teams.isMemberBySlug('user', ['team-slug']).then(isMember => {
expect(isMember).toBe(false);
done();
});
});
it('should call \'isMemberById()\' with the correct params if no team is found', done => {
teams.isMemberBySlug('user', ['no-match']).then(() => {
expect(teamsIsMemberByIdSpy).toHaveBeenCalledWith('user', []);
done();
});
});
it('should call \'isMemberById()\' with the correct params if teams are found', done => {
const spy = teamsIsMemberByIdSpy;
Promise.all([
teams.isMemberBySlug('user', ['team1']).then(() => expect(spy).toHaveBeenCalledWith('user', [1])),
teams.isMemberBySlug('user', ['team2']).then(() => expect(spy).toHaveBeenCalledWith('user', [2])),
teams.isMemberBySlug('user', ['team1', 'team2']).then(() => expect(spy).toHaveBeenCalledWith('user', [1, 2])),
]).then(done);
});
it('should resolve with false if \'isMemberById()\' rejects', done => {
teamsIsMemberByIdSpy.and.returnValue(Promise.reject(null));
teams.isMemberBySlug('user', ['team1']).then(isMember => {
expect(isMember).toBe(false);
expect(teamsIsMemberByIdSpy).toHaveBeenCalled();
done();
});
});
it('should resolve with the value \'isMemberById()\' resolves with', done => {
teamsIsMemberByIdSpy.and.returnValues(Promise.resolve(false), Promise.resolve(true));
Promise.all([
teams.isMemberBySlug('user', ['team1']).then(isMember => expect(isMember).toBe(false)),
teams.isMemberBySlug('user', ['team1']).then(isMember => expect(isMember).toBe(true)),
]).then(() => {
expect(teamsIsMemberByIdSpy).toHaveBeenCalledTimes(2);
done();
});
});
});
});

View File

@ -1,81 +0,0 @@
// Imports
import {assertNotMissingOrEmpty, getEnvVar} from '../../lib/common/utils';
// Tests
describe('utils', () => {
describe('assertNotMissingOrEmpty()', () => {
it('should throw if passed an empty value', () => {
expect(() => assertNotMissingOrEmpty('foo', undefined)).
toThrowError('Missing or empty required parameter \'foo\'!');
expect(() => assertNotMissingOrEmpty('bar', null)).toThrowError('Missing or empty required parameter \'bar\'!');
expect(() => assertNotMissingOrEmpty('baz', '')).toThrowError('Missing or empty required parameter \'baz\'!');
});
it('should not throw if passed a non-empty value', () => {
expect(() => assertNotMissingOrEmpty('foo', ' ')).not.toThrow();
expect(() => assertNotMissingOrEmpty('bar', 'bar')).not.toThrow();
expect(() => assertNotMissingOrEmpty('baz', 'b a z')).not.toThrow();
});
});
describe('getEnvVar()', () => {
const emptyVar = '$$test_utils_getEnvVar_empty$$';
const nonEmptyVar = '$$test_utils_getEnvVar_nonEmpty$$';
const undefinedVar = '$$test_utils_getEnvVar_undefined$$';
beforeEach(() => {
process.env[emptyVar] = '';
process.env[nonEmptyVar] = 'foo';
});
afterEach(() => {
delete process.env[emptyVar];
delete process.env[nonEmptyVar];
});
it('should return an environment variable', () => {
expect(getEnvVar(nonEmptyVar)).toBe('foo');
});
it('should exit with an error if the environment variable is not defined', () => {
const consoleErrorSpy = spyOn(console, 'error');
const processExitSpy = spyOn(process, 'exit');
getEnvVar(undefinedVar);
expect(consoleErrorSpy).toHaveBeenCalled();
expect(consoleErrorSpy.calls.argsFor(0)[0]).toContain(undefinedVar);
expect(processExitSpy).toHaveBeenCalledWith(1);
});
it('should exit with an error if the environment variable is empty', () => {
const consoleErrorSpy = spyOn(console, 'error');
const processExitSpy = spyOn(process, 'exit');
getEnvVar(emptyVar);
expect(consoleErrorSpy).toHaveBeenCalled();
expect(consoleErrorSpy.calls.argsFor(0)[0]).toContain(emptyVar);
expect(processExitSpy).toHaveBeenCalledWith(1);
});
it('should return an empty string if an undefined variable is optional', () => {
expect(getEnvVar(undefinedVar, true)).toBe('');
});
it('should return an empty string if an empty variable is optional', () => {
expect(getEnvVar(emptyVar, true)).toBe('');
});
});
});

View File

@ -1,6 +0,0 @@
declare namespace jasmine {
export interface DoneFn extends Function {
(): void;
fail: (message: Error | string) => void;
}
}

View File

@ -1,7 +0,0 @@
// Imports
import {runTests} from '../lib/common/run-tests';
// Run
const specFiles = [`${__dirname}/**/*.spec.js`];
const helpers = [`${__dirname}/helpers.js`];
runTests(specFiles, helpers);

View File

@ -1,320 +0,0 @@
// Imports
import * as cp from 'child_process';
import {EventEmitter} from 'events';
import * as fs from 'fs';
import * as shell from 'shelljs';
import {BuildCreator} from '../../lib/upload-server/build-creator';
import {CreatedBuildEvent} from '../../lib/upload-server/build-events';
import {UploadError} from '../../lib/upload-server/upload-error';
import {expectToBeUploadError} from './helpers';
// Tests
describe('BuildCreator', () => {
const pr = '9';
const sha = '9'.repeat(40);
const archive = 'snapshot.tar.gz';
const buildsDir = 'builds/dir';
const prDir = `${buildsDir}/${pr}`;
const shaDir = `${prDir}/${sha}`;
let bc: BuildCreator;
beforeEach(() => bc = new BuildCreator(buildsDir));
describe('constructor()', () => {
it('should throw if \'buildsDir\' is missing or empty', () => {
expect(() => new BuildCreator('')).toThrowError('Missing or empty required parameter \'buildsDir\'!');
});
it('should extend EventEmitter', () => {
expect(bc).toEqual(jasmine.any(BuildCreator));
expect(bc).toEqual(jasmine.any(EventEmitter));
expect(Object.getPrototypeOf(bc)).toBe(BuildCreator.prototype);
});
});
describe('create()', () => {
let bcEmitSpy: jasmine.Spy;
let bcExistsSpy: jasmine.Spy;
let bcExtractArchiveSpy: jasmine.Spy;
let shellMkdirSpy: jasmine.Spy;
let shellRmSpy: jasmine.Spy;
beforeEach(() => {
bcEmitSpy = spyOn(bc, 'emit');
bcExistsSpy = spyOn(bc as any, 'exists');
bcExtractArchiveSpy = spyOn(bc as any, 'extractArchive');
shellMkdirSpy = spyOn(shell, 'mkdir');
shellRmSpy = spyOn(shell, 'rm');
});
it('should return a promise', done => {
const promise = bc.create(pr, sha, archive);
promise.then(done); // Do not complete the test (and release the spies) synchronously
// to avoid running the actual `extractArchive()`.
expect(promise).toEqual(jasmine.any(Promise));
});
it('should throw if the build does already exist', done => {
bcExistsSpy.and.returnValue(true);
bc.create(pr, sha, archive).catch(err => {
expectToBeUploadError(err, 409, `Request to overwrite existing directory: ${shaDir}`);
done();
});
});
it('should create the build directory (and any parent directories)', done => {
bc.create(pr, sha, archive).
then(() => expect(shellMkdirSpy).toHaveBeenCalledWith('-p', shaDir)).
then(done);
});
it('should extract the archive contents into the build directory', done => {
bc.create(pr, sha, archive).
then(() => expect(bcExtractArchiveSpy).toHaveBeenCalledWith(archive, shaDir)).
then(done);
});
it('should emit a CreatedBuildEvent on success', done => {
let emitted = false;
bcEmitSpy.and.callFake((type: string, evt: CreatedBuildEvent) => {
expect(type).toBe(CreatedBuildEvent.type);
expect(evt).toEqual(jasmine.any(CreatedBuildEvent));
expect(evt.pr).toBe(+pr);
expect(evt.sha).toBe(sha);
emitted = true;
});
bc.create(pr, sha, archive).
then(() => expect(emitted).toBe(true)).
then(done);
});
describe('on error', () => {
it('should abort and skip further operations if it fails to create the directories', done => {
shellMkdirSpy.and.throwError('');
bc.create(pr, sha, archive).catch(() => {
expect(shellMkdirSpy).toHaveBeenCalled();
expect(bcExtractArchiveSpy).not.toHaveBeenCalled();
expect(bcEmitSpy).not.toHaveBeenCalled();
done();
});
});
it('should abort and skip further operations if it fails to extract the archive', done => {
bcExtractArchiveSpy.and.throwError('');
bc.create(pr, sha, archive).catch(() => {
expect(shellMkdirSpy).toHaveBeenCalled();
expect(bcExtractArchiveSpy).toHaveBeenCalled();
expect(bcEmitSpy).not.toHaveBeenCalled();
done();
});
});
it('should delete the PR directory (for new PR)', done => {
bcExtractArchiveSpy.and.throwError('');
bc.create(pr, sha, archive).catch(() => {
expect(shellRmSpy).toHaveBeenCalledWith('-rf', prDir);
done();
});
});
it('should delete the SHA directory (for existing PR)', done => {
bcExistsSpy.and.callFake((path: string) => path !== shaDir);
bcExtractArchiveSpy.and.throwError('');
bc.create(pr, sha, archive).catch(() => {
expect(shellRmSpy).toHaveBeenCalledWith('-rf', shaDir);
done();
});
});
it('should reject with an UploadError', done => {
shellMkdirSpy.and.callFake(() => {throw 'Test'; });
bc.create(pr, sha, archive).catch(err => {
expectToBeUploadError(err, 500, `Error while uploading to directory: ${shaDir}\nTest`);
done();
});
});
it('should pass UploadError instances unmodified', done => {
shellMkdirSpy.and.callFake(() => { throw new UploadError(543, 'Test'); });
bc.create(pr, sha, archive).catch(err => {
expectToBeUploadError(err, 543, 'Test');
done();
});
});
});
});
// Protected methods
describe('exists()', () => {
let fsAccessSpy: jasmine.Spy;
let fsAccessCbs: Function[];
beforeEach(() => {
fsAccessCbs = [];
fsAccessSpy = spyOn(fs, 'access').and.callFake((_: string, cb: Function) => fsAccessCbs.push(cb));
});
it('should return a promise', () => {
expect((bc as any).exists('foo')).toEqual(jasmine.any(Promise));
});
it('should call \'fs.access()\' with the specified argument', () => {
(bc as any).exists('foo');
expect(fs.access).toHaveBeenCalledWith('foo', jasmine.any(Function));
});
it('should resolve with \'true\' if \'fs.access()\' succeeds', done => {
Promise.
all([(bc as any).exists('foo'), (bc as any).exists('bar')]).
then(results => expect(results).toEqual([true, true])).
then(done);
fsAccessCbs[0]();
fsAccessCbs[1](null);
});
it('should resolve with \'false\' if \'fs.access()\' errors', done => {
Promise.
all([(bc as any).exists('foo'), (bc as any).exists('bar')]).
then(results => expect(results).toEqual([false, false])).
then(done);
fsAccessCbs[0]('Error');
fsAccessCbs[1](new Error());
});
});
describe('extractArchive()', () => {
let consoleWarnSpy: jasmine.Spy;
let shellChmodSpy: jasmine.Spy;
let shellRmSpy: jasmine.Spy;
let cpExecSpy: jasmine.Spy;
let cpExecCbs: Function[];
beforeEach(() => {
cpExecCbs = [];
consoleWarnSpy = spyOn(console, 'warn');
shellChmodSpy = spyOn(shell, 'chmod');
shellRmSpy = spyOn(shell, 'rm');
cpExecSpy = spyOn(cp, 'exec').and.callFake((_: string, cb: Function) => cpExecCbs.push(cb));
});
it('should return a promise', () => {
expect((bc as any).extractArchive('foo', 'bar')).toEqual(jasmine.any(Promise));
});
it('should "gunzip" and "untar" the input file into the output directory', () => {
const cmd = 'tar --extract --gzip --directory "output/dir" --file "input/file"';
(bc as any).extractArchive('input/file', 'output/dir');
expect(cpExecSpy).toHaveBeenCalledWith(cmd, jasmine.any(Function));
});
it('should log (as a warning) any stderr output if extracting succeeded', done => {
(bc as any).extractArchive('foo', 'bar').
then(() => expect(consoleWarnSpy).toHaveBeenCalledWith('This is stderr')).
then(done);
cpExecCbs[0](null, 'This is stdout', 'This is stderr');
});
it('should make the build directory non-writable', done => {
(bc as any).extractArchive('foo', 'bar').
then(() => expect(shellChmodSpy).toHaveBeenCalledWith('-R', 'a-w', 'bar')).
then(done);
cpExecCbs[0]();
});
it('should delete the uploaded file on success', done => {
(bc as any).extractArchive('input/file', 'output/dir').
then(() => expect(shellRmSpy).toHaveBeenCalledWith('-f', 'input/file')).
then(done);
cpExecCbs[0]();
});
describe('on error', () => {
it('should abort and skip further operations if it fails to extract the archive', done => {
(bc as any).extractArchive('foo', 'bar').catch((err: any) => {
expect(shellChmodSpy).not.toHaveBeenCalled();
expect(shellRmSpy).not.toHaveBeenCalled();
expect(err).toBe('Test');
done();
});
cpExecCbs[0]('Test');
});
it('should abort and skip further operations if it fails to make non-writable', done => {
(bc as any).extractArchive('foo', 'bar').catch((err: any) => {
expect(shellChmodSpy).toHaveBeenCalled();
expect(shellRmSpy).not.toHaveBeenCalled();
expect(err).toBe('Test');
done();
});
shellChmodSpy.and.callFake(() => { throw 'Test'; });
cpExecCbs[0]();
});
it('should abort and reject if it fails to remove the uploaded file', done => {
(bc as any).extractArchive('foo', 'bar').catch((err: any) => {
expect(shellChmodSpy).toHaveBeenCalled();
expect(shellRmSpy).toHaveBeenCalled();
expect(err).toBe('Test');
done();
});
shellRmSpy.and.callFake(() => { throw 'Test'; });
cpExecCbs[0]();
});
});
});
});

View File

@ -1,61 +0,0 @@
// Imports
import {BuildEvent, CreatedBuildEvent} from '../../lib/upload-server/build-events';
// Tests
describe('BuildEvent', () => {
let evt: BuildEvent;
beforeEach(() => evt = new BuildEvent('foo', 42, 'bar'));
it('should have a \'type\' property', () => {
expect(evt.type).toBe('foo');
});
it('should have a \'pr\' property', () => {
expect(evt.pr).toBe(42);
});
it('should have a \'sha\' property', () => {
expect(evt.sha).toBe('bar');
});
});
describe('CreatedBuildEvent', () => {
let evt: CreatedBuildEvent;
beforeEach(() => evt = new CreatedBuildEvent(42, 'bar'));
it('should have a static \'type\' property', () => {
expect(CreatedBuildEvent.type).toBe('build.created');
});
it('should extend BuildEvent', () => {
expect(evt).toEqual(jasmine.any(CreatedBuildEvent));
expect(evt).toEqual(jasmine.any(BuildEvent));
expect(Object.getPrototypeOf(evt)).toBe(CreatedBuildEvent.prototype);
});
it('should automatically set the \'type\'', () => {
expect(evt.type).toBe(CreatedBuildEvent.type);
});
it('should have a \'pr\' property', () => {
expect(evt.pr).toBe(42);
});
it('should have a \'sha\' property', () => {
expect(evt.sha).toBe('bar');
});
});

View File

@ -1,261 +0,0 @@
// Imports
import * as jwt from 'jsonwebtoken';
import {GithubPullRequests} from '../../lib/common/github-pull-requests';
import {GithubTeams} from '../../lib/common/github-teams';
import {BuildVerifier} from '../../lib/upload-server/build-verifier';
import {expectToBeUploadError} from './helpers';
// Tests
describe('BuildVerifier', () => {
const defaultConfig = {
allowedTeamSlugs: ['team1', 'team2'],
githubToken: 'githubToken',
organization: 'organization',
repoSlug: 'repo/slug',
secret: 'secret',
};
let bv: BuildVerifier;
// Helpers
const createBuildVerifier = (partialConfig: Partial<typeof defaultConfig> = {}) => {
const cfg = {...defaultConfig, ...partialConfig};
return new BuildVerifier(cfg.secret, cfg.githubToken, cfg.repoSlug, cfg.organization,
cfg.allowedTeamSlugs);
};
beforeEach(() => bv = createBuildVerifier());
describe('constructor()', () => {
['secret', 'githubToken', 'repoSlug', 'organization', 'allowedTeamSlugs'].forEach(param => {
it(`should throw if '${param}' is missing or empty`, () => {
expect(() => createBuildVerifier({[param]: ''})).
toThrowError(`Missing or empty required parameter '${param}'!`);
});
});
it('should throw if \'allowedTeamSlugs\' is an empty array', () => {
expect(() => createBuildVerifier({allowedTeamSlugs: []})).
toThrowError('Missing or empty required parameter \'allowedTeamSlugs\'!');
});
});
describe('verify()', () => {
const pr = 9;
const defaultJwt = {
'exp': Math.floor(Date.now() / 1000) + 30,
'iat': Math.floor(Date.now() / 1000) - 30,
'iss': 'Travis CI, GmbH',
'pull-request': pr,
'slug': defaultConfig.repoSlug,
};
let bvGetPrAuthorTeamMembership: jasmine.Spy;
// Heleprs
const createAuthHeader = (partialJwt: Partial<typeof defaultJwt> = {}, secret: string = defaultConfig.secret) =>
`Token ${jwt.sign({...defaultJwt, ...partialJwt}, secret)}`;
beforeEach(() => {
bvGetPrAuthorTeamMembership = spyOn(bv, 'getPrAuthorTeamMembership').
and.returnValue(Promise.resolve({author: 'some-author', isMember: true}));
});
it('should return a promise', done => {
const promise = bv.verify(pr, createAuthHeader());
promise.then(done); // Do not complete the test (and release the spies) synchronously
// to avoid running the actual `bvGetPrAuthorTeamMembership()`.
expect(promise).toEqual(jasmine.any(Promise));
});
it('should fail if the authorization header is invalid', done => {
bv.verify(pr, 'foo').catch(err => {
const errorMessage = 'Error while verifying upload for PR 9: jwt malformed';
expectToBeUploadError(err, 403, errorMessage);
done();
});
});
it('should fail if the secret is invalid', done => {
bv.verify(pr, createAuthHeader({}, 'foo')).catch(err => {
const errorMessage = 'Error while verifying upload for PR 9: invalid signature';
expectToBeUploadError(err, 403, errorMessage);
done();
});
});
it('should fail if the issuer is invalid', done => {
bv.verify(pr, createAuthHeader({iss: 'not valid'})).catch(err => {
const errorMessage = 'Error while verifying upload for PR 9: ' +
`jwt issuer invalid. expected: ${defaultJwt.iss}`;
expectToBeUploadError(err, 403, errorMessage);
done();
});
});
it('should fail if the token has expired', done => {
bv.verify(pr, createAuthHeader({exp: 0})).catch(err => {
const errorMessage = 'Error while verifying upload for PR 9: jwt expired';
expectToBeUploadError(err, 403, errorMessage);
done();
});
});
it('should fail if the repo slug does not match', done => {
bv.verify(pr, createAuthHeader({slug: 'foo/bar'})).catch(err => {
const errorMessage = 'Error while verifying upload for PR 9: ' +
`jwt slug invalid. expected: ${defaultConfig.repoSlug}`;
expectToBeUploadError(err, 403, errorMessage);
done();
});
});
it('should fail if the PR does not match', done => {
bv.verify(pr, createAuthHeader({'pull-request': 1337})).catch(err => {
const errorMessage = 'Error while verifying upload for PR 9: ' +
`jwt pull-request invalid. expected: ${pr}`;
expectToBeUploadError(err, 403, errorMessage);
done();
});
});
it('should not fail if the token is valid', done => {
bv.verify(pr, createAuthHeader()).then(done);
});
it('should not fail even if the token has been issued in the future', done => {
const in30s = Math.floor(Date.now() / 1000) + 30;
bv.verify(pr, createAuthHeader({iat: in30s})).then(done);
});
it('should call \'getPrAuthorTeamMembership()\' if the token is valid', done => {
bv.verify(pr, createAuthHeader()).then(() => {
expect(bvGetPrAuthorTeamMembership).toHaveBeenCalledWith(pr);
done();
});
});
it('should fail if \'getPrAuthorTeamMembership()\' rejects', done => {
bvGetPrAuthorTeamMembership.and.callFake(() => Promise.reject('Test'));
bv.verify(pr, createAuthHeader()).catch(err => {
expectToBeUploadError(err, 403, `Error while verifying upload for PR ${pr}: Test`);
done();
});
});
it('should fail if \'getPrAuthorTeamMembership()\' reports no membership', done => {
const errorMessage = `Error while verifying upload for PR ${pr}: User 'test' is not an active member of any of ` +
'the following teams: team1, team2';
bvGetPrAuthorTeamMembership.and.returnValue(Promise.resolve({author: 'test', isMember: false}));
bv.verify(pr, createAuthHeader()).catch(err => {
expectToBeUploadError(err, 403, errorMessage);
done();
});
});
it('should succeed if everything checks outs', done => {
bv.verify(pr, createAuthHeader()).then(done);
});
});
describe('getPrAuthorTeamMembership()', () => {
const pr = 9;
let prsFetchSpy: jasmine.Spy;
let teamsIsMemberBySlugSpy: jasmine.Spy;
beforeEach(() => {
prsFetchSpy = spyOn(GithubPullRequests.prototype, 'fetch').
and.returnValue(Promise.resolve({user: {login: 'username'}}));
teamsIsMemberBySlugSpy = spyOn(GithubTeams.prototype, 'isMemberBySlug').
and.returnValue(Promise.resolve(true));
});
it('should return a promise', done => {
const promise = bv.getPrAuthorTeamMembership(pr);
promise.then(done); // Do not complete the test (and release the spies) synchronously
// to avoid running the actual `GithubTeams#isMemberBySlug()`.
expect(promise).toEqual(jasmine.any(Promise));
});
it('should fetch the corresponding PR', done => {
bv.getPrAuthorTeamMembership(pr).then(() => {
expect(prsFetchSpy).toHaveBeenCalledWith(pr);
done();
});
});
it('should fail if fetching the PR errors', done => {
prsFetchSpy.and.callFake(() => Promise.reject('Test'));
bv.getPrAuthorTeamMembership(pr).catch(err => {
expect(err).toBe('Test');
done();
});
});
it('should verify the PR author\'s membership in the specified teams', done => {
bv.getPrAuthorTeamMembership(pr).then(() => {
expect(teamsIsMemberBySlugSpy).toHaveBeenCalledWith('username', ['team1', 'team2']);
done();
});
});
it('should fail if verifying membership errors', done => {
teamsIsMemberBySlugSpy.and.callFake(() => Promise.reject('Test'));
bv.getPrAuthorTeamMembership(pr).catch(err => {
expect(err).toBe('Test');
done();
});
});
it('should return the PR\'s author and whether they are members', done => {
teamsIsMemberBySlugSpy.and.returnValues(Promise.resolve(true), Promise.resolve(false));
Promise.all([
bv.getPrAuthorTeamMembership(pr).then(({author, isMember}) => {
expect(author).toBe('username');
expect(isMember).toBe(true);
}),
bv.getPrAuthorTeamMembership(pr).then(({author, isMember}) => {
expect(author).toBe('username');
expect(isMember).toBe(false);
}),
]).then(done);
});
});
});

View File

@ -1,11 +0,0 @@
import {UploadError} from '../../lib/upload-server/upload-error';
export const expectToBeUploadError = (actual: UploadError, status?: number, message?: string) => {
expect(actual).toEqual(jasmine.any(UploadError));
if (status != null) {
expect(actual.status).toBe(status);
}
if (message != null) {
expect(actual.message).toBe(message);
}
};

View File

@ -1,39 +0,0 @@
// Imports
import {UploadError} from '../../lib/upload-server/upload-error';
// Tests
describe('UploadError', () => {
let err: UploadError;
beforeEach(() => err = new UploadError(999, 'message'));
it('should extend Error', () => {
expect(err).toEqual(jasmine.any(UploadError));
expect(err).toEqual(jasmine.any(Error));
expect(Object.getPrototypeOf(err)).toBe(UploadError.prototype);
});
it('should have a \'status\' property', () => {
expect(err.status).toBe(999);
});
it('should have a \'message\' property', () => {
expect(err.message).toBe('message');
});
it('should have a 500 \'status\' by default', () => {
expect(new UploadError().status).toBe(500);
});
it('should have an empty \'message\' by default', () => {
expect(new UploadError().message).toBe('');
expect(new UploadError(999).message).toBe('');
});
});

View File

@ -1,403 +0,0 @@
// Imports
import * as express from 'express';
import * as http from 'http';
import * as supertest from 'supertest';
import {GithubPullRequests} from '../../lib/common/github-pull-requests';
import {BuildCreator} from '../../lib/upload-server/build-creator';
import {CreatedBuildEvent} from '../../lib/upload-server/build-events';
import {BuildVerifier} from '../../lib/upload-server/build-verifier';
import {uploadServerFactory as usf} from '../../lib/upload-server/upload-server-factory';
// Tests
describe('uploadServerFactory', () => {
const defaultConfig = {
buildsDir: 'builds/dir',
domainName: 'domain.name',
githubOrganization: 'organization',
githubTeamSlugs: ['team1', 'team2'],
githubToken: '12345',
repoSlug: 'repo/slug',
secret: 'secret',
};
// Helpers
const createUploadServer = (partialConfig: Partial<typeof defaultConfig> = {}) =>
usf.create({...defaultConfig, ...partialConfig});
describe('create()', () => {
let usfCreateMiddlewareSpy: jasmine.Spy;
beforeEach(() => {
usfCreateMiddlewareSpy = spyOn(usf as any, 'createMiddleware').and.callThrough();
});
it('should throw if \'buildsDir\' is missing or empty', () => {
expect(() => createUploadServer({buildsDir: ''})).
toThrowError('Missing or empty required parameter \'buildsDir\'!');
});
it('should throw if \'domainName\' is missing or empty', () => {
expect(() => createUploadServer({domainName: ''})).
toThrowError('Missing or empty required parameter \'domainName\'!');
});
it('should throw if \'githubToken\' is missing or empty', () => {
expect(() => createUploadServer({githubToken: ''})).
toThrowError('Missing or empty required parameter \'githubToken\'!');
});
it('should throw if \'githubOrganization\' is missing or empty', () => {
expect(() => createUploadServer({githubOrganization: ''})).
toThrowError('Missing or empty required parameter \'organization\'!');
});
it('should throw if \'githubTeamSlugs\' is missing or empty', () => {
expect(() => createUploadServer({githubTeamSlugs: []})).
toThrowError('Missing or empty required parameter \'allowedTeamSlugs\'!');
});
it('should throw if \'repoSlug\' is missing or empty', () => {
expect(() => createUploadServer({repoSlug: ''})).
toThrowError('Missing or empty required parameter \'repoSlug\'!');
});
it('should throw if \'secret\' is missing or empty', () => {
expect(() => createUploadServer({secret: ''})).
toThrowError('Missing or empty required parameter \'secret\'!');
});
it('should return an http.Server', () => {
const httpCreateServerSpy = spyOn(http, 'createServer').and.callThrough();
const server = createUploadServer();
expect(server).toBe(httpCreateServerSpy.calls.mostRecent().returnValue);
});
it('should create and use an appropriate BuildCreator', () => {
const usfCreateBuildCreatorSpy = spyOn(usf as any, 'createBuildCreator').and.callThrough();
createUploadServer();
const buildCreator: BuildCreator = usfCreateBuildCreatorSpy.calls.mostRecent().returnValue;
expect(usfCreateMiddlewareSpy).toHaveBeenCalledWith(jasmine.any(BuildVerifier), buildCreator);
expect(usfCreateBuildCreatorSpy).toHaveBeenCalledWith('builds/dir', '12345', 'repo/slug', 'domain.name');
});
it('should create and use an appropriate middleware', () => {
const httpCreateServerSpy = spyOn(http, 'createServer').and.callThrough();
createUploadServer();
const middleware: express.Express = usfCreateMiddlewareSpy.calls.mostRecent().returnValue;
const buildVerifier = jasmine.any(BuildVerifier);
const buildCreator = jasmine.any(BuildCreator);
expect(httpCreateServerSpy).toHaveBeenCalledWith(middleware);
expect(usfCreateMiddlewareSpy).toHaveBeenCalledWith(buildVerifier, buildCreator);
});
it('should log the server address info on \'listening\'', () => {
const consoleInfoSpy = spyOn(console, 'info');
const server = createUploadServer('builds/dir');
server.address = () => ({address: 'foo', family: '', port: 1337});
expect(consoleInfoSpy).not.toHaveBeenCalled();
server.emit('listening');
expect(consoleInfoSpy).toHaveBeenCalledWith('Up and running (and listening on foo:1337)...');
});
});
// Protected methods
describe('createBuildCreator()', () => {
let buildCreator: BuildCreator;
beforeEach(() => {
buildCreator = (usf as any).createBuildCreator(
defaultConfig.buildsDir,
defaultConfig.githubToken,
defaultConfig.repoSlug,
defaultConfig.domainName,
);
});
it('should pass the \'buildsDir\' to the BuildCreator', () => {
expect((buildCreator as any).buildsDir).toBe('builds/dir');
});
it('should post a comment on GitHub on \'build.created\'', () => {
const prsAddCommentSpy = spyOn(GithubPullRequests.prototype, 'addComment');
const commentBody = 'The angular.io preview for 1234567890 is available [here][1].\n\n' +
'[1]: https://pr42-1234567890.domain.name/';
buildCreator.emit(CreatedBuildEvent.type, {pr: 42, sha: '1234567890'});
expect(prsAddCommentSpy).toHaveBeenCalledWith(42, commentBody);
});
it('should pass the correct \'githubToken\' and \'repoSlug\' to GithubPullRequests', () => {
const prsAddCommentSpy = spyOn(GithubPullRequests.prototype, 'addComment');
buildCreator.emit(CreatedBuildEvent.type, {pr: 42, sha: '1234567890'});
const prs = prsAddCommentSpy.calls.mostRecent().object;
expect(prs).toEqual(jasmine.any(GithubPullRequests));
expect((prs as any).repoSlug).toBe('repo/slug');
expect((prs as any).requestHeaders.Authorization).toContain('12345');
});
});
describe('createMiddleware()', () => {
let buildVerifier: BuildVerifier;
let buildCreator: BuildCreator;
let agent: supertest.SuperTest<supertest.Test>;
// Helpers
const promisifyRequest = (req: supertest.Request) =>
new Promise((resolve, reject) => req.end(err => err ? reject(err) : resolve()));
const verifyRequests = (reqs: supertest.Request[], done: jasmine.DoneFn) =>
Promise.all(reqs.map(promisifyRequest)).then(done, done.fail);
beforeEach(() => {
buildVerifier = new BuildVerifier(
defaultConfig.secret,
defaultConfig.githubToken,
defaultConfig.repoSlug,
defaultConfig.githubOrganization,
defaultConfig.githubTeamSlugs,
);
buildCreator = new BuildCreator(defaultConfig.buildsDir);
agent = supertest.agent((usf as any).createMiddleware(buildVerifier, buildCreator));
spyOn(console, 'error');
});
describe('GET /create-build/<pr>/<sha>', () => {
const pr = '9';
const sha = '9'.repeat(40);
let buildVerifierVerifySpy: jasmine.Spy;
let buildCreatorCreateSpy: jasmine.Spy;
beforeEach(() => {
buildVerifierVerifySpy = spyOn(buildVerifier, 'verify').and.returnValue(Promise.resolve());
buildCreatorCreateSpy = spyOn(buildCreator, 'create').and.returnValue(Promise.resolve());
});
it('should respond with 405 for non-GET requests', done => {
verifyRequests([
agent.put(`/create-build/${pr}/${sha}`).expect(405),
agent.post(`/create-build/${pr}/${sha}`).expect(405),
agent.patch(`/create-build/${pr}/${sha}`).expect(405),
agent.delete(`/create-build/${pr}/${sha}`).expect(405),
], done);
});
it('should respond with 401 for requests without an \'AUTHORIZATION\' header', done => {
const url = `/create-build/${pr}/${sha}`;
const responseBody = `Missing or empty 'AUTHORIZATION' header in request: GET ${url}`;
verifyRequests([
agent.get(url).expect(401, responseBody),
agent.get(url).set('AUTHORIZATION', '').expect(401, responseBody),
], done);
});
it('should respond with 400 for requests without an \'X-FILE\' header', done => {
const url = `/create-build/${pr}/${sha}`;
const responseBody = `Missing or empty 'X-FILE' header in request: GET ${url}`;
const request1 = agent.get(url).set('AUTHORIZATION', 'foo');
const request2 = agent.get(url).set('AUTHORIZATION', 'foo').set('X-FILE', '');
verifyRequests([
request1.expect(400, responseBody),
request2.expect(400, responseBody),
], done);
});
it('should respond with 404 for unknown paths', done => {
verifyRequests([
agent.get(`/foo/create-build/${pr}/${sha}`).expect(404),
agent.get(`/foo-create-build/${pr}/${sha}`).expect(404),
agent.get(`/fooncreate-build/${pr}/${sha}`).expect(404),
agent.get(`/create-build/foo/${pr}/${sha}`).expect(404),
agent.get(`/create-build-foo/${pr}/${sha}`).expect(404),
agent.get(`/create-buildnfoo/${pr}/${sha}`).expect(404),
agent.get(`/create-build/pr${pr}/${sha}`).expect(404),
agent.get(`/create-build/${pr}/${sha}42`).expect(404),
], done);
});
it('should call \'BuildVerifier#verify()\' with the correct arguments', done => {
const req = agent.
get(`/create-build/${pr}/${sha}`).
set('AUTHORIZATION', 'foo').
set('X-FILE', 'bar');
promisifyRequest(req).
then(() => expect(buildVerifierVerifySpy).toHaveBeenCalledWith(9, 'foo')).
then(done, done.fail);
});
it('should propagate errors from BuildVerifier', done => {
buildVerifierVerifySpy.and.callFake(() => Promise.reject('Test'));
const req = agent.
get(`/create-build/${pr}/${sha}`).
set('AUTHORIZATION', 'foo').
set('X-FILE', 'bar').
expect(500, 'Test');
promisifyRequest(req).
then(() => {
expect(buildVerifierVerifySpy).toHaveBeenCalledWith(9, 'foo');
expect(buildCreatorCreateSpy).not.toHaveBeenCalled();
}).
then(done, done.fail);
});
it('should call \'BuildCreator#create()\' with the correct arguments', done => {
const req = agent.
get(`/create-build/${pr}/${sha}`).
set('AUTHORIZATION', 'foo').
set('X-FILE', 'bar');
promisifyRequest(req).
then(() => expect(buildCreatorCreateSpy).toHaveBeenCalledWith(pr, sha, 'bar')).
then(done, done.fail);
});
it('should propagate errors from BuildCreator', done => {
buildCreatorCreateSpy.and.callFake(() => Promise.reject('Test'));
const req = agent.
get(`/create-build/${pr}/${sha}`).
set('AUTHORIZATION', 'foo').
set('X-FILE', 'bar').
expect(500, 'Test');
verifyRequests([req], done);
});
it('should respond with 201 on successful upload', done => {
const req = agent.
get(`/create-build/${pr}/${sha}`).
set('AUTHORIZATION', 'foo').
set('X-FILE', 'bar').
expect(201, http.STATUS_CODES[201]);
verifyRequests([req], done);
});
it('should reject PRs with leading zeros', done => {
verifyRequests([agent.get(`/create-build/0${pr}/${sha}`).expect(404)], done);
});
it('should accept SHAs with leading zeros (but not trim the zeros)', done => {
const sha40 = '0'.repeat(40);
const sha41 = `0${sha40}`;
const request40 = agent.get(`/create-build/${pr}/${sha40}`).set('AUTHORIZATION', 'foo').set('X-FILE', 'bar');
const request41 = agent.get(`/create-build/${pr}/${sha41}`).set('AUTHORIZATION', 'baz').set('X-FILE', 'qux');
Promise.all([
promisifyRequest(request40.expect(201)),
promisifyRequest(request41.expect(404)),
]).then(done, done.fail);
});
});
describe('GET /health-check', () => {
it('should respond with 200', done => {
verifyRequests([
agent.get('/health-check').expect(200),
agent.get('/health-check/').expect(200),
], done);
});
it('should respond with 405 for non-GET requests', done => {
verifyRequests([
agent.put('/health-check').expect(405),
agent.post('/health-check').expect(405),
agent.patch('/health-check').expect(405),
agent.delete('/health-check').expect(405),
], done);
});
it('should respond with 404 if the path does not match exactly', done => {
verifyRequests([
agent.get('/health-check/foo').expect(404),
agent.get('/health-check-foo').expect(404),
agent.get('/health-checknfoo').expect(404),
agent.get('/foo/health-check').expect(404),
agent.get('/foo-health-check').expect(404),
agent.get('/foonhealth-check').expect(404),
], done);
});
});
describe('GET *', () => {
it('should respond with 404', done => {
const responseBody = 'Unknown resource in request: GET /some/url';
verifyRequests([agent.get('/some/url').expect(404, responseBody)], done);
});
});
describe('ALL *', () => {
it('should respond with 405', done => {
const responseFor = (method: string) => `Unsupported method in request: ${method.toUpperCase()} /some/url`;
verifyRequests([
agent.put('/some/url').expect(405, responseFor('put')),
agent.post('/some/url').expect(405, responseFor('post')),
agent.patch('/some/url').expect(405, responseFor('patch')),
agent.delete('/some/url').expect(405, responseFor('delete')),
], done);
});
});
});
});

View File

@ -1,28 +0,0 @@
{
"compilerOptions": {
"alwaysStrict": true,
"forceConsistentCasingInFileNames": true,
"inlineSourceMap": true,
"lib": [
"es2016"
],
"noImplicitAny": true,
"noImplicitReturns": true,
"noImplicitThis": true,
"noUnusedLocals": true,
"noUnusedParameters": true,
"outDir": "dist",
"pretty": true,
"rootDir": ".",
"skipLibCheck": true,
"strictNullChecks": true,
"target": "es5",
"typeRoots": [
"node_modules/@types"
]
},
"include": [
"lib/**/*",
"test/**/*"
]
}

View File

@ -1,15 +0,0 @@
{
"extends": "tslint:recommended",
"rules": {
"array-type": [true, "array"],
"arrow-parens": [true, "ban-single-arg-parens"],
"interface-name": [true, "never-prefix"],
"max-classes-per-file": [true, 4],
"no-consecutive-blank-lines": [true, 2],
"no-console": false,
"no-namespace": [true, "allow-declarations"],
"no-string-literal": false,
"quotemark": [true, "single"],
"variable-name": [true, "ban-keywords", "check-format", "allow-leading-underscore"]
}
}

File diff suppressed because it is too large Load Diff

View File

@ -1,8 +0,0 @@
#!/bin/bash
set -e -o pipefail
# Set up env variables
export AIO_GITHUB_TOKEN=$(head -c -1 /aio-secrets/GITHUB_TOKEN 2>/dev/null)
# Run the clean-up
node $AIO_SCRIPTS_JS_DIR/dist/lib/clean-up >> /var/log/aio/clean-up.log 2>&1

View File

@ -1,53 +0,0 @@
#!/bin/bash
set +e -o pipefail
# Variables
exitCode=0
# Helpers
function reportStatus {
local lastExitCode=$?
echo "$1: $([[ $lastExitCode -eq 0 ]] && echo OK || echo NOT OK)"
[[ $lastExitCode -eq 0 ]] || exitCode=1
}
# Check services
services=(
rsyslog
cron
nginx
pm2-root
)
for s in ${services[@]}; do
service $s status > /dev/null
reportStatus "Service '$s'"
done
# Check servers
origins=(
http://$AIO_UPLOAD_HOSTNAME:$AIO_UPLOAD_PORT
http://$AIO_NGINX_HOSTNAME:$AIO_NGINX_PORT_HTTP
https://$AIO_NGINX_HOSTNAME:$AIO_NGINX_PORT_HTTPS
)
for o in ${origins[@]}; do
curl --fail --silent $o/health-check > /dev/null
reportStatus "Server '$o'"
done
# Check resolution of external URLs
origins=(
https://google.com
)
for o in ${origins[@]}; do
curl --fail --silent $o > /dev/null
reportStatus "External URL '$o'"
done
# Exit
exit $exitCode

View File

@ -1,18 +0,0 @@
#!/bin/bash
set -e -o pipefail
exec >> /var/log/aio/init.log
exec 2>&1
# Start the services
echo [`date`] - Starting services...
mkdir -p $AIO_NGINX_LOGS_DIR
mkdir -p $TEST_AIO_NGINX_LOGS_DIR
service rsyslog start
service cron start
service dnsmasq start
service nginx start
service pm2-root start
aio-upload-server-prod start
echo [`date`] - Services started successfully.

View File

@ -1,15 +0,0 @@
#!/bin/bash
set -e -o pipefail
# Set up env variables for production
export AIO_GITHUB_TOKEN=$(head -c -1 /aio-secrets/GITHUB_TOKEN 2>/dev/null)
export AIO_PREVIEW_DEPLOYMENT_TOKEN=$(head -c -1 /aio-secrets/PREVIEW_DEPLOYMENT_TOKEN 2>/dev/null)
# Start the upload-server instance
# TODO(gkalpak): Ideally, the upload server should be run as a non-privileged user.
# (Currently, there doesn't seem to be a straight forward way.)
action=$([ "$1" == "stop" ] && echo "stop" || echo "start")
pm2 $action $AIO_SCRIPTS_JS_DIR/dist/lib/upload-server \
--log /var/log/aio/upload-server-prod.log \
--name aio-upload-server-prod \
${@:2}

View File

@ -1,29 +0,0 @@
#!/bin/bash
set -e -o pipefail
# Set up env variables for testing
export AIO_BUILDS_DIR=$TEST_AIO_BUILDS_DIR
export AIO_DOMAIN_NAME=$TEST_AIO_DOMAIN_NAME
export AIO_GITHUB_ORGANIZATION=$TEST_AIO_GITHUB_ORGANIZATION
export AIO_GITHUB_TEAM_SLUGS=$TEST_AIO_GITHUB_TEAM_SLUGS
export AIO_PREVIEW_DEPLOYMENT_TOKEN=$TEST_AIO_PREVIEW_DEPLOYMENT_TOKEN
export AIO_REPO_SLUG=$TEST_AIO_REPO_SLUG
export AIO_UPLOAD_HOSTNAME=$TEST_AIO_UPLOAD_HOSTNAME
export AIO_UPLOAD_PORT=$TEST_AIO_UPLOAD_PORT
export AIO_GITHUB_TOKEN=$(head -c -1 /aio-secrets/TEST_GITHUB_TOKEN 2>/dev/null || echo "TEST_GITHUB_TOKEN")
export AIO_PREVIEW_DEPLOYMENT_TOKEN=$(head -c -1 /aio-secrets/TEST_PREVIEW_DEPLOYMENT_TOKEN 2>/dev/null || echo "TEST_PREVIEW_DEPLOYMENT_TOKEN")
# Start the upload-server instance
# TODO(gkalpak): Ideally, the upload server should be run as a non-privileged user.
# (Currently, there doesn't seem to be a straight forward way.)
appName=aio-upload-server-test
if [[ "$1" == "stop" ]]; then
pm2 delete $appName
else
pm2 start $AIO_SCRIPTS_JS_DIR/dist/lib/upload-server/index-test.js \
--log /var/log/aio/upload-server-test.log \
--name $appName \
--no-autorestart \
${@:2}
fi

View File

@ -1,40 +0,0 @@
#!/bin/bash
set -e -o pipefail
logFile=/var/log/aio/verify-setup.log
uploadServerLogFile=/var/log/aio/upload-server-verify-setup.log
exec 3>&1
exec >> $logFile
exec 2>&1
echo "[`date`] - Starting verification..."
# Helpers
function countdown {
message=$1
secs=$2
while [ $secs -gt 0 ]; do
echo -ne "$message in $secs...\033[0K\r"
sleep 1
: $((secs--))
done
echo -ne "\033[0K\r"
}
function onExit {
aio-upload-server-test stop
echo -e "Full logs in '$logFile'.\n" > /dev/fd/3
}
# Setup EXIT trap
trap 'onExit' EXIT
# Start an upload-server instance for testing
aio-upload-server-test start --log $uploadServerLogFile
# Give the upload-server some time to start :(
countdown "Starting" 5 > /dev/fd/3
# Run the tests
node $AIO_SCRIPTS_JS_DIR/dist/lib/verify-setup | tee /dev/fd/3

View File

@ -1,32 +0,0 @@
# VM Setup Instructions
## Overview
- [General overview](overview--general.md)
- [Security model](overview--security-model.md)
- [Available Commands](overview--scripts-and-commands.md)
## Setting up the VM
- [Set up secrets](vm-setup--set-up-secrets.md)
- [Set up docker](vm-setup--set-up-docker.md)
- [Attach persistent disk](vm-setup--attach-persistent-disk.md)
- [Create host directories and files](vm-setup--create-host-dirs-and-files.md)
- [Create docker image](vm-setup--create-docker-image.md)
## Configuring the docker image
- [Available environment variables](image-config--environment-variables.md)
## Starting the docker container
- [Start docker container](vm-setup--start-docker-container.md)
## Updating the docker container
- [Update docker container](vm-setup--update-docker-container.md)
## Miscellaneous
- [Debug docker container](misc--debug-docker-container.md)
- [Integrate with CI](misc--integrate-with-ci.md)

View File

@ -1,52 +0,0 @@
# Image config - Environment variables
Below is a list of environment variables that can be configured when creating the docker image (as
described [here](vm-setup--create-docker-image.md)). An up-to-date list of the configurable
environment variables and their default values can be found in the
[Dockerfile](../dockerbuild/Dockerfile).
**Note:**
Each variable has a `TEST_` prefixed counterpart, which is used for testing purposes. In most cases
you don't need to specify values for those.
- `AIO_BUILDS_DIR`:
The directory (inside the container) where the uploaded build artifacts are kept.
- `AIO_DOMAIN_NAME`:
The domain name of the server.
- `AIO_GITHUB_ORGANIZATION`:
The GitHub organization whose teams arew whitelisted for accepting uploads.
See also `AIO_GITHUB_TEAM_SLUGS`.
- `AIO_GITHUB_TEAM_SLUGS`:
A comma-separated list of teams, whose authors are allowed to upload PRs.
See also `AIO_GITHUB_ORGANIZATION`.
- `AIO_NGINX_HOSTNAME`:
The internal hostname for accessing the nginx server. This is mostly used for performing a
periodic health-check.
- `AIO_NGINX_PORT_HTTP`:
The port number on which nginx listens for HTTP connections. This should be mapped to the
corresponding port on the host VM (as described [here](vm-setup--start-docker-container.md)).
- `AIO_NGINX_PORT_HTTPS`:
The port number on which nginx listens for HTTPS connections. This should be mapped to the
corresponding port on the host VM (as described [here](vm-setup--start-docker-container.md)).
- `AIO_REPO_SLUG`:
The repository slug (in the form `<user>/<repo>`) for which PRs will be uploaded.
- `AIO_UPLOAD_HOSTNAME`:
The internal hostname for accessing the Node.js upload-server. This is used by nginx for
delegating upload requests and also for performing a periodic health-check.
- `AIO_UPLOAD_MAX_SIZE`:
The maximum allowed size for the uploaded gzip archive containing the build artifacts. Files
larger than this will be rejected.
- `AIO_UPLOAD_PORT`:
The port number on which the Node.js upload-server listens for HTTP connections. This is used by
nginx for delegating upload requests and also for performing a periodic health-check.

View File

@ -1,12 +0,0 @@
# Miscellaneous - Debug docker container
TODO (gkalpak): Add docs. Mention:
- `aio-health-check`
- `aio-verify-setup`
- Test nginx accessible at:
- `http://$TEST_AIO_NGINX_HOTNAME:$TEST_AIO_NGINX_PORT_HTTP`
- `https://$TEST_AIO_NGINX_HOTNAME:$TEST_AIO_NGINX_PORT_HTTPS`
- Test upload-server accessible at:
- `http://$TEST_AIO_UPLOAD_HOTNAME:$TEST_AIO_UPLOAD_PORT`
- Local DNS (via dnsmasq) maps the above hostnames to 127.0.0.1

View File

@ -1,12 +0,0 @@
# Miscellaneous - Integrate with CI
TODO (gkalpak): Add docs. Mention:
- Travis' JWT addon (+ limitations).
Relevant files: `.travis.yml`
- Testing on CI.
Relevant files: `ci/test-aio.sh`, `aio/aio-builds-setup/scripts/test.sh`
- Preverifying on CI.
Relevant files: `ci/deploy.sh`, `aio/aio-builds-setup/scripts/travis-preverify-pr.sh`
- Deploying from CI.
Relevant files: `ci/deploy.sh`, `aio/scripts/deploy-preview.sh`

View File

@ -1,84 +0,0 @@
# Overview - General
## Objective
Whenever a PR job is run on Travis, we want to build `angular.io` and upload the build artifacts to
a publicly accessible server so that collaborators (developers, designers, authors, etc) can preview
the changes without having to checkout and build the app locally.
## Source code
In order to make it easier to administer the server and version-control the setup, we are using
[docker](https://www.docker.com) to run a container on a VM. The Dockerfile and all other files
necessary for creating the docker container are stored (and versioned) along with the angular.io
project's source code (currently part of the angular/angular repo) in the `aio-builds-setup/`
directory.
## Setup
The VM is hosted on [Google Compute Engine](https://cloud.google.com/compute/). The host OS is
debian:jessie. For more info how to set up the host VM take a look at the "Setting up the VM"
section in [TOC](_TOC.md).
## Security model
Since we are managing a public server, it is important to take appropriate measures in order to
prevent abuse. For more details on the challenges and the chosen approach take a look at the
[security model](overview--security-model.md).
## The 10000 feet view
This section gives a brief summary of the several operations performed on CI and by the docker
container:
### On CI (Travis)
- Build job completes successfully (i.e. build succeeds and tests pass).
- The CI script checks whether the build job was initiated by a PR against the angular/angular
master branch.
- The CI script checks whether the PR has touched any files inside the angular.io project directory
(currently `aio/`).
- The CI script checks whether the author of the PR is a member of one of the whitelisted GitHub
teams (and therefore allowed to upload).
**Note:**
For security reasons, the same checks will be performed on the server as well. This is an optional
step with the purpose of:
1. Avoiding the wasted overhead associated with uploads that are going to be rejected (e.g.
building the artifacts, sending them to the server, running checks on the server, etc).
2. Avoiding failing the build (due to an error response from the server) or requiring additional
logic for detecting the reasons of the failure.
- The CI script gzip and upload the build artifacts to the server.
More info on how to set things up on CI can be found [here](misc--integrate-with-ci.md).
### Uploading build artifacts
- nginx receives upload request.
- nginx checks that the uploaded gzip archive does not exceed the specified max file size, stores it
in a temporary location and passes the filepath to the Node.js upload-server.
- The upload-server verifies that the uploaded file is not trying to overwrite an existing build,
and runs several checks to determine whether the request should be accepted (more details can be
found [here](overview--security-model.md)).
- The upload-server deploys the artifacts to a sub-directory named after the PR number and SHA:
`<PR>/<SHA>/`
- The upload-server posts a comment on the corresponding PR on GitHub mentioning the SHA and the
the link where the preview can be found.
### Serving build artifacts
- nginx receives a request for an uploaded resource on a subdomain corresponding to the PR and SHA.
E.g.: `pr<PR>-<SHA>.ngbuilds.io/path/to/resource`
- nginx maps the subdomain to the correct sub-directory and serves the resource.
E.g.: `/<PR>/<SHA>/path/to/resource`
### Removing obsolete artifacts
In order to avoid flooding the disk with unnecessary build artifacts, there is a cronjob that runs a
clean-up tasks once a day. The task retrieves all open PRs from GitHub and removes all directories
that do not correspond with an open PR.
### Health-check
The docker service runs a periodic health-check that verifies the running conditions of the
container. This includes verifying the status of specific system services, the responsiveness of
nginx and the upload-server and internet connectivity.

View File

@ -1,58 +0,0 @@
# Overview - Scripts and Commands
This is an overview of the available scripts and commands.
## Scripts
The scripts are located inside `<aio-builds-setup-dir>/scripts/`. The following scripts are
available:
- `create-image.sh`:
Can be used for creating a preconfigured docker image.
See [here](vm-setup--create-docker-image.md) for more info.
- `test.sh`
Can be used for running the tests for `<aio-builds-setup-dir>/dockerbuild/scripts-js/`. This is
useful for CI integration. See [here](misc--integrate-with-ci.md) for more info.
- `travis-preverify-pr.sh`
Can be used for "preverifying" a PR before uploading the artifacts to the server. It checks that
the author of the PR is a member of one of the specified GitHub teams and therefore allowed to
upload build artifacts. This is useful for CI integration. See [here](misc--integrate-with-ci.md)
for more info.
- `update-preview-server.sh`
Can be used for updating the docker container (and image) based on the latest changes checked out
from a git repository. See [here](vm-setup--update-docker-container.md) for more info.
## Commands
The following commands are available globally from inside the docker container. They are either used
by the container to perform its various operations or can be used ad-hoc, mainly for testing
purposes. Each command is backed by a corresponding script inside
`<aio-builds-setup-dir>/dockerbuild/scripts-sh/`.
- `aio-clean-up`:
Cleans up the builds directory by removing the artifacts that do not correspond to an open PR.
_It is run as a daily cronjob._
- `aio-health-check`:
Runs a basic health-check, verifying that the necessary services are running, the servers are
responding and there is a working internet connection.
_It is used periodically by docker for determining the container's health status._
- `aio-init`:
Initializes the container (mainly by starting the necessary services).
_It is run (by default) when starting the container._
- `aio-upload-server-prod`:
Spins up a Node.js upload-server instance.
_It is used in `aio-init` (see above) during initialization._
- `aio-upload-server-test`:
Spins up a Node.js upload-server instance for tests.
_It is used in `aio-verify-setup` (see below) for running tests._
- `aio-verify-setup`:
Runs a suite of e2e-like tests, mainly verifying the correct (inter)operation of nginx and the
Node.js upload-server.

View File

@ -1,116 +0,0 @@
# Overview - Security model
Whenever a PR job is run on Travis, we want to build `angular.io` and upload the build artifacts to
a publicly accessible server so that collaborators (developers, designers, authors, etc) can preview
the changes without having to checkout and build the app locally.
This document discusses the security considerations associated with uploading build artifacts as
part of the CI setup and serving them publicly.
## Security objectives
- **Prevent uploading arbitrary content to our servers.**
Since there is no restriction on who can submit a PR, we cannot allow any PR's build artifacts to
be uploaded.
- **Prevent overwriting other peoples uploaded content.**
There needs to be a mechanism in place to ensure that the uploaded content does indeed correspond
to the PR indicated by its URL.
- **Prevent arbitrary access on the server.**
Since the PR author has full access over the build artifacts that would be uploaded, we must
ensure that the uploaded files will not enable arbitrary access to the server or expose sensitive
info.
## Issues / Caveats
- Because the PR author can change the scripts run on CI, any security mechanisms must be immune to
such changes.
- For security reasons, encrypted Travis variables are not available to PRs, so we can't rely on
them to implement security.
## Implemented approach
### In a nutshell
The implemented approach can be broken up to the following sub-tasks:
1. Verify which PR the uploaded artifacts correspond to.
2. Determine the author of the PR.
3. Check whether the PR author is a member of some whitelisted GitHub team.
4. Deploy the artifacts to the corresponding PR's directory.
5. Prevent overwriting previously deployed artifacts (which ensures that the guarantees established
during deployment will remain valid until the artifacts are removed).
6. Prevent uploaded files from accessing anything outside their directory.
### Implementation details
This section describes how each of the aforementioned sub-tasks is accomplished:
1. **Verify which PR the uploaded artifacts correspond to.**
We are taking advantage of Travis' [JWT addon](https://docs.travis-ci.com/user/jwt). By sharing
a secret between Travis (which keeps it private but uses it to sign a JWT) and the server (which
uses it to verify the authenticity of the JWT), we can accomplish the following:
a. Verify that the upload request comes from Travis.
b. Determine the PR that these artifacts correspond to (since Travis puts that information into
the JWT, without the PR author being able to modify it).
_Note:_
_There are currently certain limitation in the implementation of the JWT addon._
_See the next section for more details._
2. **Determine the author of the PR.**
Once we have securely associated the uploaded artifaacts to a PR, we retrieve the PR's metadata -
including the author's username - using the [GitHub API](https://developer.github.com/v3/).
To avoid rate-limit restrictions, we use a Personal Access Token (issued by
[@mary-poppins](https://github.com/mary-poppins)).
3. **Check whether the PR author is a member of some whitelisted GitHub team.**
Again using the GitHub API, we can verify the author's membership in one of the
whitelisted/trusted GitHub teams. For this operation, we need a PErsonal Access Token with the
`read:org` scope issued by a user that can "see" the specified GitHub organization.
Here too, we use token by @mary-poppins.
4. **Deploy the artifacts to the corresponding PR's directory.**
With the preceeding steps, we have verified that the uploaded artifacts have been uploaded by
Travis and correspond to a PR whose author is a member of a trusted team. Essentially, as long as
sub-tasks 1, 2 and 3 can be securely accomplished, it is possible to "project" the trust we have
in a team's members through the PR and Travis to the build artifacts.
5. **Prevent overwriting previously deployed artifacts**.
In order to enforce this restriction (and ensure that the deployed artifacts validity is
preserved throughout their "lifetime"), the server that handles the upload (currently a Node.js
Express server) rejects uploads that target an existing directory.
_Note: A PR can contain multiple uploads; one for each SHA that was built on Travis._
6. **Prevent uploaded files from accessing anything outside their directory.**
Nginx (which is used to serve the uploaded artifacts) has been configured to not follow symlinks
outside of the directory where the build artifacts are stored.
## Assumptions / Things to keep in mind
- Each trusted PR author has full control over the content that is uploaded for their PRs. Part of
the security model relies on the trustworthiness of these authors.
- If anyone gets access to the `PREVIEW_DEPLOYMENT_TOKEN` (a.k.a. `NGBUILDS_IO_KEY` on
angular/angular) variable generated for each Travis job, they will be able to impersonate the
corresponding PR's author on the preview server for as long as the token is valid (currently 90
mins). Because of this, the value of the `PREVIEW_DEPLOYMENT_TOKEN` should not be made publicly
accessible (e.g. by printing it on the Travis job log).
- Travis does only allow specific whitelisted property names to be used with the JWT addon. The only
known such property at the time is `SAUCE_ACCESS_KEY` (used for integration with SauceLabs). In
order to be able to actually use the JWT addon we had to name the encrypted variable
`SAUCE_ACCESS_KEY` (which we later re-assign to `NGBUILDS_IO_KEY`).

View File

@ -1,20 +0,0 @@
# VM setup - Attach persistent disk
## Create `aio-builds` persistent disk (if not already exists)
- Follow instructions [here](https://cloud.google.com/compute/docs/disks/add-persistent-disk#create_disk).
- `sudo mkfs.ext4 -F -E lazy_itable_init=0,lazy_journal_init=0,discard /dev/disk/by-id/google-aio-builds`
## Mount disk
- `sudo mkdir -p /mnt/disks/aio-builds`
- `sudo mount -o discard,defaults /dev/disk/by-id/google-aio-builds /mnt/disks/aio-builds`
- `sudo chmod a+w /mnt/disks/aio-builds`
## Mount disk on boot
- Run:
```
echo UUID=`sudo blkid -s UUID -o value /dev/disk/by-id/google-aio-builds` \
/mnt/disks/aio-builds ext4 discard,defaults,nofail 0 2 | sudo tee -a /etc/fstab
```

View File

@ -1,32 +0,0 @@
# VM setup - Create docker image
## Checkout repository
- `git clone <repo-url>`
## Build docker image
- `<aio-builds-setup-dir>/scripts/create-image.sh [<name>[:<tag>] [--build-arg <NAME>=<value> ...]]`
- You can overwrite the default environment variables inside the image, by passing new values using
`--build-arg`.
**Note:** The script has to execute docker commands with `sudo`.
## Example
The following commands would create a docker image from GitHub repo `foo/bar` to be deployed on the
`foobar-builds.io` domain and accepting PR deployments from authors that are members of the
`bar-core` and `bar-docs-authors` teams of organization `foo`:
- `git clone https://github.com/foo/bar.git foobar`
- Run:
```
./foobar/aio-builds-setup/scripts/build.sh foobar-builds \
--build-arg AIO_REPO_SLUG=foo/bar \
--build-arg AIO_DOMAIN_NAME=foobar-builds.io \
--build-arg AIO_GITHUB_ORGANIZATION=foo \
--build-arg AIO_GITHUB_TEMA_SLUGS=bar-core,bar-docs-authors
```
A full list of the available environment variables can be found
[here](image-config--environment-variables.md).

View File

@ -1,75 +0,0 @@
# VM setup - Create host directories and files
## Create directory with secrets
For security reasons, sensitive info (such as tokens and passwords) are not hardcoded into the
docker image, nor passed as environment variables at runtime. They are passed to the docker
container from the host VM as files inside a directory. Each file's name is the name of the variable
and the file content is the value. These are read from inside the running container when necessary.
More info on how to create `secrets` directory and files can be found
[here](vm-setup--set-up-secrets.md).
## Create directory for build artifacts
The uploaded build artifacts should be kept on a directory outside the docker container, so it is
easier to replace the container without losing the uploaded builds. For portability across VMs a
persistent disk can be used (as described [here](vm-setup--attach-persistent-disk.md)).
**Note:** The directories created inside that directory will be owned by user `www-data`.
## Create SSL certificates (Optional for dev)
The host VM can attach a directory containing the SSL certificate and key to be used by the nginx
server for serving the uploaded build artifacts. More info on how to attach the directory when
starting the container can be found [here](vm-setup--start-docker-container.md).
In order for the container to be able to find the certificate and key, they should be named
`<DOMAIN_NAME>.crt` and `<DOMAIN_NAME>.key` respectively. For example, for a domain name
`ngbuild.io`, nginx will look for files `ngbuilds.io.crt` and `ngbuilds.io.key`. More info on how to
specify the domain name see [here](vm-setup--create-docker-image.md).
If no directory is attached, nginx will use an internal self-signed certificate. This is convenient
during development, but is not suitable for production.
**Note:**
Since nginx needs to be able to serve requests for both the main domain as well as any subdomain
(e.g. `ngbuilds.io/` and `foo-bar.ngbuilds.io/`), the provided certificate needs to be a wildcard
certificate covering both the domain and subdomains.
## Create directory for logs (Optional)
Optionally, a logs directory can pe passed to the docker container for storing non-system-related
logs. If not provided, the logs are kept locally on the container and will be lost whenever the
container is replaced (e.g. when updating to use a newer version of the docker image). Log files are
rotated and retained for 6 months.
The following log files are kept in this directory:
- `clean-up.log`:
Output of the `aio-clean-up` command, run as a cronjob for cleaning up the build artifacts of
closed PRs.
- `init.log`:
Output of the `aio-init` command, run (by default) when starting the container.
- `nginx/{access,error}.log`:
The access and error logs produced by the nginx server while serving "production" files.
- `nginx-test/{access,error}.log`:
The access and error logs produced by the nginx server while serving "test" files. This is only
used when running tests locally from inside the container, e.g. with the `aio-verify-setup`
command. (See [here](overview--scripts-and-commands.md) for more info.)
- `upload-server-{prod,test,verify-setup}-*.log`:
The logs produced by the Node.js upload-server while serving either:
- `-prod`: "Production" files (g.g during normal operation).
- `-test`: "Test" files (e.g. when a test instance is started with the `aio-upload-server-test`
command).
- `-verify-setup`: "Test" files, but while running `aio-verify-setup`.
(See [here](overview--scripts-and-commands.md) for more info the commands mentioned above.)
- `verify-setup.log`:
The output of the `aio-verify-setup` command (e.g. Jasmine output), except for upload-server
output which is logged to `upload-server-verify-setup-*.log` (see above).

View File

@ -1,35 +0,0 @@
# VM Setup - Set up docker
## Install docker
_Debian (jessie):_
- `sudo apt-get update`
- `sudo apt-get install -y apt-transport-https ca-certificates curl git software-properties-common`
- `curl -fsSL https://apt.dockerproject.org/gpg | sudo apt-key add -`
- `apt-key fingerprint 58118E89F3A912897C070ADBF76221572C52609D`
- `sudo add-apt-repository "deb https://apt.dockerproject.org/repo/ debian-$(lsb_release -cs) main"`
- `sudo apt-get update`
- `sudo apt-get -y install docker-engine`
_Ubuntu (16.04):_
- `sudo apt-get update`
- `sudo apt-get install -y curl git linux-image-extra-$(uname -r) linux-image-extra-virtual`
- `sudo apt-get install -y apt-transport-https ca-certificates`
- `curl -fsSL https://yum.dockerproject.org/gpg | sudo apt-key add -`
- `apt-key fingerprint 58118E89F3A912897C070ADBF76221572C52609D`
- `sudo add-apt-repository "deb https://apt.dockerproject.org/repo/ ubuntu-$(lsb_release -cs) main"`
- `sudo apt-get update`
- `sudo apt-get -y install docker-engine`
## Start the docker
- `sudo service docker start`
## Test docker
- `sudo docker run hello-world`
## Start docker on boot
- `sudo systemctl enable docker`

View File

@ -1,52 +0,0 @@
# VM Setup - Set up secrets
## Overview
Necessary secrets:
1. `GITHUB_TOKEN`
- Used for:
- Retrieving open PRs without rate-limiting.
- Retrieving PR author.
- Retrieving members of the `angular-core` team.
- Posting comments with preview links on PRs.
2. `PREVIEW_DEPLOYMENT_TOKEN`
- Used for:
- Decoding the JWT tokens received with `/create-build` requests.
**Note:**
`TEST_GITHUB_TOKEN` and `TEST_PREVIEW_DEPLOYMENT_TOKEN` can also be created similar to their
non-TEST counterparts and they will be loaded when running `aio-verify-setup`, but it is currently
not clear if/how they can be used in tests.
## Create secrets
1. `GITHUB_TOKEN`
- Visit https://github.com/settings/tokens.
- Generate new token with the `public_repo` scope.
2. `PREVIEW_DEPLOYMENT_TOKEN`
- Just generate a hard-to-guess character sequence.
- Add it to `.travis.yml` under `addons -> jwt -> secure`.
Can be added automatically with: `travis encrypt --add addons.jwt PREVIEW_DEPLOYMENT_TOKEN=<access-key>`
**Note:**
Due to [travis-ci/travis-ci#7223](https://github.com/travis-ci/travis-ci/issues/7223) it is not
currently possible to use the JWT addon (as described above) for anything other than the
`SAUCE_ACCESS_KEY` variable. You can get creative, though...
**WARNING**
TO avoid arbitrary uploads, make sure the `PREVIEW_DEPLOYMENT_TOKEN` is NOT printed in the Travis log.
## Save secrets on the VM
- `sudo mkdir /aio-secrets`
- `sudo touch /aio-secrets/GITHUB_TOKEN`
- Insert `<github-token>` into `/aio-secrets/GITHUB_TOKEN`.
- `sudo touch /aio-secrets/PREVIEW_DEPLOYMENT_TOKEN`
- Insert `<access-token>` into `/aio-secrets/PREVIEW_DEPLOYMENT_TOKEN`.
- `sudo chmod 400 /aio-secrets/*`

View File

@ -1,92 +0,0 @@
# VM setup - Start docker container
## The `docker run` command
Once everything has been setup and configured, a docker container can be started with the following
command:
```
sudo docker run \
--detach \
--dns 127.0.0.1 \
--name <instance-name> \
--publish 80:80 \
--publish 443:443 \
--restart unless-stopped \
[--volume <host-cert-dir>:/etc/ssl/localcerts:ro] \
--volume <host-secrets-dir>:/aio-secrets:ro \
--volume <host-builds-dir>:/var/www/aio-builds \
[--volume <host-logs-dir>:/var/log/aio] \
<name>[:<tag>]
```
Below is the same command with inline comments explaining each option. The aPI docs for `docker run`
can be found [here](https://docs.docker.com/engine/reference/run/).
```
sudo docker run \
# Start as a daemon.
--detach \
# Use the local DNS server.
# (This is necessary for mapping internal URLs, e.g. for the Node.js upload-server.)
--dns 127.0.0.1 \
# USe `<instance-name>` as an alias for the container.
# Useful for running `docker` commands, e.g.: `docker stop <instance-name>`
--name <instance-name> \
# Map ports of the host VM (left) to ports of the docker container (right)
--publish 80:80 \
--publish 443:443 \
# Automatically restart the container (unless it was explicitly stopped by the user).
# (This ensures that the container will be automatically started on boot.)
--restart unless-stopped \
# The directory the contains the SSL certificates.
# (See [here](vm-setup--create-host-dirs-and-files.md) for more info.)
# If not provided, the container will use self-signed certificates.
[--volume <host-cert-dir>:/etc/ssl/localcerts:ro] \
# The directory the contains the secrets (e.g. GitHub token, JWT secret, etc).
# (See [here](vm-setup--set-up-secrets.md) for more info.)
--volume <host-secrets-dir>:/aio-secrets:ro \
# The uploaded build artifacts will stored to and served from this directory.
# (If you are using a persistent disk - as described [here](vm-setup--attach-persistent-disk.md) -
# this will be a directory inside the disk.)
--volume <host-builds-dir>:/var/www/aio-builds \
# The directory where the logs are being kept.
# (See [here](vm-setup--create-host-dirs-and-files.md) for more info.)
# If not provided, the logs will be kept inside the container, which means they will be lost
# whenever a new container is created.
[--volume <host-logs-dir>:/var/log/aio] \
# The name of the docker image to use (and an optional tag; defaults to `latest`).
# (See [here](vm-setup--create-docker-image.md) for instructions on how to create the iamge.)
<name>[:<tag>]
```
## Example
The following command would start a docker container based on the previously created `foobar-builds`
docker image, alias it as 'foobar-builds-1' and map predefined directories on the host VM to be used
by the container for accesing secrets and SSL certificates and keeping the build artifacts and logs.
```
sudo docker run \
--detach \
--dns 127.0.0.1 \
--name foobar-builds-1 \
--publish 80:80 \
--publish 443:443 \
--restart unless-stopped \
--volume /etc/ssl/localcerts:/etc/ssl/localcerts:ro \
--volume /foobar-secrets:/aio-secrets:ro \
--volume /mnt/disks/foobar-builds:/var/www/aio-builds \
--volume /foobar-logs:/var/log/aio \
foobar-builds
```

View File

@ -1,52 +0,0 @@
# VM setup - Update docker container
## Overview
Assuming you have cloned the repository containing the preview server code (as described
[here](vm-setup--create-docker-image.md)), you can use the `update-preview-server.sh` script on the
VM host to update the preview server based on changes in the source code.
The script will pull the latest changes from the origin's master branch and examine if there have
been any changes in files inside the preview server source code directory (see below). If there are,
it will create a new image and verify that is works as expected. Finally, it will stop and remove
the old docker container and image, create and new container based on the new image and start it.
The script assumes that the preview server source code is in the repository's
`aio/aio-builds-setup/` directory and expects the following inputs:
- **$1**: `HOST_REPO_DIR`
- **$2**: `HOST_LOCALCERTS_DIR`
- **$3**: `HOST_SECRETS_DIR`
- **$4**: `HOST_BUILDS_DIR`
- **$5**: `HOST_LOGS_DIR`
See [here](vm-setup--create-host-dirs-and-files.md) for more info on what each input directory is
used for.
**Note 1:** The script has to execute docker commands with `sudo`.
**Note 2:** Make sure the user that executes the script has access to update the repository
## Run the script manually
You may choose to manually run the script, when necessary. Example:
```
update-preview-server.sh \
/path/to/repo \
/path/to/localcerts \
/path/to/secrets \
/path/to/builds \
/path/to/logs
```
## Run the script automatically
You may choose to automatically trigger the script, e.g. using a cronjob. For example, the following
cronjob entry would run the script every hour and update the preview server (assuming the user has
the necessary permissions):
```
# Periodically check for changes and update the preview server (if necessary)
*/30 * * * * /path/to/update-preview-server.sh /path/to/repo /path/to/localcerts /path/to/secrets /path/to/builds /path/to/logs
```

View File

@ -1,5 +0,0 @@
#!/bin/bash
readonly THIS_DIR=$(cd $(dirname $0); pwd)
readonly DOCKERBUILD_DIR="$THIS_DIR/../dockerbuild"
readonly SCRIPTS_JS_DIR="$DOCKERBUILD_DIR/scripts-js"

View File

@ -1,10 +0,0 @@
#!/bin/bash
set -eux -o pipefail
# Set up env
source "`dirname $0`/_env.sh"
readonly defaultImageNameAndTag="aio-builds:latest"
# Create docker image
readonly nameAndOptionalTag=${1:-$defaultImageNameAndTag}
sudo docker build --tag $nameAndOptionalTag ${@:2} $DOCKERBUILD_DIR

View File

@ -1,12 +0,0 @@
#!/bin/bash
set -eux -o pipefail
# Set up env
source "`dirname $0`/_env.sh"
# Test `scripts-js/`
(
cd "$SCRIPTS_JS_DIR"
yarn install
yarn test
)

View File

@ -1,20 +0,0 @@
#!/bin/bash
set -eux -o pipefail
# Set up env
source "`dirname $0`/_env.sh"
# Build `scripts-js/`
(
cd "$SCRIPTS_JS_DIR"
yarn install
yarn build
)
# Preverify PR
AIO_GITHUB_ORGANIZATION="angular" \
AIO_GITHUB_TEAM_SLUGS="angular-core,aio-contributors" \
AIO_GITHUB_TOKEN=$(echo ${GITHUB_TEAM_MEMBERSHIP_CHECK_KEY} | rev) \
AIO_REPO_SLUG=$TRAVIS_REPO_SLUG \
AIO_PREVERIFY_PR=$TRAVIS_PULL_REQUEST \
node "$SCRIPTS_JS_DIR/dist/lib/upload-server/index-preverify-pr"

View File

@ -1,70 +0,0 @@
#!/usr/bin/env bash
set -eux -o pipefail
exec 3>&1
echo "[`date`] - Updating the preview server..."
# Input
readonly HOST_REPO_DIR=$1
readonly HOST_LOCALCERTS_DIR=$2
readonly HOST_SECRETS_DIR=$3
readonly HOST_BUILDS_DIR=$4
readonly HOST_LOGS_DIR=$5
# Constants
readonly PROVISIONAL_IMAGE_NAME=aio-builds:provisional
readonly LATEST_IMAGE_NAME=aio-builds:latest
readonly CONTAINER_NAME=aio
# Run
(
cd "$HOST_REPO_DIR"
readonly lastDeployedCommit=$(git rev-parse HEAD)
echo "Currently at commit $lastDeployedCommit."
# Pull latest master from origin.
git pull origin master
# Do not update the server unless files inside `aio-builds-setup/` have changed
# or the last attempt failed (identified by the provisional image still being around).
readonly relevantChangedFilesCount=$(git diff --name-only $lastDeployedCommit...HEAD | grep -P "^aio/aio-builds-setup/" | wc -l)
readonly lastAttemptFailed=$(sudo docker rmi "$PROVISIONAL_IMAGE_NAME" >> /dev/fd/3 && echo "true" || echo "false")
if [[ $relevantChangedFilesCount -eq 0 ]] && [[ "$lastAttemptFailed" != "true" ]]; then
echo "Skipping update because no relevant files have been touched."
exit 0
fi
# Create and verify a new docker image.
aio/aio-builds-setup/scripts/create-image.sh "$PROVISIONAL_IMAGE_NAME"
readonly imageVerified=$(sudo docker run --dns 127.0.0.1 --rm --volume $HOST_SECRETS_DIR:/aio-secrets:ro "$PROVISIONAL_IMAGE_NAME" /bin/bash -c "aio-init && aio-health-check && aio-verify-setup" >> /dev/fd/3 && echo "true" || echo "false")
if [[ "$imageVerified" != "true" ]]; then
echo "Failed to verify new docker image. Aborting update!"
exit 1
fi
# Remove the old container and replace the docker image.
sudo docker stop "$CONTAINER_NAME" || true
sudo docker rm "$CONTAINER_NAME" || true
sudo docker rmi "$LATEST_IMAGE_NAME" || true
sudo docker tag "$PROVISIONAL_IMAGE_NAME" "$LATEST_IMAGE_NAME"
sudo docker rmi "$PROVISIONAL_IMAGE_NAME"
# Create and start a docker container based on the new image.
sudo docker run \
--detach \
--dns 127.0.0.1 \
--name "$CONTAINER_NAME" \
--publish 80:80 \
--publish 443:443 \
--restart unless-stopped \
--volume $HOST_LOCALCERTS_DIR:/etc/ssl/localcerts:ro \
--volume $HOST_SECRETS_DIR:/aio-secrets:ro \
--volume $HOST_BUILDS_DIR:/var/www/aio-builds \
--volume $HOST_LOGS_DIR:/var/log/aio \
"$LATEST_IMAGE_NAME"
echo "The new docker image has been successfully deployed."
)

View File

@ -1,71 +0,0 @@
# boilerplate files
**/src/styles.css
**/src/systemjs-angular-loader.js
**/src/systemjs.config.js
**/src/tsconfig.json
**/bs-config.e2e.json
**/bs-config.json
**/package.json
**/tslint.json
**/karma.conf.js
**/karma-test-shim.js
**/browser-test-shim.js
**/node_modules
# built files
*.map
_test-output
protractor-helpers.js
*/e2e-spec.js
**/*.js
**/ts/**/*.js
**/js-es6*/**/*.js
dist/
# special
!/*
!*.1.*
!*.2.*
!*.3.*
*.1.js
*.2.js
*.3.js
*.1.js.map
*.2.js.map
*.3.js.map
!systemjs.config.*.js
!karma-test-shim.*.js
!copy-dist-files.js
# AngularJS files
!**/*.ajs.js
**/app/**/*.ajs.js
# aot
**/*.ngfactory.ts
**/*.ngsummary.json
**/*.ngsummary.ts
**/*.shim.ngstyle.ts
**/*.metadata.json
!aot/bs-config.json
!aot/index.html
!rollup-config.js
# testing
!testing/src/browser-test-shim.js
!testing/karma*.js
# TS to JS
!ts-to-js/js*/**/*.js
ts-to-js/js*/**/system*.js
# webpack
!webpack/**/config/*.js
!webpack/**/*webpack*.js
# styleguide
!styleguide/src/systemjs.custom.js
# plunkers
*plnkr.no-link.html

View File

@ -1,115 +0,0 @@
'use strict'; // necessary for es6 output in node
import { browser, element, by } from 'protractor';
describe('AngularJS to Angular Quick Reference Tests', function () {
beforeAll(function () {
browser.get('');
});
it('should display no poster images after bootstrap', function () {
testImagesAreDisplayed(false);
});
it('should display proper movie data', function () {
// We check only a few samples
let expectedSamples: any[] = [
{row: 0, column: 0, element: 'img', attr: 'src', value: 'images/hero.png', contains: true},
{row: 0, column: 2, value: 'Celeritas'},
{row: 1, column: 3, matches: /Dec 1[678], 2015/}, // absorb timezone dif; we care about date format
{row: 1, column: 5, value: '$14.95'},
{row: 2, column: 4, value: 'PG-13'},
{row: 2, column: 7, value: '100%'},
{row: 2, column: 0, element: 'img', attr: 'src', value: 'images/ng-logo.png', contains: true},
];
// Go through the samples
let movieRows = getMovieRows();
for (let i = 0; i < expectedSamples.length; i++) {
let sample = expectedSamples[i];
let tableCell = movieRows.get(sample.row)
.all(by.tagName('td')).get(sample.column);
// Check the cell or its nested element
let elementToCheck = sample.element
? tableCell.element(by.tagName(sample.element))
: tableCell;
// Check element attribute or text
let valueToCheck = sample.attr
? elementToCheck.getAttribute(sample.attr)
: elementToCheck.getText();
// Test for equals/contains/match
if (sample.contains) {
expect(valueToCheck).toContain(sample.value);
} else if (sample.matches) {
expect(valueToCheck).toMatch(sample.matches);
} else {
expect(valueToCheck).toEqual(sample.value);
}
}
});
it('should display images after Show Poster', function () {
testPosterButtonClick('Show Poster', true);
});
it('should hide images after Hide Poster', function () {
testPosterButtonClick('Hide Poster', false);
});
it('should display no movie when no favorite hero is specified', function () {
testFavoriteHero(null, 'Please enter your favorite hero.');
});
it('should display no movie for Magneta', function () {
testFavoriteHero('Magneta', 'No movie, sorry!');
});
it('should display a movie for Mr. Nice', function () {
testFavoriteHero('Mr. Nice', 'Excellent choice!');
});
function testImagesAreDisplayed(isDisplayed: boolean) {
let expectedMovieCount = 3;
let movieRows = getMovieRows();
expect(movieRows.count()).toBe(expectedMovieCount);
for (let i = 0; i < expectedMovieCount; i++) {
let movieImage = movieRows.get(i).element(by.css('td > img'));
expect(movieImage.isDisplayed()).toBe(isDisplayed);
}
}
function testPosterButtonClick(expectedButtonText: string, isDisplayed: boolean) {
let posterButton = element(by.css('movie-list tr > th > button'));
expect(posterButton.getText()).toBe(expectedButtonText);
posterButton.click().then(function () {
testImagesAreDisplayed(isDisplayed);
});
}
function getMovieRows() {
return element.all(by.css('movie-list tbody > tr'));
}
function testFavoriteHero(heroName: string, expectedLabel: string) {
let movieListComp = element(by.tagName('movie-list'));
let heroInput = movieListComp.element(by.tagName('input'));
let favoriteHeroLabel = movieListComp.element(by.tagName('h3'));
let resultLabel = movieListComp.element(by.css('span > p'));
heroInput.clear().then(function () {
heroInput.sendKeys(heroName || '');
expect(resultLabel.getText()).toBe(expectedLabel);
if (heroName) {
expect(favoriteHeroLabel.isDisplayed()).toBe(true);
expect(favoriteHeroLabel.getText()).toContain(heroName);
} else {
expect(favoriteHeroLabel.isDisplayed()).toBe(false);
}
});
}
});

View File

@ -1,10 +0,0 @@
{
"description": "AngularJS to Angular Quick Reference",
"basePath": "src/",
"files":[
"!**/*.d.ts",
"!**/*.js",
"!**/*.[1].*"
],
"tags":["cookbook", "angularjs"]
}

View File

@ -1,16 +0,0 @@
// #docregion
import { NgModule } from '@angular/core';
import { Routes, RouterModule } from '@angular/router';
import { MovieListComponent } from './movie-list.component';
const routes: Routes = [
{ path: '', redirectTo: '/movies', pathMatch: 'full' },
{ path: 'movies', component: MovieListComponent }
];
@NgModule({
imports: [RouterModule.forRoot(routes)],
exports: [RouterModule]
})
export class AppRoutingModule {}

View File

@ -1,9 +0,0 @@
.active {font-style: italic;}
.shazam {font-weight: bold;}
img {height: 100px;}
table td {
padding: 4px;
border: 1px solid #e0e0e0;
}

View File

@ -1,112 +0,0 @@
<!-- #docplaster -->
<h1>{{title}}</h1>
<h3>Routed Movies</h3>
<nav>
<!-- #docregion router-link -->
<a [routerLink]="['/movies']">Movies</a>
<!-- #enddocregion router-link -->
</nav>
<router-outlet></router-outlet>
<hr>
<h1>Example Snippets</h1>
<!-- #docregion ngClass -->
<div [ngClass]="{active: isActive}">
<!-- #enddocregion ngClass -->
[ngClass] active
</div>
<!-- #docregion ngClass -->
<div [ngClass]="{active: isActive,
shazam: isImportant}">
<!-- #enddocregion ngClass -->
[ngClass] active and boldly important
</div>
<!-- #docregion ngClass -->
<div [class.active]="isActive">
<!-- #enddocregion ngClass -->
[class.active]
</div>
<p></p>
<!-- #docregion href -->
<a [href]="angularDocsUrl">Angular Docs</a>
<!-- #enddocregion href -->
<p></p>
<div>
<!-- #docregion event-binding -->
<button (click)="toggleImage()">
<!-- #enddocregion event-binding -->
Image Toggle #1</button>
<!-- #docregion event-binding -->
<button (click)="toggleImage($event)">
<!-- #enddocregion event-binding -->
Image Toggle #2</button>
<p>Image toggle event type was {{eventType}}</p>
</div>
<p></p>
<div *ngIf="showImage">
<!-- #docregion src -->
<img [src]="movie.imageurl">
<!-- #enddocregion src -->
</div>
<p></p>
<!-- #docregion ngStyle -->
<div [ngStyle]="{color: colorPreference}">
<!-- #enddocregion ngStyle -->
color preference #1
</div>
<!-- #docregion ngStyle -->
<div [style.color]="colorPreference">
<!-- #enddocregion ngStyle -->
color preference #2
</div>
<h3>Movie as JSON</h3>
<!-- #docregion json -->
<pre>{{movie | json}}</pre>
<!-- #enddocregion json -->
<h3>Movie Titles via local variable</h3>
<table>
<!-- #docregion local -->
<tr *ngFor="let movie of movies">
<td>{{movie.title}}</td>
</tr>
<!-- #enddocregion local -->
</table>
<h3>Sliced Movies with pipes</h3>
<table>
<!-- #docregion slice -->
<tr *ngFor="let movie of movies | slice:0:2">
<!-- #enddocregion slice -->
<!-- #docregion uppercase -->
<td>{{movie.title | uppercase}}</td>
<!-- #enddocregion uppercase -->
<!-- #docregion lowercase -->
<td>{{movie.title | lowercase}}</td>
<!-- #enddocregion lowercase -->
<!-- #docregion date -->
<td>{{movie.releaseDate | date}}</td>
<!-- #enddocregion date -->
<!-- #docregion currency -->
<td>{{movie.price | currency:'USD':true}}</td>
<!-- #enddocregion currency -->
<!-- #docregion number -->
<td>{{movie.starRating | number}}</td>
<td>{{movie.starRating | number:'1.1-2'}}</td>
<td>{{movie.approvalRating | percent: '1.0-2'}}</td>
<!-- #enddocregion number -->
</tr></table>

View File

@ -1,32 +0,0 @@
import { Component } from '@angular/core';
import { MovieService } from './movie.service';
import { IMovie } from './movie';
@Component({
selector: 'my-app',
templateUrl: './app.component.html',
styleUrls: [ './app.component.css' ],
providers: [ MovieService ]
})
export class AppComponent {
angularDocsUrl = 'https://angular.io/';
colorPreference = 'red';
eventType = '<not clicked yet>';
isActive = true;
isImportant = true;
movie: IMovie = null;
movies: IMovie[] = [];
showImage = true;
title = 'AngularJS to Angular Quick Ref Cookbook';
toggleImage(event: UIEvent) {
this.showImage = !this.showImage;
this.eventType = (event && event.type) || 'not provided';
}
constructor(movieService: MovieService) {
this.movies = movieService.getMovies();
this.movie = this.movies[0];
}
}

Some files were not shown because too many files have changed in this diff Show More