0
Fork 0
mirror of https://github.com/verdaccio/verdaccio.git synced 2024-12-30 22:34:10 -05:00

refactor: relocate verdaccio-aws-storage plugin (#1977)

This commit is contained in:
Juan Picado 2020-10-23 23:58:39 +02:00
parent 4024205829
commit eb686fbcaf
26 changed files with 2648 additions and 0 deletions

View file

@ -0,0 +1,3 @@
{
"extends": "../../../.babelrc"
}

View file

@ -0,0 +1,5 @@
{
"rules": {
"jest/no-mocks-import": 0
}
}

View file

@ -0,0 +1,329 @@
# Change Log
All notable changes to this project will be documented in this file.
See [Conventional Commits](https://conventionalcommits.org) for commit guidelines.
## [9.7.2](https://github.com/verdaccio/monorepo/compare/v9.7.1...v9.7.2) (2020-07-20)
**Note:** Version bump only for package verdaccio-aws-s3-storage
## [9.7.1](https://github.com/verdaccio/monorepo/compare/v9.7.0...v9.7.1) (2020-07-10)
**Note:** Version bump only for package verdaccio-aws-s3-storage
# [9.7.0](https://github.com/verdaccio/monorepo/compare/v9.6.1...v9.7.0) (2020-06-24)
**Note:** Version bump only for package verdaccio-aws-s3-storage
## [9.6.1](https://github.com/verdaccio/monorepo/compare/v9.6.0...v9.6.1) (2020-06-07)
**Note:** Version bump only for package verdaccio-aws-s3-storage
# [9.6.0](https://github.com/verdaccio/monorepo/compare/v9.5.1...v9.6.0) (2020-06-07)
### Features
* allow providing session token in config ([#362](https://github.com/verdaccio/monorepo/issues/362)) ([acef36f](https://github.com/verdaccio/monorepo/commit/acef36f99c9028742bf417ee9879ed80bfbb7a8d))
# [9.5.0](https://github.com/verdaccio/monorepo/compare/v9.4.1...v9.5.0) (2020-05-02)
**Note:** Version bump only for package verdaccio-aws-s3-storage
# [9.4.0](https://github.com/verdaccio/monorepo/compare/v9.3.4...v9.4.0) (2020-03-21)
**Note:** Version bump only for package verdaccio-aws-s3-storage
## [9.3.2](https://github.com/verdaccio/monorepo/compare/v9.3.1...v9.3.2) (2020-03-08)
### Bug Fixes
* update dependencies ([#332](https://github.com/verdaccio/monorepo/issues/332)) ([b6165ae](https://github.com/verdaccio/monorepo/commit/b6165aea9b7e4012477081eae68bfa7159c58f56))
## [9.3.1](https://github.com/verdaccio/monorepo/compare/v9.3.0...v9.3.1) (2020-02-23)
**Note:** Version bump only for package verdaccio-aws-s3-storage
# [9.3.0](https://github.com/verdaccio/monorepo/compare/v9.2.0...v9.3.0) (2020-01-29)
**Note:** Version bump only for package verdaccio-aws-s3-storage
# [9.2.0](https://github.com/verdaccio/monorepo/compare/v9.1.0...v9.2.0) (2020-01-28)
### Features
* **verdaccio-aws-s3-storage:** Allow endpoint to be configurable ([#319](https://github.com/verdaccio/monorepo/issues/319)) ([1191dcd](https://github.com/verdaccio/monorepo/commit/1191dcd829b7d9f2dd0b4fab4910f4dc9d697565))
# [9.1.0](https://github.com/verdaccio/monorepo/compare/v9.0.0...v9.1.0) (2020-01-25)
### Features
* **verdaccio-aws-s3-storage:** separate s3 subfolders (key prefix for different packages) ([#313](https://github.com/verdaccio/monorepo/issues/313)) ([6639a71](https://github.com/verdaccio/monorepo/commit/6639a71c2d2056f93e913c71e27b4453acb029aa))
* **verdaccio-aws-s3-storage:** supporting environment variables ([#315](https://github.com/verdaccio/monorepo/issues/315)) ([0c532f0](https://github.com/verdaccio/monorepo/commit/0c532f0198aba786a3292e866e7a2d933a06d2fa))
# [9.0.0](https://github.com/verdaccio/monorepo/compare/v8.5.3...v9.0.0) (2020-01-07)
### chore
* update dependencies ([68add74](https://github.com/verdaccio/monorepo/commit/68add743159867f678ddb9168d2bc8391844de47))
### Features
* **eslint-config:** enable eslint curly ([#308](https://github.com/verdaccio/monorepo/issues/308)) ([91acb12](https://github.com/verdaccio/monorepo/commit/91acb121847018e737c21b367fcaab8baa918347))
### BREAKING CHANGES
* @verdaccio/eslint-config requires ESLint >=6.8.0 and Prettier >=1.19.1 to fix compatibility with overrides.extends config
## [8.5.3](https://github.com/verdaccio/monorepo/compare/v8.5.2...v8.5.3) (2019-12-27)
### Bug Fixes
* verdaccio/verdaccio/issues/1435 ([#289](https://github.com/verdaccio/monorepo/issues/289)) ([7a130ca](https://github.com/verdaccio/monorepo/commit/7a130ca0281ac2a008091753341baae4f17fb71a)), closes [/github.com/verdaccio/verdaccio/issues/1435#issuecomment-559977118](https://github.com//github.com/verdaccio/verdaccio/issues/1435/issues/issuecomment-559977118)
## [8.5.2](https://github.com/verdaccio/monorepo/compare/v8.5.1...v8.5.2) (2019-12-25)
### Bug Fixes
* add types for storage handler ([#307](https://github.com/verdaccio/monorepo/issues/307)) ([c35746e](https://github.com/verdaccio/monorepo/commit/c35746ebba071900db172608dedff66a7d27c23d))
## [8.5.1](https://github.com/verdaccio/monorepo/compare/v8.5.0...v8.5.1) (2019-12-24)
**Note:** Version bump only for package verdaccio-aws-s3-storage
# [8.5.0](https://github.com/verdaccio/monorepo/compare/v8.4.2...v8.5.0) (2019-12-22)
**Note:** Version bump only for package verdaccio-aws-s3-storage
## [8.4.2](https://github.com/verdaccio/monorepo/compare/v8.4.1...v8.4.2) (2019-11-23)
**Note:** Version bump only for package verdaccio-aws-s3-storage
## [8.4.1](https://github.com/verdaccio/monorepo/compare/v8.4.0...v8.4.1) (2019-11-22)
**Note:** Version bump only for package verdaccio-aws-s3-storage
# [8.4.0](https://github.com/verdaccio/monorepo/compare/v8.3.0...v8.4.0) (2019-11-22)
**Note:** Version bump only for package verdaccio-aws-s3-storage
# [8.3.0](https://github.com/verdaccio/monorepo/compare/v8.2.0...v8.3.0) (2019-10-27)
**Note:** Version bump only for package verdaccio-aws-s3-storage
# [8.2.0](https://github.com/verdaccio/monorepo/compare/v8.2.0-next.0...v8.2.0) (2019-10-23)
**Note:** Version bump only for package verdaccio-aws-s3-storage
# [8.2.0-next.0](https://github.com/verdaccio/monorepo/compare/v8.1.4...v8.2.0-next.0) (2019-10-08)
### Bug Fixes
* fixed lint errors ([5e677f7](https://github.com/verdaccio/monorepo/commit/5e677f7))
## [8.1.2](https://github.com/verdaccio/monorepo/compare/v8.1.1...v8.1.2) (2019-09-29)
**Note:** Version bump only for package verdaccio-aws-s3-storage
## [8.1.1](https://github.com/verdaccio/monorepo/compare/v8.1.0...v8.1.1) (2019-09-26)
**Note:** Version bump only for package verdaccio-aws-s3-storage
# [8.1.0](https://github.com/verdaccio/monorepo/compare/v8.0.1-next.1...v8.1.0) (2019-09-07)
### Features
* **verdaccio-aws-s3-storage:** update @verdaccio/types and add new required methods ([f39b7a2](https://github.com/verdaccio/monorepo/commit/f39b7a2))
## [8.0.1-next.1](https://github.com/verdaccio/monorepo/compare/v8.0.1-next.0...v8.0.1-next.1) (2019-08-29)
**Note:** Version bump only for package verdaccio-aws-s3-storage
## [8.0.1-next.0](https://github.com/verdaccio/monorepo/compare/v8.0.0...v8.0.1-next.0) (2019-08-29)
### Bug Fixes
* **package:** update aws-sdk to version 2.516.0 ([82f7117](https://github.com/verdaccio/monorepo/commit/82f7117))
* **package:** update aws-sdk to version 2.517.0 ([39183eb](https://github.com/verdaccio/monorepo/commit/39183eb))
* **package:** update aws-sdk to version 2.518.0 ([c4f18a6](https://github.com/verdaccio/monorepo/commit/c4f18a6))
# [8.0.0](https://github.com/verdaccio/verdaccio-aws-s3-storage/compare/v8.0.0-next.4...v8.0.0) (2019-08-22)
### Bug Fixes
* **package:** update aws-sdk to version 2.514.0 ([16860e6](https://github.com/verdaccio/verdaccio-aws-s3-storage/commit/16860e6))
* **package:** update aws-sdk to version 2.515.0 ([eed8547](https://github.com/verdaccio/verdaccio-aws-s3-storage/commit/eed8547))
# [8.0.0-next.4](https://github.com/verdaccio/verdaccio-aws-s3-storage/compare/v8.0.0-next.3...v8.0.0-next.4) (2019-08-18)
**Note:** Version bump only for package verdaccio-aws-s3-storage
# [8.0.0-next.2](https://github.com/verdaccio/verdaccio-aws-s3-storage/compare/v8.0.0-next.1...v8.0.0-next.2) (2019-08-03)
**Note:** Version bump only for package verdaccio-aws-s3-storage
# Changelog
All notable changes to this project will be documented in this file. See [standard-version](https://github.com/conventional-changelog/standard-version) for commit guidelines.
### [0.1.2](https://github.com/verdaccio/verdaccio-aws-s3-storage/compare/v0.1.1...v0.1.2) (2019-07-15)
### Build System
* update dependencies @verdaccio/commons-api ([151e4df](https://github.com/verdaccio/verdaccio-aws-s3-storage/commit/151e4df))
### [0.1.1](https://github.com/verdaccio/verdaccio-aws-s3-storage/compare/v0.1.0...v0.1.1) (2019-07-12)
### Build System
* update dependencies ([7a7c3b7](https://github.com/verdaccio/verdaccio-aws-s3-storage/commit/7a7c3b7))
## 0.1.0 (2019-06-25)
### Features
* add aws s3 plugin in typescrip ([2e4df1d](https://github.com/verdaccio/verdaccio-aws-s3-storage/commit/2e4df1d))
* add logging ([#5](https://github.com/verdaccio/verdaccio-aws-s3-storage/issues/5)) ([16b9e0f](https://github.com/verdaccio/verdaccio-aws-s3-storage/commit/16b9e0f))

View file

@ -0,0 +1,21 @@
MIT License
Copyright (c) 2019 Verdaccio
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View file

@ -0,0 +1,124 @@
# verdaccio-aws-s3-storage
📦 AWS S3 storage plugin for Verdaccio
[![verdaccio (latest)](https://img.shields.io/npm/v/verdaccio-aws-s3-storage/latest.svg)](https://www.npmjs.com/package/verdaccio-aws-s3-storage)
[![CircleCI](https://circleci.com/gh/verdaccio/verdaccio-aws-s3-storage/tree/master.svg?style=svg)](https://circleci.com/gh/verdaccio/verdaccio-aws-s3-storage/tree/master)
[![Known Vulnerabilities](https://snyk.io/test/github/verdaccio/verdaccio-aws-s3-storage/badge.svg?targetFile=package.json)](https://snyk.io/test/github/verdaccio/verdaccio-aws-s3-storage?targetFile=package.json)
[![codecov](https://codecov.io/gh/verdaccio/verdaccio-aws-s3-storage/branch/master/graph/badge.svg)](https://codecov.io/gh/verdaccio/verdaccio-aws-s3-storage)
[![backers](https://opencollective.com/verdaccio/tiers/backer/badge.svg?label=Backer&color=brightgreen)](https://opencollective.com/verdaccio)
[![discord](https://img.shields.io/discord/388674437219745793.svg)](http://chat.verdaccio.org/)
![MIT](https://img.shields.io/github/license/mashape/apistatus.svg)
[![node](https://img.shields.io/node/v/verdaccio-aws-s3-storage/latest.svg)](https://www.npmjs.com/package/verdaccio-aws-s3-storage)
[![Twitter followers](https://img.shields.io/twitter/follow/verdaccio_npm.svg?style=social&label=Follow)](https://twitter.com/verdaccio_npm)
[![Github](https://img.shields.io/github/stars/verdaccio/verdaccio.svg?style=social&label=Stars)](https://github.com/verdaccio/verdaccio/stargazers)
[![backers](https://opencollective.com/verdaccio/tiers/backer/badge.svg?label=Backer&color=brightgreen)](https://opencollective.com/verdaccio)
[![stackshare](https://img.shields.io/badge/Follow%20on-StackShare-blue.svg?logo=stackshare&style=flat)](https://stackshare.io/verdaccio)
> This plugin was forked based on [`verdaccio-s3-storage`](https://github.com/Remitly/verdaccio-s3-storage) built in Typescript + other features added along
> the time. Both plugins might have vary in behaviour since then, we recommend use the AWS plugin on this repo due
> is under control of Verdaccio community and constantly upated.
## See it in action
- Test on [Docker + LocalStack + Verdaccio 4 + S3 Plugin example](https://github.com/verdaccio/docker-examples/tree/master/amazon-s3-docker-example).
- Using `docker-compose` on this repo based on [**verdaccio-minio**](https://github.com/barolab/verdaccio-minio) developed by [barolab](https://github.com/barolab).
- Feel free to propose new ways to run this plugin.
### Basic Requirements
- AWS Account (in case you are using the cloud)
- Verdaccio server (4.0) (for 3.x use `verdaccio-s3-storage` instead)
```
npm install -g verdaccio
```
## Usage
```
npm install verdaccio-aws-s3-storage
```
This will pull AWS credentials from your environment.
In your verdaccio config, configure
```yaml
store:
aws-s3-storage:
bucket: your-s3-bucket
keyPrefix: some-prefix # optional, has the effect of nesting all files in a subdirectory
region: us-west-2 # optional, will use aws s3's default behavior if not specified
endpoint: https://{service}.{region}.amazonaws.com # optional, will use aws s3's default behavior if not specified
s3ForcePathStyle: false # optional, will use path style URLs for S3 objects
accessKeyId: your-access-key-id # optional, aws accessKeyId for private S3 bucket
secretAccessKey: your-secret-access-key # optional, aws secretAccessKey for private S3 bucket
sessionToken: your-session-token # optional, aws sessionToken for private S3 bucket
```
The configured values can either be the actual value or the name of an environment variable that contains the value for the following options:
- `bucket`
- `keyPrefix`
- `region`
- `endpoint`
- `accessKeyID`
- `secretAccessKey`
- `sessionToken`
```yaml
store:
aws-s3-storage:
bucket: S3_BUCKET # If an environment variable named S3_BUCKET is set, it will use that value. Otherwise assumes the bucket is named 'S3_BUCKET'
keyPrefix: S3_KEY_PREFIX # If an environment variable named S3_KEY_PREFIX is set, it will use that value. Otherwise assumes the bucket is named 'S3_KEY_PREFIX'
endpoint: S3_ENDPOINT # If an environment variable named S3_ENDPOINT is set, it will use that value. Otherwise assumes the bucket is named 'S3_ENDPOINT'
...
```
store properties can be defined for packages. The storage location corresponds to the folder in s3 bucket.
```
packages:
'@scope/*':
access: all
publish: $all
storage: 'scoped'
'**':
access: $all
publish: $all
proxy: npmjs
storage: 'public'
```
# Developer Testing
In case of local testing, this project can be used self-efficiently. Four main ingredients are as follows:
- `config.yaml`, see [verdaccio documentation](https://verdaccio.org/docs/en/configuration.html)
- The provided docker file allows to test the plugin, with no need for main verdaccio application
- The provided docker-compose also provides minio in orchestration as a local substitute for S3 backend
- Create and set content of `registry.envs` as follows. This file does not exist on the repo and should be generated manually after cloning the project.
```
AWS_ACCESS_KEY_ID=foobar
AWS_SECRET_ACCESS_KEY=1234567e
AWS_DEFAULT_REGION=eu-central-1
AWS_S3_ENDPOINT=https://localhost:9000/
AWS_S3_PATH_STYLE=true
```
## Execute the docker image for testing
> You need the latest docker installed in your computer
```bash
docker-compose up
```
> By default there is no bucket created, **you might need to browse `http://127.0.0.1:9000/minio/` and create
> the example bucket manually named `rise`** and then restart `docker-compose up`.
The default values should work out of the box. If you change anything, make sure the corresponding variables are set in
other parts of the ingredient as well.

View file

@ -0,0 +1,5 @@
const config = require('../../../jest/config');
module.exports = Object.assign({}, config, {
collectCoverage: true,
});

View file

@ -0,0 +1,45 @@
{
"name": "verdaccio-aws-s3-storage",
"version": "10.0.0-beta",
"description": "AWS S3 storage implementation for Verdaccio",
"keywords": [
"verdaccio",
"plugin",
"storage",
"aws"
],
"author": "Juan Picado <juanpicado19@gmail.com>",
"license": "MIT",
"homepage": "https://verdaccio.org",
"repository": {
"type": "https",
"url": "https://github.com/verdaccio/verdaccio",
"directory": "packages/plugins/aws-storage"
},
"bugs": {
"url": "https://github.com/verdaccio/verdaccio/issues"
},
"main": "build/index.js",
"types": "build/index.d.ts",
"dependencies": {
"@verdaccio/commons-api": "workspace:*",
"@verdaccio/streams": "workspace:*",
"aws-sdk": "^2.607.0"
},
"devDependencies": {
"@verdaccio/types": "workspace:*",
"recursive-readdir": "2.2.2"
},
"scripts": {
"clean": "rimraf ./build",
"type-check": "tsc --noEmit -p tsconfig.build.json",
"build:types": "tsc --emitDeclarationOnly -p tsconfig.build.json",
"build:js": "babel src/ --out-dir build/ --copy-files --extensions \".ts,.tsx\" --source-maps",
"build": "pnpm run build:js && pnpm run build:types",
"test": "cross-env NODE_ENV=test BABEL_ENV=test jest"
},
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/verdaccio"
}
}

View file

@ -0,0 +1,3 @@
export default (path?: string): string => {
return path != null ? (path.endsWith('/') ? path : `${path}/`) : '';
};

View file

@ -0,0 +1,12 @@
import { Config } from '@verdaccio/types';
export interface S3Config extends Config {
bucket: string;
keyPrefix: string;
endpoint?: string;
region?: string;
s3ForcePathStyle?: boolean;
accessKeyId?: string;
secretAccessKey?: string;
sessionToken?: string;
}

View file

@ -0,0 +1,39 @@
import { S3 } from 'aws-sdk';
import { convertS3Error, create404Error } from './s3Errors';
interface DeleteKeyPrefixOptions {
Bucket: string;
Prefix: string;
}
export function deleteKeyPrefix(
s3: S3,
options: DeleteKeyPrefixOptions,
callback: (err: Error | null) => void
): void {
s3.listObjectsV2(options, (err, data) => {
if (err) {
callback(convertS3Error(err));
} else if (data.KeyCount) {
const objectsToDelete: S3.ObjectIdentifierList = data.Contents
? data.Contents.map((s3Object) => ({ Key: s3Object.Key as S3.ObjectKey }))
: [];
s3.deleteObjects(
{
Bucket: options.Bucket,
Delete: { Objects: objectsToDelete },
},
(err) => {
if (err) {
callback(convertS3Error(err));
} else {
callback(null);
}
}
);
} else {
callback(create404Error());
}
});
}

View file

@ -0,0 +1,264 @@
import {
LocalStorage,
Logger,
Config,
Callback,
IPluginStorage,
PluginOptions,
Token,
TokenFilter,
} from '@verdaccio/types';
import { getInternalError, VerdaccioError, getServiceUnavailable } from '@verdaccio/commons-api';
import { S3 } from 'aws-sdk';
import { S3Config } from './config';
import S3PackageManager from './s3PackageManager';
import { convertS3Error, is404Error } from './s3Errors';
import addTrailingSlash from './addTrailingSlash';
import setConfigValue from './setConfigValue';
export default class S3Database implements IPluginStorage<S3Config> {
public logger: Logger;
public config: S3Config;
private s3: S3;
private _localData: LocalStorage | null;
public constructor(config: Config, options: PluginOptions<S3Config>) {
this.logger = options.logger;
// copy so we don't mutate
if (!config) {
throw new Error('s3 storage missing config. Add `store.s3-storage` to your config file');
}
this.config = Object.assign(config, config.store['aws-s3-storage']);
if (!this.config.bucket) {
throw new Error('s3 storage requires a bucket');
}
this.config.bucket = setConfigValue(this.config.bucket);
this.config.keyPrefix = setConfigValue(this.config.keyPrefix);
this.config.endpoint = setConfigValue(this.config.endpoint);
this.config.region = setConfigValue(this.config.region);
this.config.accessKeyId = setConfigValue(this.config.accessKeyId);
this.config.secretAccessKey = setConfigValue(this.config.secretAccessKey);
this.config.sessionToken = setConfigValue(this.config.sessionToken);
const configKeyPrefix = this.config.keyPrefix;
this._localData = null;
this.config.keyPrefix = addTrailingSlash(configKeyPrefix);
this.logger.debug(
{ config: JSON.stringify(this.config, null, 4) },
's3: configuration: @{config}'
);
this.s3 = new S3({
endpoint: this.config.endpoint,
region: this.config.region,
s3ForcePathStyle: this.config.s3ForcePathStyle,
accessKeyId: this.config.accessKeyId,
secretAccessKey: this.config.secretAccessKey,
sessionToken: this.config.sessionToken,
});
}
public async getSecret(): Promise<string> {
return Promise.resolve((await this._getData()).secret);
}
public async setSecret(secret: string): Promise<void> {
(await this._getData()).secret = secret;
await this._sync();
}
public add(name: string, callback: Callback): void {
this.logger.debug({ name }, 's3: [add] private package @{name}');
this._getData().then(async (data) => {
if (data.list.indexOf(name) === -1) {
data.list.push(name);
this.logger.trace({ name }, 's3: [add] @{name} has been added');
try {
await this._sync();
callback(null);
} catch (err) {
callback(err);
}
} else {
callback(null);
}
});
}
public async search(onPackage: Function, onEnd: Function): Promise<void> {
this.logger.debug('s3: [search]');
const storage = await this._getData();
const storageInfoMap = storage.list.map(this._fetchPackageInfo.bind(this, onPackage));
this.logger.debug({ l: storageInfoMap.length }, 's3: [search] storageInfoMap length is @{l}');
await Promise.all(storageInfoMap);
onEnd();
}
private async _fetchPackageInfo(onPackage: Function, packageName: string): Promise<void> {
const { bucket, keyPrefix } = this.config;
this.logger.debug({ packageName }, 's3: [_fetchPackageInfo] @{packageName}');
this.logger.trace(
{ keyPrefix, bucket },
's3: [_fetchPackageInfo] bucket: @{bucket} prefix: @{keyPrefix}'
);
return new Promise((resolve): void => {
this.s3.headObject(
{
Bucket: bucket,
Key: `${keyPrefix + packageName}/package.json`,
},
(err, response) => {
if (err) {
this.logger.debug({ err }, 's3: [_fetchPackageInfo] error: @{err}');
return resolve();
}
if (response.LastModified) {
const { LastModified } = response;
this.logger.trace(
{ LastModified },
's3: [_fetchPackageInfo] LastModified: @{LastModified}'
);
return onPackage(
{
name: packageName,
path: packageName,
time: LastModified.getTime(),
},
resolve
);
}
resolve();
}
);
});
}
public remove(name: string, callback: Callback): void {
this.logger.debug({ name }, 's3: [remove] @{name}');
this.get(async (err, data) => {
if (err) {
this.logger.error({ err }, 's3: [remove] error: @{err}');
callback(getInternalError('something went wrong on remove a package'));
}
const pkgName = data.indexOf(name);
if (pkgName !== -1) {
const data = await this._getData();
data.list.splice(pkgName, 1);
this.logger.debug({ pkgName }, 's3: [remove] sucessfully removed @{pkgName}');
}
try {
this.logger.trace('s3: [remove] starting sync');
await this._sync();
this.logger.trace('s3: [remove] finish sync');
callback(null);
} catch (err) {
this.logger.error({ err }, 's3: [remove] sync error: @{err}');
callback(err);
}
});
}
public get(callback: Callback): void {
this.logger.debug('s3: [get]');
this._getData().then((data) => callback(null, data.list));
}
// Create/write database file to s3
private async _sync(): Promise<void> {
await new Promise((resolve, reject): void => {
const { bucket, keyPrefix } = this.config;
this.logger.debug(
{ keyPrefix, bucket },
's3: [_sync] bucket: @{bucket} prefix: @{keyPrefix}'
);
this.s3.putObject(
{
Bucket: this.config.bucket,
Key: `${this.config.keyPrefix}verdaccio-s3-db.json`,
Body: JSON.stringify(this._localData),
},
(err) => {
if (err) {
this.logger.error({ err }, 's3: [_sync] error: @{err}');
reject(err);
return;
}
this.logger.debug('s3: [_sync] sucess');
resolve();
}
);
});
}
// returns an instance of a class managing the storage for a single package
public getPackageStorage(packageName: string): S3PackageManager {
this.logger.debug({ packageName }, 's3: [getPackageStorage] @{packageName}');
return new S3PackageManager(this.config, packageName, this.logger);
}
private async _getData(): Promise<LocalStorage> {
if (!this._localData) {
this._localData = await new Promise((resolve, reject): void => {
const { bucket, keyPrefix } = this.config;
this.logger.debug(
{ keyPrefix, bucket },
's3: [_getData] bucket: @{bucket} prefix: @{keyPrefix}'
);
this.logger.trace('s3: [_getData] get database object');
this.s3.getObject(
{
Bucket: bucket,
Key: `${keyPrefix}verdaccio-s3-db.json`,
},
(err, response) => {
if (err) {
const s3Err: VerdaccioError = convertS3Error(err);
this.logger.error({ err: s3Err.message }, 's3: [_getData] err: @{err}');
if (is404Error(s3Err)) {
this.logger.error('s3: [_getData] err 404 create new database');
resolve({ list: [], secret: '' });
} else {
reject(err);
}
return;
}
const body = response.Body ? response.Body.toString() : '';
const data = JSON.parse(body);
this.logger.trace({ body }, 's3: [_getData] get data @{body}');
resolve(data);
}
);
});
} else {
this.logger.trace('s3: [_getData] already exist');
}
return this._localData as LocalStorage;
}
public saveToken(token: Token): Promise<void> {
this.logger.warn({ token }, 'save token has not been implemented yet @{token}');
return Promise.reject(getServiceUnavailable('[saveToken] method not implemented'));
}
public deleteToken(user: string, tokenKey: string): Promise<void> {
this.logger.warn({ tokenKey, user }, 'delete token has not been implemented yet @{user}');
return Promise.reject(getServiceUnavailable('[deleteToken] method not implemented'));
}
public readTokens(filter: TokenFilter): Promise<Token[]> {
this.logger.warn({ filter }, 'read tokens has not been implemented yet @{filter}');
return Promise.reject(getServiceUnavailable('[readTokens] method not implemented'));
}
}

View file

@ -0,0 +1,48 @@
import { AWSError } from 'aws-sdk';
import {
getNotFound,
getCode,
getInternalError,
getConflict,
API_ERROR,
HTTP_STATUS,
VerdaccioError,
} from '@verdaccio/commons-api';
export function is404Error(err: VerdaccioError): boolean {
return err.code === HTTP_STATUS.NOT_FOUND;
}
export function create404Error(): VerdaccioError {
return getNotFound('no such package available');
}
export function is409Error(err: VerdaccioError): boolean {
return err.code === HTTP_STATUS.CONFLICT;
}
export function create409Error(): VerdaccioError {
return getConflict('file already exists');
}
export function is503Error(err: VerdaccioError): boolean {
return err.code === HTTP_STATUS.SERVICE_UNAVAILABLE;
}
export function create503Error(): VerdaccioError {
return getCode(HTTP_STATUS.SERVICE_UNAVAILABLE, 'resource temporarily unavailable');
}
export function convertS3Error(err: AWSError): VerdaccioError {
switch (err.code) {
case 'NoSuchKey':
case 'NotFound':
return getNotFound();
case 'StreamContentLengthMismatch':
return getInternalError(API_ERROR.CONTENT_MISMATCH);
case 'RequestAbortedError':
return getInternalError('request aborted');
default:
return getCode(err.statusCode, err.message);
}
}

View file

@ -0,0 +1,502 @@
import { S3, AWSError } from 'aws-sdk';
import { UploadTarball, ReadTarball } from '@verdaccio/streams';
import { HEADERS, HTTP_STATUS, VerdaccioError } from '@verdaccio/commons-api';
import {
Callback,
Logger,
Package,
ILocalPackageManager,
CallbackAction,
ReadPackageCallback,
} from '@verdaccio/types';
import { HttpError } from 'http-errors';
import { is404Error, convertS3Error, create409Error } from './s3Errors';
import { deleteKeyPrefix } from './deleteKeyPrefix';
import { S3Config } from './config';
import addTrailingSlash from './addTrailingSlash';
const pkgFileName = 'package.json';
export default class S3PackageManager implements ILocalPackageManager {
public config: S3Config;
public logger: Logger;
private readonly packageName: string;
private readonly s3: S3;
private readonly packagePath: string;
public constructor(config: S3Config, packageName: string, logger: Logger) {
this.config = config;
this.packageName = packageName;
this.logger = logger;
const {
endpoint,
region,
s3ForcePathStyle,
accessKeyId,
secretAccessKey,
sessionToken,
} = config;
this.s3 = new S3({
endpoint,
region,
s3ForcePathStyle,
accessKeyId,
secretAccessKey,
sessionToken,
});
this.logger.trace(
{ packageName },
's3: [S3PackageManager constructor] packageName @{packageName}'
);
this.logger.trace({ endpoint }, 's3: [S3PackageManager constructor] endpoint @{endpoint}');
this.logger.trace({ region }, 's3: [S3PackageManager constructor] region @{region}');
this.logger.trace(
{ s3ForcePathStyle },
's3: [S3PackageManager constructor] s3ForcePathStyle @{s3ForcePathStyle}'
);
this.logger.trace(
{ accessKeyId },
's3: [S3PackageManager constructor] accessKeyId @{accessKeyId}'
);
this.logger.trace(
{ secretAccessKey },
's3: [S3PackageManager constructor] secretAccessKey @{secretAccessKey}'
);
this.logger.trace(
{ sessionToken },
's3: [S3PackageManager constructor] sessionToken @{sessionToken}'
);
const packageAccess = this.config.getMatchedPackagesSpec(packageName);
if (packageAccess) {
const storage = packageAccess.storage;
const packageCustomFolder = addTrailingSlash(storage);
this.packagePath = `${this.config.keyPrefix}${packageCustomFolder}${this.packageName}`;
} else {
this.packagePath = `${this.config.keyPrefix}${this.packageName}`;
}
}
public updatePackage(
name: string,
updateHandler: Callback,
onWrite: Callback,
transformPackage: Function,
onEnd: Callback
): void {
this.logger.debug({ name }, 's3: [S3PackageManager updatePackage init] @{name}');
(async (): Promise<any> => {
try {
const json = await this._getData();
updateHandler(json, (err) => {
if (err) {
this.logger.error(
{ err },
's3: [S3PackageManager updatePackage updateHandler onEnd] @{err}'
);
onEnd(err);
} else {
const transformedPackage = transformPackage(json);
this.logger.debug(
{ transformedPackage },
's3: [S3PackageManager updatePackage updateHandler onWrite] @{transformedPackage}'
);
onWrite(name, transformedPackage, onEnd);
}
});
} catch (err) {
this.logger.error(
{ err },
's3: [S3PackageManager updatePackage updateHandler onEnd catch] @{err}'
);
return onEnd(err);
}
})();
}
private async _getData(): Promise<unknown> {
this.logger.debug('s3: [S3PackageManager _getData init]');
return await new Promise((resolve, reject): void => {
this.s3.getObject(
{
Bucket: this.config.bucket,
Key: `${this.packagePath}/${pkgFileName}`,
},
(err, response) => {
if (err) {
this.logger.error({ err: err.message }, 's3: [S3PackageManager _getData] aws @{err}');
const error: HttpError = convertS3Error(err);
this.logger.error({ error: err.message }, 's3: [S3PackageManager _getData] @{error}');
reject(error);
return;
}
const body = response.Body ? response.Body.toString() : '';
let data;
try {
data = JSON.parse(body);
} catch (e) {
this.logger.error({ body }, 's3: [S3PackageManager _getData] error parsing: @{body}');
reject(e);
return;
}
this.logger.trace({ data }, 's3: [S3PackageManager _getData body] @{data.name}');
resolve(data);
}
);
});
}
public deletePackage(fileName: string, callback: Callback): void {
this.s3.deleteObject(
{
Bucket: this.config.bucket,
Key: `${this.packagePath}/${fileName}`,
},
(err) => {
if (err) {
callback(err);
} else {
callback(null);
}
}
);
}
public removePackage(callback: CallbackAction): void {
deleteKeyPrefix(
this.s3,
{
Bucket: this.config.bucket,
Prefix: `${this.packagePath}`,
},
function (err) {
if (err && is404Error(err as VerdaccioError)) {
callback(null);
} else {
callback(err);
}
}
);
}
public createPackage(name: string, value: Package, callback: CallbackAction): void {
this.logger.debug(
{ name, packageName: this.packageName },
's3: [S3PackageManager createPackage init] name @{name}/@{packageName}'
);
this.logger.trace({ value }, 's3: [S3PackageManager createPackage init] name @value');
this.s3.headObject(
{
Bucket: this.config.bucket,
Key: `${this.packagePath}/${pkgFileName}`,
},
(err, data) => {
if (err) {
const s3Err = convertS3Error(err);
// only allow saving if this file doesn't exist already
if (is404Error(s3Err)) {
this.logger.debug(
{ s3Err },
's3: [S3PackageManager createPackage] 404 package not found]'
);
this.savePackage(name, value, callback);
this.logger.trace(
{ data },
's3: [S3PackageManager createPackage] package saved data from s3: @data'
);
} else {
this.logger.error(
{ s3Err: s3Err.message },
's3: [S3PackageManager createPackage error] @s3Err'
);
callback(s3Err);
}
} else {
this.logger.debug('s3: [S3PackageManager createPackage ] package exist already');
callback(create409Error());
}
}
);
}
public savePackage(name: string, value: Package, callback: CallbackAction): void {
this.logger.debug(
{ name, packageName: this.packageName },
's3: [S3PackageManager savePackage init] name @{name}/@{packageName}'
);
this.logger.trace({ value }, 's3: [S3PackageManager savePackage ] init value @value');
this.s3.putObject(
{
// TODO: not sure whether save the object with spaces will increase storage size
Body: JSON.stringify(value, null, ' '),
Bucket: this.config.bucket,
Key: `${this.packagePath}/${pkgFileName}`,
},
callback
);
}
public readPackage(name: string, callback: ReadPackageCallback): void {
this.logger.debug(
{ name, packageName: this.packageName },
's3: [S3PackageManager readPackage init] name @{name}/@{packageName}'
);
(async (): Promise<void> => {
try {
const data: Package = (await this._getData()) as Package;
this.logger.trace(
{ data, packageName: this.packageName },
's3: [S3PackageManager readPackage] packageName: @{packageName} / data @data'
);
callback(null, data);
} catch (err) {
this.logger.error({ err: err.message }, 's3: [S3PackageManager readPackage] @{err}');
callback(err);
}
})();
}
public writeTarball(name: string): UploadTarball {
this.logger.debug(
{ name, packageName: this.packageName },
's3: [S3PackageManager writeTarball init] name @{name}/@{packageName}'
);
const uploadStream = new UploadTarball({});
let streamEnded = 0;
uploadStream.on('end', () => {
this.logger.debug(
{ name, packageName: this.packageName },
's3: [S3PackageManager writeTarball event: end] name @{name}/@{packageName}'
);
streamEnded = 1;
});
const baseS3Params = {
Bucket: this.config.bucket,
Key: `${this.packagePath}/${name}`,
};
// NOTE: I'm using listObjectVersions so I don't have to download the full object with getObject.
// Preferably, I'd use getObjectMetadata or getDetails when it's available in the node sdk
// TODO: convert to headObject
this.s3.headObject(
{
Bucket: this.config.bucket,
Key: `${this.packagePath}/${name}`,
},
(err) => {
if (err) {
const convertedErr = convertS3Error(err);
this.logger.error(
{ error: convertedErr.message },
's3: [S3PackageManager writeTarball headObject] @{error}'
);
if (is404Error(convertedErr) === false) {
this.logger.error(
{
error: convertedErr.message,
},
's3: [S3PackageManager writeTarball headObject] non a 404 emit error: @{error}'
);
uploadStream.emit('error', convertedErr);
} else {
this.logger.debug('s3: [S3PackageManager writeTarball managedUpload] init stream');
const managedUpload = this.s3.upload(
Object.assign({}, baseS3Params, { Body: uploadStream })
);
// NOTE: there's a managedUpload.promise, but it doesn't seem to work
const promise = new Promise((resolve): void => {
this.logger.debug('s3: [S3PackageManager writeTarball managedUpload] send');
managedUpload.send((err, data) => {
if (err) {
const error: HttpError = convertS3Error(err);
this.logger.error(
{ error: error.message },
's3: [S3PackageManager writeTarball managedUpload send] emit error @{error}'
);
uploadStream.emit('error', error);
} else {
this.logger.trace(
{ data },
's3: [S3PackageManager writeTarball managedUpload send] response @{data}'
);
resolve();
}
});
this.logger.debug(
{ name },
's3: [S3PackageManager writeTarball uploadStream] emit open @{name}'
);
uploadStream.emit('open');
});
uploadStream.done = (): void => {
const onEnd = async (): Promise<void> => {
try {
await promise;
this.logger.debug(
's3: [S3PackageManager writeTarball uploadStream done] emit success'
);
uploadStream.emit('success');
} catch (err) {
// already emitted in the promise above, necessary because of some issues
// with promises in jest
this.logger.error(
{ err },
's3: [S3PackageManager writeTarball uploadStream done] error @{err}'
);
}
};
if (streamEnded) {
this.logger.trace(
{ name },
's3: [S3PackageManager writeTarball uploadStream] streamEnded true @{name}'
);
onEnd();
} else {
this.logger.trace(
{ name },
's3: [S3PackageManager writeTarball uploadStream] streamEnded false emit end @{name}'
);
uploadStream.on('end', onEnd);
}
};
uploadStream.abort = (): void => {
this.logger.debug('s3: [S3PackageManager writeTarball uploadStream abort] init');
try {
this.logger.debug('s3: [S3PackageManager writeTarball managedUpload abort]');
managedUpload.abort();
} catch (err) {
const error: HttpError = convertS3Error(err);
uploadStream.emit('error', error);
this.logger.error(
{ error },
's3: [S3PackageManager writeTarball uploadStream error] emit error @{error}'
);
} finally {
this.logger.debug(
{ name, baseS3Params },
's3: [S3PackageManager writeTarball uploadStream abort] s3.deleteObject @{name}/@baseS3Params'
);
this.s3.deleteObject(baseS3Params);
}
};
}
} else {
this.logger.debug(
{ name },
's3: [S3PackageManager writeTarball headObject] emit error @{name} 409'
);
uploadStream.emit('error', create409Error());
}
}
);
return uploadStream;
}
public readTarball(name: string): ReadTarball {
this.logger.debug(
{ name, packageName: this.packageName },
's3: [S3PackageManager readTarball init] name @{name}/@{packageName}'
);
const readTarballStream = new ReadTarball({});
const request = this.s3.getObject({
Bucket: this.config.bucket,
Key: `${this.packagePath}/${name}`,
});
let headersSent = false;
const readStream = request
.on('httpHeaders', (statusCode, headers) => {
// don't process status code errors here, we'll do that in readStream.on('error'
// otherwise they'll be processed twice
// verdaccio force garbage collects a stream on 404, so we can't emit more
// than one error or it'll fail
// https://github.com/verdaccio/verdaccio/blob/c1bc261/src/lib/storage.js#L178
this.logger.debug(
{ name, packageName: this.packageName },
's3: [S3PackageManager readTarball httpHeaders] name @{name}/@{packageName}'
);
this.logger.trace(
{ headers },
's3: [S3PackageManager readTarball httpHeaders event] headers @headers'
);
this.logger.trace(
{ statusCode },
's3: [S3PackageManager readTarball httpHeaders event] statusCode @statusCode'
);
if (statusCode !== HTTP_STATUS.NOT_FOUND) {
if (headers[HEADERS.CONTENT_LENGTH]) {
const contentLength = parseInt(headers[HEADERS.CONTENT_LENGTH], 10);
// not sure this is necessary
if (headersSent) {
return;
}
headersSent = true;
this.logger.debug(
's3: [S3PackageManager readTarball readTarballStream event] emit content-length'
);
readTarballStream.emit(HEADERS.CONTENT_LENGTH, contentLength);
// we know there's content, so open the stream
readTarballStream.emit('open');
this.logger.debug(
's3: [S3PackageManager readTarball readTarballStream event] emit open'
);
}
} else {
this.logger.trace(
's3: [S3PackageManager readTarball httpHeaders event] not found, avoid emit open file'
);
}
})
.createReadStream();
readStream.on('error', (err) => {
const error: HttpError = convertS3Error(err as AWSError);
readTarballStream.emit('error', error);
this.logger.error(
{ error: error.message },
's3: [S3PackageManager readTarball readTarballStream event] error @{error}'
);
});
this.logger.trace('s3: [S3PackageManager readTarball readTarballStream event] pipe');
readStream.pipe(readTarballStream);
readTarballStream.abort = (): void => {
this.logger.debug('s3: [S3PackageManager readTarball readTarballStream event] request abort');
request.abort();
this.logger.debug(
's3: [S3PackageManager readTarball readTarballStream event] request destroy'
);
readStream.destroy();
};
return readTarballStream;
}
}

View file

@ -0,0 +1,4 @@
export default (configValue: any): string => {
const envValue = process.env[configValue];
return envValue || configValue;
};

View file

@ -0,0 +1,56 @@
import { Package } from '@verdaccio/types';
const json: Package = {
_id: '@scope/pk1-test',
name: '@scope/pk1-test',
description: '',
'dist-tags': {
latest: '1.0.6',
},
versions: {
'1.0.6': {
name: '@scope/pk1-test',
version: '1.0.6',
description: '',
main: 'index.js',
scripts: {
test: 'echo "Error: no test specified" && exit 1',
},
keywords: [],
author: {
name: 'Juan Picado',
email: 'juan@jotadeveloper.com',
},
license: 'ISC',
dependencies: {
verdaccio: '^2.7.2',
},
readme: '# test',
readmeFilename: 'README.md',
_id: '@scope/pk1-test@1.0.6',
// @ts-ignore
_npmVersion: '5.5.1',
_nodeVersion: '8.7.0',
_npmUser: {
name: '',
},
dist: {
integrity:
'sha512-6gHiERpiDgtb3hjqpQH5/i7zRmvYi9pmCjQf2ZMy3QEa9wVk9RgdZaPWUt7ZOnWUPFjcr9cmE6dUBf+XoPoH4g==',
shasum: '2c03764f651a9f016ca0b7620421457b619151b9',
tarball: 'http://localhost:5555/@scope/pk1-test/-/@scope/pk1-test-1.0.6.tgz',
},
},
},
readme: '# test',
_attachments: {
'@scope/pk1-test-1.0.6.tgz': {
content_type: 'application/octet-stream',
data:
'H4sIAAAAAAAAE+2W32vbMBDH85y/QnjQp9qxLEeBMsbGlocNBmN7bFdQ5WuqxJaEpGQdo//79KPeQsnIw5KUDX/9IOvurLuz/DHSjK/YAiY6jcXSKjk6sMqypHWNdtmD6hlBI0wqQmo8nVbVqMR4OsNoVB66kF1aW8eML+Vv10m9oF/jP6IfY4QyyTrILlD2eqkcm+gVzpdrJrPz4NuAsULJ4MZFWdBkbcByI7R79CRjx0ScCdnAvf+SkjUFWu8IubzBgXUhDPidQlfZ3BhlLpBUKDiQ1cDFrYDmKkNnZwjuhUM4808+xNVW8P2bMk1Y7vJrtLC1u1MmLPjBF40+Cc4ahV6GDmI/DWygVRpMwVX3KtXUCg7Sxp7ff3nbt6TBFy65gK1iffsN41yoEHtdFbOiisWMH8bPvXUH0SP3k+KG3UBr+DFy7OGfEJr4x5iWVeS/pLQe+D+FIv/agIWI6GX66kFuIhT+1gDjrp/4d7WAvAwEJPh0u14IufWkM0zaW2W6nLfM2lybgJ4LTJ0/jWiAK8OcMjt8MW3OlfQppcuhhQ6k+2OgkK2Q8DssFPi/IHpU9fz3/+xj5NjDf8QFE39VmE4JDfzPCBn4P4X6/f88f/Pu47zomiPk2Lv/dOv8h+P/34/D/p9CL+Kp67mrGDRo0KBBp9ZPsETQegASAAA=',
length: 512,
},
},
};
export default json;

View file

@ -0,0 +1,56 @@
{
"name": "readme-test",
"versions": {
"0.0.0": {
"name": "test-readme",
"version": "0.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"repository": {
"type": "git",
"url": ""
},
"author": "",
"license": "ISC",
"_id": "test-readme@0.0.0",
"dist": {
"shasum": "8ee7331cbc641581b1a8cecd9d38d744a8feb863",
"tarball": "http://localhost:1234/test-readme/-/test-readme-0.0.0.tgz"
},
"_from": ".",
"_npmVersion": "1.3.1",
"_npmUser": {
"name": "alex",
"email": "alex@kocharin.ru"
},
"maintainers": [
{
"name": "juan",
"email": "juanpicado19@gmail.com"
}
]
}
},
"dist-tags": {
"foo": "0.0.0",
"latest": "0.0.0"
},
"time": {
"modified": "2017-10-06T20:30:38.721Z",
"created": "2017-10-06T20:30:38.721Z",
"0.0.0": "2017-10-06T20:30:38.721Z"
},
"_distfiles": {},
"_attachments": {
"test-readme-0.0.0.tgz": {
"shasum": "8ee7331cbc641581b1a8cecd9d38d744a8feb863",
"version": "0.0.0"
}
},
"_uplinks": {},
"_rev": "5-d647003b88ff08a0",
"readme": "this is a readme"
}

View file

@ -0,0 +1,58 @@
export default class Config {
constructor() {
this.storage = './test-storage';
this.listen = 'http://localhost:1443/';
this.auth = {
htpasswd: {
file: './htpasswd',
max_users: 1000,
},
};
this.uplinks = {
npmjs: {
url: 'https://registry.npmjs.org',
cache: true,
},
};
this.packages = {
'@*/*': {
access: ['$all'],
publish: ['$authenticated'],
proxy: [],
},
'*': {
access: ['$all'],
publish: ['$authenticated'],
proxy: ['npmjs'],
},
'**': {
access: [],
publish: [],
proxy: [],
},
};
this.logs = [
{
type: 'stdout',
format: 'pretty',
level: 35,
},
];
this.self_path = './src/___tests___/__fixtures__/config.yaml';
this.https = {
enable: false,
};
this.user_agent = 'verdaccio/3.0.0-alpha.7';
this.users = {};
this.server_id = 'severMockId';
this.checkSecretKey = (secret): string => {
if (!secret) {
const newSecret = 'superNewSecret';
this.secret = newSecret;
return newSecret;
}
return secret;
};
}
}

View file

@ -0,0 +1,13 @@
import { Logger } from '@verdaccio/types';
const logger: Logger = {
error: (e) => console.warn(e),
info: (e) => console.warn(e),
debug: (e) => console.warn(e),
warn: (e) => console.warn(e),
child: (e) => console.warn(e),
http: (e) => console.warn(e),
trace: (e) => console.warn(e),
};
export default logger;

View file

@ -0,0 +1,142 @@
import { S3 } from 'aws-sdk';
import { IPluginStorage } from '@verdaccio/types';
import S3Database from '../src/index';
import { deleteKeyPrefix } from '../src/deleteKeyPrefix';
import { is404Error } from '../src/s3Errors';
import { S3Config } from '../src/config';
import logger from './__mocks__/Logger';
import Config from './__mocks__/Config';
describe.skip('Local Database', () => {
let db: IPluginStorage<S3Config>;
let config;
// random key for testing
const keyPrefix = `test/${Math.floor(Math.random() * Math.pow(10, 8))}`;
const bucket = process.env.VERDACCIO_TEST_BUCKET;
if (!bucket) {
throw new Error('no bucket specified via VERDACCIO_TEST_BUCKET env var');
}
beforeEach(() => {
config = Object.assign(new Config(), {
store: {
's3-storage': {
bucket,
keyPrefix,
},
},
});
db = new S3Database(config, { logger, config });
});
afterEach(async () => {
const s3 = new S3();
// snapshot test the final state of s3
await new Promise((resolve, reject): void => {
s3.listObjectsV2(
{ Bucket: bucket, Prefix: config.store['s3-storage'].keyPrefix },
(err, data) => {
if (err) {
reject(err);
return;
}
expect(data.IsTruncated).toBe(false); // none of the tests we do should create this much data
// remove the stuff that changes from the results
expect(
data.Contents.map(({ Key, Size }) => ({
Key: Key.split(keyPrefix)[1],
Size,
}))
).toMatchSnapshot();
resolve();
}
);
});
// clean up s3
try {
await new Promise((resolve, reject): void => {
deleteKeyPrefix(
s3,
{
Bucket: bucket,
Prefix: keyPrefix,
},
(err) => {
if (err) {
reject(err);
} else {
resolve();
}
}
);
});
} catch (err) {
if (is404Error(err)) {
// ignore
} else {
throw err;
}
}
});
test('should create an instance', () => {
expect(db).toBeDefined();
});
describe('manages a secret', () => {
test('should create get secret', async () => {
const secretKey = await db.getSecret();
expect(secretKey).toBeDefined();
expect(typeof secretKey === 'string').toBeTruthy();
});
test('should create set secret', async () => {
await db.setSecret(config.checkSecretKey());
expect(config.secret).toBeDefined();
expect(typeof config.secret === 'string').toBeTruthy();
const fetchedSecretKey = await db.getSecret();
expect(config.secret).toBe(fetchedSecretKey);
});
});
describe('Database CRUD', () => {
test('should add an item to database', (done) => {
const pgkName = 'jquery';
db.get((err, data) => {
expect(err).toBeNull();
expect(data).toHaveLength(0);
db.add(pgkName, (err) => {
expect(err).toBeNull();
db.get((err, data) => {
expect(err).toBeNull();
expect(data).toHaveLength(1);
done();
});
});
});
});
test('should remove an item to database', (done) => {
const pgkName = 'jquery';
db.get((err, data) => {
expect(err).toBeNull();
expect(data).toHaveLength(0);
db.add(pgkName, (err) => {
expect(err).toBeNull();
db.remove(pgkName, (err) => {
expect(err).toBeNull();
db.get((err, data) => {
expect(err).toBeNull();
expect(data).toHaveLength(0);
done();
});
});
});
});
});
});
});

View file

@ -0,0 +1,332 @@
import path from 'path';
import fs from 'fs';
import { S3 } from 'aws-sdk';
import rReadDir from 'recursive-readdir';
import { Package } from '@verdaccio/types';
import S3PackageManager from '../src/s3PackageManager';
import { deleteKeyPrefix } from '../src/deleteKeyPrefix';
import { create404Error, create409Error, is404Error } from '../src/s3Errors';
import { S3Config } from '../src/config';
import logger from './__mocks__/Logger';
import pkg from './__fixtures__/pkg';
const pkgFileName = 'package.json';
describe.skip('S3 package manager', () => {
// random key for testing
const keyPrefix = `test/${Math.floor(Math.random() * Math.pow(10, 8))}`;
const bucket = process.env.VERDACCIO_TEST_BUCKET;
if (!bucket) {
throw new Error('no bucket specified via VERDACCIO_TEST_BUCKET env var');
}
const config: S3Config = {
bucket,
keyPrefix: `${keyPrefix}/`,
} as S3Config;
afterEach(async () => {
const s3 = new S3();
// snapshot test the final state of s3
await new Promise((resolve, reject): void => {
s3.listObjectsV2({ Bucket: bucket, Prefix: config.keyPrefix }, (err, data) => {
if (err) {
reject(err);
return;
}
expect(data.IsTruncated).toBe(false); // none of the tests we do should create this much data
// remove the stuff that changes from the results
expect(
data.Contents.map(({ Key, Size }) => ({
Key: Key.split(keyPrefix)[1],
Size,
}))
).toMatchSnapshot();
resolve();
});
});
// clean up s3
try {
await new Promise((resolve, reject): void => {
deleteKeyPrefix(
s3,
{
Bucket: bucket,
Prefix: keyPrefix,
},
(err) => {
if (err) {
reject(err);
} else {
resolve();
}
}
);
});
} catch (err) {
if (is404Error(err)) {
// ignore
} else {
throw err;
}
}
});
describe('savePackage() group', () => {
test('savePackage()', (done) => {
const data = ('{data:5}' as unknown) as Package;
const packageManager = new S3PackageManager(config, 'first-package', logger);
packageManager.savePackage('pkg.1.0.0.tar.gz', data, (err) => {
expect(err).toBeNull();
done();
});
});
});
async function syncFixtureDir(fixture): Promise<void> {
const s3 = new S3();
const dir = path.join(__dirname, '__fixtures__');
const filenames = await new Promise<string[]>((resolve, reject): void =>
rReadDir(path.join(dir, fixture), (err, filenames) => {
if (err) {
reject(err);
return;
}
resolve(filenames);
})
);
await Promise.all(
filenames.map(
(filename) =>
new Promise((resolve, reject): void => {
const key = `${config.keyPrefix}${path.relative(dir, filename)}`;
fs.readFile(filename, (err, data) => {
if (err) {
reject(err);
return;
}
s3.upload({ Bucket: bucket, Key: key, Body: data }).send((err) => {
if (err) {
reject(err);
return;
}
resolve();
});
});
})
)
);
}
describe('readPackage() group', () => {
test('readPackage() success', async (done) => {
await syncFixtureDir('readme-test');
const packageManager = new S3PackageManager(config, 'readme-test', logger);
packageManager.readPackage(pkgFileName, (err) => {
expect(err).toBeNull();
done();
});
});
test('readPackage() fails', async (done) => {
await syncFixtureDir('readme-test');
const packageManager = new S3PackageManager(config, 'readme-test', logger);
packageManager.readPackage(pkgFileName, (err) => {
expect(err).toBeTruthy();
done();
});
});
test('readPackage() fails corrupt', async (done) => {
await syncFixtureDir('readme-test-corrupt');
const packageManager = new S3PackageManager(config, 'readme-test-corrupt', logger);
packageManager.readPackage('corrupt.js', (err) => {
expect(err).toBeTruthy();
done();
});
});
});
describe('createPackage() group', () => {
test('createPackage()', (done) => {
const packageManager = new S3PackageManager(config, 'createPackage', logger);
packageManager.createPackage('package5', pkg, (err) => {
expect(err).toBeNull();
done();
});
});
test('createPackage() fails by fileExist', (done) => {
const packageManager = new S3PackageManager(config, 'createPackage', logger);
packageManager.createPackage('package5', pkg, (err) => {
expect(err).toBeNull();
packageManager.createPackage('package5', pkg, (err) => {
expect(err).not.toBeNull();
expect(err.code).toBe(create409Error().code); // file exists
done();
});
});
});
describe('deletePackage() group', () => {
test('deletePackage()', (done) => {
const packageManager = new S3PackageManager(config, 'createPackage', logger);
// verdaccio removes the package.json instead the package name
packageManager.deletePackage('package.json', (err) => {
expect(err).toBeNull();
done();
});
});
});
});
describe('removePackage() group', () => {
test('removePackage() success', (done) => {
const packageManager = new S3PackageManager(config, '_toDelete', logger);
packageManager.createPackage('package5', pkg, (err) => {
expect(err).toBeNull();
packageManager.removePackage((error) => {
expect(error).toBeNull();
done();
});
});
});
test('removePackage() fails', (done) => {
const packageManager = new S3PackageManager(config, '_toDelete_fake', logger);
packageManager.removePackage((error) => {
expect(error).toBeTruthy();
expect(error.code).toBe(create404Error().code); // file exists
done();
});
});
});
describe('readTarball() group', () => {
test('readTarball() success', async (done) => {
await syncFixtureDir('readme-test');
const packageManager = new S3PackageManager(config, 'readme-test', logger);
const readTarballStream = packageManager.readTarball('test-readme-0.0.0.tgz');
readTarballStream.on('error', (err) => {
expect(err).toBeNull();
});
readTarballStream.on('content-length', (content) => {
expect(content).toBe(352);
});
readTarballStream.on('end', () => {
done();
});
readTarballStream.on('data', (data) => {
expect(data).toBeDefined();
});
});
test('readTarball() fails', async (done) => {
await syncFixtureDir('readme-test');
const packageManager = new S3PackageManager(config, 'readme-test', logger);
const readTarballStream = packageManager.readTarball('file-does-not-exist-0.0.0.tgz');
readTarballStream.on('error', function (err) {
expect(err).toBeTruthy();
done();
});
});
});
describe('writeTarball() group', () => {
test('writeTarball() success', async (done) => {
await syncFixtureDir('readme-test');
const newFileName = 'new-readme-0.0.0.tgz';
const readmeStorage = new S3PackageManager(config, 'readme-test', logger);
const writeStorage = new S3PackageManager(config, 'write-storage', logger);
const readTarballStream = readmeStorage.readTarball('test-readme-0.0.0.tgz');
const writeTarballStream = writeStorage.writeTarball(newFileName);
writeTarballStream.on('error', function (err) {
expect(err).toBeNull();
done.fail(new Error("shouldn't have errored"));
});
writeTarballStream.on('success', () => {
done();
});
readTarballStream.on('end', () => {
writeTarballStream.done();
});
writeTarballStream.on('data', (data) => {
expect(data).toBeDefined();
});
readTarballStream.on('error', (err) => {
expect(err).toBeNull();
done.fail(new Error("shouldn't have errored"));
});
readTarballStream.pipe(writeTarballStream);
});
test('writeTarball() fails on existing file', async (done) => {
await syncFixtureDir('readme-test');
const newFileName = 'test-readme-0.0.0.tgz';
const storage = new S3PackageManager(config, 'readme-test', logger);
const readTarballStream = storage.readTarball('test-readme-0.0.0.tgz');
const writeTarballStream = storage.writeTarball(newFileName);
writeTarballStream.on('error', (err) => {
expect(err).toBeTruthy();
expect(err.code).toBe('EEXISTS');
done();
});
readTarballStream.pipe(writeTarballStream);
});
test('writeTarball() abort', async (done) => {
await syncFixtureDir('readme-test');
const newFileName = 'new-readme-abort-0.0.0.tgz';
const readmeStorage = new S3PackageManager(config, 'readme-test', logger);
const writeStorage = new S3PackageManager(config, 'write-storage', logger);
const readTarballStream = readmeStorage.readTarball('test-readme-0.0.0.tgz');
const writeTarballStream = writeStorage.writeTarball(newFileName);
writeTarballStream.on('error', (err) => {
expect(err).toBeTruthy();
done();
});
writeTarballStream.on('data', (data) => {
expect(data).toBeDefined();
writeTarballStream.abort();
});
readTarballStream.pipe(writeTarballStream);
});
});
});

View file

@ -0,0 +1,443 @@
import { PackageAccess } from '@verdaccio/types';
import S3PackageManager from '../src/s3PackageManager';
import { S3Config } from '../src/config';
import logger from './__mocks__/Logger';
import pkg from './__fixtures__/pkg';
const mockHeadObject = jest.fn();
const mockPutObject = jest.fn();
const mockDeleteObject = jest.fn();
const mockListObject = jest.fn();
const mockDeleteObjects = jest.fn();
const mockGetObject = jest.fn();
jest.mock('aws-sdk', () => ({
S3: jest.fn().mockImplementation(() => ({
headObject: mockHeadObject,
putObject: mockPutObject,
deleteObject: mockDeleteObject,
listObjectsV2: mockListObject,
deleteObjects: mockDeleteObjects,
getObject: mockGetObject,
})),
}));
describe('S3PackageManager with mocked s3', function () {
beforeEach(() => {
mockHeadObject.mockClear();
mockPutObject.mockClear();
mockDeleteObject.mockClear();
mockDeleteObjects.mockClear();
mockListObject.mockClear();
mockGetObject.mockClear();
});
test('existing packages on s3 are not recreated', (done) => {
expect.assertions(1);
const config: S3Config = {
bucket: 'test-bucket',
keyPrefix: 'keyPrefix/',
getMatchedPackagesSpec: jest.fn(() => null) as PackageAccess,
} as S3Config;
mockHeadObject.mockImplementation((params, callback) => {
callback();
});
const testPackageManager = new S3PackageManager(config, 'test-package', logger);
testPackageManager.createPackage('test-0.0.0.tgz', pkg, (err) => {
expect(err.message).toEqual('file already exists');
done();
});
});
test('new package is created on s3', (done) => {
expect.assertions(2);
const config: S3Config = {
bucket: 'test-bucket',
keyPrefix: 'keyPrefix/',
getMatchedPackagesSpec: jest.fn(() => null) as PackageAccess,
} as S3Config;
mockHeadObject.mockImplementation((params, callback) => {
callback({ code: 'NoSuchKey' }, 'some data');
});
mockPutObject.mockImplementation((params, callback) => {
callback();
});
const testPackageManager = new S3PackageManager(config, 'test-package', logger);
testPackageManager.createPackage('test-0.0.0.tgz', pkg, (err) => {
expect(err).toBeUndefined();
expect(mockPutObject).toHaveBeenCalled();
done();
});
});
test('new package is uploaded to keyprefix if custom storage is not specified', (done) => {
expect.assertions(1);
const config: S3Config = {
bucket: 'test-bucket',
keyPrefix: 'testKeyPrefix/',
getMatchedPackagesSpec: jest.fn(() => null) as PackageAccess,
} as S3Config;
mockHeadObject.mockImplementation((params, callback) => {
callback({ code: 'NoSuchKey' }, 'some data');
});
mockPutObject.mockImplementation((params, callback) => {
callback();
});
const testPackageManager = new S3PackageManager(config, 'test-package', logger);
testPackageManager.createPackage('test-0.0.0.tgz', pkg, () => {
expect(mockPutObject).toHaveBeenCalledWith(
expect.objectContaining({
Bucket: 'test-bucket',
Key: 'testKeyPrefix/test-package/package.json',
}),
expect.any(Function)
);
done();
});
});
test('new package is uploaded to custom storage prefix as specified on package section in config', (done) => {
expect.assertions(2);
const config: S3Config = {
bucket: 'test-bucket',
keyPrefix: 'testKeyPrefix/',
getMatchedPackagesSpec: jest.fn(() => ({
storage: 'customFolder',
})) as PackageAccess,
} as S3Config;
mockHeadObject.mockImplementation((params, callback) => {
callback({ code: 'NoSuchKey' }, 'some data');
});
mockPutObject.mockImplementation((params, callback) => {
callback();
});
const testPackageManager = new S3PackageManager(config, '@company/test-package', logger);
testPackageManager.createPackage('test-0.0.0.tgz', pkg, () => {
expect(mockHeadObject).toHaveBeenCalledWith(
{
Bucket: 'test-bucket',
Key: 'testKeyPrefix/customFolder/@company/test-package/package.json',
},
expect.any(Function)
);
expect(mockPutObject).toHaveBeenCalledWith(
expect.objectContaining({
Bucket: 'test-bucket',
Key: 'testKeyPrefix/customFolder/@company/test-package/package.json',
}),
expect.any(Function)
);
done();
});
});
test('delete package with custom folder from s3 bucket', (done) => {
expect.assertions(1);
const config: S3Config = {
bucket: 'test-bucket',
keyPrefix: 'testKeyPrefix/',
getMatchedPackagesSpec: jest.fn(() => ({
storage: 'customFolder',
})) as PackageAccess,
} as S3Config;
mockDeleteObject.mockImplementation((params, callback) => {
callback();
});
const testPackageManager = new S3PackageManager(config, '@company/test-package', logger);
testPackageManager.deletePackage('test-0.0.0.tgz', () => {
expect(mockDeleteObject).toHaveBeenCalledWith(
{
Bucket: 'test-bucket',
Key: 'testKeyPrefix/customFolder/@company/test-package/test-0.0.0.tgz',
},
expect.any(Function)
);
done();
});
});
test('delete package from s3 bucket', (done) => {
expect.assertions(1);
const config: S3Config = {
bucket: 'test-bucket',
keyPrefix: 'testKeyPrefix/',
getMatchedPackagesSpec: jest.fn(() => ({})) as PackageAccess,
} as S3Config;
mockDeleteObject.mockImplementation((params, callback) => {
callback();
});
const testPackageManager = new S3PackageManager(config, '@company/test-package', logger);
testPackageManager.deletePackage('test-0.0.0.tgz', () => {
expect(mockDeleteObject).toHaveBeenCalledWith(
{
Bucket: 'test-bucket',
Key: 'testKeyPrefix/@company/test-package/test-0.0.0.tgz',
},
expect.any(Function)
);
done();
});
});
test('remove packages from s3 bucket', (done) => {
expect.assertions(2);
const config: S3Config = {
bucket: 'test-bucket',
keyPrefix: 'testKeyPrefix/',
getMatchedPackagesSpec: jest.fn(() => ({})) as PackageAccess,
} as S3Config;
mockListObject.mockImplementation((params, callback) => {
callback(null, { KeyCount: 1 });
});
mockDeleteObjects.mockImplementation((params, callback) => {
callback();
});
const testPackageManager = new S3PackageManager(config, '@company/test-package', logger);
testPackageManager.removePackage(() => {
expect(mockDeleteObjects).toHaveBeenCalledWith(
{
Bucket: 'test-bucket',
Delete: { Objects: [] },
},
expect.any(Function)
);
expect(mockListObject).toHaveBeenCalledWith(
{
Bucket: 'test-bucket',
Prefix: 'testKeyPrefix/@company/test-package',
},
expect.any(Function)
);
done();
});
});
test('remove packages with custom storage from s3 bucket', (done) => {
expect.assertions(2);
const config: S3Config = {
bucket: 'test-bucket',
keyPrefix: 'testKeyPrefix/',
getMatchedPackagesSpec: jest.fn(() => ({
storage: 'customFolder',
})) as PackageAccess,
} as S3Config;
mockListObject.mockImplementation((params, callback) => {
callback(null, { KeyCount: 1 });
});
mockDeleteObjects.mockImplementation((params, callback) => {
callback();
});
const testPackageManager = new S3PackageManager(config, '@company/test-package', logger);
testPackageManager.removePackage(() => {
expect(mockDeleteObjects).toHaveBeenCalledWith(
{
Bucket: 'test-bucket',
Delete: { Objects: [] },
},
expect.any(Function)
);
expect(mockListObject).toHaveBeenCalledWith(
{
Bucket: 'test-bucket',
Prefix: 'testKeyPrefix/customFolder/@company/test-package',
},
expect.any(Function)
);
done();
});
});
test('read packages with custom storage from s3 bucket', (done) => {
expect.assertions(1);
const config: S3Config = {
bucket: 'test-bucket',
keyPrefix: 'testKeyPrefix/',
getMatchedPackagesSpec: jest.fn(() => ({
storage: 'customStorage',
})) as PackageAccess,
} as S3Config;
mockGetObject.mockImplementation((params, callback) => {
callback(null, { Body: JSON.stringify({ some: 'data' }) });
});
const testPackageManager = new S3PackageManager(config, '@company/test-package', logger);
testPackageManager.readPackage('some package', () => {
expect(mockGetObject).toHaveBeenCalledWith(
{
Bucket: 'test-bucket',
Key: 'testKeyPrefix/customStorage/@company/test-package/package.json',
},
expect.any(Function)
);
done();
});
});
test('read packages from s3 bucket', (done) => {
expect.assertions(1);
const config: S3Config = {
bucket: 'test-bucket',
keyPrefix: 'testKeyPrefix/',
getMatchedPackagesSpec: jest.fn(() => null) as PackageAccess,
} as S3Config;
mockGetObject.mockImplementation((params, callback) => {
callback(null, { Body: JSON.stringify({ some: 'data' }) });
});
const testPackageManager = new S3PackageManager(config, '@company/test-package', logger);
testPackageManager.readPackage('some package', () => {
expect(mockGetObject).toHaveBeenCalledWith(
{
Bucket: 'test-bucket',
Key: 'testKeyPrefix/@company/test-package/package.json',
},
expect.any(Function)
);
done();
});
});
test('read tarballs from s3 bucket', () => {
expect.assertions(1);
const config: S3Config = {
bucket: 'test-bucket',
keyPrefix: 'testKeyPrefix/',
getMatchedPackagesSpec: jest.fn(() => null) as PackageAccess,
} as S3Config;
mockGetObject.mockImplementation((params) => {
return {
on: jest.fn(() => ({
createReadStream: jest.fn(() => ({
on: jest.fn(),
pipe: jest.fn(),
})),
})),
};
});
const testPackageManager = new S3PackageManager(config, '@company/test-package', logger);
testPackageManager.readTarball('tarballfile.gz');
expect(mockGetObject).toHaveBeenCalledWith({
Bucket: 'test-bucket',
Key: 'testKeyPrefix/@company/test-package/tarballfile.gz',
});
});
test('read tarballs for a custom folder from s3 bucket', () => {
expect.assertions(1);
const config: S3Config = {
bucket: 'test-bucket',
keyPrefix: 'testKeyPrefix/',
getMatchedPackagesSpec: jest.fn(() => ({
storage: 'customStorage',
})) as PackageAccess,
} as S3Config;
mockGetObject.mockImplementation((params) => {
return {
on: jest.fn(() => ({
createReadStream: jest.fn(() => ({
on: jest.fn(),
pipe: jest.fn(),
})),
})),
};
});
const testPackageManager = new S3PackageManager(config, '@company/test-package', logger);
testPackageManager.readTarball('tarballfile.gz');
expect(mockGetObject).toHaveBeenCalledWith({
Bucket: 'test-bucket',
Key: 'testKeyPrefix/customStorage/@company/test-package/tarballfile.gz',
});
});
test('write tarballs from s3 bucket', () => {
expect.assertions(1);
const config: S3Config = {
bucket: 'test-bucket',
keyPrefix: 'testKeyPrefix/',
getMatchedPackagesSpec: jest.fn(() => null) as PackageAccess,
} as S3Config;
mockHeadObject.mockImplementation(() => {});
const testPackageManager = new S3PackageManager(config, '@company/test-package', logger);
testPackageManager.writeTarball('tarballfile.gz');
expect(mockHeadObject).toHaveBeenCalledWith(
{
Bucket: 'test-bucket',
Key: 'testKeyPrefix/@company/test-package/tarballfile.gz',
},
expect.any(Function)
);
});
test('write tarballs with custom storage from s3 bucket', () => {
expect.assertions(1);
const config: S3Config = {
bucket: 'test-bucket',
keyPrefix: 'testKeyPrefix/',
getMatchedPackagesSpec: jest.fn(() => ({
storage: 'customStorage',
})) as PackageAccess,
} as S3Config;
const testPackageManager = new S3PackageManager(config, '@company/test-package', logger);
mockHeadObject.mockImplementation(() => {});
testPackageManager.writeTarball('tarballfile.gz');
expect(mockHeadObject).toHaveBeenCalledWith(
{
Bucket: 'test-bucket',
Key: 'testKeyPrefix/customStorage/@company/test-package/tarballfile.gz',
},
expect.any(Function)
);
});
});

View file

@ -0,0 +1,36 @@
import setConfigValue from '../src/setConfigValue';
describe('Setting config values', () => {
const bucket = 'TEST_AWS_S3_BUCKET_NAME';
const keyPrefix = 'TEST_AWS_S3_BUCKET_PREFIX';
const sessionToken = 'TEST_AWS_S3_SESSION_TOKEN';
afterEach(async () => {
delete process.env[bucket];
delete process.env[keyPrefix];
});
test('should fall back to value if environment variable is not set', () => {
const expected = bucket;
const actual = setConfigValue(bucket);
expect(actual === expected).toBeTruthy();
});
test('should use the environment variable value', async () => {
const expected = 'someBucket';
process.env[bucket] = expected;
const actual = setConfigValue(bucket);
expect(actual === expected).toBeTruthy();
});
// Session token is temporary and users will mostly set it as environment variable. Verify.
test('should use the environment variable value for session token', async () => {
const expected = 'mySessionToken';
process.env[sessionToken] = expected;
const actual = setConfigValue(sessionToken);
expect(actual === expected).toBeTruthy();
});
});

View file

@ -0,0 +1,9 @@
{
"extends": "../../../tsconfig.base",
"compilerOptions": {
"rootDir": "./src",
"outDir": "./build"
},
"include": ["src/**/*.ts"],
"exclude": ["src/**/*.test.ts"]
}

View file

@ -0,0 +1,20 @@
{
"extends": "../../../tsconfig.reference.json",
"compilerOptions": {
"rootDir": "./src",
"outDir": "./build"
},
"include": ["src/**/*", "types/*.d.ts"],
"exclude": ["src/**/*.test.ts"],
"references": [
{
"path": "../../core/commons-api"
},
{
"path": "../../core/streams"
},
{
"path": "../../core/types"
}
]
}

View file

@ -545,6 +545,20 @@ importers:
specifiers:
'@verdaccio/commons-api': 'workspace:*'
'@verdaccio/types': 'workspace:*'
packages/plugins/aws-storage:
dependencies:
'@verdaccio/commons-api': 'link:../../core/commons-api'
'@verdaccio/streams': 'link:../../core/streams'
aws-sdk: 2.778.0
devDependencies:
'@verdaccio/types': 'link:../../core/types'
recursive-readdir: 2.2.2
specifiers:
'@verdaccio/commons-api': 'workspace:*'
'@verdaccio/streams': 'workspace:*'
'@verdaccio/types': 'workspace:*'
aws-sdk: ^2.607.0
recursive-readdir: 2.2.2
packages/plugins/memory:
dependencies:
'@verdaccio/commons-api': 'link:../../core/commons-api'
@ -7033,6 +7047,22 @@ packages:
hasBin: true
resolution:
integrity: sha512-XrvP4VVHdRBCdX1S3WXVD8+RyG9qeb1D5Sn1DeLiG2xfSpzellk5k54xbUERJ3M5DggQxes39UGOTP8CFrEGbg==
/aws-sdk/2.778.0:
dependencies:
buffer: 4.9.2
events: 1.1.1
ieee754: 1.1.13
jmespath: 0.15.0
querystring: 0.2.0
sax: 1.2.1
url: 0.10.3
uuid: 3.3.2
xml2js: 0.4.19
dev: false
engines:
node: '>= 0.8.0'
resolution:
integrity: sha512-sIJRO7tMaztLs+gvHF/Wo+iek/rhH99+2OzharQJMS0HATPl5/EdhKgWGv1n/bNpVH+kD3n0QMQgdFu0FNUt0Q==
/aws-sign2/0.7.0:
resolution:
integrity: sha1-tG6JCTSpWR8tL2+G1+ap8bP+dqg=
@ -10998,6 +11028,12 @@ packages:
dev: false
resolution:
integrity: sha512-s3GJL04SQoM+gn2c14oyqxvZ3Pcq7cduSDqy3sBFXx6UPSUmgVYwQM9zwkTn9je0lrfg0gHEwR42pF3Q2dCQkQ==
/events/1.1.1:
dev: false
engines:
node: '>=0.4.x'
resolution:
integrity: sha1-nr23Y1rQmccNzEwqH1AEKI6L2SQ=
/events/3.2.0:
dev: false
engines:
@ -15191,6 +15227,12 @@ packages:
dev: false
resolution:
integrity: sha512-8BXU+J8+SPmwwyq9ELihpSV4dWPTiOKBWCEgtkbnxxAVMjXdf3yGmyaLSshBfXc8sP/JQ9OZj5R8nZzz2wPXgA==
/jmespath/0.15.0:
dev: false
engines:
node: '>= 0.6.0'
resolution:
integrity: sha1-o/Iiqarp+Wb10nx5ZRDigJF2Qhc=
/jpeg-js/0.4.2:
dev: false
resolution:
@ -19588,6 +19630,14 @@ packages:
node: '>=0.10.0'
resolution:
integrity: sha1-kO8jHQd4xc4JPJpI105cVCLROpk=
/recursive-readdir/2.2.2:
dependencies:
minimatch: 3.0.4
dev: true
engines:
node: '>=0.10.0'
resolution:
integrity: sha512-nRCcW9Sj7NuZwa2XvH9co8NPeXUBhZP7CRKJtU+cS6PW9FpCIFoI5ib0NT1ZrbNuPoRy0ylyCaUL8Gih4LSyFg==
/redent/1.0.0:
dependencies:
indent-string: 2.1.0
@ -20246,6 +20296,10 @@ packages:
dev: false
resolution:
integrity: sha512-VvY1hxVvMXzSos/LzqeBl9/KYu3mkEOtl5NMwz6jER318dSHDCig0AOjZOtnoCwAC3HMs9LhfWkPCmQGttb4ng==
/sax/1.2.1:
dev: false
resolution:
integrity: sha1-e45lYZCyKOgaZq6nSEgNgozS03o=
/sax/1.2.4:
dev: false
resolution:
@ -22488,6 +22542,13 @@ packages:
node: '>= 4'
resolution:
integrity: sha1-FQWgOiiaSMvXpDTvuu7FBV9WM6k=
/url/0.10.3:
dependencies:
punycode: 1.3.2
querystring: 0.2.0
dev: false
resolution:
integrity: sha1-Ah5NnHcF8hu/N9A861h2dAJ3TGQ=
/url/0.11.0:
dependencies:
punycode: 1.3.2
@ -22581,6 +22642,11 @@ packages:
node: '>= 0.4.0'
resolution:
integrity: sha1-n5VxD1CiZ5R7LMwSR0HBAoQn5xM=
/uuid/3.3.2:
dev: false
hasBin: true
resolution:
integrity: sha512-yXJmeNaw3DnnKAOKJE51sL/ZaYfWJRl1pK9dr19YFCu0ObS231AB1/LbqTKRAQ5kw8A90rA6fr4riOUpTZvQZA==
/uuid/3.4.0:
hasBin: true
resolution:
@ -23313,6 +23379,13 @@ packages:
dev: true
resolution:
integrity: sha1-eLpyAgApxbyHuKgaPPzXS0ovweU=
/xml2js/0.4.19:
dependencies:
sax: 1.2.4
xmlbuilder: 9.0.7
dev: false
resolution:
integrity: sha512-esZnJZJOiJR9wWKMyuvSE1y6Dq5LCuJanqhxslH2bxM6duahNZ+HMpCLhBQGZkbX6xRf8x1Y2eJlgt2q3qo49Q==
/xml2js/0.4.23:
dependencies:
sax: 1.2.4
@ -23328,6 +23401,12 @@ packages:
node: '>=4.0'
resolution:
integrity: sha512-fDlsI/kFEx7gLvbecc0/ohLG50fugQp8ryHzMTuW9vSa1GJ0XYWKnhsUx7oie3G98+r56aTQIUB4kht42R3JvA==
/xmlbuilder/9.0.7:
dev: false
engines:
node: '>=4.0'
resolution:
integrity: sha1-Ey7mPS7FVlxVfiD0wi35rKaGsQ0=
/xmlchars/2.2.0:
resolution:
integrity: sha512-JZnDKK8B0RCDw84FNdDAIpZK+JuJw+s7Lz8nksI7SIuU3UXJJslUthsi+uWBUYOwPFwW7W7PRLRfUKpxjtjFCw==