mirror of
https://github.com/TryGhost/Ghost.git
synced 2025-01-27 22:49:56 -05:00
7eb316b786
* 🛠 bookshelf tarball, bson-objectid
* 🎨 schema changes
- change increment type to string
- add a default fallback for string length 191 (to avoid adding this logic to every single column which uses an ID)
- remove uuid, because ID now represents a global resource identifier
- keep uuid for post, because we are using this as preview id
- keep uuid for clients for now - we are using this param for Ghost-Auth
* ✨ base model: generate ObjectId on creating event
- each new resource get's a auto generate ObjectId
- this logic won't work for attached models, this commit comes later
* 🎨 centralised attach method
When attaching models there are two things important two know
1. To be able to attach an ObjectId, we need to register the `onCreating` event the fetched model!This is caused by the Bookshelf design in general. On this target model we are attaching the new model.
2. We need to manually fetch the target model, because Bookshelf has a weird behaviour (which is known as a bug, see see https://github.com/tgriesser/bookshelf/issues/629). The most important property when attaching a model is `parentFk`, which is the foreign key. This can be null when fetching the model with the option `withRelated`. To ensure quality and consistency, the custom attach wrapper always fetches the target model manual. By fetching the target model (again) is a little performance decrease, but it also has advantages: we can register the event, and directly unregister the event again. So very clean code.
Important: please only use the custom attach wrapper in the future.
* 🎨 token model had overriden the onCreating function because of the created_at field
- we need to ensure that the base onCreating hook get's triggered for ALL models
- if not, they don't get an ObjectId assigned
- in this case: be smart and check if the target model has a created_at field
* 🎨 we don't have a uuid field anymore, remove the usages
- no default uuid creation in models
- i am pretty sure we have some more definitions in our tests (for example in the export json files), but that is too much work to delete them all
* 🎨 do not parse ID to Number
- we had various occurances of parsing all ID's to numbers
- we don't need this behaviour anymore
- ID is string
- i will adapt the ID validation in the next commit
* 🎨 change ID regex for validation
- we only allow: ID as ObjectId, ID as 1 and ID as me
- we need to keep ID 1, because our whole software relies on ID 1 (permissions etc)
* 🎨 owner fixture
- roles: [4] does not work anymore
- 4 means -> static id 4
- this worked in an auto increment system (not even in a system with distributed writes)
- with ObjectId we generate each ID automatically (for static and dynamic resources)
- it is possible to define all id's for static resources still, but that means we need to know which ID is already used and for consistency we have to define ObjectId's for these static resources
- so no static id's anymore, except of: id 1 for owner and id 0 for external usage (because this is required from our permission system)
- NOTE: please read through the comment in the user model
* 🎨 tests: DataGenerator and test utils
First of all: we need to ensure using ObjectId's in the tests. When don't, we can't ensure that ObjectId's work properly.
This commit brings lot's of dynamic into all the static defined id's.
In one of the next commits, i will adapt all the tests.
* 🚨 remove counter in Notification API
- no need to add a counter
- we simply generate ObjectId's (they are auto incremental as well)
- our id validator does only allow ObjectId as id,1 and me
* 🎨 extend contextUser in Base Model
- remove isNumber check, because id's are no longer numbers, except of id 0/1
- use existing isExternalUser
- support id 0/1 as string or number
* ✨ Ghost Owner has id 1
- ensure we define this id in the fixtures.json
- doesn't matter if number or string
* 🎨 functional tests adaptions
- use dynamic id's
* 🎨 fix unit tests
* 🎨 integration tests adaptions
* 🎨 change importer utils
- all our export examples (test/fixtures/exports) contain id's as numbers
- fact: but we ignore them anyway when inserting into the database, see https://github.com/TryGhost/Ghost/blob/master/core/server/data/import/utils.js#L249
- in 0e6ed957cd (diff-70f514a06347c048648be464819503c4L67)
i removed parsing id's to integers
- i realised that this ^ check just existed, because the userIdToMap was an object key and object keys are always strings!
- i think this logic is a little bit complicated, but i don't want to refactor this now
- this commit ensures when trying to find the user, the id comparison works again
- i've added more documentation to understand this logic ;)
- plus i renamed an attribute to improve readability
* 🎨 Data-Generator: add more defaults to createUser
- if i use the function DataGenerator.forKnex.createUser i would like to get a full set of defaults
* 🎨 test utils: change/extend function set for functional tests
- functional tests work a bit different
- they boot Ghost and seed the database
- some functional tests have mis-used the test setup
- the test setup needs two sections: integration/unit and functional tests
- any functional test is allowed to either add more data or change data in the existing Ghost db
- but what it should not do is: add test fixtures like roles or users from our DataGenerator and cross fingers it will work
- this commit adds a clean method for functional tests to add extra users
* 🎨 functional tests adaptions
- use last commit to insert users for functional tests clean
- tidy up usage of testUtils.setup or testUtils.doAuth
* 🐛 test utils: reset database before init
- ensure we don't have any left data from other tests in the database when starting ghost
* 🐛 fix test (unrelated to this PR)
- fixes a random failure
- return statement was missing
* 🎨 make changes for invites
761 lines
26 KiB
JavaScript
761 lines
26 KiB
JavaScript
var Promise = require('bluebird'),
|
|
_ = require('lodash'),
|
|
fs = require('fs-extra'),
|
|
path = require('path'),
|
|
Module = require('module'),
|
|
debug = require('debug')('ghost:test'),
|
|
ObjectId = require('bson-objectid'),
|
|
uuid = require('node-uuid'),
|
|
KnexMigrator = require('knex-migrator'),
|
|
ghost = require('../../server'),
|
|
errors = require('../../server/errors'),
|
|
db = require('../../server/data/db'),
|
|
fixtureUtils = require('../../server/data/migration/fixtures/utils'),
|
|
models = require('../../server/models'),
|
|
SettingsAPI = require('../../server/api/settings'),
|
|
permissions = require('../../server/permissions'),
|
|
sequence = require('../../server/utils/sequence'),
|
|
DataGenerator = require('./fixtures/data-generator'),
|
|
filterData = require('./fixtures/filter-param'),
|
|
API = require('./api'),
|
|
fork = require('./fork'),
|
|
mocks = require('./mocks'),
|
|
config = require('../../server/config'),
|
|
knexMigrator = new KnexMigrator(),
|
|
fixtures,
|
|
getFixtureOps,
|
|
toDoList,
|
|
originalRequireFn,
|
|
postsInserted = 0,
|
|
|
|
mockNotExistingModule,
|
|
unmockNotExistingModule,
|
|
teardown,
|
|
setup,
|
|
doAuth,
|
|
createUser,
|
|
login,
|
|
togglePermalinks,
|
|
startGhost,
|
|
|
|
initFixtures,
|
|
initData,
|
|
clearData,
|
|
clearBruteData;
|
|
|
|
// Require additional assertions which help us keep our tests small and clear
|
|
require('./assertions');
|
|
|
|
/** TEST FIXTURES **/
|
|
fixtures = {
|
|
insertPosts: function insertPosts(posts) {
|
|
return Promise.resolve(db.knex('posts').insert(posts));
|
|
},
|
|
|
|
insertPostsAndTags: function insertPostsAndTags() {
|
|
return Promise.resolve(db.knex('posts').insert(DataGenerator.forKnex.posts)).then(function () {
|
|
return db.knex('tags').insert(DataGenerator.forKnex.tags);
|
|
}).then(function () {
|
|
return db.knex('posts_tags').insert(DataGenerator.forKnex.posts_tags);
|
|
});
|
|
},
|
|
|
|
insertMultiAuthorPosts: function insertMultiAuthorPosts(max) {
|
|
/*jshint unused:false*/
|
|
var author,
|
|
authors,
|
|
i, j, k = postsInserted,
|
|
posts = [];
|
|
|
|
max = max || 50;
|
|
// insert users of different roles
|
|
return Promise.resolve(fixtures.createUsersWithRoles()).then(function () {
|
|
// create the tags
|
|
return db.knex('tags').insert(DataGenerator.forKnex.tags);
|
|
}).then(function () {
|
|
return db.knex('users').select('id');
|
|
}).then(function (results) {
|
|
authors = _.map(results, 'id');
|
|
|
|
// Let's insert posts with random authors
|
|
for (i = 0; i < max; i += 1) {
|
|
author = authors[i % authors.length];
|
|
posts.push(DataGenerator.forKnex.createGenericPost(k, null, null, author));
|
|
k = k + 1;
|
|
}
|
|
|
|
// Keep track so we can run this function again safely
|
|
postsInserted = k;
|
|
|
|
return sequence(_.times(posts.length, function (index) {
|
|
return function () {
|
|
return db.knex('posts').insert(posts[index]);
|
|
};
|
|
}));
|
|
}).then(function () {
|
|
return Promise.all([
|
|
db.knex('posts').orderBy('id', 'asc').select('id'),
|
|
db.knex('tags').select('id')
|
|
]);
|
|
}).then(function (results) {
|
|
var posts = _.map(results[0], 'id'),
|
|
tags = _.map(results[1], 'id'),
|
|
promises = [],
|
|
i;
|
|
|
|
if (max > posts.length) {
|
|
throw new Error('Trying to add more posts_tags than the number of posts. ' + max + ' ' + posts.length);
|
|
}
|
|
|
|
for (i = 0; i < max; i += 1) {
|
|
promises.push(DataGenerator.forKnex.createPostsTags(posts[i], tags[i % tags.length]));
|
|
}
|
|
|
|
return sequence(_.times(promises.length, function (index) {
|
|
return function () {
|
|
return db.knex('posts_tags').insert(promises[index]);
|
|
};
|
|
}));
|
|
});
|
|
},
|
|
|
|
insertMorePosts: function insertMorePosts(max) {
|
|
var lang,
|
|
status,
|
|
posts = [],
|
|
i, j, k = postsInserted;
|
|
|
|
max = max || 50;
|
|
|
|
for (i = 0; i < 2; i += 1) {
|
|
lang = i % 2 ? 'en' : 'fr';
|
|
posts.push(DataGenerator.forKnex.createGenericPost(k, null, lang));
|
|
k = k + 1;
|
|
|
|
for (j = 0; j < max; j += 1) {
|
|
status = j % 2 ? 'draft' : 'published';
|
|
posts.push(DataGenerator.forKnex.createGenericPost(k, status, lang));
|
|
k = k + 1;
|
|
}
|
|
}
|
|
|
|
// Keep track so we can run this function again safely
|
|
postsInserted = k;
|
|
|
|
return sequence(_.times(posts.length, function (index) {
|
|
return function () {
|
|
return db.knex('posts').insert(posts[index]);
|
|
};
|
|
}));
|
|
},
|
|
|
|
insertMoreTags: function insertMoreTags(max) {
|
|
max = max || 50;
|
|
var tags = [],
|
|
tagName,
|
|
i;
|
|
|
|
for (i = 0; i < max; i += 1) {
|
|
tagName = uuid.v4().split('-')[0];
|
|
tags.push(DataGenerator.forKnex.createBasic({name: tagName, slug: tagName}));
|
|
}
|
|
|
|
return sequence(_.times(tags.length, function (index) {
|
|
return function () {
|
|
return db.knex('tags').insert(tags[index]);
|
|
};
|
|
}));
|
|
},
|
|
|
|
insertMorePostsTags: function insertMorePostsTags(max) {
|
|
max = max || 50;
|
|
|
|
return Promise.all([
|
|
db.knex('posts').orderBy('id', 'asc').select('id'),
|
|
db.knex('tags').select('id', 'name')
|
|
]).then(function (results) {
|
|
var posts = _.map(results[0], 'id'),
|
|
injectionTagId = _.chain(results[1])
|
|
.filter({name: 'injection'})
|
|
.map('id')
|
|
.value()[0],
|
|
promises = [],
|
|
i;
|
|
|
|
if (max > posts.length) {
|
|
throw new Error('Trying to add more posts_tags than the number of posts.');
|
|
}
|
|
|
|
for (i = 0; i < max; i += 1) {
|
|
promises.push(DataGenerator.forKnex.createPostsTags(posts[i], injectionTagId));
|
|
}
|
|
|
|
return sequence(_.times(promises.length, function (index) {
|
|
return function () {
|
|
return db.knex('posts_tags').insert(promises[index]);
|
|
};
|
|
}));
|
|
});
|
|
},
|
|
|
|
insertRoles: function insertRoles() {
|
|
return db.knex('roles').insert(DataGenerator.forKnex.roles);
|
|
},
|
|
|
|
initOwnerUser: function initOwnerUser() {
|
|
var user = DataGenerator.Content.users[0];
|
|
|
|
user = DataGenerator.forKnex.createBasic(user);
|
|
user = _.extend({}, user, {status: 'inactive'});
|
|
|
|
return db.knex('roles').insert(DataGenerator.forKnex.roles).then(function () {
|
|
return db.knex('users').insert(user);
|
|
}).then(function () {
|
|
return db.knex('roles_users').insert(DataGenerator.forKnex.roles_users[0]);
|
|
});
|
|
},
|
|
|
|
insertOwnerUser: function insertOwnerUser() {
|
|
var user;
|
|
|
|
user = DataGenerator.forKnex.createUser(DataGenerator.Content.users[0]);
|
|
|
|
return db.knex('users').insert(user).then(function () {
|
|
return db.knex('roles_users').insert(DataGenerator.forKnex.roles_users[0]);
|
|
});
|
|
},
|
|
|
|
overrideOwnerUser: function overrideOwnerUser(slug) {
|
|
var user;
|
|
user = DataGenerator.forKnex.createUser(DataGenerator.Content.users[0]);
|
|
|
|
if (slug) {
|
|
user.slug = slug;
|
|
}
|
|
|
|
return db.knex('users')
|
|
.where('id', '=', models.User.ownerUser)
|
|
.update(user);
|
|
},
|
|
|
|
createUsersWithRoles: function createUsersWithRoles() {
|
|
return db.knex('roles').insert(DataGenerator.forKnex.roles).then(function () {
|
|
return db.knex('users').insert(DataGenerator.forKnex.users);
|
|
}).then(function () {
|
|
return db.knex('roles_users').insert(DataGenerator.forKnex.roles_users);
|
|
});
|
|
},
|
|
|
|
createUsersWithoutOwner: function createUsersWithoutOwner() {
|
|
var usersWithoutOwner = DataGenerator.forKnex.users.slice(1);
|
|
|
|
return db.knex('users').insert(usersWithoutOwner)
|
|
.then(function () {
|
|
return db.knex('roles_users').insert(DataGenerator.forKnex.roles_users);
|
|
});
|
|
},
|
|
|
|
createExtraUsers: function createExtraUsers() {
|
|
// grab 3 more users
|
|
var extraUsers = DataGenerator.Content.users.slice(2, 5);
|
|
|
|
extraUsers = _.map(extraUsers, function (user) {
|
|
return DataGenerator.forKnex.createUser(_.extend({}, user, {
|
|
id: ObjectId.generate(),
|
|
email: 'a' + user.email,
|
|
slug: 'a' + user.slug
|
|
}));
|
|
});
|
|
|
|
// @TODO: remove when overhauling test env
|
|
// tests need access to the extra created users (especially to the created id)
|
|
// replacement for admin2, editor2 etc
|
|
DataGenerator.Content.extraUsers = extraUsers;
|
|
|
|
return db.knex('users').insert(extraUsers).then(function () {
|
|
return db.knex('roles_users').insert([
|
|
{id: ObjectId.generate(), user_id: extraUsers[0].id, role_id: DataGenerator.Content.roles[0].id},
|
|
{id: ObjectId.generate(), user_id: extraUsers[1].id, role_id: DataGenerator.Content.roles[1].id},
|
|
{id: ObjectId.generate(), user_id: extraUsers[2].id, role_id: DataGenerator.Content.roles[2].id}
|
|
]);
|
|
});
|
|
},
|
|
|
|
// Creates a client, and access and refresh tokens for user 3 (author)
|
|
createTokensForUser: function createTokensForUser() {
|
|
return db.knex('clients').insert(DataGenerator.forKnex.clients).then(function () {
|
|
return db.knex('accesstokens').insert(DataGenerator.forKnex.createToken({user_id: DataGenerator.Content.users[2].id}));
|
|
}).then(function () {
|
|
return db.knex('refreshtokens').insert(DataGenerator.forKnex.createToken({user_id: DataGenerator.Content.users[2].id}));
|
|
});
|
|
},
|
|
|
|
insertOne: function insertOne(obj, fn, index) {
|
|
return db.knex(obj)
|
|
.insert(DataGenerator.forKnex[fn](DataGenerator.Content[obj][index || 0]));
|
|
},
|
|
|
|
insertApps: function insertApps() {
|
|
return db.knex('apps').insert(DataGenerator.forKnex.apps).then(function () {
|
|
return db.knex('app_fields').insert(DataGenerator.forKnex.app_fields);
|
|
});
|
|
},
|
|
|
|
getImportFixturePath: function (filename) {
|
|
return path.resolve(__dirname + '/fixtures/import/' + filename);
|
|
},
|
|
|
|
getExportFixturePath: function (filename) {
|
|
return path.resolve(__dirname + '/fixtures/export/' + filename + '.json');
|
|
},
|
|
|
|
loadExportFixture: function loadExportFixture(filename) {
|
|
var filePath = this.getExportFixturePath(filename),
|
|
readFile = Promise.promisify(fs.readFile);
|
|
|
|
return readFile(filePath).then(function (fileContents) {
|
|
var data;
|
|
|
|
// Parse the json data
|
|
try {
|
|
data = JSON.parse(fileContents);
|
|
} catch (e) {
|
|
return new Error('Failed to parse the file');
|
|
}
|
|
|
|
return data;
|
|
});
|
|
},
|
|
|
|
permissionsFor: function permissionsFor(obj) {
|
|
var permsToInsert = fixtureUtils.findModelFixtures('Permission', {object_type: obj}).entries,
|
|
permsRolesToInsert = fixtureUtils.findPermissionRelationsForObject(obj).entries,
|
|
actions = [],
|
|
permissionsRoles = [],
|
|
roles = {
|
|
Administrator: DataGenerator.Content.roles[0].id,
|
|
Editor: DataGenerator.Content.roles[1].id,
|
|
Author: DataGenerator.Content.roles[2].id,
|
|
Owner: DataGenerator.Content.roles[3].id
|
|
};
|
|
|
|
// CASE: if empty db will throw SQLITE_MISUSE, hard to debug
|
|
if (_.isEmpty(permsToInsert)) {
|
|
return Promise.reject(new Error('no permission found:' + obj));
|
|
}
|
|
|
|
permsToInsert = _.map(permsToInsert, function (perms) {
|
|
perms.id = ObjectId.generate();
|
|
|
|
actions.push({type: perms.action_type, permissionId: perms.id});
|
|
return DataGenerator.forKnex.createBasic(perms);
|
|
});
|
|
|
|
_.each(permsRolesToInsert, function (perms, role) {
|
|
if (perms[obj]) {
|
|
if (perms[obj] === 'all') {
|
|
_.each(actions, function (action) {
|
|
permissionsRoles.push({
|
|
id: ObjectId.generate(),
|
|
permission_id: action.permissionId,
|
|
role_id: roles[role]
|
|
});
|
|
});
|
|
} else {
|
|
_.each(perms[obj], function (action) {
|
|
permissionsRoles.push({
|
|
id: ObjectId.generate(),
|
|
permission_id: _.find(actions, {type: action}).permissionId,
|
|
role_id: roles[role]
|
|
});
|
|
});
|
|
}
|
|
}
|
|
});
|
|
|
|
return db.knex('permissions').insert(permsToInsert).then(function () {
|
|
if (_.isEmpty(permissionsRoles)) {
|
|
return Promise.resolve();
|
|
}
|
|
|
|
return db.knex('permissions_roles').insert(permissionsRoles);
|
|
});
|
|
},
|
|
|
|
insertClients: function insertClients() {
|
|
return db.knex('clients').insert(DataGenerator.forKnex.clients);
|
|
},
|
|
|
|
insertAccessToken: function insertAccessToken(override) {
|
|
return db.knex('accesstokens').insert(DataGenerator.forKnex.createToken(override));
|
|
},
|
|
|
|
insertInvites: function insertInvites() {
|
|
return db.knex('invites').insert(DataGenerator.forKnex.invites);
|
|
}
|
|
};
|
|
|
|
/** Test Utility Functions **/
|
|
initData = function initData() {
|
|
return knexMigrator.init();
|
|
};
|
|
|
|
clearBruteData = function clearBruteData() {
|
|
return db.knex('brute').truncate();
|
|
};
|
|
|
|
// we must always try to delete all tables
|
|
clearData = function clearData() {
|
|
debug('Database reset');
|
|
return knexMigrator.reset();
|
|
};
|
|
|
|
toDoList = {
|
|
app: function insertApp() { return fixtures.insertOne('apps', 'createApp'); },
|
|
app_field: function insertAppField() {
|
|
// TODO: use the actual app ID to create the field
|
|
return fixtures.insertOne('apps', 'createApp').then(function () {
|
|
return fixtures.insertOne('app_fields', 'createAppField');
|
|
});
|
|
},
|
|
app_setting: function insertAppSetting() {
|
|
// TODO: use the actual app ID to create the field
|
|
return fixtures.insertOne('apps', 'createApp').then(function () {
|
|
return fixtures.insertOne('app_settings', 'createAppSetting');
|
|
});
|
|
},
|
|
permission: function insertPermission() { return fixtures.insertOne('permissions', 'createPermission'); },
|
|
role: function insertRole() { return fixtures.insertOne('roles', 'createRole'); },
|
|
roles: function insertRoles() { return fixtures.insertRoles(); },
|
|
tag: function insertTag() { return fixtures.insertOne('tags', 'createTag'); },
|
|
subscriber: function insertSubscriber() { return fixtures.insertOne('subscribers', 'createSubscriber'); },
|
|
posts: function insertPostsAndTags() { return fixtures.insertPostsAndTags(); },
|
|
'posts:mu': function insertMultiAuthorPosts() { return fixtures.insertMultiAuthorPosts(); },
|
|
tags: function insertMoreTags() { return fixtures.insertMoreTags(); },
|
|
apps: function insertApps() { return fixtures.insertApps(); },
|
|
settings: function populateSettings() {
|
|
return models.Settings.populateDefaults().then(function () { return SettingsAPI.updateSettingsCache(); });
|
|
},
|
|
'users:roles': function createUsersWithRoles() { return fixtures.createUsersWithRoles(); },
|
|
'users:no-owner': function createUsersWithoutOwner() { return fixtures.createUsersWithoutOwner(); },
|
|
users: function createExtraUsers() { return fixtures.createExtraUsers(); },
|
|
'user:token': function createTokensForUser() { return fixtures.createTokensForUser(); },
|
|
owner: function insertOwnerUser() { return fixtures.insertOwnerUser(); },
|
|
'owner:pre': function initOwnerUser() { return fixtures.initOwnerUser(); },
|
|
'owner:post': function overrideOwnerUser() { return fixtures.overrideOwnerUser(); },
|
|
'perms:init': function initPermissions() { return permissions.init(); },
|
|
perms: function permissionsFor(obj) {
|
|
return function permissionsForObj() { return fixtures.permissionsFor(obj); };
|
|
},
|
|
clients: function insertClients() { return fixtures.insertClients(); },
|
|
filter: function createFilterParamFixtures() { return filterData(DataGenerator); },
|
|
invites: function insertInvites() { return fixtures.insertInvites(); }
|
|
};
|
|
|
|
/**
|
|
* ## getFixtureOps
|
|
*
|
|
* Takes the arguments from a setup function and turns them into an array of promises to fullfil
|
|
*
|
|
* This is effectively a list of instructions with regard to which fixtures should be setup for this test.
|
|
* * `default` - a special option which will cause the full suite of normal fixtures to be initialised
|
|
* * `perms:init` - initialise the permissions object after having added permissions
|
|
* * `perms:obj` - initialise permissions for a particular object type
|
|
* * `users:roles` - create a full suite of users, one per role
|
|
* @param {Object} toDos
|
|
*
|
|
* @TODO:
|
|
* - key: migrations-kate
|
|
* - call migration-runner
|
|
*/
|
|
getFixtureOps = function getFixtureOps(toDos) {
|
|
// default = default fixtures, if it isn't present, init with tables only
|
|
var tablesOnly = !toDos.default,
|
|
fixtureOps = [];
|
|
|
|
// Database initialisation
|
|
if (toDos.init || toDos.default) {
|
|
fixtureOps.push(function initDB() {
|
|
// skip adding all fixtures!
|
|
if (tablesOnly) {
|
|
return knexMigrator.init({skip: 2});
|
|
}
|
|
|
|
return knexMigrator.init();
|
|
});
|
|
|
|
delete toDos.default;
|
|
delete toDos.init;
|
|
}
|
|
|
|
// Go through our list of things to do, and add them to an array
|
|
_.each(toDos, function (value, toDo) {
|
|
var tmp;
|
|
|
|
if (toDo !== 'perms:init' && toDo.indexOf('perms:') !== -1) {
|
|
tmp = toDo.split(':');
|
|
fixtureOps.push(toDoList[tmp[0]](tmp[1]));
|
|
} else {
|
|
if (!toDoList[toDo]) {
|
|
throw new Error('setup todo does not exist - spell mistake?');
|
|
}
|
|
|
|
fixtureOps.push(toDoList[toDo]);
|
|
}
|
|
});
|
|
|
|
return fixtureOps;
|
|
};
|
|
|
|
// ## Test Setup and Teardown
|
|
|
|
initFixtures = function initFixtures() {
|
|
var options = _.merge({init: true}, _.transform(arguments, function (result, val) {
|
|
result[val] = true;
|
|
})),
|
|
fixtureOps = getFixtureOps(options);
|
|
|
|
return sequence(fixtureOps);
|
|
};
|
|
|
|
/**
|
|
* ## Setup Integration Tests
|
|
* Setup takes a list of arguments like: 'default', 'tag', 'perms:tag', 'perms:init'
|
|
* Setup does 'init' (DB) by default
|
|
* @returns {Function}
|
|
*/
|
|
setup = function setup() {
|
|
var self = this,
|
|
args = arguments;
|
|
|
|
return function setup(done) {
|
|
models.init();
|
|
|
|
if (done) {
|
|
initFixtures.apply(self, args).then(function () {
|
|
done();
|
|
}).catch(done);
|
|
} else {
|
|
return initFixtures.apply(self, args);
|
|
}
|
|
};
|
|
};
|
|
|
|
// ## Functions for Route Tests (!!)
|
|
|
|
/**
|
|
* This function manages the work of ensuring we have an overridden owner user, and grabbing an access token
|
|
* @returns {deferred.promise<AccessToken>}
|
|
*/
|
|
// TODO make this do the DB init as well
|
|
doAuth = function doAuth() {
|
|
var options = arguments,
|
|
request = arguments[0],
|
|
fixtureOps;
|
|
|
|
// Remove request from this list
|
|
delete options[0];
|
|
|
|
// No DB setup, but override the owner
|
|
options = _.merge({'owner:post': true}, _.transform(options, function (result, val) {
|
|
if (val) {
|
|
result[val] = true;
|
|
}
|
|
}));
|
|
|
|
fixtureOps = getFixtureOps(options);
|
|
|
|
return sequence(fixtureOps).then(function () {
|
|
return login(request);
|
|
});
|
|
};
|
|
|
|
createUser = function createUser(options) {
|
|
var user = options.user,
|
|
role = options.role;
|
|
|
|
return db.knex('users').insert(user)
|
|
.then(function () {
|
|
return db.knex('roles');
|
|
})
|
|
.then(function (roles) {
|
|
return db.knex('roles_users').insert({
|
|
id: ObjectId.generate(),
|
|
role_id: _.find(roles, {name: role.name}).id,
|
|
user_id: user.id
|
|
});
|
|
})
|
|
.then(function () {
|
|
return user;
|
|
});
|
|
};
|
|
|
|
login = function login(request) {
|
|
// CASE: by default we use the owner to login
|
|
if (!request.user) {
|
|
request.user = DataGenerator.Content.users[0];
|
|
}
|
|
|
|
return new Promise(function (resolve, reject) {
|
|
request.post('/ghost/api/v0.1/authentication/token/')
|
|
.set('Origin', config.get('url'))
|
|
.send({
|
|
grant_type: 'password',
|
|
username: request.user.email,
|
|
password: 'Sl1m3rson',
|
|
client_id: 'ghost-admin',
|
|
client_secret: 'not_available'
|
|
}).then(function then(res) {
|
|
if (res.statusCode !== 200) {
|
|
return reject(new errors.GhostError({
|
|
message: res.body.errors[0].message
|
|
}));
|
|
}
|
|
|
|
resolve(res.body.access_token);
|
|
}, reject);
|
|
});
|
|
};
|
|
|
|
togglePermalinks = function togglePermalinks(request, toggle) {
|
|
var permalinkString = toggle === 'date' ? '/:year/:month/:day/:slug/' : '/:slug/';
|
|
|
|
return new Promise(function (resolve, reject) {
|
|
doAuth(request).then(function (token) {
|
|
request.put('/ghost/api/v0.1/settings/')
|
|
.set('Authorization', 'Bearer ' + token)
|
|
.send({settings: [
|
|
{
|
|
uuid: '75e994ae-490e-45e6-9207-0eab409c1c04',
|
|
key: 'permalinks',
|
|
value: permalinkString,
|
|
type: 'blog',
|
|
created_at: '2014-10-16T17:39:16.005Z',
|
|
created_by: 1,
|
|
updated_at: '2014-10-20T19:44:18.077Z',
|
|
updated_by: 1
|
|
}
|
|
]})
|
|
.end(function (err, res) {
|
|
if (err) {
|
|
return reject(err);
|
|
}
|
|
|
|
if (res.statusCode !== 200) {
|
|
return reject(res.body);
|
|
}
|
|
|
|
resolve(res.body);
|
|
});
|
|
});
|
|
});
|
|
};
|
|
|
|
teardown = function teardown(done) {
|
|
debug('Database reset');
|
|
|
|
if (done) {
|
|
knexMigrator.reset()
|
|
.then(function () {
|
|
done();
|
|
})
|
|
.catch(done);
|
|
} else {
|
|
return knexMigrator.reset();
|
|
}
|
|
};
|
|
|
|
/**
|
|
* offer helper functions for mocking
|
|
* we start with a small function set to mock non existent modules
|
|
*/
|
|
originalRequireFn = Module.prototype.require;
|
|
mockNotExistingModule = function mockNotExistingModule(modulePath, module) {
|
|
Module.prototype.require = function (path) {
|
|
if (path.match(modulePath)) {
|
|
return module;
|
|
}
|
|
|
|
return originalRequireFn.apply(this, arguments);
|
|
};
|
|
};
|
|
|
|
unmockNotExistingModule = function unmockNotExistingModule() {
|
|
Module.prototype.require = originalRequireFn;
|
|
};
|
|
|
|
/**
|
|
* 1. sephiroth init db
|
|
* 2. start ghost
|
|
*/
|
|
startGhost = function startGhost() {
|
|
return knexMigrator.reset()
|
|
.then(function initialiseDatabase() {
|
|
return knexMigrator.init();
|
|
})
|
|
.then(function startGhost() {
|
|
return ghost();
|
|
});
|
|
};
|
|
|
|
module.exports = {
|
|
startGhost: startGhost,
|
|
teardown: teardown,
|
|
setup: setup,
|
|
doAuth: doAuth,
|
|
createUser: createUser,
|
|
login: login,
|
|
togglePermalinks: togglePermalinks,
|
|
|
|
mockNotExistingModule: mockNotExistingModule,
|
|
unmockNotExistingModule: unmockNotExistingModule,
|
|
|
|
initFixtures: initFixtures,
|
|
initData: initData,
|
|
clearData: clearData,
|
|
clearBruteData: clearBruteData,
|
|
|
|
mocks: mocks,
|
|
|
|
fixtures: fixtures,
|
|
|
|
DataGenerator: DataGenerator,
|
|
filterData: filterData,
|
|
API: API,
|
|
|
|
fork: fork,
|
|
|
|
// Helpers to make it easier to write tests which are easy to read
|
|
context: {
|
|
internal: {context: {internal: true}},
|
|
external: {context: {external: true}},
|
|
owner: {context: {user: DataGenerator.Content.users[0].id}},
|
|
admin: {context: {user: DataGenerator.Content.users[1].id}},
|
|
editor: {context: {user: DataGenerator.Content.users[2].id}},
|
|
author: {context: {user: DataGenerator.Content.users[3].id}}
|
|
},
|
|
users: {
|
|
ids: {
|
|
owner: DataGenerator.Content.users[0].id,
|
|
admin: DataGenerator.Content.users[1].id,
|
|
editor: DataGenerator.Content.users[2].id,
|
|
author: DataGenerator.Content.users[3].id
|
|
}
|
|
},
|
|
roles: {
|
|
ids: {
|
|
owner: DataGenerator.Content.roles[3].id,
|
|
admin: DataGenerator.Content.roles[0].id,
|
|
editor: DataGenerator.Content.roles[1].id,
|
|
author: DataGenerator.Content.roles[2].id
|
|
}
|
|
},
|
|
|
|
cacheRules: {
|
|
public: 'public, max-age=0',
|
|
hour: 'public, max-age=' + 3600,
|
|
day: 'public, max-age=' + 86400,
|
|
year: 'public, max-age=' + 31536000,
|
|
private: 'no-cache, private, no-store, must-revalidate, max-stale=0, post-check=0, pre-check=0'
|
|
}
|
|
};
|