This commit is contained in:
Dawid Dziurla 2020-03-26 15:37:35 +01:00
parent d9becc67b6
commit 9308795b8b
No known key found for this signature in database
GPG key ID: 7B6D8368172E9B0B
964 changed files with 104265 additions and 16 deletions

24
node_modules/pino/LICENSE generated vendored Normal file
View file

@ -0,0 +1,24 @@
The MIT License (MIT)
Copyright (c) 2016-2019 Matteo Collina, David Mark Clements and the Pino contributors
Pino contributors listed at https://github.com/pinojs/pino#the-team and in
the README file.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

152
node_modules/pino/README.md generated vendored Normal file
View file

@ -0,0 +1,152 @@
![banner](pino-banner.png)
# pino  [![Build Status](https://travis-ci.org/pinojs/pino.svg?branch=master)](https://travis-ci.org/pinojs/pino) [![Coverage Status](https://coveralls.io/repos/github/pinojs/pino/badge.svg?branch=master)](https://coveralls.io/github/pinojs/pino?branch=master) [![js-standard-style](https://img.shields.io/badge/code%20style-standard-brightgreen.svg?style=flat)](http://standardjs.com/) [![TypeScript definitions on DefinitelyTyped](https://definitelytyped.org/badges/standard.svg)](https://definitelytyped.org)
[Very low overhead](#low-overhead) Node.js logger, inspired by Bunyan.
## Documentation
* [Benchmarks ⇗](/docs/benchmarks.md)
* [API ⇗](/docs/api.md)
* [Browser API ⇗](/docs/browser.md)
* [Redaction ⇗](/docs/redaction.md)
* [Child Loggers ⇗](/docs/child-loggers.md)
* [Transports ⇗](/docs/transports.md)
* [Web Frameworks ⇗](/docs/web.md)
* [Pretty Printing ⇗](/docs/pretty.md)
* [Extreme Mode ⇗](/docs/extreme.md)
* [Ecosystem ⇗](/docs/ecosystem.md)
* [Legacy](/docs/legacy.md)
* [Help ⇗](/docs/help.md)
## Install
```
$ npm install pino
```
## Usage
```js
const logger = require('pino')()
logger.info('hello world')
const child = logger.child({ a: 'property' })
child.info('hello child!')
```
This produces:
```
{"level":30,"time":1531171074631,"msg":"hello world","pid":657,"hostname":"Davids-MBP-3.fritz.box","v":1}
{"level":30,"time":1531171082399,"msg":"hello child!","pid":657,"hostname":"Davids-MBP-3.fritz.box","a":"property","v":1}
```
For using Pino with a web framework see:
* [Pino with Fastify](docs/web.md#fastify)
* [Pino with Express](docs/web.md#express)
* [Pino with Hapi](docs/web.md#hapi)
* [Pino with Restify](docs/web.md#restify)
* [Pino with Koa](docs/web.md#koa)
* [Pino with Node core `http`](docs/web.md#http)
* [Pino with Nest](docs/web.md#nest)
<a name="essentials"></a>
## Essentials
### Development Formatting
The [`pino-pretty`](https://github.com/pinojs/pino-pretty) module can be used to
format logs during development:
![pretty demo](pretty-demo.png)
### Transports & Log Processing
Due to Node's single-threaded event-loop, it's highly recommended that sending,
alert triggering, reformatting and all forms of log processing
is conducted in a separate process. In Pino parlance we call all log processors
"transports", and recommend that the transports be run as separate
processes, piping the stdout of the application to the stdin of the transport.
For more details see our [Transports⇗](docs/transports.md) document.
### Low overhead
Using minimum resources for logging is very important. Log messages
tend to get added over time and this can lead to a throttling effect
on applications  such as reduced requests per second.
In many cases, Pino is over 5x faster than alternatives.
See the [Benchmarks](docs/benchmarks.md) document for comparisons.
<a name="team"></a>
## The Team
### Matteo Collina
<https://github.com/pinojs>
<https://www.npmjs.com/~matteo.collina>
<https://twitter.com/matteocollina>
### David Mark Clements
<https://github.com/davidmarkclements>
<https://www.npmjs.com/~davidmarkclements>
<https://twitter.com/davidmarkclem>
### James Sumners
<https://github.com/jsumners>
<https://www.npmjs.com/~jsumners>
<https://twitter.com/jsumners79>
### Thomas Watson Steen
<https://github.com/watson>
<https://www.npmjs.com/~watson>
<https://twitter.com/wa7son>
## Communication
### Chat on Gitter
<https://gitter.im/pinojs/pino>
### Chat on IRC
You'll find an active group of Pino users in the #pinojs channel on Freenode, including some of the contributors to this project.
## Contributing
Pino is an **OPEN Open Source Project**. This means that:
> Individuals making significant and valuable contributions are given commit-access to the project to contribute as they see fit. This project is more like an open wiki than a standard guarded open source project.
See the [CONTRIBUTING.md](https://github.com/pinojs/pino/blob/master/CONTRIBUTING.md) file for more details.
<a name="acknowledgements"></a>
## Acknowledgements
This project was kindly sponsored by [nearForm](http://nearform.com).
Logo and identity designed by Cosmic Fox Design: https://www.behance.net/cosmicfox.
## License
Licensed under [MIT](./LICENSE).
[elasticsearch]: https://www.elastic.co/products/elasticsearch
[kibana]: https://www.elastic.co/products/kibana

6
node_modules/pino/bin.js generated vendored Executable file
View file

@ -0,0 +1,6 @@
#!/usr/bin/env node
console.error(
'`pino` cli has been removed. Use `pino-pretty` cli instead.\n' +
'\nSee: https://github.com/pinojs/pino-pretty'
)
process.exit(1)

325
node_modules/pino/browser.js generated vendored Normal file
View file

@ -0,0 +1,325 @@
'use strict'
var format = require('quick-format-unescaped')
module.exports = pino
var _console = pfGlobalThisOrFallback().console || {}
var stdSerializers = {
mapHttpRequest: mock,
mapHttpResponse: mock,
wrapRequestSerializer: passthrough,
wrapResponseSerializer: passthrough,
wrapErrorSerializer: passthrough,
req: mock,
res: mock,
err: asErrValue
}
function pino (opts) {
opts = opts || {}
opts.browser = opts.browser || {}
var transmit = opts.browser.transmit
if (transmit && typeof transmit.send !== 'function') { throw Error('pino: transmit option must have a send function') }
var proto = opts.browser.write || _console
if (opts.browser.write) opts.browser.asObject = true
var serializers = opts.serializers || {}
var serialize = Array.isArray(opts.browser.serialize)
? opts.browser.serialize.filter(function (k) {
return k !== '!stdSerializers.err'
})
: opts.browser.serialize === true ? Object.keys(serializers) : false
var stdErrSerialize = opts.browser.serialize
if (
Array.isArray(opts.browser.serialize) &&
opts.browser.serialize.indexOf('!stdSerializers.err') > -1
) stdErrSerialize = false
var levels = ['error', 'fatal', 'warn', 'info', 'debug', 'trace']
if (typeof proto === 'function') {
proto.error = proto.fatal = proto.warn =
proto.info = proto.debug = proto.trace = proto
}
if (opts.enabled === false) opts.level = 'silent'
var level = opts.level || 'info'
var logger = Object.create(proto)
if (!logger.log) logger.log = noop
Object.defineProperty(logger, 'levelVal', {
get: getLevelVal
})
Object.defineProperty(logger, 'level', {
get: getLevel,
set: setLevel
})
var setOpts = {
transmit: transmit,
serialize: serialize,
asObject: opts.browser.asObject,
levels: levels
}
logger.levels = pino.levels
logger.level = level
logger.setMaxListeners = logger.getMaxListeners =
logger.emit = logger.addListener = logger.on =
logger.prependListener = logger.once =
logger.prependOnceListener = logger.removeListener =
logger.removeAllListeners = logger.listeners =
logger.listenerCount = logger.eventNames =
logger.write = logger.flush = noop
logger.serializers = serializers
logger._serialize = serialize
logger._stdErrSerialize = stdErrSerialize
logger.child = child
if (transmit) logger._logEvent = createLogEventShape()
function getLevelVal () {
return this.level === 'silent'
? Infinity
: this.levels.values[this.level]
}
function getLevel () {
return this._level
}
function setLevel (level) {
if (level !== 'silent' && !this.levels.values[level]) {
throw Error('unknown level ' + level)
}
this._level = level
set(setOpts, logger, 'error', 'log') // <-- must stay first
set(setOpts, logger, 'fatal', 'error')
set(setOpts, logger, 'warn', 'error')
set(setOpts, logger, 'info', 'log')
set(setOpts, logger, 'debug', 'log')
set(setOpts, logger, 'trace', 'log')
}
function child (bindings) {
if (!bindings) {
throw new Error('missing bindings for child Pino')
}
var bindingsSerializers = bindings.serializers
if (serialize && bindingsSerializers) {
var childSerializers = Object.assign({}, serializers, bindingsSerializers)
var childSerialize = opts.browser.serialize === true
? Object.keys(childSerializers)
: serialize
delete bindings.serializers
applySerializers([bindings], childSerialize, childSerializers, this._stdErrSerialize)
}
function Child (parent) {
this._childLevel = (parent._childLevel | 0) + 1
this.error = bind(parent, bindings, 'error')
this.fatal = bind(parent, bindings, 'fatal')
this.warn = bind(parent, bindings, 'warn')
this.info = bind(parent, bindings, 'info')
this.debug = bind(parent, bindings, 'debug')
this.trace = bind(parent, bindings, 'trace')
if (childSerializers) {
this.serializers = childSerializers
this._serialize = childSerialize
}
if (transmit) {
this._logEvent = createLogEventShape(
[].concat(parent._logEvent.bindings, bindings)
)
}
}
Child.prototype = this
return new Child(this)
}
return logger
}
pino.LOG_VERSION = 1
pino.levels = {
values: {
fatal: 60,
error: 50,
warn: 40,
info: 30,
debug: 20,
trace: 10
},
labels: {
10: 'trace',
20: 'debug',
30: 'info',
40: 'warn',
50: 'error',
60: 'fatal'
}
}
pino.stdSerializers = stdSerializers
function set (opts, logger, level, fallback) {
var proto = Object.getPrototypeOf(logger)
logger[level] = logger.levelVal > logger.levels.values[level] ? noop
: (proto[level] ? proto[level] : (_console[level] || _console[fallback] || noop))
wrap(opts, logger, level)
}
function wrap (opts, logger, level) {
if (!opts.transmit && logger[level] === noop) return
logger[level] = (function (write) {
return function LOG () {
var ts = Date.now()
var args = new Array(arguments.length)
var proto = (Object.getPrototypeOf && Object.getPrototypeOf(this) === _console) ? _console : this
for (var i = 0; i < args.length; i++) args[i] = arguments[i]
if (opts.serialize && !opts.asObject) {
applySerializers(args, this._serialize, this.serializers, this._stdErrSerialize)
}
if (opts.asObject) write.call(proto, asObject(this, level, args, ts))
else write.apply(proto, args)
if (opts.transmit) {
var transmitLevel = opts.transmit.level || logger.level
var transmitValue = pino.levels.values[transmitLevel]
var methodValue = pino.levels.values[level]
if (methodValue < transmitValue) return
transmit(this, {
ts: ts,
methodLevel: level,
methodValue: methodValue,
transmitLevel: transmitLevel,
transmitValue: pino.levels.values[opts.transmit.level || logger.level],
send: opts.transmit.send,
val: logger.levelVal
}, args)
}
}
})(logger[level])
}
function asObject (logger, level, args, ts) {
if (logger._serialize) applySerializers(args, logger._serialize, logger.serializers, logger._stdErrSerialize)
var argsCloned = args.slice()
var msg = argsCloned[0]
var o = { time: ts, level: pino.levels.values[level] }
var lvl = (logger._childLevel | 0) + 1
if (lvl < 1) lvl = 1
// deliberate, catching objects, arrays
if (msg !== null && typeof msg === 'object') {
while (lvl-- && typeof argsCloned[0] === 'object') {
Object.assign(o, argsCloned.shift())
}
msg = argsCloned.length ? format(argsCloned.shift(), argsCloned) : undefined
} else if (typeof msg === 'string') msg = format(argsCloned.shift(), argsCloned)
if (msg !== undefined) o.msg = msg
return o
}
function applySerializers (args, serialize, serializers, stdErrSerialize) {
for (var i in args) {
if (stdErrSerialize && args[i] instanceof Error) {
args[i] = pino.stdSerializers.err(args[i])
} else if (typeof args[i] === 'object' && !Array.isArray(args[i])) {
for (var k in args[i]) {
if (serialize && serialize.indexOf(k) > -1 && k in serializers) {
args[i][k] = serializers[k](args[i][k])
}
}
}
}
}
function bind (parent, bindings, level) {
return function () {
var args = new Array(1 + arguments.length)
args[0] = bindings
for (var i = 1; i < args.length; i++) {
args[i] = arguments[i - 1]
}
return parent[level].apply(this, args)
}
}
function transmit (logger, opts, args) {
var send = opts.send
var ts = opts.ts
var methodLevel = opts.methodLevel
var methodValue = opts.methodValue
var val = opts.val
var bindings = logger._logEvent.bindings
applySerializers(
args,
logger._serialize || Object.keys(logger.serializers),
logger.serializers,
logger._stdErrSerialize === undefined ? true : logger._stdErrSerialize
)
logger._logEvent.ts = ts
logger._logEvent.messages = args.filter(function (arg) {
// bindings can only be objects, so reference equality check via indexOf is fine
return bindings.indexOf(arg) === -1
})
logger._logEvent.level.label = methodLevel
logger._logEvent.level.value = methodValue
send(methodLevel, logger._logEvent, val)
logger._logEvent = createLogEventShape(bindings)
}
function createLogEventShape (bindings) {
return {
ts: 0,
messages: [],
bindings: bindings || [],
level: { label: '', value: 0 }
}
}
function asErrValue (err) {
var obj = {
type: err.constructor.name,
msg: err.message,
stack: err.stack
}
for (var key in err) {
if (obj[key] === undefined) {
obj[key] = err[key]
}
}
return obj
}
function mock () { return {} }
function passthrough (a) { return a }
function noop () {}
/* eslint-disable */
/* istanbul ignore next */
function pfGlobalThisOrFallback () {
function defd (o) { return typeof o !== 'undefined' && o }
try {
if (typeof globalThis !== 'undefined') return globalThis
Object.defineProperty(Object.prototype, 'globalThis', {
get: function () {
delete Object.prototype.globalThis
return (this.globalThis = this)
},
configurable: true
})
return globalThis
} catch (e) {
return defd(self) || defd(window) || defd(this) || {}
}
}
/* eslint-enable */

868
node_modules/pino/docs/api.md generated vendored Normal file
View file

@ -0,0 +1,868 @@
# API
* [pino() => logger](#export)
* [options](#options)
* [destination](#destination)
* [destination\[Symbol.for('pino.metadata')\]](#metadata)
* [Logger Instance](#logger)
* [logger.trace()](#trace)
* [logger.debug()](#debug)
* [logger.info()](#info)
* [logger.warn()](#warn)
* [logger.error()](#error)
* [logger.fatal()](#fatal)
* [logger.child()](#child)
* [logger.bindings()](#bindings)
* [logger.flush()](#flush)
* [logger.level](#level)
* [logger.isLevelEnabled()](#islevelenabled)
* [logger.levels](#levels)
* [logger\[Symbol.for('pino.serializers')\]](#serializers)
* [Event: 'level-change'](#level-change)
* [logger.version](#version)
* [logger.LOG_VERSION](#log_version)
* [Statics](#statics)
* [pino.destination()](#pino-destination)
* [pino.extreme()](#pino-extreme)
* [pino.final()](#pino-final)
* [pino.stdSerializers](#pino-stdserializers)
* [pino.stdTimeFunctions](#pino-stdtimefunctions)
* [pino.symbols](#pino-symbols)
* [pino.version](#pino-version)
* [pino.LOG_VERSION](#pino-LOG_VERSION)
<a id="export"></a>
## `pino([options], [destination]) => logger`
The exported `pino` function takes two optional arguments,
[`options`](#options) and [`destination`](#destination) and
returns a [logger instance](#logger).
<a id=options></a>
### `options` (Object)
#### `name` (String)
Default: `undefined`
The name of the logger. When set adds a `name` field to every JSON line logged.
#### `level` (String)
Default: `'info'`
One of `'fatal'`, `'error'`, `'warn'`, `'info`', `'debug'`, `'trace'` or `'silent'`.
Additional levels can be added to the instance via the `customLevels` option.
* See [`customLevels` option](#opt-customlevels)
<a id=opt-customlevels></a>
#### `customLevels` (Object)
Default: `undefined`
Use this option to define additional logging levels.
The keys of the object correspond the namespace of the log level,
and the values should be the numerical value of the level.
```js
const logger = pino({
customLevels: {
foo: 35
}
})
logger.foo('hi')
```
<a id=opt-useOnlyCustomLevels></a>
#### `useOnlyCustomLevels` (Boolean)
Default: `false`
Use this option to only use defined `customLevels` and omit Pino's levels.
Logger's default `level` must be changed to a value in `customLevels` in order to use `useOnlyCustomLevels`
Warning: this option may not be supported by downstream transports.
```js
const logger = pino({
customLevels: {
foo: 35
},
useOnlyCustomLevels: true,
level: 'foo'
})
logger.foo('hi')
logger.info('hello') // Will throw an error saying info in not found in logger object
```
#### `mixin` (Function):
Default: `undefined`
If provided, the `mixin` function is called each time one of the active
logging methods is called. The function must synchronously return an
object. The properties of the returned object will be added to the
logged JSON.
```js
let n = 0
const logger = pino({
mixin () {
return { line: ++n }
}
})
logger.info('hello')
// {"level":30,"time":1573664685466,"pid":78742,"hostname":"x","line":1,"msg":"hello","v":1}
logger.info('world')
// {"level":30,"time":1573664685469,"pid":78742,"hostname":"x","line":2,"msg":"world","v":1}
```
#### `redact` (Array | Object):
Default: `undefined`
As an array, the `redact` option specifies paths that should
have their values redacted from any log output.
Each path must be a string using a syntax which corresponds to JavaScript dot and bracket notation.
If an object is supplied, three options can be specified:
* `paths` (array): Required. An array of paths. See [redaction - Path Syntax ⇗](/docs/redaction.md#paths) for specifics.
* `censor` (String|Function|Undefined): Optional. When supplied as a String the `censor` option will overwrite keys which are to be redacted. When set to `undefined` the the key will be removed entirely from the object.
The `censor` option may also be a mapping function. The (synchronous) mapping function is called with the unredacted value. The value returned from the mapping function becomes the applied censor value. Default: `'[Redacted]'`
value synchronously.
Default: `'[Redacted]'`
* `remove` (Boolean): Optional. Instead of censoring the value, remove both the key and the value. Default: `false`
**WARNING**: Never allow user input to define redacted paths.
* See the [redaction ⇗](/docs/redaction.md) documentation.
* See [fast-redact#caveat ⇗](http://github.com/davidmarkclements/fast-redact#caveat)
<a id=opt-serializers></a>
#### `serializers` (Object)
Default: `{err: pino.stdSerializers.err}`
An object containing functions for custom serialization of objects.
These functions should return an JSONifiable object and they
should never throw. When logging an object, each top-level property
matching the exact key of a serializer will be serialized using the defined serializer.
* See [pino.stdSerializers](#pino-stdserializers)
##### `serializers[Symbol.for('pino.*')]` (Function)
Default: `undefined`
The `serializers` object may contain a key which is the global symbol: `Symbol.for('pino.*')`.
This will act upon the complete log object rather than corresponding to a particular key.
#### `base` (Object)
Default: `{pid: process.pid, hostname: os.hostname}`
Key-value object added as child logger to each log line.
Set to `null` to avoid adding `pid`, `hostname` and `name` properties to each log.
#### `enabled` (Boolean)
Default: `true`
Set to `false` to disable logging.
#### `crlf` (Boolean)
Default: `false`
Set to `true` to logs newline delimited JSON with `\r\n` instead of `\n`.
<a id=opt-timestamp></a>
#### `timestamp` (Boolean | Function)
Default: `true`
Enables or disables the inclusion of a timestamp in the
log message. If a function is supplied, it must synchronously return a JSON string
representation of the time, e.g. `,"time":1493426328206` (which is the default).
If set to `false`, no timestamp will be included in the output.
See [stdTimeFunctions](#pino-stdtimefunctions) for a set of available functions
for passing in as a value for this option.
**Caution**: attempting to format time in-process will significantly impact logging performance.
<a id=opt-messagekey></a>
#### `messageKey` (String)
Default: `'msg'`
The string key for the 'message' in the JSON object.
<a id=opt-nestedkey></a>
#### `nestedKey` (String)
Default: `null`
If there's a chance that objects being logged have properties that conflict with those from pino itself (`level`, `timestamp`, `v`, `pid`, etc)
and duplicate keys in your log records are undesirable, pino can be configured with a `nestedKey` option that causes any `object`s that are logged
to be placed under a key whose name is the value of `nestedKey`.
This way, when searching something like Kibana for values, one can consistently search under the configured `nestedKey` value instead of the root log record keys.
For example,
```js
const logger = require('pino')({
nestedKey: 'payload'
})
const thing = { level: 'hi', time: 'never', foo: 'bar'} // has pino-conflicting properties!
logger.info(thing)
// logs the following:
// {"level":30,"time":1578357790020,"pid":91736,"hostname":"x","payload":{"level":"hi","time":"never","foo":"bar"},"v":1}
```
In this way, logged objects' properties don't conflict with pino's standard logging properties,
and searching for logged objects can start from a consistent path.
<a id=prettyPrint></a>
#### `prettyPrint` (Boolean | Object)
Default: `false`
Enables pretty printing log logs. This is intended for non-production
configurations. This may be set to a configuration object as outlined in the
[`pino-pretty` documentation](https://github.com/pinojs/pino-pretty).
The options object may additionally contain a `prettifier` property to define
which prettifier module to use. When not present, `prettifier` defaults to
`'pino-pretty'`. Regardless of the value, the specified prettifier module
must be installed as a separate dependency:
```sh
npm install pino-pretty
```
<a id="useLevelLabels"></a>
#### `useLevelLabels` (Boolean)
Default: `false`
Enables printing of level labels instead of level values in the printed logs.
Warning: this option may not be supported by downstream transports.
<a id="changeLevelName"></a>
#### `changeLevelName` (String) - DEPRECATED
Use `levelKey` instead. This will be removed in v7.
<a id="levelKey"></a>
#### `levelKey` (String)
Default: `'level'`
Changes the property `level` to any string value you pass in:
```js
const logger = pino({
levelKey: 'priority'
})
logger.info('hello world')
// {"priority":30,"time":1531257112193,"msg":"hello world","pid":55956,"hostname":"x","v":1}
```
#### `browser` (Object)
Browser only, may have `asObject` and `write` keys. This option is separately
documented in the [Browser API ⇗](/docs/browser.md) documentation.
* See [Browser API ⇗](/docs/browser.md)
<a id="destination"></a>
### `destination` (SonicBoom | WritableStream | String)
Default: `pino.destination(1)` (STDOUT)
The `destination` parameter, at a minimum must be an object with a `write` method.
An ordinary Node.js `stream` can be passed as the destination (such as the result
of `fs.createWriteStream`) but for peak log writing performance it is strongly
recommended to use `pino.destination` or `pino.extreme` to create the destination stream.
```js
// pino.destination(1) by default
const stdoutLogger = require('pino')()
// destination param may be in first position when no options:
const fileLogger = require('pino')( pino.destination('/log/path'))
// use the stderr file handle to log to stderr:
const opts = {name: 'my-logger'}
const stderrLogger = require('pino')(opts, pino.destination(2))
// automatic wrapping in pino.destination
const fileLogger = require('pino')('/log/path')
```
However, there are some special instances where `pino.destination` is not used as the default:
+ When something, e.g a process manager, has monkey-patched `process.stdout.write`.
In these cases `process.stdout` is used instead.
* See [`pino.destination`](#pino-destination)
* See [`pino.extreme`](#pino-extreme)
<a id="metadata"></a>
#### `destination[Symbol.for('pino.metadata')]`
Default: `false`
Using the global symbol `Symbol.for('pino.metadata')` as a key on the `destination` parameter and
setting the key it to `true`, indicates that the following properties should be
set on the `destination` object after each log line is written:
* the last logging level as `destination.lastLevel`
* the last logging message as `destination.lastMsg`
* the last logging object as `destination.lastObj`
* the last time as `destination.lastTime`, which will be the partial string returned
by the time function.
* the last logger instance as `destination.lastLogger` (to support child
loggers)
For a full reference for using `Symbol.for('pino.metadata')`, see the [`pino-multi-stream` ⇗](https://github.com/pinojs/pino-multi-stream)
module.
The following is a succinct usage example:
```js
const dest = pino.destination('/dev/null')
dest[Symbol.for('pino.metadata')] = true
const logger = pino(dest)
logger.info({a: 1}, 'hi')
const { lastMsg, lastLevel, lastObj, lastTime} = dest
console.log(
'Logged message "%s" at level %d with object %o at time %s',
lastMsg, lastLevel, lastObj, lastTime
) // Logged message "hi" at level 30 with object { a: 1 } at time 1531590545089
```
* See [`pino-multi-stream` ⇗](https://github.com/pinojs/pino-multi-stream)
<a id="logger"></a>
## Logger Instance
The logger instance is the object returned by the main exported
[`pino`](#export) function.
The primary purpose of the logger instance is to provide logging methods.
The default logging methods are `trace`, `debug`, `info`, `warn`, `error`, and `fatal`.
Each logging method has the following signature:
`([mergingObject], [message], [...interpolationValues])`.
The parameters are explained below using the `logger.info` method but the same applies to all logging methods.
### Logging Method Parameters
<a id=mergingobject></a>
#### `mergingObject` (Object)
An object can optionally be supplied as the first parameter. Each enumerable key and value
of the `mergingObject` is copied in to the JSON log line.
```js
logger.info({MIX: {IN: true}})
// {"level":30,"time":1531254555820,"pid":55956,"hostname":"x","MIX":{"IN":true},"v":1}
```
<a id=message></a>
#### `message` (String)
A `message` string can optionally be supplied as the first parameter, or
as the second parameter after supplying a `mergingObject`.
By default, the contents of the `message` parameter will be merged into the
JSON log line under the `msg` key:
```js
logger.info('hello world')
// {"level":30,"time":1531257112193,"msg":"hello world","pid":55956,"hostname":"x","v":1}
```
The `message` parameter takes precedence over the `mergedObject`.
That is, if a `mergedObject` contains a `msg` property, and a `message` parameter
is supplied in addition, the `msg` property in the output log will be the value of
the `message` parameter not the value of the `msg` property on the `mergedObject`.
The `messageKey` option can be used at instantiation time to change the namespace
from `msg` to another string as preferred.
The `message` string may contain a printf style string with support for
the following placeholders:
* `%s` string placeholder
* `%d` digit placeholder
* `%O`, `%o` and `%j` object placeholder
Values supplied as additional arguments to the logger method will
then be interpolated accordingly.
* See [`messageKey` pino option](#opt-messagekey)
* See [`...interpolationValues` log method parameter](#interpolationvalues)
<a id=interpolationvalues></a>
#### `...interpolationValues` (Any)
All arguments supplied after `message` are serialized and interpolated according
to any supplied printf-style placeholders (`%s`, `%d`, `%o`|`%O`|`%j`)
or else concatenated together with the `message` string to form the final
output `msg` value for the JSON log line.
```js
logger.info('hello', 'world')
// {"level":30,"time":1531257618044,"msg":"hello world","pid":55956,"hostname":"x","v":1}
```
```js
logger.info('hello', {worldly: 1})
// {"level":30,"time":1531257797727,"msg":"hello {\"worldly\":1}","pid":55956,"hostname":"x","v":1}
```
```js
logger.info('%o hello', {worldly: 1})
// {"level":30,"time":1531257826880,"msg":"{\"worldly\":1} hello","pid":55956,"hostname":"x","v":1}
```
* See [`message` log method parameter](#message)
<a id="trace"></a>
### `logger.trace([mergingObject], [message], [...interpolationValues])`
Write a `'trace'` level log, if the configured [`level`](#level) allows for it.
* See [`mergingObject` log method parameter](#mergingobject)
* See [`message` log method parameter](#message)
* See [`...interpolationValues` log method parameter](#interpolationvalues)
<a id="debug"></a>
### `logger.debug([mergingObject], [message], [...interpolationValues])`
Write a `'debug'` level log, if the configured `level` allows for it.
* See [`mergingObject` log method parameter](#mergingobject)
* See [`message` log method parameter](#message)
* See [`...interpolationValues` log method parameter](#interpolationvalues)
<a id="info"></a>
### `logger.info([mergingObject], [message], [...interpolationValues])`
Write an `'info'` level log, if the configured `level` allows for it.
* See [`mergingObject` log method parameter](#mergingobject)
* See [`message` log method parameter](#message)
* See [`...interpolationValues` log method parameter](#interpolationvalues)
<a id="warn"></a>
### `logger.warn([mergingObject], [message], [...interpolationValues])`
Write a `'warn'` level log, if the configured `level` allows for it.
* See [`mergingObject` log method parameter](#mergingobject)
* See [`message` log method parameter](#message)
* See [`...interpolationValues` log method parameter](#interpolationvalues)
<a id="error"></a>
### `logger.error([mergingObject], [message], [...interpolationValues])`
Write a `'error'` level log, if the configured `level` allows for it.
* See [`mergingObject` log method parameter](#mergingobject)
* See [`message` log method parameter](#message)
* See [`...interpolationValues` log method parameter](#interpolationvalues)
<a id="fatal"></a>
### `logger.fatal([mergingObject], [message], [...interpolationValues])`
Write a `'fatal'` level log, if the configured `level` allows for it.
Since `'fatal'` level messages are intended to be logged just prior to the process exiting the `fatal`
method will always sync flush the destination.
Therefore it's important not to misuse `fatal` since
it will cause performance overhead if used for any
other purpose than writing final log messages before
the process crashes or exits.
* See [`mergingObject` log method parameter](#mergingobject)
* See [`message` log method parameter](#message)
* See [`...interpolationValues` log method parameter](#interpolationvalues)
<a id="child"></a>
### `logger.child(bindings) => logger`
The `logger.child` method allows for the creation of stateful loggers,
where key-value pairs can be pinned to a logger causing them to be output
on every log line.
Child loggers use the same output stream as the parent and inherit
the current log level of the parent at the time they are spawned.
The log level of a child is mutable. It can be set independently
of the parent either by setting the [`level`](#level) accessor after creating
the child logger or using the reserved [`bindings.level`](#bindingslevel-string) key.
#### `bindings` (Object)
An object of key-value pairs to include in every log line output
via the returned child logger.
```js
const child = logger.child({ MIX: {IN: 'always'} })
child.info('hello')
// {"level":30,"time":1531258616689,"msg":"hello","pid":64849,"hostname":"x","MIX":{"IN":"always"},"v":1}
child.info('child!')
// {"level":30,"time":1531258617401,"msg":"child!","pid":64849,"hostname":"x","MIX":{"IN":"always"},"v":1}
```
The `bindings` object may contain any key except for reserved configuration keys `level` and `serializers`.
##### `bindings.level` (String)
If a `level` property is present in the `bindings` object passed to `logger.child`
it will override the child logger level.
```js
const logger = pino()
logger.debug('nope') // will not log, since default level is info
const child = logger.child({foo: 'bar', level: 'debug'})
child.debug('debug!') // will log as the `level` property set the level to debug
```
##### `bindings.serializers` (Object)
Child loggers inherit the [serializers](#opt-serializers) from the parent logger.
Setting the `serializers` key of the `bindings` object will override
any configured parent serializers.
```js
const logger = require('pino')()
logger.info({test: 'will appear'})
// {"level":30,"time":1531259759482,"pid":67930,"hostname":"x","test":"will appear","v":1}
const child = logger.child({serializers: {test: () => `child-only serializer`}})
child.info({test: 'will be overwritten'})
// {"level":30,"time":1531259784008,"pid":67930,"hostname":"x","test":"child-only serializer","v":1}
```
* See [`serializers` option](#opt-serializers)
* See [pino.stdSerializers](#pino-stdSerializers)
<a id="bindings"></a>
### `logger.bindings()`
Returns an object containing all the current bindings, cloned from the ones passed in via `logger.child()`.
```js
const child = logger.child({ foo: 'bar' })
console.log(child.bindings())
// { foo: 'bar' }
const anotherChild = child.child({ MIX: { IN: 'always' } })
console.log(anotherChild.bindings())
// { foo: 'bar', MIX: { IN: 'always' } }
```
<a id="flush"></a>
### `logger.flush()`
Flushes the content of the buffer when using a `pino.extreme` destination.
This is an asynchronous, fire and forget, operation.
The use case is primarily for Extreme mode logging, which may hold up to
4KiB of logs. The `logger.flush` method can be used to flush the logs
on an long interval, say ten seconds. Such a strategy can provide an
optimium balance between extremely efficient logging at high demand periods
and safer logging at low demand periods.
* See [`pino.extreme`](#pino-extreme)
* See [`destination` parameter](#destination)
* See [Extreme mode ⇗](/docs/extreme.md)
<a id="level"></a>
### `logger.level` (String) [Getter/Setter]
Set this property to the desired logging level.
The core levels and their values are as follows:
| | | | | | | | |
|:-----------|-------|-------|------|------|-------|-------|---------:|
| **Level:** | trace | debug | info | warn | error | fatal | silent |
| **Value:** | 10 | 20 | 30 | 40 | 50 | 60 | Infinity |
The logging level is a *minimum* level based on the associated value of that level.
For instance if `logger.level` is `info` *(30)* then `info` *(30)*, `warn` *(40)*, `error` *(50)* and `fatal` *(60)* log methods will be enabled but the `trace` *(10)* and `debug` *(20)* methods, being less than 30, will not.
The `silent` logging level is a specialized level which will disable all logging,
there is no `silent` log method.
<a id="islevelenabled"></a>
### `logger.isLevelEnabled(level)`
A utility method for determining if a given log level will write to the destination.
#### `level` (String)
The given level to check against:
```js
if (logger.isLevelEnabled('debug')) logger.debug('conditional log')
```
#### `levelLabel` (String)
Defines the method name of the new level.
* See [`logger.level`](#level)
#### `levelValue` (Number)
Defines the associated minimum threshold value for the level, and
therefore where it sits in order of priority among other levels.
* See [`logger.level`](#level)
<a id="levelVal"></a>
### `logger.levelVal` (Number)
Supplies the integer value for the current logging level.
```js
if (logger.levelVal === 30) {
console.log('logger level is `info`')
}
```
<a id="levels"></a>
### `logger.levels` (Object)
Levels are mapped to values to determine the minimum threshold that a
logging method should be enabled at (see [`logger.level`](#level)).
The `logger.levels` property holds the mappings between levels and values,
and vice versa.
```sh
$ node -p "require('pino')().levels"
```
```js
{ labels:
{ '10': 'trace',
'20': 'debug',
'30': 'info',
'40': 'warn',
'50': 'error',
'60': 'fatal' },
values:
{ fatal: 60, error: 50, warn: 40, info: 30, debug: 20, trace: 10 } }
```
* See [`logger.level`](#level)
<a id="serializers"></a>
### logger\[Symbol.for('pino.serializers')\]
Returns the serializers as applied to the current logger instance. If a child logger did not
register it's own serializer upon instantiation the serializers of the parent will be returned.
<a id="level-change"></a>
### Event: 'level-change'
The logger instance is also an [`EventEmitter ⇗`](https://nodejs.org/dist/latest/docs/api/events.html#events_class_eventemitter)
A listener function can be attached to a logger via the `level-change` event
The listener is passed four arguments:
* `levelLabel` the new level string, e.g `trace`
* `levelValue`  the new level number, e.g `10`
* `previousLevelLabel` the prior level string, e.g `info`
* `previousLevelValue` the prior level numbebr, e.g `30`
```js
const logger = require('pino')()
logger.on('level-change', (lvl, val, prevLvl, prevVal) => {
console.log('%s (%d) was changed to %s (%d)', lvl, val, prevLvl, prevVal)
})
logger.level = 'trace' // trigger event
```
<a id="version"></a>
### `logger.version` (String)
Exposes the Pino package version. Also available on the exported `pino` function.
* See [`pino.version`](#pino-version)
<a id="log_version"></a>
### `logger.LOG_VERSION` (Number)
Holds the current log format version as output in the `v` property of each log record.
Also available on the exported `pino` function.
* See [`pino.LOG_VERSION`](#pino-LOG_VERSION)
## Statics
<a id="pino-destination"></a>
### `pino.destination([target]) => SonicBoom`
Create a Pino Destination instance: a stream-like object with
significantly more throughput (over 30%) than a standard Node.js stream.
```js
const pino = require('pino')
const logger = pino(pino.destination('./my-file'))
const logger2 = pino(pino.destination())
```
The `pino.destination` method may be passed a file path or a numerical file descriptor.
By default, `pino.destination` will use `process.stdout.fd` (1) as the file descriptor.
`pino.destination` is implemented on [`sonic-boom` ⇗](https://github.com/mcollina/sonic-boom).
A `pino.destination` instance can also be used to reopen closed files
(for example, for some log rotation scenarios), see [Reopening log files](/docs/help.md#reopening).
* See [`destination` parameter](#destination)
* See [`sonic-boom` ⇗](https://github.com/mcollina/sonic-boom)
* See [Reopening log files](/docs/help.md#reopening)
<a id="pino-extreme"></a>
### `pino.extreme([target]) => SonicBoom`
Create an extreme mode destination. This yields an additional 60% performance boost.
There are trade-offs that should be understood before usage.
```js
const pino = require('pino')
const logger = pino(pino.extreme('./my-file'))
const logger2 = pino(pino.extreme())
```
The `pino.extreme` method may be passed a file path or a numerical file descriptor.
By default, `pino.extreme` will use `process.stdout.fd` (1) as the file descriptor.
`pino.extreme` is implemented with the [`sonic-boom` ⇗](https://github.com/mcollina/sonic-boom)
module.
A `pino.extreme` instance can also be used to reopen closed files
(for example, for some log rotation scenarios), see [Reopening log files](/docs/help.md#reopening).
On AWS Lambda we recommend to call `extreme.flushSync()` at the end
of each function execution to avoid losing data.
* See [`destination` parameter](#destination)
* See [`sonic-boom` ⇗](https://github.com/mcollina/sonic-boom)
* See [Extreme mode ⇗](/docs/extreme.md)
* See [Reopening log files](/docs/help.md#reopening)
<a id="pino-final"></a>
### `pino.final(logger, [handler]) => Function | FinalLogger`
The `pino.final` method can be used to acquire a final logger instance
or create an exit listener function.
The `finalLogger` is a specialist logger that synchronously flushes
on every write. This is important to guarantee final log writes,
both when using `pino.extreme` target.
Since final log writes cannot be guaranteed with normal Node.js streams,
if the `destination` parameter of the `logger` supplied to `pino.final`
is a Node.js stream `pino.final` will throw.
The use of `pino.final` with `pino.destination` is not needed, as
`pino.destination` writes things synchronously.
#### `pino.final(logger, handler) => Function`
In this case the `pino.final` method supplies an exit listener function that can be
supplied to process exit events such as `exit`, `uncaughtException`,
`SIGHUP` and so on.
The exit listener function will call the supplied `handler` function
with an error object (or else `null`), a `finalLogger` instance followed
by any additional arguments the `handler` may be called with.
```js
process.on('uncaughtException', pino.final(logger, (err, finalLogger) => {
finalLogger.error(err, 'uncaughtException')
process.exit(1)
}))
```
#### `pino.final(logger) => FinalLogger`
In this case the `pino.final` method returns a finalLogger instance.
```js
var finalLogger = pino.final(logger)
finalLogger.info('exiting...')
```
* See [`destination` parameter](#destination)
* See [Exit logging help](/docs/help.md#exit-logging)
* See [Extreme mode ⇗](/docs/extreme.md)
* See [Log loss prevention ⇗](/docs/extreme.md#log-loss-prevention)
<a id="pino-stdserializers"></a>
### `pino.stdSerializers` (Object)
The `pino.stdSerializers` object provides functions for serializing objects common to many projects. The standard serializers are directly imported from [pino-std-serializers](https://github.com/pinojs/pino-std-serializers).
* See [pino-std-serializers ⇗](https://github.com/pinojs/pino-std-serializers)
<a id="pino-stdtimefunctions"></a>
### `pino.stdTimeFunctions` (Object)
The [`timestamp`](#opt-timestamp) option can accept a function which determines the
`timestamp` value in a log line.
The `pino.stdTimeFunctions` object provides a very small set of common functions for generating the
`timestamp` property. These consist of the following
* `pino.stdTimeFunctions.epochTime`: Milliseconds since Unix epoch (Default)
* `pino.stdTimeFunctions.unixTime`: Seconds since Unix epoch
* `pino.stdTimeFunctions.nullTime`: Clears timestamp property (Used when `timestamp: false`)
* `pino.stdTimeFunctions.isoTime`: ISO 8601-formatted time in UTC
* See [`timestamp` option](#opt-timestamp)
<a id="pino-symbols"></a>
### `pino.symbols` (Object)
For integration purposes with ecosystem and third party libraries `pino.symbols`
exposes the symbols used to hold non-public state and methods on the logger instance.
Access to the symbols allows logger state to be adjusted, and methods to be overridden or
proxied for performant integration where necessary.
The `pino.symbols` object is intended for library implementers and shouldn't be utilized
for general use.
<a id="pino-version"></a>
### `pino.version` (String)
Exposes the Pino package version. Also available on the logger instance.
* See [`logger.version`](#version)
<a id="pino-log_version"></a>
### `pino.LOG_VERSION` (Number)
Holds the current log format version as output in the `v` property of each log record. Also available on the logger instance.
* See [`logger.LOG_VERSION`](#log_version)

58
node_modules/pino/docs/benchmarks.md generated vendored Normal file
View file

@ -0,0 +1,58 @@
# Benchmarks
`pino.info('hello world')`:
```
BASIC benchmark averages
Bunyan average: 549.042ms
Winston average: 467.873ms
Bole average: 201.529ms
Debug average: 253.724ms
LogLevel average: 282.653ms
Pino average: 188.956ms
PinoExtreme average: 108.809ms
```
`pino.info({'hello': 'world'})`:
```
OBJECT benchmark averages
BunyanObj average: 564.363ms
WinstonObj average: 464.824ms
BoleObj average: 230.220ms
LogLevelObject average: 474.857ms
PinoObj average: 201.442ms
PinoUnsafeObj average: 202.687ms
PinoExtremeObj average: 108.689ms
PinoUnsafeExtremeObj average: 106.718ms
```
`pino.info(aBigDeeplyNestedObject)`:
```
DEEPOBJECT benchmark averages
BunyanDeepObj average: 5293.279ms
WinstonDeepObj average: 9020.292ms
BoleDeepObj average: 9169.043ms
LogLevelDeepObj average: 15260.917ms
PinoDeepObj average: 8467.807ms
PinoUnsafeDeepObj average: 6159.227ms
PinoExtremeDeepObj average: 8354.557ms
PinoUnsafeExtremeDeepObj average: 6214.073ms
```
`pino.info('hello %s %j %d', 'world', {obj: true}, 4, {another: 'obj'})`:
```
BunyanInterpolateExtra average: 778.408ms
WinstonInterpolateExtra average: 627.956ms
BoleInterpolateExtra average: 429.757ms
PinoInterpolateExtra average: 316.043ms
PinoUnsafeInterpolateExtra average: 316.809ms
PinoExtremeInterpolateExtra average: 218.468ms
PinoUnsafeExtremeInterpolateExtra average: 215.040ms
```
For a fair comparison, [LogLevel](http://npm.im/loglevel) was extended
to include a timestamp and [bole](http://npm.im/bole) had
`fastTime` mode switched on.

199
node_modules/pino/docs/browser.md generated vendored Normal file
View file

@ -0,0 +1,199 @@
# Browser API
Pino is compatible with [`browserify`](http://npm.im/browserify) for browser side usage:
This can be useful with isomorphic/universal JavaScript code.
By default, in the browser,
`pino` uses corresponding [Log4j](https://en.wikipedia.org/wiki/Log4j) `console` methods (`console.error`, `console.warn`, `console.info`, `console.debug`, `console.trace`) and uses `console.error` for any `fatal` level logs.
## Options
Pino can be passed a `browser` object in the options object,
which can have the following properties:
### `asObject` (Boolean)
```js
const pino = require('pino')({browser: {asObject: true}})
```
The `asObject` option will create a pino-like log object instead of
passing all arguments to a console method, for instance:
```js
pino.info('hi') // creates and logs {msg: 'hi', level: 30, time: <ts>}
```
When `write` is set, `asObject` will always be `true`.
### `write` (Function | Object)
Instead of passing log messages to `console.log` they can be passed to
a supplied function.
If `write` is set to a single function, all logging objects are passed
to this function.
```js
const pino = require('pino')({
browser: {
write: (o) => {
// do something with o
}
}
})
```
If `write` is an object, it can have methods that correspond to the
levels. When a message is logged at a given level, the corresponding
method is called. If a method isn't present, the logging falls back
to using the `console`.
```js
const pino = require('pino')({
browser: {
write: {
info: function (o) {
//process info log object
},
error: function (o) {
//process error log object
}
}
}
})
```
### `serialize`: (Boolean | Array)
The serializers provided to `pino` are ignored by default in the browser, including
the standard serializers provided with Pino. Since the default destination for log
messages is the console, values such as `Error` objects are enhanced for inspection,
which they otherwise wouldn't be if the Error serializer was enabled.
We can turn all serializers on,
```js
const pino = require('pino')({
browser: {
serialize: true
}
})
```
Or we can selectively enable them via an array:
```js
const pino = require('pino')({
serializers: {
custom: myCustomSerializer,
another: anotherSerializer
},
browser: {
serialize: ['custom']
}
})
// following will apply myCustomSerializer to the custom property,
// but will not apply anotherSerializer to another key
pino.info({custom: 'a', another: 'b'})
```
When `serialize` is `true` the standard error serializer is also enabled (see https://github.com/pinojs/pino/blob/master/docs/api.md#stdSerializers).
This is a global serializer which will apply to any `Error` objects passed to the logger methods.
If `serialize` is an array the standard error serializer is also automatically enabled, it can
be explicitly disabled by including a string in the serialize array: `!stdSerializers.err`, like so:
```js
const pino = require('pino')({
serializers: {
custom: myCustomSerializer,
another: anotherSerializer
},
browser: {
serialize: ['!stdSerializers.err', 'custom'] //will not serialize Errors, will serialize `custom` keys
}
})
```
The `serialize` array also applies to any child logger serializers (see https://github.com/pinojs/pino/blob/master/docs/api.md#discussion-2
for how to set child-bound serializers).
Unlike server pino the serializers apply to every object passed to the logger method,
if the `asObject` option is `true`, this results in the serializers applying to the
first object (as in server pino).
For more info on serializers see https://github.com/pinojs/pino/blob/master/docs/api.md#parameters.
### `transmit` (Object)
An object with `send` and `level` properties.
The `transmit.level` property specifies the minimum level (inclusive) of when the `send` function
should be called, if not supplied the `send` function be called based on the main logging `level`
(set via `options.level`, defaulting to `info`).
The `transmit` object must have a `send` function which will be called after
writing the log message. The `send` function is passed the level of the log
message and a `logEvent` object.
The `logEvent` object is a data structure representing a log message, it represents
the arguments passed to a logger statement, the level
at which they were logged and the hierarchy of child bindings.
The `logEvent` format is structured like so:
```js
{
ts = Number,
messages = Array,
bindings = Array,
level: { label = String, value = Number}
}
```
The `ts` property is a unix epoch timestamp in milliseconds, the time is taken from the moment the
logger method is called.
The `messages` array is all arguments passed to logger method, (for instance `logger.info('a', 'b', 'c')`
would result in `messages` array `['a', 'b', 'c']`).
The `bindings` array represents each child logger (if any), and the relevant bindings.
For instance given `logger.child({a: 1}).child({b: 2}).info({c: 3})`, the bindings array
would hold `[{a: 1}, {b: 2}]` and the `messages` array would be `[{c: 3}]`. The `bindings`
are ordered according to their position in the child logger hierarchy, with the lowest index
being the top of the hierarchy.
By default serializers are not applied to log output in the browser, but they will *always* be
applied to `messages` and `bindings` in the `logEvent` object. This allows us to ensure a consistent
format for all values between server and client.
The `level` holds the label (for instance `info`), and the corresponding numerical value
(for instance `30`). This could be important in cases where client side level values and
labels differ from server side.
The point of the `send` function is to remotely record log messages:
```js
const pino = require('pino')({
browser: {
transmit: {
level: 'warn',
send: function (level, logEvent) {
if (level === 'warn') {
// maybe send the logEvent to a separate endpoint
// or maybe analyse the messages further before sending
}
// we could also use the `logEvent.level.value` property to determine
// numerical value
if (logEvent.level.value >= 50) { // covers error and fatal
// send the logEvent somewhere
}
}
}
}
})
```

95
node_modules/pino/docs/child-loggers.md generated vendored Normal file
View file

@ -0,0 +1,95 @@
# Child loggers
Let's assume we want to have `"module":"foo"` added to every log within a
module `foo.js`.
To accomplish this, simply use a child logger:
```js
'use strict'
// imports a pino logger instance of `require('pino')()`
const parentLogger = require('./lib/logger')
const log = parentLogger.child({module: 'foo'})
function doSomething () {
log.info('doSomething invoked')
}
module.exports = {
doSomething
}
```
## Cost of child logging
Child logger creation is fast:
```
benchBunyanCreation*10000: 564.514ms
benchBoleCreation*10000: 283.276ms
benchPinoCreation*10000: 258.745ms
benchPinoExtremeCreation*10000: 150.506ms
```
Logging through a child logger has little performance penalty:
```
benchBunyanChild*10000: 556.275ms
benchBoleChild*10000: 288.124ms
benchPinoChild*10000: 231.695ms
benchPinoExtremeChild*10000: 122.117ms
```
Logging via the child logger of a child logger also has negligible overhead:
```
benchBunyanChildChild*10000: 559.082ms
benchPinoChildChild*10000: 229.264ms
benchPinoExtremeChildChild*10000: 127.753ms
```
## Duplicate keys caveat
It's possible for naming conflicts to arise between child loggers and
children of child loggers.
This isn't as bad as it sounds, even if the same keys between
parent and child loggers are used, Pino resolves the conflict in the sanest way.
For example, consider the following:
```js
const pino = require('pino')
pino(pino.destination('./my-log'))
.child({a: 'property'})
.child({a: 'prop'})
.info('howdy')
```
```sh
$ cat my-log
{"pid":95469,"hostname":"MacBook-Pro-3.home","level":30,"msg":"howdy","time":1459534114473,"a":"property","a":"prop","v":1}
```
Notice how there's two key's named `a` in the JSON output. The sub-childs properties
appear after the parent child properties.
At some point the logs will most likely be processed (for instance with a [transport](transports.md)),
and this generally involves parsing. `JSON.parse` will return an object where the conflicting
namespace holds the final value assigned to it:
```sh
$ cat my-log | node -e "process.stdin.once('data', (line) => console.log(JSON.stringify(JSON.parse(line))))"
{"pid":95469,"hostname":"MacBook-Pro-3.home","level":30,"msg":"howdy","time":"2016-04-01T18:08:34.473Z","a":"prop","v":1}
```
Ultimately the conflict is resolved by taking the last value, which aligns with Bunyans child logging
behavior.
There may be cases where this edge case becomes problematic if a JSON parser with alternative behavior
is used to process the logs. It's recommended to be conscious of namespace conflicts with child loggers,
in light of an expected log processing approach.
One of Pino's performance tricks is to avoid building objects and stringifying
them, so we're building strings instead. This is why duplicate keys between
parents and children will end up in log output.

72
node_modules/pino/docs/ecosystem.md generated vendored Normal file
View file

@ -0,0 +1,72 @@
# Pino Ecosystem
This is a list of ecosystem modules that integrate with `pino`.
Modules listed under [Core](#core) are maintained by the Pino team. Modules
listed under [Community](#community) are maintained by independent community
members.
Please send a PR to add new modules!
<a id="core"></a>
## Core
+ [`express-pino-logger`](https://github.com/pinojs/express-pino-logger): use
Pino to log requests within [express](https://expressjs.com/).
+ [`koa-pino-logger`](https://github.com/pinojs/koa-pino-logger): use Pino to
log requests within [Koa](http://koajs.com/).
+ [`pino-arborsculpture`](https://github.com/pinojs/pino-arborsculpture): change
log levels at runtime.
+ [`pino-caller`](https://github.com/pinojs/pino-caller): add callsite to the log line.
+ [`pino-clf`](https://github.com/pinojs/pino-clf): reformat Pino logs into
Common Log Format.
+ [`pino-debug`](https://github.com/pinojs/pino-debug): use Pino to interpret
[`debug`](https://npm.im/debug) logs.
+ [`pino-elasticsearch`](https://github.com/pinojs/pino-elasticsearch): send
Pino logs to an Elasticsearch instance.
+ [`pino-eventhub`](https://github.com/pinojs/pino-eventhub): send Pino logs
to an [Event Hub](https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-what-is-event-hubs).
+ [`pino-filter`](https://github.com/pinojs/pino-filter): filter Pino logs in
the same fashion as the [`debug`](https://npm.im/debug) module.
+ [`pino-gelf`](https://github.com/pinojs/pino-gelf): reformat Pino logs into
GELF format for Graylog.
+ [`pino-hapi`](https://github.com/pinojs/hapi-pino): use Pino as the logger
for [Hapi](https://hapijs.com/).
+ [`pino-http`](https://github.com/pinojs/pino-http): easily use Pino to log
requests with the core `http` module.
+ [`pino-http-print`](https://github.com/pinojs/pino-http-print): reformat Pino
logs into traditional [HTTPD](https://httpd.apache.org/) style request logs.
+ [`pino-multi-stream`](https://github.com/pinojs/pino-multi-stream): send
logs to multiple destination streams (slow!).
+ [`pino-mongodb`](https://github.com/pinojs/pino-mongodb): store Pino logs
in a MongoDB database.
+ [`pino-noir`](https://github.com/pinojs/pino-noir): redact sensitive information
in logs.
+ [`pino-pretty`](https://github.com/pinojs/pino-pretty): basic prettifier to
make log lines human readable.
+ [`pino-socket`](https://github.com/pinojs/pino-socket): send logs to TCP or UDP
destinations.
+ [`pino-std-serializers`](https://github.com/pinojs/pino-std-serializers): the
core object serializers used within Pino.
+ [`pino-syslog`](https://github.com/pinojs/pino-syslog): reformat Pino logs
to standard syslog format.
+ [`pino-tee`](https://github.com/pinojs/pino-tee): pipe Pino logs into files
based upon log levels.
+ [`pino-toke`](https://github.com/pinojs/pino-toke): reformat Pino logs
according to a given format string.
+ [`restify-pino-logger`](https://github.com/pinojs/restify-pino-logger): use
Pino to log requests within [restify](http://restify.com/).
+ [`rill-pino-logger`](https://github.com/pinojs/rill-pino-logger): use Pino as
the logger for the [Rill framework](https://rill.site/).
<a id="community"></a>
## Community
+ [`pino-colada`](https://github.com/lrlna/pino-colada): cute ndjson formatter for pino.
+ [`pino-fluentd`](https://github.com/davidedantonio/pino-fluentd): send Pino logs to Elasticsearch,
MongoDB and many [others](https://www.fluentd.org/dataoutputs) via Fluentd.
+ [`pino-pretty-min`](https://github.com/unjello/pino-pretty-min): a minimal
prettifier inspired by the [logrus](https://github.com/sirupsen/logrus) logger.
+ [`pino-rotating-file`](https://github.com/homeaway/pino-rotating-file): a hapi-pino log transport for splitting logs into separate, automatically rotating files.
+ [`cls-proxify`](https://github.com/keenondrums/cls-proxify): integration of pino and [CLS](https://github.com/jeff-lewis/cls-hooked). Useful for creating dynamically configured child loggers (e.g. with added trace ID) for each request.

95
node_modules/pino/docs/extreme.md generated vendored Normal file
View file

@ -0,0 +1,95 @@
# Extreme Mode
In essence, extreme mode enables even faster performance by Pino.
In Pino's standard mode of operation log messages are directly written to the
output stream as the messages are generated. Extreme mode works by buffering
log messages and writing them in larger chunks.
## Caveats
This has a couple of important caveats:
* 4KB of spare RAM will be needed for logging
* As opposed to the default mode, there is not a one-to-one relationship between
calls to logging methods (e.g. `logger.info`) and writes to a log file
* There is a possibility of the most recently buffered log messages being lost
(up to 4KB of logs)
* For instance, a power cut will mean up to 4KB of buffered logs will be lost
So in summary, only use extreme mode when performing an extreme amount of
logging and it is acceptable to potentially lose the most recent logs.
* Pino will register handlers for the following process events/signals so that
Pino can flush the extreme mode buffer:
+ `beforeExit`
+ `exit`
+ `uncaughtException`
+ `SIGHUP`
+ `SIGINT`
+ `SIGQUIT`
+ `SIGTERM`
In all of these cases, except `SIGHUP`, the process is in a state that it
*must* terminate. Thus, if an `onTerminated` function isn't registered when
constructing a Pino instance (see [pino#constructor](api.md#constructor)),
then Pino will invoke `process.exit(0)` when no error has occurred, or
`process.exit(1)` otherwise. If an `onTerminated` function is supplied, it
is the responsibility of the `onTerminated` function to manually exit the process.
In the case of `SIGHUP`, we will look to see if any other handlers are
registered for the event. If not, we will proceed as we do with all other
signals. If there are more handlers registered than just our own, we will
simply flush the extreme mode buffer.
## Usage
The `pino.extreme()` method will provide an Extreme Mode destination.
```js
const pino = require('pino')
const dest = pino.extreme() // logs to stdout with no args
const logger = pino(dest)
```
<a id='log-loss-prevention'></a>
## Log loss prevention
The following strategy can be used to minimize log loss:
```js
const pino = require('pino')
const dest = pino.extreme() // no arguments
const logger = pino(dest)
// asynchronously flush every 10 seconds to keep the buffer empty
// in periods of low activity
setInterval(function () {
logger.flush()
}, 10000).unref()
// use pino.final to create a special logger that
// guarantees final tick writes
const handler = pino.final(logger, (err, finalLogger, evt) => {
finalLogger.info(`${evt} caught`)
if (err) finalLogger.error(err, 'error caused exit')
process.exit(err ? 1 : 0)
})
// catch all the ways node might exit
process.on('beforeExit', () => handler(null, 'beforeExit'))
process.on('exit', () => handler(null, 'exit'))
process.on('uncaughtException', (err) => handler(err, 'uncaughtException'))
process.on('SIGINT', () => handler(null, 'SIGINT'))
process.on('SIGQUIT', () => handler(null, 'SIGQUIT'))
process.on('SIGTERM', () => handler(null, 'SIGTERM'))
```
An extreme destination is an instance of
[`SonicBoom`](https://github.com/mcollina/sonic-boom) with `4096`
buffering.
* See [`pino.extreme` api](/docs/api.md#pino-extreme)
* See [`pino.final` api](/docs/api.md#pino-final)
* See [`destination` parameter](/docs/api.md#destination)

215
node_modules/pino/docs/help.md generated vendored Normal file
View file

@ -0,0 +1,215 @@
# Help
* [Exit logging](#exit-logging)
* [Log rotation](#rotate)
* [Reopening log files](#reopening)
* [Saving to multiple files](#multiple)
* [Log filtering](#filter-logs)
* [Transports and systemd](#transport-systemd)
* [Duplicate keys](#dupe-keys)
* [Log levels as labels instead of numbers](#level-string)
* [Pino with `debug`](#debug)
* [Unicode and Windows terminal](#windows)
<a id="exit-logging"></a>
## Exit logging
When a Node process crashes from uncaught exception, exits due to a signal,
or exits of it's own accord we may want to write some final logs  particularly
in cases of error.
Writing to a Node.js stream on exit is not necessarily guaranteed, and naively writing
to an Extreme Mode logger on exit will definitely lead to lost logs.
To write logs in an exit handler, create the handler with [`pino.final`](/docs/api.md#pino-final):
```js
process.on('uncaughtException', pino.final(logger, (err, finalLogger) => {
finalLogger.error(err, 'uncaughtException')
process.exit(1)
}))
process.on('unhandledRejection', pino.final(logger, (err, finalLogger) => {
finalLogger.error(err, 'unhandledRejection')
process.exit(1)
}))
```
The `finalLogger` is a special logger instance that will synchronously and reliably
flush every log line. This is important in exit handlers, since no more asynchronous
activity may be scheduled.
<a id="rotate"></a>
## Log rotation
Use a separate tool for log rotation:
We recommend [logrotate](https://github.com/logrotate/logrotate).
Consider we output our logs to `/var/log/myapp.log` like so:
```
$ node server.js > /var/log/myapp.log
```
We would rotate our log files with logrotate, by adding the following to `/etc/logrotate.d/myapp`:
```
/var/log/myapp.log {
su root
daily
rotate 7
delaycompress
compress
notifempty
missingok
copytruncate
}
```
The `copytruncate` configuration has a very slight possibility of lost log lines due
to a gap between copying and truncating - the truncate may occur after additional lines
have been written. To perform log rotation without `copytruncate`, see the [Reopening log files](#reopening)
help.
<a id="reopening"></a>
## Reopening log files
In cases where a log rotation tool doesn't offer a copy-truncate capabilities,
or where using them is deemed inappropriate `pino.destination` and `pino.extreme`
destinations are able to reopen file paths after a file has been moved away.
One way to use this is to set up a `SIGUSR2` or `SIGHUP` signal handler that
reopens the log file destination, making sure to write the process PID out
somewhere so the log rotation tool knows where to send the signal.
```js
// write the process pid to a well known location for later
const fs = require('fs')
fs.writeFileSync('/var/run/myapp.pid', process.pid)
const dest = pino.destination('/log/file') // pino.extreme will also work
const logger = require('pino')(dest)
process.on('SIGHUP', () => dest.reopen())
```
The log rotation tool can then be configured to send this signal to the process
after a log rotation event has occurred.
Given a similar scenario as in the [Log rotation](#rotate) section a basic
`logrotate` config that aligns with this strategy would look similar to the following:
```
/var/log/myapp.log {
su root
daily
rotate 7
delaycompress
compress
notifempty
missingok
postrotate
kill -HUP `cat /var/run/myapp.pid`
endscript
}
```
<a id="multiple"></a>
## Saving to multiple files
Let's assume we want to store all error messages to a separate log file.
Install [pino-tee](http://npm.im/pino-tee) with:
```bash
npm i pino-tee -g
```
The following writes the log output of `app.js` to `./all-logs`, while
writing only warnings and errors to `./warn-log:
```bash
node app.js | pino-tee warn ./warn-logs > ./all-logs
```
<a id="filter-logs"></a>
## Log Filtering
The Pino philosophy advocates common, pre-existing, system utilities.
Some recommendations in line with this philosophy are:
1. Use [`grep`](https://linux.die.net/man/1/grep):
```sh
$ # View all "INFO" level logs
$ node app.js | grep '"level":30'
```
1. Use [`jq`](https://stedolan.github.io/jq/):
```sh
$ # View all "ERROR" level logs
$ node app.js | jq 'select(.level == 50)'
```
<a id="transport-systemd"></a>
## Transports and systemd
`systemd` makes it complicated to use pipes in services. One method for overcoming
this challenge is to use a subshell:
```
ExecStart=/bin/sh -c '/path/to/node app.js | pino-transport'
```
<a id="dupe-keys"></a>
## How Pino handles duplicate keys
Duplicate keys are possibly when a child logger logs an object with a key that
collides with a key in the child loggers bindings.
See the [child logger duplicate keys caveat](/docs/child-loggers.md#duplicate-keys-caveat)
for information on this is handled.
<a id="level-string"></a>
## Log levels as labels instead of numbers
Pino log lines are meant to be parseable. Thus, Pino's default mode of operation
is to print the level value instead of the string name. However, while it is
possible to set the `useLevelLabels` option, we recommend using one of these
options instead if you are able:
1. If the only change desired is the name then a transport can be used. One such
transport is [`pino-text-level-transport`](https://npm.im/pino-text-level-transport).
1. Use a prettifier like [`pino-pretty`](https://npm.im/pino-pretty) to make
the logs human friendly.
<a id="debug"></a>
## Pino with `debug`
The popular [`debug`](http://npm.im/debug) is used in many modules across the ecosystem.
The [`pino-debug`](http://github.com/pinojs/pino-debug) module
can capture calls to `debug` loggers and run them
through `pino` instead. This results in a 10x (20x in extreme mode)
performance improvement - even though `pino-debug` is logging additional
data and wrapping it in JSON.
To quickly enable this install [`pino-debug`](http://github.com/pinojs/pino-debug)
and preload it with the `-r` flag, enabling any `debug` logs with the
`DEBUG` environment variable:
```sh
$ npm i pino-debug
$ DEBUG=* node -r pino-debug app.js
```
[`pino-debug`](http://github.com/pinojs/pino-debug) also offers fine grain control to map specific `debug`
namespaces to `pino` log levels. See [`pino-debug`](http://github.com/pinojs/pino-debug)
for more.
<a id="windows"></a>
## Unicode and Windows terminal
Pino uses [sonic-boom](https://github.com/mcollina/sonic-boom) to speed
up logging. Internally, it uses [`fs.write`](https://nodejs.org/dist/latest-v10.x/docs/api/fs.html#fs_fs_write_fd_string_position_encoding_callback) to write log lines directly to a file
descriptor. On Windows, unicode output is not handled properly in the
terminal (both `cmd.exe` and powershell), and as such the output could
be visualized incorrectly if the log lines include utf8 characters. It
is possible to configure the terminal to visualize those characters
correctly with the use of [`chcp`](https://ss64.com/nt/chcp.html) by
executing in the terminal `chcp 65001`. This is a known limitation of
Node.js.

167
node_modules/pino/docs/legacy.md generated vendored Normal file
View file

@ -0,0 +1,167 @@
# Legacy
## Legacy Node Support
### Node v4
Node v4 is supported on the [Pino v4](#pino-v4-documentation) line.
### Node v0.10-v0.12
Node v0.10 or Node v0.12 is supported on the [Pino v2](#pino-v2-documentation) line.
## Documentation
### Pino v4 Documentation
<https://github.com/pinojs/pino/tree/v4.x.x/docs>
### Pino v3 Documentation
<https://github.com/pinojs/pino/tree/v3.x.x/docs>
### Pino v2 Documentation
<https://github.com/pinojs/pino/tree/v2.x.x/docs>
## Migration
### Pino v4 to to Pino v5
#### Logging Destination
In Pino v4 the destination could be set by passing a stream as the
second parameter to the exported `pino` function. This is still the
case in v5. However it's strongly recommended to use `pino.destination`
which will write logs ~30% faster.
##### v4
```js
const stdoutLogger = require('pino')()
const stderrLogger = require('pino')(process.stderr)
const fileLogger = require('pino')(fs.createWriteStream('/log/path'))
```
##### v5
```js
const stdoutLogger = require('pino')() // pino.destination by default
const stderrLogger = require('pino')(pino.destination(2))
const fileLogger = require('pino')(pino.destination('/log/path'))
```
Note: This is not a breaking change, `WritableStream` instances are still
supported, but are slower than `pino.destination` which
uses the high speed [`sonic-boom` ⇗](https://github.com/mcollina/sonic-boom) library.
* See [`destination` parameter](/docs/api.md#destination)
#### Extreme Mode
The `extreme` setting does not exist as an option in Pino v5, instead use
a `pino.extreme` destination.
##### v4
```js
const stdoutLogger = require('pino')({extreme: true})
const stderrLogger = require('pino')({extreme: true}, process.stderr)
const fileLogger = require('pino')({extreme: true}, fs.createWriteStream('/log/path'))
```
##### v5
```js
const stdoutLogger = require('pino')(pino.extreme())
const stderrLogger = require('pino')(pino.extreme(2))
const fileLogger = require('pino')(pino.extreme('/log/path'))
```
* See [pino.extreme](/docs/api.md#pino-extreme)
* See [Extreme mode ⇗](/docs/extreme.md)
#### Pino CLI is now pino-pretty CLI
The Pino CLI is provided with Pino v4 for basic log prettification.
From Pino v5 the CLI is installed separately with `pino-pretty`.
##### v4
```sh
$ npm install -g pino
$ node app.js | pino
```
##### v5
```sh
$ npm install -g pino-pretty
$ node app.js | pino-pretty
```
* See [Pretty Printing documentation](/docs/pretty.md)
#### Programmatic Pretty Printing
The [`pino.pretty()`](https://github.com/pinojs/pino/blob/v4.x.x/docs/API.md#prettyoptions)
method has also been removed from Pino v5.
##### v4
```js
var pino = require('pino')
var pretty = pino.pretty()
pretty.pipe(process.stdout)
```
##### v5
Instead use the `prettyPrint` option (also available in v4):
```js
const logger = require('pino')({
prettyPrint: process.env.NODE_ENV !== 'production'
})
```
In v5 the `pretty-print` module must be installed to use the `prettyPrint` option:
```sh
npm install --save-dev pino-pretty
```
* See [prettyPrint option](/docs/api.md#prettyPrint)
* See [Pretty Printing documentation](/docs/pretty.md)
#### Slowtime
In Pino v4 a `slowtime` option was supplied, which allowed for full ISO dates
in the timestamps instead of milliseconds since the Epoch. In Pino v5 this
has been completely removed, along with the `pino.stdTimeFunctions.slowTime`
function. In order to achieve the equivalent in v5, a custom
time function should be supplied:
##### v4
```js
const pino = require('pino')
const logger = pino({slowtime: true})
// following avoids deprecation warning in v4:
const loggerAlt = pino({timestamp: pino.stdTimeFunctions.slowTime})
```
##### v5
```js
const logger = require('pino')({
timestamp: () => ',"time":"' + (new Date()).toISOString() + '"'
})
```
The practice of creating ISO dates in-process for logging purposes is strongly
recommended against. Instead consider post-processing the logs or using a transport
to convert the timestamps.
* See [timestamp option](/docs/api.md#timestamp)

93
node_modules/pino/docs/pretty.md generated vendored Normal file
View file

@ -0,0 +1,93 @@
# Pretty Printing
By default, Pino log lines are newline delimited JSON (NDJSON). This is perfect
for production usage and long term storage. It's not so great for development
environments. Thus, Pino logs can be prettified by using a Pino prettifier
module like [`pino-pretty`][pp]:
```sh
$ cat app.log | pino-pretty
```
For almost all situations, this is the recommended way to prettify logs. The
programmatic API, described in the next section, is primarily for integration
purposes with other CLI based prettifiers.
## Prettifier API
Pino prettifier modules are extra modules that provide a CLI for parsing NDJSON
log lines piped via `stdin` and expose an API which conforms to the Pino
[metadata streams](api.md#metadata) API.
The API requires modules provide a factory function which returns a prettifier
function. This prettifier function must accept either a string of NDJSON or
a Pino log object. A psuedo-example of such a prettifier is:
The uninitialized Pino instance is passed as `this` into prettifier factory function,
so it can be accessed via closure by the returned prettifier function.
```js
module.exports = function myPrettifier (options) {
// `this` is bound to the pino instance
// Deal with whatever options are supplied.
return function prettifier (inputData) {
let logObject
if (typeof inputData === 'string') {
const parsedData = someJsonParser(inputData)
logObject = (isPinoLog(parsedData)) ? parsedData : undefined
} else if (isObject(inputData) && isPinoLog(inputData)) {
logObject = inputData
}
if (!logObject) return inputData
// implement prettification
}
function isObject (input) {
return Object.prototype.toString.apply(input) === '[object Object]'
}
function isPinoLog (log) {
return log && (log.hasOwnProperty('v') && log.v === 1)
}
}
```
The reference implementation of such a module is the [`pino-pretty`][pp] module.
To learn more about creating a custom prettifier module, refer to the
`pino-pretty` source code.
Note: if the prettifier returns `undefined`, instead of a formatted line, nothing
will be written to the destination stream.
### API Example
> #### NOTE:
> For general usage, it is highly recommended that logs are piped into
> the prettifier instead. Prettified logs are not easily parsed and cannot
> be easily investigated at a later date.
1. Install a prettifier module as a separate dependency, e.g. `npm install pino-pretty`.
1. Instantiate the logger with pretty printing enabled:
```js
const pino = require('pino')
const log = pino({
prettyPrint: {
levelFirst: true
},
prettifier: require('pino-pretty')
})
```
Note: the default prettifier module is `pino-pretty`, so the preceding
example could be:
```js
const pino = require('pino')
const log = pino({
prettyPrint: {
levelFirst: true
}
})
```
See the [`pino-pretty` documentation][pp] for more information on the options
that can be passed via `prettyPrint`.
[pp]: https://github.com/pinojs/pino-pretty

133
node_modules/pino/docs/redaction.md generated vendored Normal file
View file

@ -0,0 +1,133 @@
# Redaction
> Redaction is not supported in the browser [#670](https://github.com/pinojs/pino/issues/670)
To redact sensitive information, supply paths to keys that hold sensitive data
using the `redact` option:
```js
const logger = require('.')({
redact: ['key', 'path.to.key', 'stuff.thats[*].secret']
})
logger.info({
key: 'will be redacted',
path: {
to: {key: 'sensitive', another: 'thing'}
},
stuff: {
thats: [
{secret: 'will be redacted', logme: 'will be logged'},
{secret: 'as will this', logme: 'as will this'}
]
}
})
```
This will output:
```JSON
{"level":30,"time":1527777350011,"pid":3186,"hostname":"Davids-MacBook-Pro-3.local","key":"[Redacted]","path":{"to":{"key":"[Redacted]","another":"thing"}},"stuff":{"thats":[{"secret":"[Redacted]","logme":"will be logged"},{"secret":"[Redacted]","logme":"as will this"}]},"v":1}
```
The `redact` option can take an array (as shown in the above example) or
an object. This allows control over *how* information is redacted.
For instance, setting the censor:
```js
const logger = require('.')({
redact: {
paths: ['key', 'path.to.key', 'stuff.thats[*].secret'],
censor: '**GDPR COMPLIANT**'
}
})
logger.info({
key: 'will be redacted',
path: {
to: {key: 'sensitive', another: 'thing'}
},
stuff: {
thats: [
{secret: 'will be redacted', logme: 'will be logged'},
{secret: 'as will this', logme: 'as will this'}
]
}
})
```
This will output:
```JSON
{"level":30,"time":1527778563934,"pid":3847,"hostname":"Davids-MacBook-Pro-3.local","key":"**GDPR COMPLIANT**","path":{"to":{"key":"**GDPR COMPLIANT**","another":"thing"}},"stuff":{"thats":[{"secret":"**GDPR COMPLIANT**","logme":"will be logged"},{"secret":"**GDPR COMPLIANT**","logme":"as will this"}]},"v":1}
```
The `redact.remove` option also allows for the key and value to be removed from output:
```js
const logger = require('.')({
redact: {
paths: ['key', 'path.to.key', 'stuff.thats[*].secret'],
remove: true
}
})
logger.info({
key: 'will be redacted',
path: {
to: {key: 'sensitive', another: 'thing'}
},
stuff: {
thats: [
{secret: 'will be redacted', logme: 'will be logged'},
{secret: 'as will this', logme: 'as will this'}
]
}
})
```
This will output
```JSON
{"level":30,"time":1527782356751,"pid":5758,"hostname":"Davids-MacBook-Pro-3.local","path":{"to":{"another":"thing"}},"stuff":{"thats":[{"logme":"will be logged"},{"logme":"as will this"}]},"v":1}
```
See [pino options in API](/docs/api.md#redact-array-object) for `redact` API details.
<a name="paths"></a>
## Path Syntax
The syntax for paths supplied to the `redact` option conform to the syntax in path lookups
in standard EcmaScript, with two additions:
* paths may start with bracket notation
* paths may contain the asterisk `*` to denote a wildcard
By way of example, the following are all valid paths:
* `a.b.c`
* `a["b-c"].d`
* `["a-b"].c`
* `a.b.*`
* `a[*].b`
## Overhead
Pino's redaction functionality is built on top of [`fast-redact`](http://github.com/davidmarkclements/fast-redact)
which adds about 2% overhead to `JSON.stringify` when using paths without wildcards.
When used with pino logger with a single redacted path, any overhead is within noise -
a way to deterministically measure it's effect has not been found. This is because its not a bottleneck.
However, wildcard redaction does carry a non-trivial cost relative to explicitly declaring the keys
(50% in a case where four keys are redacted across two objects). See
the [`fast-redact` benchmarks](https://github.com/davidmarkclements/fast-redact#benchmarks) for details.
## Safety
The `redact` option is intended as an initialization time configuration option.
It's extremely important that path strings do not originate from user input.
The `fast-redact` module uses a VM context to syntax check the paths, user input
should never be combined with such an approach. See the [`fast-redact` Caveat](https://github.com/davidmarkclements/fast-redact#caveat)
and the [`fast-redact` Approach](https://github.com/davidmarkclements/fast-redact#approach) for in-depth information.

387
node_modules/pino/docs/transports.md generated vendored Normal file
View file

@ -0,0 +1,387 @@
# Transports
A "transport" for Pino is supplementary tool which consumes Pino logs.
Consider the following example:
```js
const split = require('split2')
const pump = require('pump')
const through = require('through2')
const myTransport = through.obj(function (chunk, enc, cb) {
// do the necessary
console.log(chunk)
cb()
})
pump(process.stdin, split(JSON.parse), myTransport)
```
The above defines our "transport" as the file `my-transport-process.js`.
Logs can now be consumed using shell piping:
```sh
node my-app-which-logs-stuff-to-stdout.js | node my-transport-process.js
```
Ideally, a transport should consume logs in a separate process to the application,
Using transports in the same process causes unnecessary load and slows down
Node's single threaded event loop.
## In-process transports
> **Pino *does not* natively support in-process transports.**
Pino does not support in-process transports because Node processes are
single threaded processes (ignoring some technical details). Given this
restriction, one of the methods Pino employs to achieve its speed is to
purposefully offload the handling of logs, and their ultimate destination, to
external processes so that the threading capabilities of the OS can be
used (or other CPUs).
One consequence of this methodology is that "error" logs do not get written to
`stderr`. However, since Pino logs are in a parseable format, it is possible to
use tools like [pino-tee][pino-tee] or [jq][jq] to work with the logs. For
example, to view only logs marked as "error" logs:
```
$ node an-app.js | jq 'select(.level == 50)'
```
In short, the way Pino generates logs:
1. Reduces the impact of logging on an application to the absolute minimum.
2. Gives greater flexibility in how logs are processed and stored.
Given all of the above, Pino recommends out-of-process log processing.
However, it is possible to wrap Pino and perform processing in-process.
For an example of this, see [pino-multi-stream][pinoms].
[pino-tee]: https://npm.im/pino-tee
[jq]: https://stedolan.github.io/jq/
[pinoms]: https://npm.im/pino-multi-stream
## Known Transports
PR's to this document are welcome for any new transports!
+ [pino-applicationinsights](#pino-applicationinsights)
+ [pino-azuretable](#pino-azuretable)
+ [pino-cloudwatch](#pino-cloudwatch)
+ [pino-couch](#pino-couch)
+ [pino-datadog](#pino-datadog)
+ [pino-elasticsearch](#pino-elasticsearch)
+ [pino-mq](#pino-mq)
+ [pino-mysql](#pino-mysql)
+ [pino-papertrail](#pino-papertrail)
+ [pino-redis](#pino-redis)
+ [pino-sentry](#pino-sentry)
+ [pino-socket](#pino-socket)
+ [pino-stackdriver](#pino-stackdriver)
+ [pino-syslog](#pino-syslog)
+ [pino-websocket](#pino-websocket)
+ [pino-http-send](#pino-http-send)
<a id="pino-applicationinsights"></a>
### pino-applicationinsights
The [pino-applicationinsights](https://www.npmjs.com/package/pino-applicationinsights) module is a transport that will forward logs to [Azure Application Insights](https://docs.microsoft.com/en-us/azure/azure-monitor/app/app-insights-overview).
Given an application `foo` that logs via pino, you would use `pino-applicationinsights` like so:
``` sh
$ node foo | pino-applicationinsights --key blablabla
```
For full documentation of command line switches read [readme](https://github.com/ovhemert/pino-applicationinsights#readme)
<a id="pino-azuretable"></a>
### pino-azuretable
The [pino-azuretable](https://www.npmjs.com/package/pino-azuretable) module is a transport that will forward logs to the [Azure Table Storage](https://azure.microsoft.com/en-us/services/storage/tables/).
Given an application `foo` that logs via pino, you would use `pino-azuretable` like so:
``` sh
$ node foo | pino-azuretable --account storageaccount --key blablabla
```
For full documentation of command line switches read [readme](https://github.com/ovhemert/pino-azuretable#readme)
<a id="pino-cloudwatch"></a>
### pino-cloudwatch
[pino-cloudwatch][pino-cloudwatch] is a transport that buffers and forwards logs to [Amazon CloudWatch][].
```sh
$ node app.js | pino-cloudwatch --group my-log-group
```
[pino-cloudwatch]: https://github.com/dbhowell/pino-cloudwatch
[Amazon CloudWatch]: https://aws.amazon.com/cloudwatch/
<a id="pino-couch"></a>
### pino-couch
[pino-couch][pino-couch] uploads each log line as a [CouchDB][CouchDB] document.
```sh
$ node app.js | pino-couch -U https://couch-server -d mylogs
```
[pino-couch]: https://github.com/IBM/pino-couch
[CouchDB]: https://couchdb.apache.org
<a id="pino-datadog"></a>
### pino-datadog
The [pino-datadog](https://www.npmjs.com/package/pino-datadog) module is a transport that will forward logs to [DataDog](https://www.datadoghq.com/) through it's API.
Given an application `foo` that logs via pino, you would use `pino-datadog` like so:
``` sh
$ node foo | pino-datadog --key blablabla
```
For full documentation of command line switches read [readme](https://github.com/ovhemert/pino-datadog#readme)
<a id="pino-elasticsearch"></a>
### pino-elasticsearch
[pino-elasticsearch][pino-elasticsearch] uploads the log lines in bulk
to [Elasticsearch][elasticsearch], to be displayed in [Kibana][kibana].
It is extremely simple to use and setup
```sh
$ node app.js | pino-elasticsearch
```
Assuming Elasticsearch is running on localhost.
To connect to an external elasticsearch instance (recommended for production):
* Check that `network.host` is defined in the `elasticsearch.yml` configuration file. See [elasticsearch Network Settings documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-network.html#common-network-settings) for more details.
* Launch:
```sh
$ node app.js | pino-elasticsearch --node http://192.168.1.42:9200
```
Assuming Elasticsearch is running on `192.168.1.42`.
To connect to AWS Elasticsearch:
```sh
$ node app.js | pino-elasticsearch --node https://es-url.us-east-1.es.amazonaws.com --es-version 6
```
Then [create an index pattern](https://www.elastic.co/guide/en/kibana/current/setup.html) on `'pino'` (the default index key for `pino-elasticsearch`) on the Kibana instance.
[pino-elasticsearch]: https://github.com/pinojs/pino-elasticsearch
[elasticsearch]: https://www.elastic.co/products/elasticsearch
[kibana]: https://www.elastic.co/products/kibana
<a id="pino-mq"></a>
### pino-mq
The `pino-mq` transport will take all messages received on `process.stdin` and send them over a message bus using JSON serialization.
This useful for:
* moving backpressure from application to broker
* transforming messages pressure to another component
```
node app.js | pino-mq -u "amqp://guest:guest@localhost/" -q "pino-logs"
```
Alternatively a configuration file can be used:
```
node app.js | pino-mq -c pino-mq.json
```
A base configuration file can be initialized with:
```
pino-mq -g
```
For full documentation of command line switches and configuration see [the `pino-mq` readme](https://github.com/itavy/pino-mq#readme)
<a id="pino-papertrail"></a>
### pino-papertrail
pino-papertrail is a transport that will forward logs to the [papertrail](https://papertrailapp.com) log service through an UDPv4 socket.
Given an application `foo` that logs via pino, and a papertrail destination that collects logs on port UDP `12345` on address `bar.papertrailapp.com`, you would use `pino-papertrail`
like so:
```
node yourapp.js | pino-papertrail --host bar.papertrailapp.com --port 12345 --appname foo
```
for full documentation of command line switches read [readme](https://github.com/ovhemert/pino-papertrail#readme)
<a id="pino-mysql"></a>
### pino-mysql
[pino-mysql][pino-mysql] loads pino logs into [MySQL][MySQL] and [MariaDB][MariaDB].
```sh
$ node app.js | pino-mysql -c db-configuration.json
```
`pino-mysql` can extract and save log fields into corresponding database field
and/or save the entire log stream as a [JSON Data Type][JSONDT].
For full documentation and command line switches read the [readme][pino-mysql].
[pino-mysql]: https://www.npmjs.com/package/pino-mysql
[MySQL]: https://www.mysql.com/
[MariaDB]: https://mariadb.org/
[JSONDT]: https://dev.mysql.com/doc/refman/8.0/en/json.html
<a id="pino-redis"></a>
### pino-redis
[pino-redis][pino-redis] loads pino logs into [Redis][Redis].
```sh
$ node app.js | pino-redis -U redis://username:password@localhost:6379
```
[pino-redis]: https://github.com/buianhthang/pino-redis
[Redis]: https://redis.io/
<a id="pino-sentry"></a>
### pino-sentry
[pino-sentry][pino-sentry] loads pino logs into [Sentry][Sentry].
```sh
$ node app.js | pino-sentry --dsn=https://******@sentry.io/12345
```
For full documentation of command line switches see the [pino-sentry readme](https://github.com/aandrewww/pino-sentry/blob/master/README.md)
[pino-sentry]: https://www.npmjs.com/package/pino-sentry
[Sentry]: https://sentry.io/
<a id="pino-socket"></a>
### pino-socket
[pino-socket][pino-socket] is a transport that will forward logs to a IPv4
UDP or TCP socket.
As an example, use `socat` to fake a listener:
```sh
$ socat -v udp4-recvfrom:6000,fork exec:'/bin/cat'
```
Then run an application that uses `pino` for logging:
```sh
$ node app.js | pino-socket -p 6000
```
Logs from the application should be observed on both consoles.
[pino-socket]: https://www.npmjs.com/package/pino-socket
#### Logstash
The [pino-socket][pino-socket] module can also be used to upload logs to
[Logstash][logstash] via:
```
$ node app.js | pino-socket -a 127.0.0.1 -p 5000 -m tcp
```
Assuming logstash is running on the same host and configured as
follows:
```
input {
tcp {
port => 5000
}
}
filter {
json {
source => "message"
}
}
output {
elasticsearch {
hosts => "127.0.0.1:9200"
}
}
```
See <https://www.elastic.co/guide/en/kibana/current/setup.html> to learn
how to setup [Kibana][kibana].
For Docker users, see
https://github.com/deviantony/docker-elk to setup an ELK stack.
<a id="pino-stackdriver"></a>
### pino-stackdriver
The [pino-stackdriver](https://www.npmjs.com/package/pino-stackdriver) module is a transport that will forward logs to the [Google Stackdriver](https://cloud.google.com/logging/) log service through it's API.
Given an application `foo` that logs via pino, a stackdriver log project `bar` and credentials in the file `/credentials.json`, you would use `pino-stackdriver`
like so:
``` sh
$ node foo | pino-stackdriver --project bar --credentials /credentials.json
```
For full documentation of command line switches read [readme](https://github.com/ovhemert/pino-stackdriver#readme)
<a id="pino-syslog"></a>
### pino-syslog
[pino-syslog][pino-syslog] is a transforming transport that converts
`pino` NDJSON logs to [RFC3164][rfc3164] compatible log messages. The `pino-syslog` module does not
forward the logs anywhere, it merely re-writes the messages to `stdout`. But
when used in combination with `pino-socket` the log messages can be relayed to a syslog server:
```sh
$ node app.js | pino-syslog | pino-socket -a syslog.example.com
```
Example output for the "hello world" log:
```
<134>Apr 1 16:44:58 MacBook-Pro-3 none[94473]: {"pid":94473,"hostname":"MacBook-Pro-3","level":30,"msg":"hello world","time":1459529098958,"v":1}
```
[pino-syslog]: https://www.npmjs.com/package/pino-syslog
[rfc3164]: https://tools.ietf.org/html/rfc3164
[logstash]: https://www.elastic.co/products/logstash
<a id="pino-websocket"></a>
### pino-websocket
[pino-websocket](https://www.npmjs.com/package/@abeai/pino-websocket) is a transport that will forward each log line to a websocket server.
```sh
$ node app.js | pino-websocket -a my-websocket-server.example.com -p 3004
```
For full documentation of command line switches read [readme](https://github.com/abeai/pino-webscoket#README)
<a id="pino-http-send"></a>
### pino-http-send
[pino-http-send](https://npmjs.com/package/pino-http-send) is a configurable and low overhead
transport that will batch logs and send to a specified URL.
```console
$ node app.js | pino-http-send -u http://localhost:8080/logs
```

230
node_modules/pino/docs/web.md generated vendored Normal file
View file

@ -0,0 +1,230 @@
# Web Frameworks
Since HTTP logging is a primary use case, Pino has first class support for the Node.js
web framework ecosystem.
+ [Pino with Fastify](#fastify)
+ [Pino with Express](#express)
+ [Pino with Hapi](#hapi)
+ [Pino with Restify](#restify)
+ [Pino with Koa](#koa)
+ [Pino with Node core `http`](#http)
+ [Pino with Nest](#nest)
<a id="fastify"></a>
## Pino with Fastify
The Fastify web framework comes bundled with Pino by default, simply set Fastify's
`logger` option to `true` and use `request.log` or `reply.log` for log messages that correspond
to each individual request:
```js
const fastify = require('fastify')({
logger: true
})
fastify.get('/', async (request, reply) => {
request.log.info('something')
return { hello: 'world' }
})
```
The `logger` option can also be set to an object, which will be passed through directly
as the [`pino` options object](/docs/api.md#options-object).
See the [fastify documentation](https://www.fastify.io/docs/latest/Logging/) for more information.
<a id="express"></a>
## Pino with Express
```sh
npm install express-pino-logger
```
```js
const app = require('express')()
const pino = require('express-pino-logger')()
app.use(pino)
app.get('/', function (req, res) {
req.log.info('something')
res.send('hello world')
})
app.listen(3000)
```
See the [express-pino-logger readme](http://npm.im/express-pino-logger) for more info.
<a id="hapi"></a>
## Pino with Hapi
```sh
npm install hapi-pino
```
```js
'use strict'
require('make-promises-safe')
const Hapi = require('hapi')
async function start () {
// Create a server with a host and port
const server = Hapi.server({
host: 'localhost',
port: 3000
})
// Add the route
server.route({
method: 'GET',
path: '/',
handler: async function (request, h) {
// request.log is HAPI standard way of logging
request.log(['a', 'b'], 'Request into hello world')
// a pino instance can also be used, which will be faster
request.logger.info('In handler %s', request.path)
return 'hello world'
}
})
await server.register({
plugin: require('.'),
options: {
prettyPrint: process.env.NODE_ENV !== 'production'
}
})
// also as a decorated API
server.logger().info('another way for accessing it')
// and through Hapi standard logging system
server.log(['subsystem'], 'third way for accessing it')
await server.start()
return server
}
start().catch((err) => {
console.log(err)
process.exit(1)
})
```
See the [hapi-pino readme](http://npm.im/hapi-pino) for more info.
<a id="restify"></a>
## Pino with Restify
```sh
npm install restify-pino-logger
```
```js
const server = require('restify').createServer({name: 'server'})
const pino = require('restify-pino-logger')()
server.use(pino)
server.get('/', function (req, res) {
req.log.info('something')
res.send('hello world')
})
server.listen(3000)
```
See the [restify-pino-logger readme](http://npm.im/restify-pino-logger) for more info.
<a id="koa"></a>
## Pino with Koa
### Koa
```sh
npm install koa-pino-logger
```
```js
const Koa = require('koa')
const app = new Koa()
const pino = require('koa-pino-logger')()
app.use(pino)
app.use((ctx) => {
ctx.log.info('something else')
ctx.body = 'hello world'
})
app.listen(3000)
```
See the [koa-pino-logger readme](https://github.com/pinojs/koa-pino-logger) for more info.
<a id="http"></a>
## Pino with Node core `http`
```sh
npm install pino-http
```
```js
const http = require('http')
const server = http.createServer(handle)
const logger = require('pino-http')()
function handle (req, res) {
logger(req, res)
req.log.info('something else')
res.end('hello world')
}
server.listen(3000)
```
See the [pino-http readme](http://npm.im/pino-http) for more info.
<a id="nest"></a>
## Pino with Nest
```sh
npm install nestjs-pino
```
```ts
import { NestFactory } from '@nestjs/core'
import { Controller, Get, Module } from '@nestjs/common'
import { LoggerModule, Logger } from 'nestjs-pino'
@Controller()
export class AppController {
constructor(private readonly logger: Logger) {}
@Get()
getHello() {
this.logger.log('something')
return `Hello world`
}
}
@Module({
controllers: [AppController],
imports: [LoggerModule.forRoot()]
})
class MyModule {}
async function bootstrap() {
const app = await NestFactory.create(MyModule)
await app.listen(3000)
}
bootstrap()
```
See the [nestjs-pino readme](http://npm.im/nestjs-pino) for more info.

35
node_modules/pino/example.js generated vendored Normal file
View file

@ -0,0 +1,35 @@
'use strict'
const pino = require('./')()
pino.info('hello world')
pino.error('this is at error level')
pino.info('the answer is %d', 42)
pino.info({ obj: 42 }, 'hello world')
pino.info({ obj: 42, b: 2 }, 'hello world')
pino.info({ nested: { obj: 42 } }, 'nested')
setImmediate(() => {
pino.info('after setImmediate')
})
pino.error(new Error('an error'))
const child = pino.child({ a: 'property' })
child.info('hello child!')
const childsChild = child.child({ another: 'property' })
childsChild.info('hello baby..')
pino.debug('this should be mute')
pino.level = 'trace'
pino.debug('this is a debug statement')
pino.child({ another: 'property' }).debug('this is a debug statement via child')
pino.trace('this is a trace statement')
pino.debug('this is a "debug" statement with "')
pino.info(new Error('kaboom'))
pino.info(new Error('kaboom'), 'with', 'a', 'message')

183
node_modules/pino/lib/levels.js generated vendored Normal file
View file

@ -0,0 +1,183 @@
'use strict'
const flatstr = require('flatstr')
const {
lsCacheSym,
levelValSym,
useLevelLabelsSym,
levelKeySym,
useOnlyCustomLevelsSym,
streamSym
} = require('./symbols')
const { noop, genLog } = require('./tools')
const levels = {
trace: 10,
debug: 20,
info: 30,
warn: 40,
error: 50,
fatal: 60
}
const logFatal = genLog(levels.fatal)
const levelMethods = {
fatal (...args) {
const stream = this[streamSym]
logFatal.call(this, ...args)
if (typeof stream.flushSync === 'function') {
try {
stream.flushSync()
} catch (e) {
// https://github.com/pinojs/pino/pull/740#discussion_r346788313
}
}
},
error: genLog(levels.error),
warn: genLog(levels.warn),
info: genLog(levels.info),
debug: genLog(levels.debug),
trace: genLog(levels.trace)
}
const nums = Object.keys(levels).reduce((o, k) => {
o[levels[k]] = k
return o
}, {})
const initialLsCache = Object.keys(nums).reduce((o, k) => {
o[k] = flatstr('{"level":' + Number(k))
return o
}, {})
function genLsCache (instance) {
const levelName = instance[levelKeySym]
instance[lsCacheSym] = Object.keys(instance.levels.labels).reduce((o, k) => {
o[k] = instance[useLevelLabelsSym]
? `{"${levelName}":"${instance.levels.labels[k]}"`
: flatstr(`{"${levelName}":` + Number(k))
return o
}, Object.assign({}, instance[lsCacheSym]))
return instance
}
function isStandardLevel (level, useOnlyCustomLevels) {
if (useOnlyCustomLevels) {
return false
}
switch (level) {
case 'fatal':
case 'error':
case 'warn':
case 'info':
case 'debug':
case 'trace':
return true
default:
return false
}
}
function setLevel (level) {
const { labels, values } = this.levels
if (typeof level === 'number') {
if (labels[level] === undefined) throw Error('unknown level value' + level)
level = labels[level]
}
if (values[level] === undefined) throw Error('unknown level ' + level)
const preLevelVal = this[levelValSym]
const levelVal = this[levelValSym] = values[level]
const useOnlyCustomLevelsVal = this[useOnlyCustomLevelsSym]
for (var key in values) {
if (levelVal > values[key]) {
this[key] = noop
continue
}
this[key] = isStandardLevel(key, useOnlyCustomLevelsVal) ? levelMethods[key] : genLog(values[key])
}
this.emit(
'level-change',
level,
levelVal,
labels[preLevelVal],
preLevelVal
)
}
function getLevel (level) {
const { levels, levelVal } = this
return levels.labels[levelVal]
}
function isLevelEnabled (logLevel) {
const { values } = this.levels
const logLevelVal = values[logLevel]
return logLevelVal !== undefined && (logLevelVal >= this[levelValSym])
}
function mappings (customLevels = null, useOnlyCustomLevels = false) {
const customNums = customLevels ? Object.keys(customLevels).reduce((o, k) => {
o[customLevels[k]] = k
return o
}, {}) : null
const labels = Object.assign(
Object.create(Object.prototype, { Infinity: { value: 'silent' } }),
useOnlyCustomLevels ? null : nums,
customNums
)
const values = Object.assign(
Object.create(Object.prototype, { silent: { value: Infinity } }),
useOnlyCustomLevels ? null : levels,
customLevels
)
return { labels, values }
}
function assertDefaultLevelFound (defaultLevel, customLevels, useOnlyCustomLevels) {
if (typeof defaultLevel === 'number') {
const values = [].concat(
Object.keys(customLevels || {}).map(key => customLevels[key]),
useOnlyCustomLevels ? [] : Object.keys(nums).map(level => +level),
Infinity
)
if (!values.includes(defaultLevel)) {
throw Error(`default level:${defaultLevel} must be included in custom levels`)
}
return
}
const labels = Object.assign(
Object.create(Object.prototype, { silent: { value: Infinity } }),
useOnlyCustomLevels ? null : levels,
customLevels
)
if (!(defaultLevel in labels)) {
throw Error(`default level:${defaultLevel} must be included in custom levels`)
}
}
function assertNoLevelCollisions (levels, customLevels) {
const { labels, values } = levels
for (const k in customLevels) {
if (k in values) {
throw Error('levels cannot be overridden')
}
if (customLevels[k] in labels) {
throw Error('pre-existing level values cannot be used for new levels')
}
}
}
module.exports = {
initialLsCache,
genLsCache,
levelMethods,
getLevel,
setLevel,
isLevelEnabled,
mappings,
assertNoLevelCollisions,
assertDefaultLevelFound
}

7
node_modules/pino/lib/meta.js generated vendored Normal file
View file

@ -0,0 +1,7 @@
'use strict'
const { version } = require('../package.json')
const LOG_VERSION = 1
module.exports = { version, LOG_VERSION }

167
node_modules/pino/lib/proto.js generated vendored Normal file
View file

@ -0,0 +1,167 @@
'use strict'
/* eslint no-prototype-builtins: 0 */
const { EventEmitter } = require('events')
const SonicBoom = require('sonic-boom')
const flatstr = require('flatstr')
const {
lsCacheSym,
levelValSym,
setLevelSym,
getLevelSym,
chindingsSym,
mixinSym,
asJsonSym,
messageKeySym,
writeSym,
timeSym,
timeSliceIndexSym,
streamSym,
serializersSym,
useOnlyCustomLevelsSym,
needsMetadataGsym
} = require('./symbols')
const {
getLevel,
setLevel,
isLevelEnabled,
mappings,
initialLsCache,
genLsCache,
assertNoLevelCollisions
} = require('./levels')
const {
asChindings,
asJson
} = require('./tools')
const {
version,
LOG_VERSION
} = require('./meta')
// note: use of class is satirical
// https://github.com/pinojs/pino/pull/433#pullrequestreview-127703127
const constructor = class Pino {}
const prototype = {
constructor,
child,
bindings,
setBindings,
flush,
isLevelEnabled,
version,
get level () { return this[getLevelSym]() },
set level (lvl) { return this[setLevelSym](lvl) },
get levelVal () { return this[levelValSym] },
set levelVal (n) { throw Error('levelVal is read-only') },
[lsCacheSym]: initialLsCache,
[writeSym]: write,
[asJsonSym]: asJson,
[getLevelSym]: getLevel,
[setLevelSym]: setLevel,
LOG_VERSION
}
Object.setPrototypeOf(prototype, EventEmitter.prototype)
module.exports = prototype
function child (bindings) {
const { level } = this
const serializers = this[serializersSym]
const chindings = asChindings(this, bindings)
const instance = Object.create(this)
if (bindings.hasOwnProperty('serializers') === true) {
instance[serializersSym] = Object.create(null)
for (var k in serializers) {
instance[serializersSym][k] = serializers[k]
}
const parentSymbols = Object.getOwnPropertySymbols(serializers)
for (var i = 0; i < parentSymbols.length; i++) {
const ks = parentSymbols[i]
instance[serializersSym][ks] = serializers[ks]
}
for (var bk in bindings.serializers) {
instance[serializersSym][bk] = bindings.serializers[bk]
}
const bindingsSymbols = Object.getOwnPropertySymbols(bindings.serializers)
for (var bi = 0; bi < bindingsSymbols.length; bi++) {
const bks = bindingsSymbols[bi]
instance[serializersSym][bks] = bindings.serializers[bks]
}
} else instance[serializersSym] = serializers
if (bindings.hasOwnProperty('customLevels') === true) {
assertNoLevelCollisions(this.levels, bindings.customLevels)
instance.levels = mappings(bindings.customLevels, instance[useOnlyCustomLevelsSym])
genLsCache(instance)
}
instance[chindingsSym] = chindings
const childLevel = bindings.level || level
instance[setLevelSym](childLevel)
return instance
}
function bindings () {
const chindings = this[chindingsSym]
var chindingsJson = `{${chindings.substr(1)}}` // at least contains ,"pid":7068,"hostname":"myMac"
var bindingsFromJson = JSON.parse(chindingsJson)
delete bindingsFromJson.pid
delete bindingsFromJson.hostname
return bindingsFromJson
}
function setBindings (newBindings) {
const chindings = asChindings(this, newBindings)
this[chindingsSym] = chindings
}
function write (_obj, msg, num) {
const t = this[timeSym]()
const messageKey = this[messageKeySym]
const mixin = this[mixinSym]
const objError = _obj instanceof Error
var obj
if (_obj === undefined || _obj === null) {
obj = mixin ? mixin() : {}
obj[messageKey] = msg
} else {
obj = Object.assign(mixin ? mixin() : {}, _obj)
if (msg) {
obj[messageKey] = msg
} else if (objError) {
obj[messageKey] = _obj.message
}
if (objError) {
obj.stack = _obj.stack
if (!obj.type) {
obj.type = 'Error'
}
}
}
const s = this[asJsonSym](obj, num, t)
const stream = this[streamSym]
if (stream[needsMetadataGsym] === true) {
stream.lastLevel = num
// TODO remove in the next major release,
// it is not needed anymore
stream.lastMsg = msg
stream.lastObj = obj
stream.lastTime = t.slice(this[timeSliceIndexSym])
stream.lastLogger = this // for child loggers
}
if (stream instanceof SonicBoom) stream.write(s)
else stream.write(flatstr(s))
}
function flush () {
const stream = this[streamSym]
if ('flush' in stream) stream.flush()
}

104
node_modules/pino/lib/redaction.js generated vendored Normal file
View file

@ -0,0 +1,104 @@
'use strict'
const fastRedact = require('fast-redact')
const { redactFmtSym, wildcardFirstSym } = require('./symbols')
const { rx, validator } = fastRedact
const validate = validator({
ERR_PATHS_MUST_BE_STRINGS: () => 'pino redacted paths must be strings',
ERR_INVALID_PATH: (s) => `pino redact paths array contains an invalid path (${s})`
})
const CENSOR = '[Redacted]'
const strict = false // TODO should this be configurable?
function redaction (opts, serialize) {
const { paths, censor } = handle(opts)
const shape = paths.reduce((o, str) => {
rx.lastIndex = 0
const first = rx.exec(str)
const next = rx.exec(str)
// ns is the top-level path segment, brackets + quoting removed.
let ns = first[1] !== undefined
? first[1].replace(/^(?:"|'|`)(.*)(?:"|'|`)$/, '$1')
: first[0]
if (ns === '*') {
ns = wildcardFirstSym
}
// top level key:
if (next === null) {
o[ns] = null
return o
}
// path with at least two segments:
// if ns is already redacted at the top level, ignore lower level redactions
if (o[ns] === null) {
return o
}
const { index } = next
const nextPath = `${str.substr(index, str.length - 1)}`
o[ns] = o[ns] || []
// shape is a mix of paths beginning with literal values and wildcard
// paths [ "a.b.c", "*.b.z" ] should reduce to a shape of
// { "a": [ "b.c", "b.z" ], *: [ "b.z" ] }
// note: "b.z" is in both "a" and * arrays because "a" matches the wildcard.
// (* entry has wildcardFirstSym as key)
if (ns !== wildcardFirstSym && o[ns].length === 0) {
// first time ns's get all '*' redactions so far
o[ns].push(...(o[wildcardFirstSym] || []))
}
if (ns === wildcardFirstSym) {
// new * path gets added to all previously registered literal ns's.
Object.keys(o).forEach(function (k) {
if (o[k]) {
o[k].push(nextPath)
}
})
}
o[ns].push(nextPath)
return o
}, {})
// the redactor assigned to the format symbol key
// provides top level redaction for instances where
// an object is interpolated into the msg string
const result = {
[redactFmtSym]: fastRedact({ paths, censor, serialize, strict })
}
const topCensor = (...args) =>
typeof censor === 'function' ? serialize(censor(...args)) : serialize(censor)
return [...Object.keys(shape), ...Object.getOwnPropertySymbols(shape)].reduce((o, k) => {
// top level key:
if (shape[k] === null) o[k] = topCensor
else o[k] = fastRedact({ paths: shape[k], censor, serialize, strict })
return o
}, result)
}
function handle (opts) {
if (Array.isArray(opts)) {
opts = { paths: opts, censor: CENSOR }
validate(opts)
return opts
}
var { paths, censor = CENSOR, remove } = opts
if (Array.isArray(paths) === false) { throw Error('pino redact must contain an array of strings') }
if (remove === true) censor = undefined
validate({ paths, censor })
return { paths, censor }
}
module.exports = redaction

64
node_modules/pino/lib/symbols.js generated vendored Normal file
View file

@ -0,0 +1,64 @@
'use strict'
const setLevelSym = Symbol('pino.setLevel')
const getLevelSym = Symbol('pino.getLevel')
const levelValSym = Symbol('pino.levelVal')
const useLevelLabelsSym = Symbol('pino.useLevelLabels')
const levelKeySym = Symbol('pino.levelKey')
const useOnlyCustomLevelsSym = Symbol('pino.useOnlyCustomLevels')
const mixinSym = Symbol('pino.mixin')
const lsCacheSym = Symbol('pino.lsCache')
const chindingsSym = Symbol('pino.chindings')
const parsedChindingsSym = Symbol('pino.parsedChindings')
const asJsonSym = Symbol('pino.asJson')
const writeSym = Symbol('pino.write')
const redactFmtSym = Symbol('pino.redactFmt')
const timeSym = Symbol('pino.time')
const timeSliceIndexSym = Symbol('pino.timeSliceIndex')
const streamSym = Symbol('pino.stream')
const stringifySym = Symbol('pino.stringify')
const stringifiersSym = Symbol('pino.stringifiers')
const endSym = Symbol('pino.end')
const formatOptsSym = Symbol('pino.formatOpts')
const messageKeySym = Symbol('pino.messageKey')
const nestedKeySym = Symbol('pino.nestedKey')
const wildcardFirstSym = Symbol('pino.wildcardFirst')
// public symbols, no need to use the same pino
// version for these
const serializersSym = Symbol.for('pino.serializers')
const wildcardGsym = Symbol.for('pino.*')
const needsMetadataGsym = Symbol.for('pino.metadata')
module.exports = {
setLevelSym,
getLevelSym,
levelValSym,
useLevelLabelsSym,
mixinSym,
lsCacheSym,
chindingsSym,
parsedChindingsSym,
asJsonSym,
writeSym,
serializersSym,
redactFmtSym,
timeSym,
timeSliceIndexSym,
streamSym,
stringifySym,
stringifiersSym,
endSym,
formatOptsSym,
messageKeySym,
nestedKeySym,
wildcardFirstSym,
levelKeySym,
wildcardGsym,
needsMetadataGsym,
useOnlyCustomLevelsSym
}

11
node_modules/pino/lib/time.js generated vendored Normal file
View file

@ -0,0 +1,11 @@
'use strict'
const nullTime = () => ''
const epochTime = () => `,"time":${Date.now()}`
const unixTime = () => `,"time":${Math.round(Date.now() / 1000.0)}`
const isoTime = () => `,"time":"${new Date(Date.now()).toISOString()}"` // using Date.now() for testability
module.exports = { nullTime, epochTime, unixTime, isoTime }

386
node_modules/pino/lib/tools.js generated vendored Normal file
View file

@ -0,0 +1,386 @@
'use strict'
/* eslint no-prototype-builtins: 0 */
const format = require('quick-format-unescaped')
const { mapHttpRequest, mapHttpResponse } = require('pino-std-serializers')
const SonicBoom = require('sonic-boom')
const stringifySafe = require('fast-safe-stringify')
const {
lsCacheSym,
chindingsSym,
parsedChindingsSym,
writeSym,
serializersSym,
formatOptsSym,
endSym,
stringifiersSym,
stringifySym,
wildcardFirstSym,
needsMetadataGsym,
wildcardGsym,
redactFmtSym,
streamSym,
nestedKeySym
} = require('./symbols')
function noop () {}
function genLog (z) {
return function LOG (o, ...n) {
if (typeof o === 'object' && o !== null) {
if (o.method && o.headers && o.socket) {
o = mapHttpRequest(o)
} else if (typeof o.setHeader === 'function') {
o = mapHttpResponse(o)
}
if (this[nestedKeySym]) o = { [this[nestedKeySym]]: o }
this[writeSym](o, format(null, n, this[formatOptsSym]), z)
} else this[writeSym](null, format(o, n, this[formatOptsSym]), z)
}
}
// magically escape strings for json
// relying on their charCodeAt
// everything below 32 needs JSON.stringify()
// 34 and 92 happens all the time, so we
// have a fast case for them
function asString (str) {
var result = ''
var last = 0
var found = false
var point = 255
const l = str.length
if (l > 100) {
return JSON.stringify(str)
}
for (var i = 0; i < l && point >= 32; i++) {
point = str.charCodeAt(i)
if (point === 34 || point === 92) {
result += str.slice(last, i) + '\\'
last = i
found = true
}
}
if (!found) {
result = str
} else {
result += str.slice(last)
}
return point < 32 ? JSON.stringify(str) : '"' + result + '"'
}
function asJson (obj, num, time) {
const stringify = this[stringifySym]
const stringifiers = this[stringifiersSym]
const end = this[endSym]
const chindings = this[chindingsSym]
const serializers = this[serializersSym]
var data = this[lsCacheSym][num] + time
// we need the child bindings added to the output first so instance logged
// objects can take precedence when JSON.parse-ing the resulting log line
data = data + chindings
var value
var notHasOwnProperty = obj.hasOwnProperty === undefined
if (serializers[wildcardGsym]) {
obj = serializers[wildcardGsym](obj)
}
const wildcardStringifier = stringifiers[wildcardFirstSym]
for (var key in obj) {
value = obj[key]
if ((notHasOwnProperty || obj.hasOwnProperty(key)) && value !== undefined) {
value = serializers[key] ? serializers[key](value) : value
const stringifier = stringifiers[key] || wildcardStringifier
switch (typeof value) {
case 'undefined':
case 'function':
continue
case 'number':
/* eslint no-fallthrough: "off" */
if (Number.isFinite(value) === false) {
value = null
}
// this case explicity falls through to the next one
case 'boolean':
if (stringifier) value = stringifier(value)
break
case 'string':
value = (stringifier || asString)(value)
break
default:
value = (stringifier || stringify)(value)
}
if (value === undefined) continue
data += ',"' + key + '":' + value
}
}
return data + end
}
function asChindings (instance, bindings) {
if (!bindings) {
throw Error('missing bindings for child Pino')
}
var key
var value
var data = instance[chindingsSym]
const stringify = instance[stringifySym]
const stringifiers = instance[stringifiersSym]
const serializers = instance[serializersSym]
if (serializers[wildcardGsym]) {
bindings = serializers[wildcardGsym](bindings)
}
for (key in bindings) {
value = bindings[key]
const valid = key !== 'level' &&
key !== 'serializers' &&
key !== 'customLevels' &&
bindings.hasOwnProperty(key) &&
value !== undefined
if (valid === true) {
value = serializers[key] ? serializers[key](value) : value
value = (stringifiers[key] || stringify)(value)
if (value === undefined) continue
data += ',"' + key + '":' + value
}
}
return data
}
function getPrettyStream (opts, prettifier, dest, instance) {
if (prettifier && typeof prettifier === 'function') {
prettifier = prettifier.bind(instance)
return prettifierMetaWrapper(prettifier(opts), dest)
}
try {
var prettyFactory = require('pino-pretty')
prettyFactory.asMetaWrapper = prettifierMetaWrapper
return prettifierMetaWrapper(prettyFactory(opts), dest)
} catch (e) {
throw Error('Missing `pino-pretty` module: `pino-pretty` must be installed separately')
}
}
function prettifierMetaWrapper (pretty, dest) {
var warned = false
return {
[needsMetadataGsym]: true,
lastLevel: 0,
lastMsg: null,
lastObj: null,
lastLogger: null,
flushSync () {
if (warned) {
return
}
warned = true
setMetadataProps(dest, this)
dest.write(pretty(Object.assign({
level: 40, // warn
msg: 'pino.final with prettyPrint does not support flushing',
time: Date.now()
}, this.chindings())))
},
chindings () {
const lastLogger = this.lastLogger
var chindings = null
// protection against flushSync being called before logging
// anything
if (!lastLogger) {
return null
}
if (lastLogger.hasOwnProperty(parsedChindingsSym)) {
chindings = lastLogger[parsedChindingsSym]
} else {
chindings = JSON.parse('{"v":1' + lastLogger[chindingsSym] + '}')
lastLogger[parsedChindingsSym] = chindings
}
return chindings
},
write (chunk) {
const lastLogger = this.lastLogger
const chindings = this.chindings()
var time = this.lastTime
if (time.match(/^\d+/)) {
time = parseInt(time)
}
var lastObj = this.lastObj
var errorProps = null
const obj = Object.assign({
level: this.lastLevel,
time
}, chindings, lastObj, errorProps)
const serializers = lastLogger[serializersSym]
const keys = Object.keys(serializers)
var key
for (var i = 0; i < keys.length; i++) {
key = keys[i]
if (obj[key] !== undefined) {
obj[key] = serializers[key](obj[key])
}
}
const stringifiers = lastLogger[stringifiersSym]
const redact = stringifiers[redactFmtSym]
const formatted = pretty(typeof redact === 'function' ? redact(obj) : obj)
if (formatted === undefined) return
setMetadataProps(dest, this)
dest.write(formatted)
}
}
}
function hasBeenTampered (stream) {
return stream.write !== stream.constructor.prototype.write
}
function buildSafeSonicBoom (dest, buffer = 0, sync = true) {
const stream = new SonicBoom(dest, buffer, sync)
stream.on('error', filterBrokenPipe)
return stream
function filterBrokenPipe (err) {
// TODO verify on Windows
if (err.code === 'EPIPE') {
// If we get EPIPE, we should stop logging here
// however we have no control to the consumer of
// SonicBoom, so we just overwrite the write method
stream.write = noop
stream.end = noop
stream.flushSync = noop
stream.destroy = noop
return
}
stream.removeListener('error', filterBrokenPipe)
stream.emit('error', err)
}
}
function createArgsNormalizer (defaultOptions) {
return function normalizeArgs (instance, opts = {}, stream) {
// support stream as a string
if (typeof opts === 'string') {
stream = buildSafeSonicBoom(opts)
opts = {}
} else if (typeof stream === 'string') {
stream = buildSafeSonicBoom(stream)
} else if (opts instanceof SonicBoom || opts.writable || opts._writableState) {
stream = opts
opts = null
}
opts = Object.assign({}, defaultOptions, opts)
if ('extreme' in opts) {
throw Error('The extreme option has been removed, use pino.extreme instead')
}
if ('onTerminated' in opts) {
throw Error('The onTerminated option has been removed, use pino.final instead')
}
if ('changeLevelName' in opts) {
process.emitWarning(
'The changeLevelName option is deprecated and will be removed in v7. Use levelKey instead.',
{ code: 'changeLevelName_deprecation' }
)
opts.levelKey = opts.changeLevelName
delete opts.changeLevelName
}
const { enabled, prettyPrint, prettifier, messageKey } = opts
if (enabled === false) opts.level = 'silent'
stream = stream || process.stdout
if (stream === process.stdout && stream.fd >= 0 && !hasBeenTampered(stream)) {
stream = buildSafeSonicBoom(stream.fd)
}
if (prettyPrint) {
const prettyOpts = Object.assign({ messageKey }, prettyPrint)
stream = getPrettyStream(prettyOpts, prettifier, stream, instance)
}
return { opts, stream }
}
}
function final (logger, handler) {
if (typeof logger === 'undefined' || typeof logger.child !== 'function') {
throw Error('expected a pino logger instance')
}
const hasHandler = (typeof handler !== 'undefined')
if (hasHandler && typeof handler !== 'function') {
throw Error('if supplied, the handler parameter should be a function')
}
const stream = logger[streamSym]
if (typeof stream.flushSync !== 'function') {
throw Error('final requires a stream that has a flushSync method, such as pino.destination and pino.extreme')
}
const finalLogger = new Proxy(logger, {
get: (logger, key) => {
if (key in logger.levels.values) {
return (...args) => {
logger[key](...args)
stream.flushSync()
}
}
return logger[key]
}
})
if (!hasHandler) {
return finalLogger
}
return (err = null, ...args) => {
try {
stream.flushSync()
} catch (e) {
// it's too late to wait for the stream to be ready
// because this is a final tick scenario.
// in practice there shouldn't be a situation where it isn't
// however, swallow the error just in case (and for easier testing)
}
return handler(err, finalLogger, ...args)
}
}
function stringify (obj) {
try {
return JSON.stringify(obj)
} catch (_) {
return stringifySafe(obj)
}
}
function setMetadataProps (dest, that) {
if (dest[needsMetadataGsym] === true) {
dest.lastLevel = that.lastLevel
dest.lastMsg = that.lastMsg
dest.lastObj = that.lastObj
dest.lastTime = that.lastTime
dest.lastLogger = that.lastLogger
}
}
module.exports = {
noop,
buildSafeSonicBoom,
getPrettyStream,
asChindings,
asJson,
genLog,
createArgsNormalizer,
final,
stringify
}

135
node_modules/pino/package.json generated vendored Normal file
View file

@ -0,0 +1,135 @@
{
"_from": "pino@5.17.0",
"_id": "pino@5.17.0",
"_inBundle": false,
"_integrity": "sha512-LqrqmRcJz8etUjyV0ddqB6OTUutCgQULPFg2b4dtijRHUsucaAdBgSUW58vY6RFSX+NT8963F+q0tM6lNwGShA==",
"_location": "/pino",
"_phantomChildren": {},
"_requested": {
"type": "version",
"registry": true,
"raw": "pino@5.17.0",
"name": "pino",
"escapedName": "pino",
"rawSpec": "5.17.0",
"saveSpec": null,
"fetchSpec": "5.17.0"
},
"_requiredBy": [
"/docker-hub-utils"
],
"_resolved": "https://registry.npmjs.org/pino/-/pino-5.17.0.tgz",
"_shasum": "b9def314e82402154f89a25d76a31f20ca84b4c8",
"_spec": "pino@5.17.0",
"_where": "/home/dawidd6/github/dawidd6/action-debian-package/node_modules/docker-hub-utils",
"author": {
"name": "Matteo Collina",
"email": "hello@matteocollina.com"
},
"bin": {
"pino": "bin.js"
},
"browser": "./browser.js",
"bugs": {
"url": "https://github.com/pinojs/pino/issues"
},
"bundleDependencies": false,
"contributors": [
{
"name": "David Mark Clements",
"email": "huperekchuno@googlemail.com"
},
{
"name": "James Sumners",
"email": "james.sumners@gmail.com"
},
{
"name": "Thomas Watson Steen",
"email": "w@tson.dk",
"url": "https://twitter.com/wa7son"
}
],
"dependencies": {
"fast-redact": "^2.0.0",
"fast-safe-stringify": "^2.0.7",
"flatstr": "^1.0.12",
"pino-std-serializers": "^2.4.2",
"quick-format-unescaped": "^3.0.3",
"sonic-boom": "^0.7.5"
},
"deprecated": false,
"description": "super fast, all natural json logger",
"devDependencies": {
"airtap": "2.0.2",
"benchmark": "^2.1.4",
"bole": "^3.0.2",
"bunyan": "^1.8.12",
"cross-env": "^5.2.1",
"docsify-cli": "^4.2.1",
"execa": "^1.0.0",
"fastbench": "^1.0.1",
"flush-write-stream": "^2.0.0",
"import-fresh": "^3.0.0",
"log": "^5.0.0",
"loglevel": "^1.6.4",
"pino-pretty": "^2.6.1",
"pre-commit": "^1.2.2",
"proxyquire": "^2.1.3",
"pump": "^3.0.0",
"qodaa": "^1.0.1",
"semver": "^6.3.0",
"snazzy": "^8.0.0",
"split2": "^3.1.1",
"standard": "^14.2.0",
"steed": "^1.1.3",
"tap": "^12.7.0",
"tape": "^4.11.0",
"through2": "^3.0.1",
"winston": "^3.2.1"
},
"files": [
"pino.js",
"bin.js",
"browser.js",
"pretty.js",
"usage.txt",
"test",
"docs",
"example.js",
"lib"
],
"homepage": "http://getpino.io",
"keywords": [
"fast",
"logger",
"stream",
"json"
],
"license": "MIT",
"main": "pino.js",
"name": "pino",
"precommit": "test",
"repository": {
"type": "git",
"url": "git+https://github.com/pinojs/pino.git"
},
"scripts": {
"bench": "node benchmarks/utils/runbench all",
"bench-basic": "node benchmarks/utils/runbench basic",
"bench-child": "node benchmarks/utils/runbench child",
"bench-child-child": "node benchmarks/utils/runbench child-child",
"bench-child-creation": "node benchmarks/utils/runbench child-creation",
"bench-deep-object": "node benchmarks/utils/runbench deep-object",
"bench-longs-tring": "node benchmarks/utils/runbench long-string",
"bench-multi-arg": "node benchmarks/utils/runbench multi-arg",
"bench-object": "node benchmarks/utils/runbench object",
"browser-test": "airtap --local 8080 test/browser*test.js",
"ci": "standard | snazzy && cross-env TAP_TIMEOUT=480000 NODE_OPTIONS=\"--no-warnings -r qodaa\" tap --no-esm -j 4 --100 test/*test.js",
"cov-ci": "cross-env TAP_TIMEOUT=480000 NODE_OPTIONS=\"--no-warnings -r qodaa\" tap --no-esm -j 4 --100 --coverage-report=lcov test/*test.js",
"cov-ui": "cross-env NODE_OPTIONS=\"--no-warnings -r qodaa\" tap --no-esm -j 4 --coverage-report=html test/*test.js",
"docs": "docsify serve",
"test": "standard | snazzy && cross-env NODE_OPTIONS=\"--no-warnings -r qodaa\" tap --no-esm -j 4 --no-cov test/*test.js",
"update-bench-doc": "node benchmarks/utils/generate-benchmark-doc > docs/benchmarks.md"
},
"version": "5.17.0"
}

144
node_modules/pino/pino.js generated vendored Normal file
View file

@ -0,0 +1,144 @@
'use strict'
const os = require('os')
const stdSerializers = require('pino-std-serializers')
const redaction = require('./lib/redaction')
const time = require('./lib/time')
const proto = require('./lib/proto')
const symbols = require('./lib/symbols')
const { assertDefaultLevelFound, mappings, genLsCache } = require('./lib/levels')
const {
createArgsNormalizer,
asChindings,
final,
stringify,
buildSafeSonicBoom
} = require('./lib/tools')
const { version, LOG_VERSION } = require('./lib/meta')
const {
chindingsSym,
redactFmtSym,
serializersSym,
timeSym,
timeSliceIndexSym,
streamSym,
stringifySym,
stringifiersSym,
setLevelSym,
endSym,
formatOptsSym,
messageKeySym,
nestedKeySym,
useLevelLabelsSym,
levelKeySym,
mixinSym,
useOnlyCustomLevelsSym
} = symbols
const { epochTime, nullTime } = time
const { pid } = process
const hostname = os.hostname()
const defaultErrorSerializer = stdSerializers.err
const defaultOptions = {
level: 'info',
useLevelLabels: false,
messageKey: 'msg',
nestedKey: null,
enabled: true,
prettyPrint: false,
base: { pid, hostname },
serializers: Object.assign(Object.create(null), {
err: defaultErrorSerializer
}),
timestamp: epochTime,
name: undefined,
redact: null,
customLevels: null,
levelKey: 'level',
useOnlyCustomLevels: false
}
const normalize = createArgsNormalizer(defaultOptions)
const serializers = Object.assign(Object.create(null), stdSerializers)
function pino (...args) {
const instance = {}
const { opts, stream } = normalize(instance, ...args)
const {
redact,
crlf,
serializers,
timestamp,
messageKey,
nestedKey,
base,
name,
level,
customLevels,
useLevelLabels,
levelKey,
mixin,
useOnlyCustomLevels
} = opts
const stringifiers = redact ? redaction(redact, stringify) : {}
const formatOpts = redact
? { stringify: stringifiers[redactFmtSym] }
: { stringify }
const end = ',"v":' + LOG_VERSION + '}' + (crlf ? '\r\n' : '\n')
const coreChindings = asChindings.bind(null, {
[chindingsSym]: '',
[serializersSym]: serializers,
[stringifiersSym]: stringifiers,
[stringifySym]: stringify
})
const chindings = base === null ? '' : (name === undefined)
? coreChindings(base) : coreChindings(Object.assign({}, base, { name }))
const time = (timestamp instanceof Function)
? timestamp : (timestamp ? epochTime : nullTime)
const timeSliceIndex = time().indexOf(':') + 1
if (useOnlyCustomLevels && !customLevels) throw Error('customLevels is required if useOnlyCustomLevels is set true')
if (mixin && typeof mixin !== 'function') throw Error(`Unknown mixin type "${typeof mixin}" - expected "function"`)
assertDefaultLevelFound(level, customLevels, useOnlyCustomLevels)
const levels = mappings(customLevels, useOnlyCustomLevels)
Object.assign(instance, {
levels,
[useLevelLabelsSym]: useLevelLabels,
[levelKeySym]: levelKey,
[useOnlyCustomLevelsSym]: useOnlyCustomLevels,
[streamSym]: stream,
[timeSym]: time,
[timeSliceIndexSym]: timeSliceIndex,
[stringifySym]: stringify,
[stringifiersSym]: stringifiers,
[endSym]: end,
[formatOptsSym]: formatOpts,
[messageKeySym]: messageKey,
[nestedKeySym]: nestedKey,
[serializersSym]: serializers,
[mixinSym]: mixin,
[chindingsSym]: chindings
})
Object.setPrototypeOf(instance, proto)
if (customLevels || useLevelLabels || levelKey !== defaultOptions.levelKey) genLsCache(instance)
instance[setLevelSym](level)
return instance
}
pino.extreme = (dest = process.stdout.fd) => buildSafeSonicBoom(dest, 4096, false)
pino.destination = (dest = process.stdout.fd) => buildSafeSonicBoom(dest, 0, true)
pino.final = final
pino.levels = mappings()
pino.stdSerializers = serializers
pino.stdTimeFunctions = Object.assign({}, time)
pino.symbols = symbols
pino.version = version
pino.LOG_VERSION = LOG_VERSION
module.exports = pino

702
node_modules/pino/test/basic.test.js generated vendored Normal file
View file

@ -0,0 +1,702 @@
'use strict'
const os = require('os')
const { join } = require('path')
const { readFileSync, existsSync, statSync } = require('fs')
const { test } = require('tap')
const { sink, check, once } = require('./helper')
const pino = require('../')
const { version } = require('../package.json')
const { pid } = process
const hostname = os.hostname()
const watchFileCreated = (filename) => new Promise((resolve, reject) => {
const TIMEOUT = 800
const INTERVAL = 100
const threshold = TIMEOUT / INTERVAL
let counter = 0
const interval = setInterval(() => {
// On some CI runs file is created but not filled
if (existsSync(filename) && statSync(filename).size !== 0) {
clearInterval(interval)
resolve()
} else if (counter <= threshold) {
counter++
} else {
clearInterval(interval)
reject(new Error(`${filename} was not created.`))
}
}, INTERVAL)
})
test('pino version is exposed on export', async ({ is }) => {
is(pino.version, version)
})
test('pino version is exposed on instance', async ({ is }) => {
const instance = pino()
is(instance.version, version)
})
test('child instance exposes pino version', async ({ is }) => {
const child = pino().child({ foo: 'bar' })
is(child.version, version)
})
test('bindings are exposed on every instance', async ({ same }) => {
const instance = pino()
same(instance.bindings(), {})
})
test('bindings contain the name and the child bindings', async ({ same }) => {
const instance = pino({ name: 'basicTest', level: 'info' }).child({ foo: 'bar' }).child({ a: 2 })
same(instance.bindings(), { name: 'basicTest', foo: 'bar', a: 2 })
})
test('set bindings on instance', async ({ same }) => {
const instance = pino({ name: 'basicTest', level: 'info' })
instance.setBindings({ foo: 'bar' })
same(instance.bindings(), { name: 'basicTest', foo: 'bar' })
})
test('newly set bindings overwrite old bindings', async ({ same }) => {
const instance = pino({ name: 'basicTest', level: 'info', base: { foo: 'bar' } })
instance.setBindings({ foo: 'baz' })
same(instance.bindings(), { name: 'basicTest', foo: 'baz' })
})
test('set bindings on child instance', async ({ same }) => {
const child = pino({ name: 'basicTest', level: 'info' }).child({})
child.setBindings({ foo: 'bar' })
same(child.bindings(), { name: 'basicTest', foo: 'bar' })
})
test('child should have bindings set by parent', async ({ same }) => {
const instance = pino({ name: 'basicTest', level: 'info' })
instance.setBindings({ foo: 'bar' })
const child = instance.child({})
same(child.bindings(), { name: 'basicTest', foo: 'bar' })
})
test('child should not share bindings of parent set after child creation', async ({ same }) => {
const instance = pino({ name: 'basicTest', level: 'info' })
const child = instance.child({})
instance.setBindings({ foo: 'bar' })
same(instance.bindings(), { name: 'basicTest', foo: 'bar' })
same(child.bindings(), { name: 'basicTest' })
})
function levelTest (name, level) {
test(`${name} logs as ${level}`, async ({ is }) => {
const stream = sink()
const instance = pino(stream)
instance.level = name
instance[name]('hello world')
check(is, await once(stream, 'data'), level, 'hello world')
})
test(`passing objects at level ${name}`, async ({ is, same }) => {
const stream = sink()
const instance = pino(stream)
instance.level = name
const obj = { hello: 'world' }
instance[name](obj)
const result = await once(stream, 'data')
is(new Date(result.time) <= new Date(), true, 'time is greater than Date.now()')
is(result.pid, pid)
is(result.hostname, hostname)
is(result.level, level)
is(result.hello, 'world')
is(result.v, 1)
same(Object.keys(obj), ['hello'])
})
test(`passing an object and a string at level ${name}`, async ({ is, same }) => {
const stream = sink()
const instance = pino(stream)
instance.level = name
const obj = { hello: 'world' }
instance[name](obj, 'a string')
const result = await once(stream, 'data')
is(new Date(result.time) <= new Date(), true, 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: level,
msg: 'a string',
hello: 'world',
v: 1
})
same(Object.keys(obj), ['hello'])
})
test(`overriding object key by string at level ${name}`, async ({ is, same }) => {
const stream = sink()
const instance = pino(stream)
instance.level = name
instance[name]({ hello: 'world', msg: 'object' }, 'string')
const result = await once(stream, 'data')
is(new Date(result.time) <= new Date(), true, 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: level,
msg: 'string',
hello: 'world',
v: 1
})
})
test(`formatting logs as ${name}`, async ({ is }) => {
const stream = sink()
const instance = pino(stream)
instance.level = name
instance[name]('hello %d', 42)
const result = await once(stream, 'data')
check(is, result, level, 'hello 42')
})
test(`formatting a symbol at level ${name}`, async ({ is }) => {
const stream = sink()
const instance = pino(stream)
instance.level = name
const sym = Symbol('foo')
instance[name]('hello', sym)
const result = await once(stream, 'data')
check(is, result, level, 'hello Symbol(foo)')
})
test(`passing error with a serializer at level ${name}`, async ({ is, same }) => {
const stream = sink()
const err = new Error('myerror')
const instance = pino({
serializers: {
err: pino.stdSerializers.err
}
}, stream)
instance.level = name
instance[name]({ err })
const result = await once(stream, 'data')
is(new Date(result.time) <= new Date(), true, 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: level,
err: {
type: 'Error',
message: err.message,
stack: err.stack
},
v: 1
})
})
test(`child logger for level ${name}`, async ({ is, same }) => {
const stream = sink()
const instance = pino(stream)
instance.level = name
const child = instance.child({ hello: 'world' })
child[name]('hello world')
const result = await once(stream, 'data')
is(new Date(result.time) <= new Date(), true, 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: level,
msg: 'hello world',
hello: 'world',
v: 1
})
})
}
levelTest('fatal', 60)
levelTest('error', 50)
levelTest('warn', 40)
levelTest('info', 30)
levelTest('debug', 20)
levelTest('trace', 10)
test('serializers can return undefined to strip field', async ({ is }) => {
const stream = sink()
const instance = pino({
serializers: {
test () { return undefined }
}
}, stream)
instance.info({ test: 'sensitive info' })
const result = await once(stream, 'data')
is('test' in result, false)
})
test('does not explode with a circular ref', async ({ doesNotThrow }) => {
const stream = sink()
const instance = pino(stream)
const b = {}
const a = {
hello: b
}
b.a = a // circular ref
doesNotThrow(() => instance.info(a))
})
test('set the name', async ({ is, same }) => {
const stream = sink()
const instance = pino({
name: 'hello'
}, stream)
instance.fatal('this is fatal')
const result = await once(stream, 'data')
is(new Date(result.time) <= new Date(), true, 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 60,
name: 'hello',
msg: 'this is fatal',
v: 1
})
})
test('set the messageKey', async ({ is, same }) => {
const stream = sink()
const message = 'hello world'
const messageKey = 'fooMessage'
const instance = pino({
messageKey
}, stream)
instance.info(message)
const result = await once(stream, 'data')
is(new Date(result.time) <= new Date(), true, 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 30,
fooMessage: message,
v: 1
})
})
test('set the nestedKey', async ({ is, same }) => {
const stream = sink()
const object = { hello: 'world' }
const nestedKey = 'stuff'
const instance = pino({
nestedKey
}, stream)
instance.info(object)
const result = await once(stream, 'data')
is(new Date(result.time) <= new Date(), true, 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 30,
stuff: object,
v: 1
})
})
test('set undefined properties', async ({ is, same }) => {
const stream = sink()
const instance = pino(stream)
instance.info({ hello: 'world', property: undefined })
const result = await once(stream, 'data')
is(new Date(result.time) <= new Date(), true, 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 30,
hello: 'world',
v: 1
})
})
test('prototype properties are not logged', async ({ is }) => {
const stream = sink()
const instance = pino(stream)
instance.info(Object.create({ hello: 'world' }))
const { hello } = await once(stream, 'data')
is(hello, undefined)
})
test('set the base', async ({ is, same }) => {
const stream = sink()
const instance = pino({
base: {
a: 'b'
}
}, stream)
instance.fatal('this is fatal')
const result = await once(stream, 'data')
is(new Date(result.time) <= new Date(), true, 'time is greater than Date.now()')
delete result.time
same(result, {
a: 'b',
level: 60,
msg: 'this is fatal',
v: 1
})
})
test('set the base to null', async ({ is, same }) => {
const stream = sink()
const instance = pino({
base: null
}, stream)
instance.fatal('this is fatal')
const result = await once(stream, 'data')
is(new Date(result.time) <= new Date(), true, 'time is greater than Date.now()')
delete result.time
same(result, {
level: 60,
msg: 'this is fatal',
v: 1
})
})
test('set the base to null and use a serializer', async ({ is, same }) => {
const stream = sink()
const instance = pino({
base: null,
serializers: {
[Symbol.for('pino.*')]: (input) => {
return Object.assign({}, input, { additionalMessage: 'using pino' })
}
}
}, stream)
instance.fatal('this is fatal too')
const result = await once(stream, 'data')
is(new Date(result.time) <= new Date(), true, 'time is greater than Date.now()')
delete result.time
same(result, {
level: 60,
msg: 'this is fatal too',
additionalMessage: 'using pino',
v: 1
})
})
test('throw if creating child without bindings', async ({ throws }) => {
const stream = sink()
const instance = pino(stream)
throws(() => instance.child())
})
test('correctly escapes msg strings with stray double quote at end', async ({ same }) => {
const stream = sink()
const instance = pino({
name: 'hello'
}, stream)
instance.fatal('this contains "')
const result = await once(stream, 'data')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 60,
name: 'hello',
msg: 'this contains "',
v: 1
})
})
test('correctly escape msg strings with unclosed double quote', async ({ same }) => {
const stream = sink()
const instance = pino({
name: 'hello'
}, stream)
instance.fatal('" this contains')
const result = await once(stream, 'data')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 60,
name: 'hello',
msg: '" this contains',
v: 1
})
})
// https://github.com/pinojs/pino/issues/139
test('object and format string', async ({ same }) => {
const stream = sink()
const instance = pino(stream)
instance.info({}, 'foo %s', 'bar')
const result = await once(stream, 'data')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 30,
msg: 'foo bar',
v: 1
})
})
test('object and format string property', async ({ same }) => {
const stream = sink()
const instance = pino(stream)
instance.info({ answer: 42 }, 'foo %s', 'bar')
const result = await once(stream, 'data')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 30,
msg: 'foo bar',
answer: 42,
v: 1
})
})
test('correctly strip undefined when returned from toJSON', async ({ is }) => {
const stream = sink()
const instance = pino({
test: 'this'
}, stream)
instance.fatal({ test: { toJSON () { return undefined } } })
const result = await once(stream, 'data')
is('test' in result, false)
})
test('correctly supports stderr', async ({ same }) => {
// stderr inherits from Stream, rather than Writable
const dest = {
writable: true,
write (result) {
result = JSON.parse(result)
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 60,
msg: 'a message',
v: 1
})
}
}
const instance = pino(dest)
instance.fatal('a message')
})
test('normalize number to string', async ({ same }) => {
const stream = sink()
const instance = pino(stream)
instance.info(1)
const result = await once(stream, 'data')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 30,
msg: '1',
v: 1
})
})
test('normalize number to string with an object', async ({ same }) => {
const stream = sink()
const instance = pino(stream)
instance.info({ answer: 42 }, 1)
const result = await once(stream, 'data')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 30,
msg: '1',
answer: 42,
v: 1
})
})
test('handles objects with null prototype', async ({ same }) => {
const stream = sink()
const instance = pino(stream)
const o = Object.create(null)
o.test = 'test'
instance.info(o)
const result = await once(stream, 'data')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 30,
test: 'test',
v: 1
})
})
test('pino.destination', async ({ same }) => {
const tmp = join(
os.tmpdir(),
'_' + Math.random().toString(36).substr(2, 9)
)
const instance = pino(pino.destination(tmp))
instance.info('hello')
await watchFileCreated(tmp)
const result = JSON.parse(readFileSync(tmp).toString())
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 30,
msg: 'hello',
v: 1
})
})
test('auto pino.destination with a string', async ({ same }) => {
const tmp = join(
os.tmpdir(),
'_' + Math.random().toString(36).substr(2, 9)
)
const instance = pino(tmp)
instance.info('hello')
await watchFileCreated(tmp)
const result = JSON.parse(readFileSync(tmp).toString())
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 30,
msg: 'hello',
v: 1
})
})
test('auto pino.destination with a string as second argument', async ({ same }) => {
const tmp = join(
os.tmpdir(),
'_' + Math.random().toString(36).substr(2, 9)
)
const instance = pino(null, tmp)
instance.info('hello')
await watchFileCreated(tmp)
const result = JSON.parse(readFileSync(tmp).toString())
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 30,
msg: 'hello',
v: 1
})
})
test('does not override opts with a string as second argument', async ({ same }) => {
const tmp = join(
os.tmpdir(),
'_' + Math.random().toString(36).substr(2, 9)
)
const instance = pino({
timestamp: () => ',"time":"none"'
}, tmp)
instance.info('hello')
await watchFileCreated(tmp)
const result = JSON.parse(readFileSync(tmp).toString())
same(result, {
pid: pid,
hostname: hostname,
level: 30,
time: 'none',
msg: 'hello',
v: 1
})
})
// https://github.com/pinojs/pino/issues/222
test('children with same names render in correct order', async ({ is }) => {
const stream = sink()
const root = pino(stream)
root.child({ a: 1 }).child({ a: 2 }).info({ a: 3 })
const { a } = await once(stream, 'data')
is(a, 3, 'last logged object takes precedence')
})
// https://github.com/pinojs/pino/pull/251 - use this.stringify
test('use `fast-safe-stringify` to avoid circular dependencies', async ({ deepEqual }) => {
const stream = sink()
const root = pino(stream)
// circular depth
const obj = {}
obj.a = obj
root.info(obj)
const { a } = await once(stream, 'data')
deepEqual(a, { a: '[Circular]' })
})
test('fast-safe-stringify must be used when interpolating', async (t) => {
const stream = sink()
const instance = pino(stream)
const o = { a: { b: {} } }
o.a.b.c = o.a.b
instance.info('test', o)
const { msg } = await once(stream, 'data')
t.is(msg, 'test {"a":{"b":{"c":"[Circular]"}}}')
})
test('throws when setting useOnlyCustomLevels without customLevels', async ({ is, throws }) => {
throws(() => {
pino({
useOnlyCustomLevels: true
})
})
try {
pino({
useOnlyCustomLevels: true
})
} catch ({ message }) {
is(message, 'customLevels is required if useOnlyCustomLevels is set true')
}
})
test('correctly log Infinity', async (t) => {
const stream = sink()
const instance = pino(stream)
const o = { num: Infinity }
instance.info(o)
const { num } = await once(stream, 'data')
t.is(num, null)
})
test('correctly log -Infinity', async (t) => {
const stream = sink()
const instance = pino(stream)
const o = { num: -Infinity }
instance.info(o)
const { num } = await once(stream, 'data')
t.is(num, null)
})
test('correctly log NaN', async (t) => {
const stream = sink()
const instance = pino(stream)
const o = { num: NaN }
instance.info(o)
const { num } = await once(stream, 'data')
t.is(num, null)
})

42
node_modules/pino/test/broken-pipe.test.js generated vendored Normal file
View file

@ -0,0 +1,42 @@
'use strict'
const t = require('tap')
const { join } = require('path')
const { fork } = require('child_process')
const { once } = require('./helper')
const pino = require('..')
function test (file) {
file = join('fixtures', 'broken-pipe', file)
t.test(file, { parallel: true }, async ({ is }) => {
const child = fork(join(__dirname, file), { silent: true })
child.stdout.destroy()
child.stderr.pipe(process.stdout)
const res = await once(child, 'close')
is(res, 0) // process exits successfully
})
}
t.jobs = 42
test('basic.js')
test('destination.js')
test('extreme.js')
t.test('let error pass through', ({ is, plan }) => {
plan(3)
const stream = pino.destination()
// side effect of the pino constructor is that it will set an
// event handler for error
pino(stream)
process.nextTick(() => stream.emit('error', new Error('kaboom')))
process.nextTick(() => stream.emit('error', new Error('kaboom')))
stream.on('error', (err) => {
is(err.message, 'kaboom')
})
})

218
node_modules/pino/test/browser-levels.test.js generated vendored Normal file
View file

@ -0,0 +1,218 @@
'use strict'
const test = require('tape')
const pino = require('../browser')
test('set the level by string', ({ end, same, is }) => {
const expected = [
{
level: 50,
msg: 'this is an error'
},
{
level: 60,
msg: 'this is fatal'
}
]
const instance = pino({
browser: {
write (actual) {
checkLogObjects(is, same, actual, expected.shift())
}
}
})
instance.level = 'error'
instance.info('hello world')
instance.error('this is an error')
instance.fatal('this is fatal')
end()
})
test('set the level by string. init with silent', ({ end, same, is }) => {
const expected = [
{
level: 50,
msg: 'this is an error'
},
{
level: 60,
msg: 'this is fatal'
}
]
const instance = pino({
level: 'silent',
browser: {
write (actual) {
checkLogObjects(is, same, actual, expected.shift())
}
}
})
instance.level = 'error'
instance.info('hello world')
instance.error('this is an error')
instance.fatal('this is fatal')
end()
})
test('set the level by string. init with silent and transmit', ({ end, same, is }) => {
const expected = [
{
level: 50,
msg: 'this is an error'
},
{
level: 60,
msg: 'this is fatal'
}
]
const instance = pino({
level: 'silent',
browser: {
write (actual) {
checkLogObjects(is, same, actual, expected.shift())
}
},
transmit: {
send () {}
}
})
instance.level = 'error'
instance.info('hello world')
instance.error('this is an error')
instance.fatal('this is fatal')
end()
})
test('set the level via constructor', ({ end, same, is }) => {
const expected = [
{
level: 50,
msg: 'this is an error'
},
{
level: 60,
msg: 'this is fatal'
}
]
const instance = pino({
level: 'error',
browser: {
write (actual) {
checkLogObjects(is, same, actual, expected.shift())
}
}
})
instance.info('hello world')
instance.error('this is an error')
instance.fatal('this is fatal')
end()
})
test('the wrong level throws', ({ end, throws }) => {
const instance = pino()
throws(() => {
instance.level = 'kaboom'
})
end()
})
test('the wrong level by number throws', ({ end, throws }) => {
const instance = pino()
throws(() => {
instance.levelVal = 55
})
end()
})
test('exposes level string mappings', ({ end, is }) => {
is(pino.levels.values.error, 50)
end()
})
test('exposes level number mappings', ({ end, is }) => {
is(pino.levels.labels[50], 'error')
end()
})
test('returns level integer', ({ end, is }) => {
const instance = pino({ level: 'error' })
is(instance.levelVal, 50)
end()
})
test('silent level via constructor', ({ end, fail }) => {
const instance = pino({
level: 'silent',
browser: {
write () {
fail('no data should be logged')
}
}
})
Object.keys(pino.levels.values).forEach((level) => {
instance[level]('hello world')
})
end()
})
test('silent level by string', ({ end, fail }) => {
const instance = pino({
browser: {
write () {
fail('no data should be logged')
}
}
})
instance.level = 'silent'
Object.keys(pino.levels.values).forEach((level) => {
instance[level]('hello world')
})
end()
})
test('exposed levels', ({ end, same }) => {
same(Object.keys(pino.levels.values), [
'fatal',
'error',
'warn',
'info',
'debug',
'trace'
])
end()
})
test('exposed labels', ({ end, same }) => {
same(Object.keys(pino.levels.labels), [
'10',
'20',
'30',
'40',
'50',
'60'
])
end()
})
function checkLogObjects (is, same, actual, expected) {
is(actual.time <= Date.now(), true, 'time is greater than Date.now()')
const actualCopy = Object.assign({}, actual)
const expectedCopy = Object.assign({}, expected)
delete actualCopy.time
delete expectedCopy.time
same(actualCopy, expectedCopy)
}

327
node_modules/pino/test/browser-serializers.test.js generated vendored Normal file
View file

@ -0,0 +1,327 @@
'use strict'
// eslint-disable-next-line
if (typeof $1 !== 'undefined') $1 = arguments.callee.caller.arguments[0]
const test = require('tape')
const fresh = require('import-fresh')
const pino = require('../browser')
const parentSerializers = {
test: () => 'parent'
}
const childSerializers = {
test: () => 'child'
}
test('serializers override values', ({ end, is }) => {
const parent = pino({
serializers: parentSerializers,
browser: {
serialize: true,
write (o) {
is(o.test, 'parent')
end()
}
}
})
parent.fatal({ test: 'test' })
})
test('without the serialize option, serializers do not override values', ({ end, is }) => {
const parent = pino({
serializers: parentSerializers,
browser: {
write (o) {
is(o.test, 'test')
end()
}
}
})
parent.fatal({ test: 'test' })
})
if (process.title !== 'browser') {
test('if serialize option is true, standard error serializer is auto enabled', ({ end, same }) => {
const err = Error('test')
err.code = 'test'
err.type = 'Error' // get that cov
const expect = pino.stdSerializers.err(err)
const consoleError = console.error
console.error = function (err) {
same(err, expect)
}
const logger = fresh('../browser')({
browser: { serialize: true }
})
console.error = consoleError
logger.fatal(err)
end()
})
test('if serialize option is array, standard error serializer is auto enabled', ({ end, same }) => {
const err = Error('test')
err.code = 'test'
const expect = pino.stdSerializers.err(err)
const consoleError = console.error
console.error = function (err) {
same(err, expect)
}
const logger = fresh('../browser', require)({
browser: { serialize: [] }
})
console.error = consoleError
logger.fatal(err)
end()
})
test('if serialize option is array containing !stdSerializers.err, standard error serializer is disabled', ({ end, is }) => {
const err = Error('test')
err.code = 'test'
const expect = err
const consoleError = console.error
console.error = function (err) {
is(err, expect)
}
const logger = fresh('../browser', require)({
browser: { serialize: ['!stdSerializers.err'] }
})
console.error = consoleError
logger.fatal(err)
end()
})
test('in browser, serializers apply to all objects', ({ end, is }) => {
const consoleError = console.error
console.error = function (test, test2, test3, test4, test5) {
is(test.key, 'serialized')
is(test2.key2, 'serialized2')
is(test5.key3, 'serialized3')
}
const logger = fresh('../browser', require)({
serializers: {
key: () => 'serialized',
key2: () => 'serialized2',
key3: () => 'serialized3'
},
browser: { serialize: true }
})
console.error = consoleError
logger.fatal({ key: 'test' }, { key2: 'test' }, 'str should skip', [{ foo: 'array should skip' }], { key3: 'test' })
end()
})
test('serialize can be an array of selected serializers', ({ end, is }) => {
const consoleError = console.error
console.error = function (test, test2, test3, test4, test5) {
is(test.key, 'test')
is(test2.key2, 'serialized2')
is(test5.key3, 'test')
}
const logger = fresh('../browser', require)({
serializers: {
key: () => 'serialized',
key2: () => 'serialized2',
key3: () => 'serialized3'
},
browser: { serialize: ['key2'] }
})
console.error = consoleError
logger.fatal({ key: 'test' }, { key2: 'test' }, 'str should skip', [{ foo: 'array should skip' }], { key3: 'test' })
end()
})
test('serialize filter applies to child loggers', ({ end, is }) => {
const consoleError = console.error
console.error = function (binding, test, test2, test3, test4, test5) {
is(test.key, 'test')
is(test2.key2, 'serialized2')
is(test5.key3, 'test')
}
const logger = fresh('../browser', require)({
browser: { serialize: ['key2'] }
})
console.error = consoleError
logger.child({
aBinding: 'test',
serializers: {
key: () => 'serialized',
key2: () => 'serialized2',
key3: () => 'serialized3'
}
}).fatal({ key: 'test' }, { key2: 'test' }, 'str should skip', [{ foo: 'array should skip' }], { key3: 'test' })
end()
})
test('parent serializers apply to child bindings', ({ end, is }) => {
const consoleError = console.error
console.error = function (binding) {
is(binding.key, 'serialized')
}
const logger = fresh('../browser', require)({
serializers: {
key: () => 'serialized'
},
browser: { serialize: true }
})
console.error = consoleError
logger.child({ key: 'test' }).fatal({ test: 'test' })
end()
})
test('child serializers apply to child bindings', ({ end, is }) => {
const consoleError = console.error
console.error = function (binding) {
is(binding.key, 'serialized')
}
const logger = fresh('../browser', require)({
browser: { serialize: true }
})
console.error = consoleError
logger.child({
key: 'test',
serializers: {
key: () => 'serialized'
}
}).fatal({ test: 'test' })
end()
})
}
test('child does not overwrite parent serializers', ({ end, is }) => {
var c = 0
const parent = pino({
serializers: parentSerializers,
browser: {
serialize: true,
write (o) {
c++
if (c === 1) is(o.test, 'parent')
if (c === 2) {
is(o.test, 'child')
end()
}
}
}
})
const child = parent.child({ serializers: childSerializers })
parent.fatal({ test: 'test' })
child.fatal({ test: 'test' })
})
test('children inherit parent serializers', ({ end, is }) => {
const parent = pino({
serializers: parentSerializers,
browser: {
serialize: true,
write (o) {
is(o.test, 'parent')
}
}
})
const child = parent.child({ a: 'property' })
child.fatal({ test: 'test' })
end()
})
test('children serializers get called', ({ end, is }) => {
const parent = pino({
test: 'this',
browser: {
serialize: true,
write (o) {
is(o.test, 'child')
}
}
})
const child = parent.child({ a: 'property', serializers: childSerializers })
child.fatal({ test: 'test' })
end()
})
test('children serializers get called when inherited from parent', ({ end, is }) => {
const parent = pino({
test: 'this',
serializers: parentSerializers,
browser: {
serialize: true,
write: (o) => {
is(o.test, 'pass')
}
}
})
const child = parent.child({ serializers: { test: () => 'pass' } })
child.fatal({ test: 'fail' })
end()
})
test('non overriden serializers are available in the children', ({ end, is }) => {
const pSerializers = {
onlyParent: () => 'parent',
shared: () => 'parent'
}
const cSerializers = {
shared: () => 'child',
onlyChild: () => 'child'
}
var c = 0
const parent = pino({
serializers: pSerializers,
browser: {
serialize: true,
write (o) {
c++
if (c === 1) is(o.shared, 'child')
if (c === 2) is(o.onlyParent, 'parent')
if (c === 3) is(o.onlyChild, 'child')
if (c === 4) is(o.onlyChild, 'test')
}
}
})
const child = parent.child({ serializers: cSerializers })
child.fatal({ shared: 'test' })
child.fatal({ onlyParent: 'test' })
child.fatal({ onlyChild: 'test' })
parent.fatal({ onlyChild: 'test' })
end()
})

349
node_modules/pino/test/browser-transmit.test.js generated vendored Normal file
View file

@ -0,0 +1,349 @@
'use strict'
const test = require('tape')
const pino = require('../browser')
function noop () {}
test('throws if transmit object does not have send function', ({ end, throws }) => {
throws(() => {
pino({ browser: { transmit: {} } })
})
throws(() => {
pino({ browser: { transmit: { send: 'not a func' } } })
})
end()
})
test('calls send function after write', ({ end, is }) => {
var c = 0
const logger = pino({
browser: {
write: () => {
c++
},
transmit: {
send () { is(c, 1) }
}
}
})
logger.fatal({ test: 'test' })
end()
})
test('passes send function the logged level', ({ end, is }) => {
const logger = pino({
browser: {
write () {},
transmit: {
send (level) {
is(level, 'fatal')
}
}
}
})
logger.fatal({ test: 'test' })
end()
})
test('passes send function message strings in logEvent object when asObject is not set', ({ end, same, is }) => {
const logger = pino({
browser: {
write: noop,
transmit: {
send (level, { messages }) {
is(messages[0], 'test')
is(messages[1], 'another test')
}
}
}
})
logger.fatal('test', 'another test')
end()
})
test('passes send function message objects in logEvent object when asObject is not set', ({ end, same, is }) => {
const logger = pino({
browser: {
write: noop,
transmit: {
send (level, { messages }) {
same(messages[0], { test: 'test' })
is(messages[1], 'another test')
}
}
}
})
logger.fatal({ test: 'test' }, 'another test')
end()
})
test('passes send function message strings in logEvent object when asObject is set', ({ end, same, is }) => {
const logger = pino({
browser: {
asObject: true,
write: noop,
transmit: {
send (level, { messages }) {
is(messages[0], 'test')
is(messages[1], 'another test')
}
}
}
})
logger.fatal('test', 'another test')
end()
})
test('passes send function message objects in logEvent object when asObject is set', ({ end, same, is }) => {
const logger = pino({
browser: {
asObject: true,
write: noop,
transmit: {
send (level, { messages }) {
same(messages[0], { test: 'test' })
is(messages[1], 'another test')
}
}
}
})
logger.fatal({ test: 'test' }, 'another test')
end()
})
test('supplies a timestamp (ts) in logEvent object which is exactly the same as the `time` property in asObject mode', ({ end, is }) => {
var expected
const logger = pino({
browser: {
asObject: true, // implict because `write`, but just to be explicit
write (o) {
expected = o.time
},
transmit: {
send (level, logEvent) {
is(logEvent.ts, expected)
}
}
}
})
logger.fatal('test')
end()
})
test('passes send function child bindings via logEvent object', ({ end, same, is }) => {
const logger = pino({
browser: {
write: noop,
transmit: {
send (level, logEvent) {
const messages = logEvent.messages
const bindings = logEvent.bindings
same(bindings[0], { first: 'binding' })
same(bindings[1], { second: 'binding2' })
same(messages[0], { test: 'test' })
is(messages[1], 'another test')
}
}
}
})
logger
.child({ first: 'binding' })
.child({ second: 'binding2' })
.fatal({ test: 'test' }, 'another test')
end()
})
test('passes send function level:{label, value} via logEvent object', ({ end, is }) => {
const logger = pino({
browser: {
write: noop,
transmit: {
send (level, logEvent) {
const label = logEvent.level.label
const value = logEvent.level.value
is(label, 'fatal')
is(value, 60)
}
}
}
})
logger.fatal({ test: 'test' }, 'another test')
end()
})
test('calls send function according to transmit.level', ({ end, is }) => {
var c = 0
const logger = pino({
browser: {
write: noop,
transmit: {
level: 'error',
send (level) {
c++
if (c === 1) is(level, 'error')
if (c === 2) is(level, 'fatal')
}
}
}
})
logger.warn('ignored')
logger.error('test')
logger.fatal('test')
end()
})
test('transmit.level defaults to logger level', ({ end, is }) => {
var c = 0
const logger = pino({
level: 'error',
browser: {
write: noop,
transmit: {
send (level) {
c++
if (c === 1) is(level, 'error')
if (c === 2) is(level, 'fatal')
}
}
}
})
logger.warn('ignored')
logger.error('test')
logger.fatal('test')
end()
})
test('transmit.level is effective even if lower than logger level', ({ end, is }) => {
var c = 0
const logger = pino({
level: 'error',
browser: {
write: noop,
transmit: {
level: 'info',
send (level) {
c++
if (c === 1) is(level, 'warn')
if (c === 2) is(level, 'error')
if (c === 3) is(level, 'fatal')
}
}
}
})
logger.warn('ignored')
logger.error('test')
logger.fatal('test')
end()
})
test('applies all serializers to messages and bindings (serialize:false - default)', ({ end, same, is }) => {
const logger = pino({
serializers: {
first: () => 'first',
second: () => 'second',
test: () => 'serialize it'
},
browser: {
write: noop,
transmit: {
send (level, logEvent) {
const messages = logEvent.messages
const bindings = logEvent.bindings
same(bindings[0], { first: 'first' })
same(bindings[1], { second: 'second' })
same(messages[0], { test: 'serialize it' })
is(messages[1].type, 'Error')
}
}
}
})
logger
.child({ first: 'binding' })
.child({ second: 'binding2' })
.fatal({ test: 'test' }, Error())
end()
})
test('applies all serializers to messages and bindings (serialize:true)', ({ end, same, is }) => {
const logger = pino({
serializers: {
first: () => 'first',
second: () => 'second',
test: () => 'serialize it'
},
browser: {
serialize: true,
write: noop,
transmit: {
send (level, logEvent) {
const messages = logEvent.messages
const bindings = logEvent.bindings
same(bindings[0], { first: 'first' })
same(bindings[1], { second: 'second' })
same(messages[0], { test: 'serialize it' })
is(messages[1].type, 'Error')
}
}
}
})
logger
.child({ first: 'binding' })
.child({ second: 'binding2' })
.fatal({ test: 'test' }, Error())
end()
})
test('extracts correct bindings and raw messages over multiple transmits', ({ end, same, is }) => {
var messages = null
var bindings = null
const logger = pino({
browser: {
write: noop,
transmit: {
send (level, logEvent) {
messages = logEvent.messages
bindings = logEvent.bindings
}
}
}
})
const child = logger.child({ child: true })
const grandchild = child.child({ grandchild: true })
logger.fatal({ test: 'parent:test1' })
logger.fatal({ test: 'parent:test2' })
same([], bindings)
same([{ test: 'parent:test2' }], messages)
child.fatal({ test: 'child:test1' })
child.fatal({ test: 'child:test2' })
same([{ child: true }], bindings)
same([{ test: 'child:test2' }], messages)
grandchild.fatal({ test: 'grandchild:test1' })
grandchild.fatal({ test: 'grandchild:test2' })
same([{ child: true }, { grandchild: true }], bindings)
same([{ test: 'grandchild:test2' }], messages)
end()
})

553
node_modules/pino/test/browser.test.js generated vendored Normal file
View file

@ -0,0 +1,553 @@
'use strict'
const test = require('tape')
const fresh = require('import-fresh')
const pinoStdSerializers = require('pino-std-serializers')
const pino = require('../browser')
levelTest('fatal')
levelTest('error')
levelTest('warn')
levelTest('info')
levelTest('debug')
levelTest('trace')
test('silent level', ({ end, fail, pass }) => {
const instance = pino({
level: 'silent',
browser: { write: fail }
})
instance.info('test')
const child = instance.child({ test: 'test' })
child.info('msg-test')
// use setTimeout because setImmediate isn't supported in most browsers
setTimeout(() => {
pass()
end()
}, 0)
})
test('enabled false', ({ end, fail, pass }) => {
const instance = pino({
enabled: false,
browser: { write: fail }
})
instance.info('test')
const child = instance.child({ test: 'test' })
child.info('msg-test')
// use setTimeout because setImmediate isn't supported in most browsers
setTimeout(() => {
pass()
end()
}, 0)
})
test('throw if creating child without bindings', ({ end, throws }) => {
const instance = pino()
throws(() => instance.child())
end()
})
test('stubs write, flush and ee methods on instance', ({ end, ok, is }) => {
const instance = pino()
ok(isFunc(instance.setMaxListeners))
ok(isFunc(instance.getMaxListeners))
ok(isFunc(instance.emit))
ok(isFunc(instance.addListener))
ok(isFunc(instance.on))
ok(isFunc(instance.prependListener))
ok(isFunc(instance.once))
ok(isFunc(instance.prependOnceListener))
ok(isFunc(instance.removeListener))
ok(isFunc(instance.removeAllListeners))
ok(isFunc(instance.listeners))
ok(isFunc(instance.listenerCount))
ok(isFunc(instance.eventNames))
ok(isFunc(instance.write))
ok(isFunc(instance.flush))
is(instance.on(), undefined)
end()
})
test('exposes levels object', ({ end, same }) => {
same(pino.levels, {
values: {
fatal: 60,
error: 50,
warn: 40,
info: 30,
debug: 20,
trace: 10
},
labels: {
10: 'trace',
20: 'debug',
30: 'info',
40: 'warn',
50: 'error',
60: 'fatal'
}
})
end()
})
test('exposes LOG_VERSION', ({ end, is }) => {
is(pino.LOG_VERSION, 1)
end()
})
test('exposes faux stdSerializers', ({ end, ok, same }) => {
ok(pino.stdSerializers)
// make sure faux stdSerializers match pino-std-serializers
for (const serializer in pinoStdSerializers) {
ok(pino.stdSerializers[serializer], `pino.stdSerializers.${serializer}`)
}
// confirm faux methods return empty objects
same(pino.stdSerializers.req(), {})
same(pino.stdSerializers.mapHttpRequest(), {})
same(pino.stdSerializers.mapHttpResponse(), {})
same(pino.stdSerializers.res(), {})
// confirm wrapping function is a passthrough
const noChange = { foo: 'bar', fuz: 42 }
same(pino.stdSerializers.wrapRequestSerializer(noChange), noChange)
same(pino.stdSerializers.wrapResponseSerializer(noChange), noChange)
end()
})
test('exposes err stdSerializer', ({ end, ok }) => {
ok(pino.stdSerializers.err)
ok(pino.stdSerializers.err(Error()))
end()
})
consoleMethodTest('error')
consoleMethodTest('fatal', 'error')
consoleMethodTest('warn')
consoleMethodTest('info')
consoleMethodTest('debug')
consoleMethodTest('trace')
absentConsoleMethodTest('error', 'log')
absentConsoleMethodTest('warn', 'error')
absentConsoleMethodTest('info', 'log')
absentConsoleMethodTest('debug', 'log')
absentConsoleMethodTest('trace', 'log')
// do not run this with airtap
if (process.title !== 'browser') {
test('in absence of console, log methods become noops', ({ end, ok }) => {
var console = global.console
delete global.console
const instance = fresh('../browser')()
global.console = console
ok(fnName(instance.log).match(/noop/))
ok(fnName(instance.fatal).match(/noop/))
ok(fnName(instance.error).match(/noop/))
ok(fnName(instance.warn).match(/noop/))
ok(fnName(instance.info).match(/noop/))
ok(fnName(instance.debug).match(/noop/))
ok(fnName(instance.trace).match(/noop/))
end()
})
}
test('opts.browser.asObject logs pino-like object to console', ({ end, ok, is }) => {
var info = console.info
console.info = function (o) {
is(o.level, 30)
is(o.msg, 'test')
ok(o.time)
console.info = info
}
const instance = require('../browser')({
browser: {
asObject: true
}
})
instance.info('test')
end()
})
test('opts.browser.write func log single string', ({ end, ok, is }) => {
const instance = pino({
browser: {
write: function (o) {
is(o.level, 30)
is(o.msg, 'test')
ok(o.time)
}
}
})
instance.info('test')
end()
})
test('opts.browser.write func string joining', ({ end, ok, is }) => {
const instance = pino({
browser: {
write: function (o) {
is(o.level, 30)
is(o.msg, 'test test2 test3')
ok(o.time)
}
}
})
instance.info('test', 'test2', 'test3')
end()
})
test('opts.browser.write func string joining when asObject is true', ({ end, ok, is }) => {
const instance = pino({
browser: {
asObject: true,
write: function (o) {
is(o.level, 30)
is(o.msg, 'test test2 test3')
ok(o.time)
}
}
})
instance.info('test', 'test2', 'test3')
end()
})
test('opts.browser.write func string joining when asObject is true', ({ end, ok, is }) => {
const instance = pino({
browser: {
asObject: true,
write: function (o) {
is(o.level, 30)
is(o.msg, 'test test2 test3')
ok(o.time)
}
}
})
instance.info('test', 'test2', 'test3')
end()
})
test('opts.browser.write func string object joining', ({ end, ok, is }) => {
const instance = pino({
browser: {
write: function (o) {
is(o.level, 30)
is(o.msg, 'test {"test":"test2"} {"test":"test3"}')
ok(o.time)
}
}
})
instance.info('test', { test: 'test2' }, { test: 'test3' })
end()
})
test('opts.browser.write func string object joining when asObject is true', ({ end, ok, is }) => {
const instance = pino({
browser: {
asObject: true,
write: function (o) {
is(o.level, 30)
is(o.msg, 'test {"test":"test2"} {"test":"test3"}')
ok(o.time)
}
}
})
instance.info('test', { test: 'test2' }, { test: 'test3' })
end()
})
test('opts.browser.write func string interpolation', ({ end, ok, is }) => {
const instance = pino({
browser: {
write: function (o) {
is(o.level, 30)
is(o.msg, 'test2 test ({"test":"test3"})')
ok(o.time)
}
}
})
instance.info('%s test (%j)', 'test2', { test: 'test3' })
end()
})
test('opts.browser.write func number', ({ end, ok, is }) => {
const instance = pino({
browser: {
write: function (o) {
is(o.level, 30)
is(o.msg, 1)
ok(o.time)
}
}
})
instance.info(1)
end()
})
test('opts.browser.write func log single object', ({ end, ok, is }) => {
const instance = pino({
browser: {
write: function (o) {
is(o.level, 30)
is(o.test, 'test')
ok(o.time)
}
}
})
instance.info({ test: 'test' })
end()
})
test('opts.browser.write obj writes to methods corresponding to level', ({ end, ok, is }) => {
const instance = pino({
browser: {
write: {
error: function (o) {
is(o.level, 50)
is(o.test, 'test')
ok(o.time)
}
}
}
})
instance.error({ test: 'test' })
end()
})
test('opts.browser.asObject/write supports child loggers', ({ end, ok, is }) => {
const instance = pino({
browser: {
write (o) {
is(o.level, 30)
is(o.test, 'test')
is(o.msg, 'msg-test')
ok(o.time)
}
}
})
const child = instance.child({ test: 'test' })
child.info('msg-test')
end()
})
test('opts.browser.asObject/write supports child child loggers', ({ end, ok, is }) => {
const instance = pino({
browser: {
write (o) {
is(o.level, 30)
is(o.test, 'test')
is(o.foo, 'bar')
is(o.msg, 'msg-test')
ok(o.time)
}
}
})
const child = instance.child({ test: 'test' }).child({ foo: 'bar' })
child.info('msg-test')
end()
})
test('opts.browser.asObject/write supports child child child loggers', ({ end, ok, is }) => {
const instance = pino({
browser: {
write (o) {
is(o.level, 30)
is(o.test, 'test')
is(o.foo, 'bar')
is(o.baz, 'bop')
is(o.msg, 'msg-test')
ok(o.time)
}
}
})
const child = instance.child({ test: 'test' }).child({ foo: 'bar' }).child({ baz: 'bop' })
child.info('msg-test')
end()
})
test('opts.browser.asObject defensively mitigates naughty numbers', ({ end, pass }) => {
const instance = pino({
browser: { asObject: true, write: () => {} }
})
const child = instance.child({ test: 'test' })
child._childLevel = -10
child.info('test')
pass() // if we reached here, there was no infinite loop, so, .. pass.
end()
})
test('opts.browser.write obj falls back to console where a method is not supplied', ({ end, ok, is }) => {
var info = console.info
console.info = (o) => {
is(o.level, 30)
is(o.msg, 'test')
ok(o.time)
console.info = info
}
const instance = require('../browser')({
browser: {
write: {
error (o) {
is(o.level, 50)
is(o.test, 'test')
ok(o.time)
}
}
}
})
instance.error({ test: 'test' })
instance.info('test')
end()
})
function levelTest (name) {
test(name + ' logs', ({ end, is }) => {
var msg = 'hello world'
sink(name, (args) => {
is(args[0], msg)
end()
})
pino({ level: name })[name](msg)
})
test('passing objects at level ' + name, ({ end, is }) => {
var msg = { hello: 'world' }
sink(name, (args) => {
is(args[0], msg)
end()
})
pino({ level: name })[name](msg)
})
test('passing an object and a string at level ' + name, ({ end, is }) => {
var a = { hello: 'world' }
var b = 'a string'
sink(name, (args) => {
is(args[0], a)
is(args[1], b)
end()
})
pino({ level: name })[name](a, b)
})
test('formatting logs as ' + name, ({ end, is }) => {
sink(name, (args) => {
is(args[0], 'hello %d')
is(args[1], 42)
end()
})
pino({ level: name })[name]('hello %d', 42)
})
test('passing error at level ' + name, ({ end, is }) => {
var err = new Error('myerror')
sink(name, (args) => {
is(args[0], err)
end()
})
pino({ level: name })[name](err)
})
test('passing error with a serializer at level ' + name, ({ end, is }) => {
// in browser - should have no effect (should not crash)
var err = new Error('myerror')
sink(name, (args) => {
is(args[0].err, err)
end()
})
const instance = pino({
level: name,
serializers: {
err: pino.stdSerializers.err
}
})
instance[name]({ err: err })
})
test('child logger for level ' + name, ({ end, is }) => {
var msg = 'hello world'
var parent = { hello: 'world' }
sink(name, (args) => {
is(args[0], parent)
is(args[1], msg)
end()
})
const instance = pino({ level: name })
const child = instance.child(parent)
child[name](msg)
})
test('child-child logger for level ' + name, ({ end, is }) => {
var msg = 'hello world'
var grandParent = { hello: 'world' }
var parent = { hello: 'you' }
sink(name, (args) => {
is(args[0], grandParent)
is(args[1], parent)
is(args[2], msg)
end()
})
const instance = pino({ level: name })
const child = instance.child(grandParent).child(parent)
child[name](msg)
})
}
function consoleMethodTest (level, method) {
if (!method) method = level
test('pino().' + level + ' uses console.' + method, ({ end, is }) => {
sink(method, (args) => {
is(args[0], 'test')
end()
})
const instance = require('../browser')({ level: level })
instance[level]('test')
})
}
function absentConsoleMethodTest (method, fallback) {
test('in absence of console.' + method + ', console.' + fallback + ' is used', ({ end, is }) => {
var fn = console[method]
console[method] = undefined
sink(fallback, function (args) {
is(args[0], 'test')
end()
console[method] = fn
})
const instance = require('../browser')({ level: method })
instance[method]('test')
})
}
function isFunc (fn) { return typeof fn === 'function' }
function fnName (fn) {
var rx = /^\s*function\s*([^(]*)/i
var match = rx.exec(fn)
return match && match[1]
}
function sink (method, fn) {
if (method === 'fatal') method = 'error'
var orig = console[method]
console[method] = function () {
console[method] = orig
fn(Array.prototype.slice.call(arguments))
}
}

32
node_modules/pino/test/crlf.test.js generated vendored Normal file
View file

@ -0,0 +1,32 @@
'use strict'
const { test } = require('tap')
const writer = require('flush-write-stream')
const pino = require('../')
function capture () {
const ws = writer((chunk, enc, cb) => {
ws.data += chunk.toString()
cb()
})
ws.data = ''
return ws
}
test('pino uses LF by default', async ({ ok }) => {
const stream = capture()
const logger = pino(stream)
logger.info('foo')
logger.error('bar')
ok(/foo[^\r\n]+\n[^\r\n]+bar[^\r\n]+\n/.test(stream.data))
})
test('pino can log CRLF', async ({ ok }) => {
const stream = capture()
const logger = pino({
crlf: true
}, stream)
logger.info('foo')
logger.error('bar')
ok(/foo[^\n]+\r\n[^\n]+bar[^\n]+\r\n/.test(stream.data))
})

308
node_modules/pino/test/custom-levels.test.js generated vendored Normal file
View file

@ -0,0 +1,308 @@
'use strict'
/* eslint no-prototype-builtins: 0 */
const { test } = require('tap')
const { sink, once } = require('./helper')
const pino = require('../')
test('adds additional levels', async ({ is }) => {
const stream = sink()
const logger = pino({
customLevels: {
foo: 35,
bar: 45
}
}, stream)
logger.foo('test')
const { level } = await once(stream, 'data')
is(level, 35)
})
test('custom levels does not override default levels', async ({ is }) => {
const stream = sink()
const logger = pino({
customLevels: {
foo: 35
}
}, stream)
logger.info('test')
const { level } = await once(stream, 'data')
is(level, 30)
})
test('default levels can be redefined using custom levels', async ({ is }) => {
const stream = sink()
const logger = pino({
customLevels: {
info: 35,
debug: 45
},
useOnlyCustomLevels: true
}, stream)
is(logger.hasOwnProperty('info'), true)
logger.info('test')
const { level } = await once(stream, 'data')
is(level, 35)
})
test('custom levels overrides default level label if use useOnlyCustomLevels', async ({ is }) => {
const stream = sink()
const logger = pino({
customLevels: {
foo: 35
},
useOnlyCustomLevels: true,
level: 'foo'
}, stream)
is(logger.hasOwnProperty('info'), false)
})
test('custom levels overrides default level value if use useOnlyCustomLevels', async ({ is }) => {
const stream = sink()
const logger = pino({
customLevels: {
foo: 35
},
useOnlyCustomLevels: true,
level: 35
}, stream)
is(logger.hasOwnProperty('info'), false)
})
test('custom levels are inherited by children', async ({ is }) => {
const stream = sink()
const logger = pino({
customLevels: {
foo: 35
}
}, stream)
logger.child({ childMsg: 'ok' }).foo('test')
const { msg, childMsg, level } = await once(stream, 'data')
is(level, 35)
is(childMsg, 'ok')
is(msg, 'test')
})
test('custom levels can be specified on child bindings', async ({ is }) => {
const stream = sink()
const logger = pino(stream).child({
customLevels: {
foo: 35
},
childMsg: 'ok'
})
logger.foo('test')
const { msg, childMsg, level } = await once(stream, 'data')
is(level, 35)
is(childMsg, 'ok')
is(msg, 'test')
})
test('customLevels property child bindings does not get logged', async ({ is }) => {
const stream = sink()
const logger = pino(stream).child({
customLevels: {
foo: 35
},
childMsg: 'ok'
})
logger.foo('test')
const { customLevels } = await once(stream, 'data')
is(customLevels, undefined)
})
test('throws when specifying pre-existing parent labels via child bindings', async ({ is, throws }) => {
const stream = sink()
throws(() => pino({
customLevels: {
foo: 35
}
}, stream).child({
customLevels: {
foo: 45
}
})
)
try {
pino({
customLevels: {
foo: 35
}
}, stream).child({
customLevels: {
foo: 45
}
})
} catch ({ message }) {
is(message, 'levels cannot be overridden')
}
})
test('throws when specifying pre-existing parent values via child bindings', async ({ is, throws }) => {
const stream = sink()
throws(() => pino({
customLevels: {
foo: 35
}
}, stream).child({
customLevels: {
bar: 35
}
})
)
try {
pino({
customLevels: {
foo: 35
}
}, stream).child({
customLevels: {
bar: 35
}
})
} catch ({ message }) {
is(message, 'pre-existing level values cannot be used for new levels')
}
})
test('throws when specifying core values via child bindings', async ({ is, throws }) => {
const stream = sink()
throws(() => pino(stream).child({
customLevels: {
foo: 30
}
})
)
try {
pino(stream).child({
customLevels: {
foo: 30
}
})
} catch ({ message }) {
is(message, 'pre-existing level values cannot be used for new levels')
}
})
test('throws when useOnlyCustomLevels is set true without customLevels', async ({ is, throws }) => {
const stream = sink()
throws(() => pino({
useOnlyCustomLevels: true
}, stream)
)
try {
pino({
useOnlyCustomLevels: true
}, stream)
} catch ({ message }) {
is(message, 'customLevels is required if useOnlyCustomLevels is set true')
}
})
test('custom level on one instance does not affect other instances', async ({ is }) => {
pino({
customLevels: {
foo: 37
}
})
is(typeof pino().foo, 'undefined')
})
test('setting level below or at custom level will successfully log', async ({ is }) => {
const stream = sink()
const instance = pino({ customLevels: { foo: 35 } }, stream)
instance.level = 'foo'
instance.info('nope')
instance.foo('bar')
const { msg } = await once(stream, 'data')
is(msg, 'bar')
})
test('custom level below level threshold will not log', async ({ is }) => {
const stream = sink()
const instance = pino({ customLevels: { foo: 15 } }, stream)
instance.level = 'info'
instance.info('bar')
instance.foo('nope')
const { msg } = await once(stream, 'data')
is(msg, 'bar')
})
test('does not share custom level state across siblings', async ({ doesNotThrow }) => {
const stream = sink()
const logger = pino(stream)
logger.child({
customLevels: { foo: 35 }
})
doesNotThrow(() => {
logger.child({
customLevels: { foo: 35 }
})
})
})
test('custom level does not affect levelKey', async ({ is }) => {
const stream = sink()
const logger = pino({
customLevels: {
foo: 35,
bar: 45
},
levelKey: 'priority'
}, stream)
logger.foo('test')
const { priority } = await once(stream, 'data')
is(priority, 35)
})
test('custom levels accesible in prettifier function', async ({ plan, same }) => {
plan(1)
const logger = pino({
prettyPrint: true,
prettifier: function prettifierFactory () {
const instance = this
return function () {
same(instance.levels, {
labels: {
10: 'trace',
20: 'debug',
30: 'info',
35: 'foo',
40: 'warn',
45: 'bar',
50: 'error',
60: 'fatal'
},
values: {
trace: 10,
debug: 20,
info: 30,
warn: 40,
error: 50,
fatal: 60,
foo: 35,
bar: 45
}
})
}
},
customLevels: {
foo: 35,
bar: 45
},
changeLevelName: 'priority'
})
logger.foo('test')
})

179
node_modules/pino/test/error.test.js generated vendored Normal file
View file

@ -0,0 +1,179 @@
'use strict'
/* eslint no-prototype-builtins: 0 */
const os = require('os')
const { test } = require('tap')
const { sink, once } = require('./helper')
const pino = require('../')
const { pid } = process
const hostname = os.hostname()
const level = 50
const name = 'error'
test('err is serialized with additional properties set on the Error object', async ({ ok, same }) => {
const stream = sink()
const err = Object.assign(new Error('myerror'), { foo: 'bar' })
const instance = pino(stream)
instance.level = name
instance[name](err)
const result = await once(stream, 'data')
ok(new Date(result.time) <= new Date(), 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: level,
type: 'Error',
msg: err.message,
stack: err.stack,
foo: err.foo,
v: 1
})
})
test('type should be retained, even if type is a property', async ({ ok, same }) => {
const stream = sink()
const err = Object.assign(new Error('myerror'), { type: 'bar' })
const instance = pino(stream)
instance.level = name
instance[name](err)
const result = await once(stream, 'data')
ok(new Date(result.time) <= new Date(), 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: level,
type: 'bar',
msg: err.message,
stack: err.stack,
v: 1
})
})
test('type, message and stack should be first level properties', async ({ ok, same }) => {
const stream = sink()
const err = Object.assign(new Error('foo'), { foo: 'bar' })
const instance = pino(stream)
instance.level = name
instance[name](err)
const result = await once(stream, 'data')
ok(new Date(result.time) <= new Date(), 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: level,
type: 'Error',
msg: err.message,
stack: err.stack,
foo: err.foo,
v: 1
})
})
test('err serializer', async ({ ok, same }) => {
const stream = sink()
const err = Object.assign(new Error('myerror'), { foo: 'bar' })
const instance = pino({
serializers: {
err: pino.stdSerializers.err
}
}, stream)
instance.level = name
instance[name]({ err })
const result = await once(stream, 'data')
ok(new Date(result.time) <= new Date(), 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: level,
err: {
type: 'Error',
message: err.message,
stack: err.stack,
foo: err.foo
},
v: 1
})
})
test('an error with statusCode property is not confused for a http response', async ({ ok, same }) => {
const stream = sink()
const err = Object.assign(new Error('StatusCodeErr'), { statusCode: 500 })
const instance = pino(stream)
instance.level = name
instance[name](err)
const result = await once(stream, 'data')
ok(new Date(result.time) <= new Date(), 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: level,
type: 'Error',
msg: err.message,
stack: err.stack,
statusCode: err.statusCode,
v: 1
})
})
test('stack is omitted if it is not set on err', t => {
t.plan(2)
var err = new Error('myerror')
delete err.stack
var instance = pino(sink(function (chunk, enc, cb) {
t.ok(new Date(chunk.time) <= new Date(), 'time is greater than Date.now()')
delete chunk.time
t.equal(chunk.hasOwnProperty('stack'), false)
cb()
}))
instance.level = name
instance[name](err)
})
test('stack is rendered as any other property if it\'s not a string', t => {
t.plan(3)
var err = new Error('myerror')
err.stack = null
var instance = pino(sink(function (chunk, enc, cb) {
t.ok(new Date(chunk.time) <= new Date(), 'time is greater than Date.now()')
delete chunk.time
t.equal(chunk.hasOwnProperty('stack'), true)
t.equal(chunk.stack, null)
cb()
}))
instance.level = name
instance[name](err)
})
test('correctly ignores toString on errors', async ({ same }) => {
const err = new Error('myerror')
err.toString = () => undefined
const stream = sink()
const instance = pino({
test: 'this'
}, stream)
instance.fatal(err)
const result = await once(stream, 'data')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 60,
type: 'Error',
msg: err.message,
stack: err.stack,
v: 1
})
})

93
node_modules/pino/test/escaping.test.js generated vendored Normal file
View file

@ -0,0 +1,93 @@
'use strict'
const os = require('os')
const { test } = require('tap')
const { sink, once } = require('./helper')
const pino = require('../')
const { pid } = process
const hostname = os.hostname()
function testEscape (ch, key) {
test('correctly escape ' + ch, async ({ same }) => {
const stream = sink()
const instance = pino({
name: 'hello'
}, stream)
instance.fatal('this contains ' + key)
const result = await once(stream, 'data')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 60,
name: 'hello',
msg: 'this contains ' + key,
v: 1
})
})
}
testEscape('\\n', '\n')
testEscape('\\/', '/')
testEscape('\\\\', '\\')
testEscape('\\r', '\r')
testEscape('\\t', '\t')
testEscape('\\b', '\b')
const toEscape = [
'\u0000', // NUL Null character
'\u0001', // SOH Start of Heading
'\u0002', // STX Start of Text
'\u0003', // ETX End-of-text character
'\u0004', // EOT End-of-transmission character
'\u0005', // ENQ Enquiry character
'\u0006', // ACK Acknowledge character
'\u0007', // BEL Bell character
'\u0008', // BS Backspace
'\u0009', // HT Horizontal tab
'\u000A', // LF Line feed
'\u000B', // VT Vertical tab
'\u000C', // FF Form feed
'\u000D', // CR Carriage return
'\u000E', // SO Shift Out
'\u000F', // SI Shift In
'\u0010', // DLE Data Link Escape
'\u0011', // DC1 Device Control 1
'\u0012', // DC2 Device Control 2
'\u0013', // DC3 Device Control 3
'\u0014', // DC4 Device Control 4
'\u0015', // NAK Negative-acknowledge character
'\u0016', // SYN Synchronous Idle
'\u0017', // ETB End of Transmission Block
'\u0018', // CAN Cancel character
'\u0019', // EM End of Medium
'\u001A', // SUB Substitute character
'\u001B', // ESC Escape character
'\u001C', // FS File Separator
'\u001D', // GS Group Separator
'\u001E', // RS Record Separator
'\u001F' // US Unit Separator
]
toEscape.forEach((key) => {
testEscape(JSON.stringify(key), key)
})
test('correctly escape `hello \\u001F world \\n \\u0022`', async ({ same }) => {
const stream = sink()
const instance = pino({
name: 'hello'
}, stream)
instance.fatal('hello \u001F world \n \u0022')
const result = await once(stream, 'data')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 60,
name: 'hello',
msg: 'hello \u001F world \n \u0022',
v: 1
})
})

53
node_modules/pino/test/exit.test.js generated vendored Normal file
View file

@ -0,0 +1,53 @@
'use strict'
const { test } = require('tap')
const { join } = require('path')
const execa = require('execa')
const writer = require('flush-write-stream')
const { once } = require('./helper')
// https://github.com/pinojs/pino/issues/542
test('pino.destination log everything when calling process.exit(0)', async ({ isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'destination-exit.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/hello/), null)
isNot(actual.match(/world/), null)
})
test('pino.extreme does not log everything when calling process.exit(0)', async ({ is }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'extreme-exit.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
is(actual.match(/hello/), null)
is(actual.match(/world/), null)
})
test('pino.extreme logs everything when calling flushSync', async ({ isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'extreme-flush-exit.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/hello/), null)
isNot(actual.match(/world/), null)
})

125
node_modules/pino/test/extreme.test.js generated vendored Normal file
View file

@ -0,0 +1,125 @@
'use strict'
const os = require('os')
const { createWriteStream } = require('fs')
const { join } = require('path')
const { test } = require('tap')
const { fork } = require('child_process')
const writer = require('flush-write-stream')
const { once, getPathToNull } = require('./helper')
test('extreme mode', async ({ is, teardown }) => {
const now = Date.now
const hostname = os.hostname
const proc = process
global.process = {
__proto__: process,
pid: 123456
}
Date.now = () => 1459875739796
os.hostname = () => 'abcdefghijklmnopqr'
delete require.cache[require.resolve('../')]
const pino = require('../')
var expected = ''
var actual = ''
const normal = pino(writer((s, enc, cb) => {
expected += s
cb()
}))
const dest = createWriteStream(getPathToNull())
dest.write = (s) => {
actual += s
}
const extreme = pino(dest)
var i = 44
while (i--) {
normal.info('h')
extreme.info('h')
}
var expected2 = expected.split('\n')[0]
var actual2 = ''
const child = fork(join(__dirname, '/fixtures/extreme.js'), { silent: true })
child.stdout.pipe(writer((s, enc, cb) => {
actual2 += s
cb()
}))
await once(child, 'close')
is(actual, expected)
is(actual2.trim(), expected2)
teardown(() => {
os.hostname = hostname
Date.now = now
global.process = proc
})
})
test('extreme mode with child', async ({ is, teardown }) => {
const now = Date.now
const hostname = os.hostname
const proc = process
global.process = {
__proto__: process,
pid: 123456
}
Date.now = function () {
return 1459875739796
}
os.hostname = function () {
return 'abcdefghijklmnopqr'
}
delete require.cache[require.resolve('../')]
const pino = require('../')
var expected = ''
var actual = ''
const normal = pino(writer((s, enc, cb) => {
expected += s
cb()
})).child({ hello: 'world' })
const dest = createWriteStream(getPathToNull())
dest.write = function (s) { actual += s }
const extreme = pino(dest).child({ hello: 'world' })
var i = 500
while (i--) {
normal.info('h')
extreme.info('h')
}
extreme.flush()
var expected2 = expected.split('\n')[0]
var actual2 = ''
const child = fork(join(__dirname, '/fixtures/extreme-child.js'), { silent: true })
child.stdout.pipe(writer((s, enc, cb) => {
actual2 += s
cb()
}))
await once(child, 'close')
is(actual, expected)
is(actual2.trim(), expected2)
teardown(() => {
os.hostname = hostname
Date.now = now
global.process = proc
})
})
test('throw an error if extreme is passed', async ({ throws }) => {
const pino = require('..')
throws(() => {
pino({ extreme: true })
})
})
test('flush does nothing without extreme mode', async () => {
var instance = require('..')()
instance.flush()
})

182
node_modules/pino/test/final.test.js generated vendored Normal file
View file

@ -0,0 +1,182 @@
'use strict'
const pino = require('..')
const fs = require('fs')
const { test } = require('tap')
const { sleep, getPathToNull } = require('./helper')
test('replaces onTerminated option', async ({ throws }) => {
throws(() => {
pino({
onTerminated: () => {}
})
}, Error('The onTerminated option has been removed, use pino.final instead'))
})
test('throws if not supplied a logger instance', async ({ throws }) => {
throws(() => {
pino.final()
}, Error('expected a pino logger instance'))
})
test('throws if the supplied handler is not a function', async ({ throws }) => {
throws(() => {
pino.final(pino(), 'dummy')
}, Error('if supplied, the handler parameter should be a function'))
})
test('throws if not supplied logger with pino.extreme instance', async ({ throws, doesNotThrow }) => {
throws(() => {
pino.final(pino(fs.createWriteStream(getPathToNull())), () => {})
}, Error('final requires a stream that has a flushSync method, such as pino.destination and pino.extreme'))
doesNotThrow(() => {
pino.final(pino(pino.extreme()), () => {})
})
doesNotThrow(() => {
pino.final(pino(pino.extreme()), () => {})
})
})
test('returns an exit listener function', async ({ is }) => {
is(typeof pino.final(pino(pino.extreme()), () => {}), 'function')
})
test('listener function immediately sync flushes when fired', async ({ pass, fail }) => {
const dest = pino.extreme(getPathToNull())
var passed = false
dest.flushSync = () => {
passed = true
pass('flushSync called')
}
pino.final(pino(dest), () => {})()
await sleep(10)
if (passed === false) fail('flushSync not called')
})
test('listener function immediately sync flushes when fired (pino.destination)', async ({ pass, fail }) => {
const dest = pino.destination(getPathToNull())
var passed = false
dest.flushSync = () => {
passed = true
pass('flushSync called')
}
pino.final(pino(dest), () => {})()
await sleep(10)
if (passed === false) fail('flushSync not called')
})
test('swallows the non-ready error', async ({ doesNotThrow }) => {
const dest = pino.extreme(getPathToNull())
doesNotThrow(() => {
pino.final(pino(dest), () => {})()
})
})
test('listener function triggers handler function parameter', async ({ pass, fail }) => {
const dest = pino.extreme(getPathToNull())
var passed = false
pino.final(pino(dest), () => {
passed = true
pass('handler function triggered')
})()
await sleep(10)
if (passed === false) fail('handler function not triggered')
})
test('passes any error to the handler', async ({ is }) => {
const dest = pino.extreme(getPathToNull())
pino.final(pino(dest), (err) => {
is(err.message, 'test')
})(Error('test'))
})
test('passes a specialized final logger instance', async ({ is, isNot, error }) => {
const dest = pino.extreme(getPathToNull())
const logger = pino(dest)
pino.final(logger, (err, finalLogger) => {
error(err)
is(typeof finalLogger.trace, 'function')
is(typeof finalLogger.debug, 'function')
is(typeof finalLogger.info, 'function')
is(typeof finalLogger.warn, 'function')
is(typeof finalLogger.error, 'function')
is(typeof finalLogger.fatal, 'function')
isNot(finalLogger.trace, logger.trace)
isNot(finalLogger.debug, logger.debug)
isNot(finalLogger.info, logger.info)
isNot(finalLogger.warn, logger.warn)
isNot(finalLogger.error, logger.error)
isNot(finalLogger.fatal, logger.fatal)
is(finalLogger.child, logger.child)
is(finalLogger.levels, logger.levels)
})()
})
test('returns a specialized final logger instance if no handler is passed', async ({ is, isNot }) => {
const dest = pino.extreme(getPathToNull())
const logger = pino(dest)
const finalLogger = pino.final(logger)
is(typeof finalLogger.trace, 'function')
is(typeof finalLogger.debug, 'function')
is(typeof finalLogger.info, 'function')
is(typeof finalLogger.warn, 'function')
is(typeof finalLogger.error, 'function')
is(typeof finalLogger.fatal, 'function')
isNot(finalLogger.trace, logger.trace)
isNot(finalLogger.debug, logger.debug)
isNot(finalLogger.info, logger.info)
isNot(finalLogger.warn, logger.warn)
isNot(finalLogger.error, logger.error)
isNot(finalLogger.fatal, logger.fatal)
is(finalLogger.child, logger.child)
is(finalLogger.levels, logger.levels)
})
test('final logger instances synchronously flush after a log method call', async ({ pass, fail, error }) => {
const dest = pino.extreme(getPathToNull())
const logger = pino(dest)
var passed = false
var count = 0
dest.flushSync = () => {
count++
if (count === 2) {
passed = true
pass('flushSync called')
}
}
pino.final(logger, (err, finalLogger) => {
error(err)
finalLogger.info('hello')
})()
await sleep(10)
if (passed === false) fail('flushSync not called')
})
test('also instruments custom log methods', async ({ pass, fail, error }) => {
const dest = pino.extreme(getPathToNull())
const logger = pino({
customLevels: {
foo: 35
}
}, dest)
var passed = false
var count = 0
dest.flushSync = () => {
count++
if (count === 2) {
passed = true
pass('flushSync called')
}
}
pino.final(logger, (err, finalLogger) => {
error(err)
finalLogger.foo('hello')
})()
await sleep(10)
if (passed === false) fail('flushSync not called')
})

9
node_modules/pino/test/fixtures/broken-pipe/basic.js generated vendored Normal file
View file

@ -0,0 +1,9 @@
'use strict'
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
const pino = require('../../..')()
pino.info('hello world')

View file

@ -0,0 +1,10 @@
'use strict'
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
const pino = require('../../..')
const logger = pino(pino.destination())
logger.info('hello world')

12
node_modules/pino/test/fixtures/broken-pipe/extreme.js generated vendored Normal file
View file

@ -0,0 +1,12 @@
'use strict'
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
const pino = require('../../..')
const logger = pino(pino.extreme())
for (let i = 0; i < 1000; i++) {
logger.info('hello world')
}

8
node_modules/pino/test/fixtures/destination-exit.js generated vendored Normal file
View file

@ -0,0 +1,8 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../'))
var logger = pino({}, pino.destination(1))
logger.info('hello')
logger.info('world')
process.exit(0)

6
node_modules/pino/test/fixtures/extreme-child.js generated vendored Normal file
View file

@ -0,0 +1,6 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../'))
var extreme = pino(pino.extreme()).child({ hello: 'world' })
pino.final(extreme, (_, logger) => logger.info('h'))()

9
node_modules/pino/test/fixtures/extreme-exit.js generated vendored Normal file
View file

@ -0,0 +1,9 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../'))
var dest = pino.extreme(1)
var logger = pino({}, dest)
logger.info('hello')
logger.info('world')
process.exit(0)

10
node_modules/pino/test/fixtures/extreme-flush-exit.js generated vendored Normal file
View file

@ -0,0 +1,10 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../'))
var dest = pino.extreme(1)
var logger = pino({}, dest)
logger.info('hello')
logger.info('world')
dest.flushSync()
process.exit(0)

6
node_modules/pino/test/fixtures/extreme.js generated vendored Normal file
View file

@ -0,0 +1,6 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../'))
var extreme = pino(pino.extreme())
pino.final(extreme, (_, logger) => logger.info('h'))()

6
node_modules/pino/test/fixtures/pretty/basic.js generated vendored Normal file
View file

@ -0,0 +1,6 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({ prettyPrint: true })
log.info('h')

8
node_modules/pino/test/fixtures/pretty/child.js generated vendored Normal file
View file

@ -0,0 +1,8 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({ prettyPrint: true }).child({ a: 1 })
log.info('h')
log.child({ b: 2 }).info('h3')
setTimeout(() => log.info('h2'), 200)

View file

@ -0,0 +1,9 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({
timestamp: () => ',"custom-time-label":"test"',
prettyPrint: true
})
log.info('h')

View file

@ -0,0 +1,9 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({
timestamp: () => ',"time":"test"',
prettyPrint: true
})
log.info('h')

10
node_modules/pino/test/fixtures/pretty/dateformat.js generated vendored Normal file
View file

@ -0,0 +1,10 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({
prettyPrint: {
translateTime: true
}
})
log.info('h')

View file

@ -0,0 +1,9 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({
prettyPrint: { errorProps: 'code,errno' }
})
var err = Object.assign(new Error('kaboom'), { code: 'ENOENT', errno: 1 })
log.error(err)

7
node_modules/pino/test/fixtures/pretty/error.js generated vendored Normal file
View file

@ -0,0 +1,7 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({ prettyPrint: true })
log.error(new Error('kaboom'))
log.error(new Error('kaboom'), 'with', 'a', 'message')

View file

@ -0,0 +1,8 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({ prettyPrint: true })
process.once('beforeExit', pino.final(log, (_, logger) => {
logger.info('beforeExit')
}))

View file

@ -0,0 +1,7 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({ prettyPrint: true })
log.info('h')
pino.final(log).info('after')

9
node_modules/pino/test/fixtures/pretty/final.js generated vendored Normal file
View file

@ -0,0 +1,9 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({ prettyPrint: true })
log.info('h')
process.once('beforeExit', pino.final(log, (_, logger) => {
logger.info('beforeExit')
}))

View file

@ -0,0 +1,6 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({ prettyPrint: { levelFirst: true } })
log.info('h')

9
node_modules/pino/test/fixtures/pretty/no-time.js generated vendored Normal file
View file

@ -0,0 +1,9 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({
timestamp: false,
prettyPrint: true
})
log.info('h')

View file

@ -0,0 +1,6 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({ prettyPrint: true })
log.info({ msg: 'hello' })

View file

@ -0,0 +1,6 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({ prettyPrint: { levelFirst: true }, prettifier: require('pino-pretty') })
log.info('h')

9
node_modules/pino/test/fixtures/pretty/redact.js generated vendored Normal file
View file

@ -0,0 +1,9 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({
prettyPrint: true,
redact: ['foo.an']
})
log.info({ foo: { an: 'object' } }, 'h')

17
node_modules/pino/test/fixtures/pretty/serializers.js generated vendored Normal file
View file

@ -0,0 +1,17 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({
prettyPrint: true,
serializers: {
foo (obj) {
if (obj.an !== 'object') {
throw new Error('kaboom')
}
return 'bar'
}
}
})
log.info({ foo: { an: 'object' } }, 'h')

View file

@ -0,0 +1,13 @@
global.process = { __proto__: process, pid: 123456 }
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('./../../../'))
var log = pino({
prettyPrint: true,
prettifier: function () {
return function () {
return undefined
}
}
})
log.info('h')

View file

@ -0,0 +1,11 @@
global.process = { __proto__: process, pid: 123456 }
const write = process.stdout.write.bind(process.stdout)
process.stdout.write = function (chunk) {
write('hack ' + chunk)
}
Date.now = function () { return 1459875739796 }
require('os').hostname = function () { return 'abcdefghijklmnopqr' }
var pino = require(require.resolve('../../'))()
pino.info('me')

55
node_modules/pino/test/helper.js generated vendored Normal file
View file

@ -0,0 +1,55 @@
'use strict'
const os = require('os')
const writer = require('flush-write-stream')
const split = require('split2')
const pid = process.pid
const hostname = os.hostname()
const v = 1
const isWin = process.platform === 'win32'
function getPathToNull () {
return isWin ? '\\\\.\\NUL' : '/dev/null'
}
function once (emitter, name) {
return new Promise((resolve, reject) => {
if (name !== 'error') emitter.once('error', reject)
emitter.once(name, (...args) => {
emitter.removeListener('error', reject)
resolve(...args)
})
})
}
function sink (func) {
const result = split((data) => {
try {
return JSON.parse(data)
} catch (err) {
console.log(err)
console.log(data)
}
})
if (func) result.pipe(writer.obj(func))
return result
}
function check (is, chunk, level, msg) {
is(new Date(chunk.time) <= new Date(), true, 'time is greater than Date.now()')
delete chunk.time
is(chunk.pid, pid)
is(chunk.hostname, hostname)
is(chunk.level, level)
is(chunk.msg, msg)
is(chunk.v, v)
}
function sleep (ms) {
return new Promise((resolve) => {
setTimeout(resolve, ms)
})
}
module.exports = { getPathToNull, sink, check, once, sleep }

247
node_modules/pino/test/http.test.js generated vendored Normal file
View file

@ -0,0 +1,247 @@
'use strict'
const http = require('http')
const os = require('os')
const semver = require('semver')
const { test } = require('tap')
const { sink, once } = require('./helper')
const pino = require('../')
const { pid } = process
const hostname = os.hostname()
test('http request support', async ({ ok, same, error, teardown }) => {
var originalReq
const instance = pino(sink((chunk, enc) => {
ok(new Date(chunk.time) <= new Date(), 'time is greater than Date.now()')
delete chunk.time
same(chunk, {
pid: pid,
hostname: hostname,
level: 30,
msg: 'my request',
v: 1,
req: {
method: originalReq.method,
url: originalReq.url,
headers: originalReq.headers,
remoteAddress: originalReq.connection.remoteAddress,
remotePort: originalReq.connection.remotePort
}
})
}))
const server = http.createServer((req, res) => {
originalReq = req
instance.info(req, 'my request')
res.end('hello')
})
server.unref()
server.listen()
const err = await once(server, 'listening')
error(err)
const res = await once(http.get('http://localhost:' + server.address().port), 'response')
res.resume()
server.close()
})
test('http request support via serializer', async ({ ok, same, error, teardown }) => {
var originalReq
const instance = pino({
serializers: {
req: pino.stdSerializers.req
}
}, sink((chunk, enc) => {
ok(new Date(chunk.time) <= new Date(), 'time is greater than Date.now()')
delete chunk.time
same(chunk, {
pid: pid,
hostname: hostname,
level: 30,
msg: 'my request',
v: 1,
req: {
method: originalReq.method,
url: originalReq.url,
headers: originalReq.headers,
remoteAddress: originalReq.connection.remoteAddress,
remotePort: originalReq.connection.remotePort
}
})
}))
const server = http.createServer(function (req, res) {
originalReq = req
instance.info({ req: req }, 'my request')
res.end('hello')
})
server.unref()
server.listen()
const err = await once(server, 'listening')
error(err)
const res = await once(http.get('http://localhost:' + server.address().port), 'response')
res.resume()
server.close()
})
test('http request support via serializer without request connection', async ({ ok, same, error, teardown }) => {
var originalReq
const instance = pino({
serializers: {
req: pino.stdSerializers.req
}
}, sink((chunk, enc) => {
ok(new Date(chunk.time) <= new Date(), 'time is greater than Date.now()')
delete chunk.time
const expected = {
pid: pid,
hostname: hostname,
level: 30,
msg: 'my request',
v: 1,
req: {
method: originalReq.method,
url: originalReq.url,
headers: originalReq.headers
}
}
if (semver.gte(process.version, '13.0.0')) {
expected.req.remoteAddress = originalReq.connection.remoteAddress
expected.req.remotePort = originalReq.connection.remotePort
}
same(chunk, expected)
}))
const server = http.createServer(function (req, res) {
originalReq = req
delete req.connection
instance.info({ req: req }, 'my request')
res.end('hello')
})
server.unref()
server.listen()
const err = await once(server, 'listening')
error(err)
const res = await once(http.get('http://localhost:' + server.address().port), 'response')
res.resume()
server.close()
})
test('http response support', async ({ ok, same, error, teardown }) => {
var originalRes
const instance = pino(sink((chunk, enc) => {
ok(new Date(chunk.time) <= new Date(), 'time is greater than Date.now()')
delete chunk.time
same(chunk, {
pid: pid,
hostname: hostname,
level: 30,
msg: 'my response',
v: 1,
res: {
statusCode: originalRes.statusCode,
headers: originalRes._headers
}
})
}))
const server = http.createServer(function (req, res) {
originalRes = res
res.end('hello')
instance.info(res, 'my response')
})
server.unref()
server.listen()
const err = await once(server, 'listening')
error(err)
const res = await once(http.get('http://localhost:' + server.address().port), 'response')
res.resume()
server.close()
})
test('http response support via a serializer', async ({ ok, same, error, teardown }) => {
const instance = pino({
serializers: {
res: pino.stdSerializers.res
}
}, sink((chunk, enc) => {
ok(new Date(chunk.time) <= new Date(), 'time is greater than Date.now()')
delete chunk.time
same(chunk, {
pid: pid,
hostname: hostname,
level: 30,
msg: 'my response',
v: 1,
res: {
statusCode: 200,
headers: {
'x-single': 'y',
'x-multi': [1, 2]
}
}
})
}))
const server = http.createServer(function (req, res) {
res.setHeader('x-single', 'y')
res.setHeader('x-multi', [1, 2])
res.end('hello')
instance.info({ res: res }, 'my response')
})
server.unref()
server.listen()
const err = await once(server, 'listening')
error(err)
const res = await once(http.get('http://localhost:' + server.address().port), 'response')
res.resume()
server.close()
})
test('http request support via serializer in a child', async ({ ok, same, error, teardown }) => {
var originalReq
const instance = pino({
serializers: {
req: pino.stdSerializers.req
}
}, sink((chunk, enc) => {
ok(new Date(chunk.time) <= new Date(), 'time is greater than Date.now()')
delete chunk.time
same(chunk, {
pid: pid,
hostname: hostname,
level: 30,
msg: 'my request',
v: 1,
req: {
method: originalReq.method,
url: originalReq.url,
headers: originalReq.headers,
remoteAddress: originalReq.connection.remoteAddress,
remotePort: originalReq.connection.remotePort
}
})
}))
const server = http.createServer(function (req, res) {
originalReq = req
const child = instance.child({ req: req })
child.info('my request')
res.end('hello')
})
server.unref()
server.listen()
const err = await once(server, 'listening')
error(err)
const res = await once(http.get('http://localhost:' + server.address().port), 'response')
res.resume()
server.close()
})

43
node_modules/pino/test/is-level-enabled.test.js generated vendored Normal file
View file

@ -0,0 +1,43 @@
'use strict'
const { test } = require('tap')
const pino = require('../')
test('can check if current level enabled', async ({ is }) => {
const log = pino({ level: 'debug' })
is(true, log.isLevelEnabled('debug'))
})
test('can check if level enabled after level set', async ({ is }) => {
const log = pino()
is(false, log.isLevelEnabled('debug'))
log.level = 'debug'
is(true, log.isLevelEnabled('debug'))
})
test('can check if higher level enabled', async ({ is }) => {
const log = pino({ level: 'debug' })
is(true, log.isLevelEnabled('error'))
})
test('can check if lower level is disabled', async ({ is }) => {
const log = pino({ level: 'error' })
is(false, log.isLevelEnabled('trace'))
})
test('can check if child has current level enabled', async ({ is }) => {
const log = pino().child({ level: 'debug' })
is(true, log.isLevelEnabled('debug'))
is(true, log.isLevelEnabled('error'))
is(false, log.isLevelEnabled('trace'))
})
test('can check if custom level is enabled', async ({ is }) => {
const log = pino({
customLevels: { foo: 35 },
level: 'debug'
})
is(true, log.isLevelEnabled('foo'))
is(true, log.isLevelEnabled('error'))
is(false, log.isLevelEnabled('trace'))
})

441
node_modules/pino/test/levels.test.js generated vendored Normal file
View file

@ -0,0 +1,441 @@
'use strict'
const { test } = require('tap')
const { sink, once, check } = require('./helper')
const pino = require('../')
test('set the level by string', async ({ is }) => {
const expected = [{
level: 50,
msg: 'this is an error'
}, {
level: 60,
msg: 'this is fatal'
}]
const stream = sink()
const instance = pino(stream)
instance.level = 'error'
instance.info('hello world')
instance.error('this is an error')
instance.fatal('this is fatal')
const result = await once(stream, 'data')
const current = expected.shift()
check(is, result, current.level, current.msg)
})
test('the wrong level throws', async ({ throws }) => {
const instance = pino()
throws(() => {
instance.level = 'kaboom'
})
})
test('set the level by number', async ({ is }) => {
const expected = [{
level: 50,
msg: 'this is an error'
}, {
level: 60,
msg: 'this is fatal'
}]
const stream = sink()
const instance = pino(stream)
instance.level = 50
instance.info('hello world')
instance.error('this is an error')
instance.fatal('this is fatal')
const result = await once(stream, 'data')
const current = expected.shift()
check(is, result, current.level, current.msg)
})
test('exposes level string mappings', async ({ is }) => {
is(pino.levels.values.error, 50)
})
test('exposes level number mappings', async ({ is }) => {
is(pino.levels.labels[50], 'error')
})
test('returns level integer', async ({ is }) => {
const instance = pino({ level: 'error' })
is(instance.levelVal, 50)
})
test('child returns level integer', async ({ is }) => {
const parent = pino({ level: 'error' })
const child = parent.child({ foo: 'bar' })
is(child.levelVal, 50)
})
test('set the level via exported pino function', async ({ is }) => {
const expected = [{
level: 50,
msg: 'this is an error'
}, {
level: 60,
msg: 'this is fatal'
}]
const stream = sink()
const instance = pino({ level: 'error' }, stream)
instance.info('hello world')
instance.error('this is an error')
instance.fatal('this is fatal')
const result = await once(stream, 'data')
const current = expected.shift()
check(is, result, current.level, current.msg)
})
test('level-change event', async ({ is }) => {
const instance = pino()
function handle (lvl, val, prevLvl, prevVal) {
is(lvl, 'trace')
is(val, 10)
is(prevLvl, 'info')
is(prevVal, 30)
}
instance.on('level-change', handle)
instance.level = 'trace'
instance.removeListener('level-change', handle)
instance.level = 'info'
var count = 0
const l1 = () => count++
const l2 = () => count++
const l3 = () => count++
instance.on('level-change', l1)
instance.on('level-change', l2)
instance.on('level-change', l3)
instance.level = 'trace'
instance.removeListener('level-change', l3)
instance.level = 'fatal'
instance.removeListener('level-change', l1)
instance.level = 'debug'
instance.removeListener('level-change', l2)
instance.level = 'info'
is(count, 6)
})
test('enable', async ({ fail }) => {
const instance = pino({
level: 'trace',
enabled: false
}, sink((result, enc) => {
fail('no data should be logged')
}))
Object.keys(pino.levels.values).forEach((level) => {
instance[level]('hello world')
})
})
test('silent level', async ({ fail }) => {
const instance = pino({
level: 'silent'
}, sink((result, enc) => {
fail('no data should be logged')
}))
Object.keys(pino.levels.values).forEach((level) => {
instance[level]('hello world')
})
})
test('set silent via Infinity', async ({ fail }) => {
const instance = pino({
level: Infinity
}, sink((result, enc) => {
fail('no data should be logged')
}))
Object.keys(pino.levels.values).forEach((level) => {
instance[level]('hello world')
})
})
test('exposed levels', async ({ same }) => {
same(Object.keys(pino.levels.values), [
'trace',
'debug',
'info',
'warn',
'error',
'fatal'
])
})
test('exposed labels', async ({ same }) => {
same(Object.keys(pino.levels.labels), [
'10',
'20',
'30',
'40',
'50',
'60'
])
})
test('setting level in child', async ({ is }) => {
const expected = [{
level: 50,
msg: 'this is an error'
}, {
level: 60,
msg: 'this is fatal'
}]
const instance = pino(sink((result, enc, cb) => {
const current = expected.shift()
check(is, result, current.level, current.msg)
cb()
})).child({ level: 30 })
instance.level = 'error'
instance.info('hello world')
instance.error('this is an error')
instance.fatal('this is fatal')
})
test('setting level by assigning a number to level', async ({ is }) => {
const instance = pino()
is(instance.levelVal, 30)
is(instance.level, 'info')
instance.level = 50
is(instance.levelVal, 50)
is(instance.level, 'error')
})
test('setting level by number to unknown value results in a throw', async ({ throws }) => {
const instance = pino()
throws(() => { instance.level = 973 })
})
test('setting level by assigning a known label to level', async ({ is }) => {
const instance = pino()
is(instance.levelVal, 30)
is(instance.level, 'info')
instance.level = 'error'
is(instance.levelVal, 50)
is(instance.level, 'error')
})
test('levelVal is read only', async ({ throws }) => {
const instance = pino()
throws(() => { instance.levelVal = 20 })
})
test('produces labels when told to', async ({ is }) => {
const expected = [{
level: 'info',
msg: 'hello world'
}]
const instance = pino({ useLevelLabels: true }, sink((result, enc, cb) => {
const current = expected.shift()
check(is, result, current.level, current.msg)
cb()
}))
instance.info('hello world')
})
test('resets levels from labels to numbers', async ({ is }) => {
const expected = [{
level: 30,
msg: 'hello world'
}]
pino({ useLevelLabels: true })
const instance = pino({ useLevelLabels: false }, sink((result, enc, cb) => {
const current = expected.shift()
check(is, result, current.level, current.msg)
cb()
}))
instance.info('hello world')
})
test('aliases changeLevelName to levelKey', async ({ is }) => {
const instance = pino({ changeLevelName: 'priority' }, sink((result, enc, cb) => {
is(result.priority, 30)
cb()
}))
instance.info('hello world')
})
test('changes label naming when told to', async ({ is }) => {
const expected = [{
priority: 30,
msg: 'hello world'
}]
const instance = pino({ levelKey: 'priority' }, sink((result, enc, cb) => {
const current = expected.shift()
is(result.priority, current.priority)
is(result.msg, current.msg)
cb()
}))
instance.info('hello world')
})
test('children produce labels when told to', async ({ is }) => {
const expected = [
{
level: 'info',
msg: 'child 1'
},
{
level: 'info',
msg: 'child 2'
}
]
const instance = pino({ useLevelLabels: true }, sink((result, enc, cb) => {
const current = expected.shift()
check(is, result, current.level, current.msg)
cb()
}))
const child1 = instance.child({ name: 'child1' })
const child2 = child1.child({ name: 'child2' })
child1.info('child 1')
child2.info('child 2')
})
test('produces labels for custom levels', async ({ is }) => {
const expected = [
{
level: 'info',
msg: 'hello world'
},
{
level: 'foo',
msg: 'foobar'
}
]
const opts = {
useLevelLabels: true,
customLevels: {
foo: 35
}
}
const instance = pino(opts, sink((result, enc, cb) => {
const current = expected.shift()
check(is, result, current.level, current.msg)
cb()
}))
instance.info('hello world')
instance.foo('foobar')
})
test('setting levelKey does not affect labels when told to', async ({ is }) => {
const instance = pino(
{
useLevelLabels: true,
levelKey: 'priority'
},
sink((result, enc, cb) => {
is(result.priority, 'info')
cb()
})
)
instance.info('hello world')
})
test('throws when creating a default label that does not exist in logger levels', async ({ is, throws }) => {
const defaultLevel = 'foo'
throws(() => {
pino({
customLevels: {
bar: 5
},
level: defaultLevel
})
})
try {
pino({
level: defaultLevel
})
} catch ({ message }) {
is(message, `default level:${defaultLevel} must be included in custom levels`)
}
})
test('throws when creating a default value that does not exist in logger levels', async ({ is, throws }) => {
const defaultLevel = 15
throws(() => {
pino({
customLevels: {
bar: 5
},
level: defaultLevel
})
})
try {
pino({
level: defaultLevel
})
} catch ({ message }) {
is(message, `default level:${defaultLevel} must be included in custom levels`)
}
})
test('throws when creating a default value that does not exist in logger levels', async ({ is, throws }) => {
throws(() => {
pino({
customLevels: {
foo: 5
},
useOnlyCustomLevels: true
})
})
try {
pino({
customLevels: {
foo: 5
},
useOnlyCustomLevels: true
})
} catch ({ message }) {
is(message, 'default level:info must be included in custom levels')
}
})
test('passes when creating a default value that exists in logger levels', async ({ is, throws }) => {
pino({
level: 30
})
})
test('fatal method sync-flushes the destination if sync flushing is available', async ({ pass, doesNotThrow, plan }) => {
plan(2)
const stream = sink()
stream.flushSync = () => {
pass('destination flushed')
}
const instance = pino(stream)
instance.fatal('this is fatal')
await once(stream, 'data')
doesNotThrow(() => {
stream.flushSync = undefined
instance.fatal('this is fatal')
})
})
test('fatal method should call async when sync-flushing fails', ({ equal, fail, doesNotThrow, plan }) => {
plan(2)
const messages = [
'this is fatal 1'
]
const stream = sink((result) => equal(result.msg, messages.shift()))
stream.flushSync = () => { throw new Error('Error') }
stream.flush = () => fail('flush should be called')
const instance = pino(stream)
doesNotThrow(() => instance.fatal(messages[0]))
})

110
node_modules/pino/test/metadata.test.js generated vendored Normal file
View file

@ -0,0 +1,110 @@
'use strict'
const os = require('os')
const { test } = require('tap')
const pino = require('../')
const { pid } = process
const hostname = os.hostname()
test('metadata works', async ({ ok, same, is }) => {
const now = Date.now()
const instance = pino({}, {
[Symbol.for('pino.metadata')]: true,
write (chunk) {
is(instance, this.lastLogger)
is(30, this.lastLevel)
is('a msg', this.lastMsg)
ok(Number(this.lastTime) >= now)
same(this.lastObj, { hello: 'world', msg: 'a msg' })
const result = JSON.parse(chunk)
ok(new Date(result.time) <= new Date(), 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 30,
hello: 'world',
msg: 'a msg',
v: 1
})
}
})
instance.info({ hello: 'world' }, 'a msg')
})
test('child loggers works', async ({ ok, same, is }) => {
const instance = pino({}, {
[Symbol.for('pino.metadata')]: true,
write (chunk) {
is(child, this.lastLogger)
is(30, this.lastLevel)
is('a msg', this.lastMsg)
same(this.lastObj, { from: 'child', msg: 'a msg' })
const result = JSON.parse(chunk)
ok(new Date(result.time) <= new Date(), 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 30,
hello: 'world',
from: 'child',
msg: 'a msg',
v: 1
})
}
})
const child = instance.child({ hello: 'world' })
child.info({ from: 'child' }, 'a msg')
})
test('without object', async ({ ok, same, is }) => {
const instance = pino({}, {
[Symbol.for('pino.metadata')]: true,
write (chunk) {
is(instance, this.lastLogger)
is(30, this.lastLevel)
is('a msg', this.lastMsg)
same({ msg: 'a msg' }, this.lastObj)
const result = JSON.parse(chunk)
ok(new Date(result.time) <= new Date(), 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 30,
msg: 'a msg',
v: 1
})
}
})
instance.info('a msg')
})
test('without msg', async ({ ok, same, is }) => {
const instance = pino({}, {
[Symbol.for('pino.metadata')]: true,
write (chunk) {
is(instance, this.lastLogger)
is(30, this.lastLevel)
is(undefined, this.lastMsg)
same({ hello: 'world' }, this.lastObj)
const result = JSON.parse(chunk)
ok(new Date(result.time) <= new Date(), 'time is greater than Date.now()')
delete result.time
same(result, {
pid: pid,
hostname: hostname,
level: 30,
hello: 'world',
v: 1
})
}
})
instance.info({ hello: 'world' })
})

106
node_modules/pino/test/mixin.test.js generated vendored Normal file
View file

@ -0,0 +1,106 @@
'use strict'
const os = require('os')
const { test } = require('tap')
const { sink, once } = require('./helper')
const pino = require('../')
const { pid } = process
const hostname = os.hostname()
const level = 50
const name = 'error'
test('mixin object is included', async ({ ok, same }) => {
let n = 0
const stream = sink()
const instance = pino({
mixin () {
return { hello: ++n }
}
}, stream)
instance.level = name
instance[name]('test')
const result = await once(stream, 'data')
ok(new Date(result.time) <= new Date(), 'time is greater than Date.now()')
delete result.time
same(result, {
pid,
hostname,
level,
msg: 'test',
hello: 1,
v: 1
})
})
test('mixin object is new every time', async ({ plan, ok, same }) => {
plan(6)
let n = 0
const stream = sink()
const instance = pino({
mixin () {
return { hello: n }
}
}, stream)
instance.level = name
while (++n < 4) {
const msg = `test #${n}`
stream.pause()
instance[name](msg)
stream.resume()
const result = await once(stream, 'data')
ok(new Date(result.time) <= new Date(), 'time is greater than Date.now()')
delete result.time
same(result, {
pid,
hostname,
level,
msg,
hello: n,
v: 1
})
}
})
test('mixin object is not called if below log level', async ({ ok }) => {
const stream = sink()
const instance = pino({
mixin () {
ok(false, 'should not call mixin function')
}
}, stream)
instance.level = 'error'
instance.info('test')
})
test('mixin object + logged object', async ({ ok, same }) => {
const stream = sink()
const instance = pino({
mixin () {
return { foo: 1, bar: 2 }
}
}, stream)
instance.level = name
instance[name]({ bar: 3, baz: 4 })
const result = await once(stream, 'data')
ok(new Date(result.time) <= new Date(), 'time is greater than Date.now()')
delete result.time
same(result, {
pid,
hostname,
level,
foo: 1,
bar: 3,
baz: 4,
v: 1
})
})
test('mixin not a function', async ({ throws }) => {
const stream = sink()
throws(function () {
pino({ mixin: 'not a function' }, stream)
})
})

312
node_modules/pino/test/pretty.test.js generated vendored Normal file
View file

@ -0,0 +1,312 @@
'use strict'
const { Writable } = require('stream')
const { test } = require('tap')
const { join } = require('path')
const execa = require('execa')
const writer = require('flush-write-stream')
const { once } = require('./helper')
const pino = require('../')
const tap = require('tap')
const isWin = process.platform === 'win32'
if (isWin) {
tap.comment('Skipping pretty printing tests on Windows as colour codes are different and tests fail')
process.exit(0)
}
test('can be enabled via exported pino function', async ({ isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'basic.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/\(123456 on abcdefghijklmnopqr\): h/), null)
})
test('can be enabled via exported pino function with pretty configuration', async ({ isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'level-first.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/^INFO.*h/), null)
})
test('can be enabled via exported pino function with prettifier', async ({ isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'pretty-factory.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/^INFO.*h/), null)
})
test('does not throw error when enabled with stream specified', async ({ doesNotThrow }) => {
doesNotThrow(() => pino({ prettyPrint: true }, process.stdout))
})
test('throws when prettyPrint is true but pino-pretty module is not installed', async ({ throws, is }) => {
// pino pretty *is* installed, and probably also cached, so rather than
// messing with the filesystem the simplest way to generate a not found
// error is to simulate it:
const prettyFactory = require('pino-pretty')
require.cache[require.resolve('pino-pretty')].exports = () => {
throw Error('Cannot find module \'pino-pretty\'')
}
throws(() => pino({ prettyPrint: true }))
try { pino({ prettyPrint: true }) } catch ({ message }) {
is(message, 'Missing `pino-pretty` module: `pino-pretty` must be installed separately')
}
require.cache[require.resolve('pino-pretty')].exports = prettyFactory
})
test('can send pretty print to custom stream', async ({ is }) => {
const dest = new Writable({
objectMode: true,
write (formatted, enc) {
is(/^INFO.*foo\n$/.test(formatted), true)
}
})
const log = pino({
prettifier: require('pino-pretty'),
prettyPrint: {
levelFirst: true,
colorize: false
}
}, dest)
log.info('foo')
})
test('ignores `undefined` from prettifier', async ({ is }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'skipped-output.js')])
child.stdout.pipe(writer((s, enc) => {
actual += s
}))
await once(child, 'close')
is(actual, '')
})
test('parses and outputs chindings', async ({ is, isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'child.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/\(123456 on abcdefghijklmnopqr\): h/), null)
isNot(actual.match(/\(123456 on abcdefghijklmnopqr\): h2/), null)
isNot(actual.match(/a: 1/), null)
isNot(actual.match(/b: 2/), null)
is(actual.match(/a: 1/g).length, 3)
})
test('applies serializers', async ({ is, isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'serializers.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/\(123456 on abcdefghijklmnopqr\): h/), null)
isNot(actual.match(/foo: "bar"/), null)
})
test('applies redaction rules', async ({ is, isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'redact.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/\(123456 on abcdefghijklmnopqr\): h/), null)
isNot(actual.match(/\[Redacted\]/), null)
is(actual.match(/object/), null)
})
test('dateformat', async ({ isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'dateformat.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/\(123456 on abcdefghijklmnopqr\): h/), null)
})
test('without timestamp', async ({ isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'no-time.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.slice(2), '[]')
})
test('with custom timestamp', async ({ is }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'custom-time.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
is(actual.slice(0, 8), '["test"]')
})
test('with custom timestamp label', async ({ is }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'custom-time-label.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
is(actual.slice(0, 8), '["test"]')
})
test('errors', async ({ isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'error.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/\(123456 on abcdefghijklmnopqr\): kaboom/), null)
isNot(actual.match(/\(123456 on abcdefghijklmnopqr\): with a message/), null)
isNot(actual.match(/.*error\.js.*/), null)
})
test('errors with props', async ({ isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'error-props.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/\(123456 on abcdefghijklmnopqr\): kaboom/), null)
isNot(actual.match(/code: ENOENT/), null)
isNot(actual.match(/errno: 1/), null)
isNot(actual.match(/.*error-props\.js.*/), null)
})
test('final works with pretty', async ({ isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'final.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/WARN\s+\(123456 on abcdefghijklmnopqr\): pino.final with prettyPrint does not support flushing/), null)
isNot(actual.match(/INFO\s+\(123456 on abcdefghijklmnopqr\): beforeExit/), null)
})
test('final works when returning a logger', async ({ isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'final-return.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/WARN\s+\(123456 on abcdefghijklmnopqr\): pino.final with prettyPrint does not support flushing/), null)
isNot(actual.match(/INFO\s+\(123456 on abcdefghijklmnopqr\): after/), null)
})
test('final works without prior logging', async ({ isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'final-no-log-before.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/WARN\s+: pino.final with prettyPrint does not support flushing/), null)
isNot(actual.match(/INFO\s+\(123456 on abcdefghijklmnopqr\): beforeExit/), null)
})
test('works as expected with an object with the msg prop', async ({ isNot }) => {
var actual = ''
const child = execa(process.argv[0], [join(__dirname, 'fixtures', 'pretty', 'obj-msg-prop.js')])
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/\(123456 on abcdefghijklmnopqr\): hello/), null)
})
test('should not lose stream metadata for streams with `needsMetadataGsym` flag', async ({ isNot }) => {
const dest = new Writable({
objectMode: true,
write () {
isNot(typeof this.lastLevel === 'undefined', true)
isNot(typeof this.lastMsg === 'undefined', true)
isNot(typeof this.lastObj === 'undefined', true)
isNot(typeof this.lastTime === 'undefined', true)
isNot(typeof this.lastLogger === 'undefined', true)
}
})
dest[pino.symbols.needsMetadataGsym] = true
const log = pino({
prettyPrint: true
}, dest)
log.info('foo')
})
test('should not add stream metadata for streams without `needsMetadataGsym` flag', async ({ is }) => {
const dest = new Writable({
objectMode: true,
write () {
is(typeof this.lastLevel === 'undefined', true)
is(typeof this.lastMsg === 'undefined', true)
is(typeof this.lastObj === 'undefined', true)
is(typeof this.lastTime === 'undefined', true)
is(typeof this.lastLogger === 'undefined', true)
}
})
const log = pino({
prettyPrint: true
}, dest)
log.info('foo')
})

713
node_modules/pino/test/redact.test.js generated vendored Normal file
View file

@ -0,0 +1,713 @@
'use strict'
const { test } = require('tap')
const { sink, once } = require('./helper')
const pino = require('../')
test('redact option throws if not array', async ({ throws }) => {
throws(() => {
pino({ redact: 'req.headers.cookie' })
})
})
test('redact option throws if array does not only contain strings', async ({ throws }) => {
throws(() => {
pino({ redact: ['req.headers.cookie', {}] })
})
})
test('redact option throws if array contains an invalid path', async ({ throws }) => {
throws(() => {
pino({ redact: ['req,headers.cookie'] })
})
})
test('redact.paths option throws if not array', async ({ throws }) => {
throws(() => {
pino({ redact: { paths: 'req.headers.cookie' } })
})
})
test('redact.paths option throws if array does not only contain strings', async ({ throws }) => {
throws(() => {
pino({ redact: { paths: ['req.headers.cookie', {}] } })
})
})
test('redact.paths option throws if array contains an invalid path', async ({ throws }) => {
throws(() => {
pino({ redact: { paths: ['req,headers.cookie'] } })
})
})
test('redact option top level key', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['key'] }, stream)
instance.info({
key: { redact: 'me' }
})
const { key } = await once(stream, 'data')
is(key, '[Redacted]')
})
test('redact option top level key next level key', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['key', 'key.foo'] }, stream)
instance.info({
key: { redact: 'me' }
})
const { key } = await once(stream, 'data')
is(key, '[Redacted]')
})
test('redact option next level key then top level key', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['key.foo', 'key'] }, stream)
instance.info({
key: { redact: 'me' }
})
const { key } = await once(stream, 'data')
is(key, '[Redacted]')
})
test('redact option object', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['req.headers.cookie'] }, stream)
instance.info({
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
})
const { req } = await once(stream, 'data')
is(req.headers.cookie, '[Redacted]')
})
test('redact option child object', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['req.headers.cookie'] }, stream)
instance.child({
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
}).info('message completed')
const { req } = await once(stream, 'data')
is(req.headers.cookie, '[Redacted]')
})
test('redact option interpolated object', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['req.headers.cookie'] }, stream)
instance.info('test', {
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
})
const { msg } = await once(stream, 'data')
is(JSON.parse(msg.replace(/test /, '')).req.headers.cookie, '[Redacted]')
})
test('redact.paths option object', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: { paths: ['req.headers.cookie'] } }, stream)
instance.info({
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
})
const { req } = await once(stream, 'data')
is(req.headers.cookie, '[Redacted]')
})
test('redact.paths option child object', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: { paths: ['req.headers.cookie'] } }, stream)
instance.child({
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
}).info('message completed')
const { req } = await once(stream, 'data')
is(req.headers.cookie, '[Redacted]')
})
test('redact.paths option interpolated object', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: { paths: ['req.headers.cookie'] } }, stream)
instance.info('test', {
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
})
const { msg } = await once(stream, 'data')
is(JSON.parse(msg.replace(/test /, '')).req.headers.cookie, '[Redacted]')
})
test('redact.censor option sets the redact value', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: { paths: ['req.headers.cookie'], censor: 'test' } }, stream)
instance.info({
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
})
const { req } = await once(stream, 'data')
is(req.headers.cookie, 'test')
})
test('redact.remove option removes both key and value', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: { paths: ['req.headers.cookie'], remove: true } }, stream)
instance.info({
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
})
const { req } = await once(stream, 'data')
is('cookie' in req.headers, false)
})
test('redact.remove top level key - object value', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: { paths: ['key'], remove: true } }, stream)
instance.info({
key: { redact: 'me' }
})
const o = await once(stream, 'data')
is('key' in o, false)
})
test('redact.remove top level key - number value', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: { paths: ['key'], remove: true } }, stream)
instance.info({
key: 1
})
const o = await once(stream, 'data')
is('key' in o, false)
})
test('redact.remove top level key - boolean value', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: { paths: ['key'], remove: true } }, stream)
instance.info({
key: false
})
const o = await once(stream, 'data')
is('key' in o, false)
})
test('redact.remove top level key in child logger', async ({ is }) => {
const stream = sink()
const opts = { redact: { paths: ['key'], remove: true } }
const instance = pino(opts, stream).child({ key: { redact: 'me' } })
instance.info('test')
const o = await once(stream, 'data')
is('key' in o, false)
})
test('redact.paths preserves original object values after the log write', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['req.headers.cookie'] }, stream)
const obj = {
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
}
instance.info(obj)
const o = await once(stream, 'data')
is(o.req.headers.cookie, '[Redacted]')
is(obj.req.headers.cookie, 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;')
})
test('redact.paths preserves original object values after the log write', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: { paths: ['req.headers.cookie'] } }, stream)
const obj = {
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
}
instance.info(obj)
const o = await once(stream, 'data')
is(o.req.headers.cookie, '[Redacted]')
is(obj.req.headers.cookie, 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;')
})
test('redact.censor preserves original object values after the log write', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: { paths: ['req.headers.cookie'], censor: 'test' } }, stream)
const obj = {
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
}
instance.info(obj)
const o = await once(stream, 'data')
is(o.req.headers.cookie, 'test')
is(obj.req.headers.cookie, 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;')
})
test('redact.remove preserves original object values after the log write', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: { paths: ['req.headers.cookie'], remove: true } }, stream)
const obj = {
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
}
instance.info(obj)
const o = await once(stream, 'data')
is('cookie' in o.req.headers, false)
is('cookie' in obj.req.headers, true)
})
test('redact supports last position wildcard paths', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['req.headers.*'] }, stream)
instance.info({
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
})
const { req } = await once(stream, 'data')
is(req.headers.cookie, '[Redacted]')
is(req.headers.host, '[Redacted]')
is(req.headers.connection, '[Redacted]')
})
test('redact supports first position wildcard paths', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['*.headers'] }, stream)
instance.info({
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
})
const { req } = await once(stream, 'data')
is(req.headers, '[Redacted]')
})
test('redact supports first position wildcards before other paths', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['*.headers.cookie', 'req.id'] }, stream)
instance.info({
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
})
const { req } = await once(stream, 'data')
is(req.headers.cookie, '[Redacted]')
is(req.id, '[Redacted]')
})
test('redact supports first position wildcards after other paths', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['req.id', '*.headers.cookie'] }, stream)
instance.info({
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
})
const { req } = await once(stream, 'data')
is(req.headers.cookie, '[Redacted]')
is(req.id, '[Redacted]')
})
test('redact supports first position wildcards after top level keys', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['key', '*.headers.cookie'] }, stream)
instance.info({
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
})
const { req } = await once(stream, 'data')
is(req.headers.cookie, '[Redacted]')
})
test('redact supports top level wildcard', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['*'] }, stream)
instance.info({
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
})
const { req } = await once(stream, 'data')
is(req, '[Redacted]')
})
test('redact supports top level wildcard with a censor function', async ({ is }) => {
const stream = sink()
const instance = pino({
redact: {
paths: ['*'],
censor: () => '[Redacted]'
}
}, stream)
instance.info({
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
})
const { req } = await once(stream, 'data')
is(req, '[Redacted]')
})
test('redact supports top level wildcard and leading wildcard', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['*', '*.req'] }, stream)
instance.info({
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
})
const { req } = await once(stream, 'data')
is(req, '[Redacted]')
})
test('redact supports intermediate wildcard paths', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['req.*.cookie'] }, stream)
instance.info({
req: {
id: 7915,
method: 'GET',
url: '/',
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
cookie: 'SESSID=298zf09hf012fh2; csrftoken=u32t4o3tb3gg43; _gat=1;'
},
remoteAddress: '::ffff:127.0.0.1',
remotePort: 58022
}
})
const { req } = await once(stream, 'data')
is(req.headers.cookie, '[Redacted]')
})
test('redacts numbers at the top level', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['id'] }, stream)
const obj = {
id: 7915
}
instance.info(obj)
const o = await once(stream, 'data')
is(o.id, '[Redacted]')
})
test('redacts booleans at the top level', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['maybe'] }, stream)
const obj = {
maybe: true
}
instance.info(obj)
const o = await once(stream, 'data')
is(o.maybe, '[Redacted]')
})
test('redacts strings at the top level', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['s'] }, stream)
const obj = {
s: 's'
}
instance.info(obj)
const o = await once(stream, 'data')
is(o.s, '[Redacted]')
})
test('does not redact primitives if not objects', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['a.b'] }, stream)
const obj = {
a: 42
}
instance.info(obj)
const o = await once(stream, 'data')
is(o.a, 42)
})
test('redacts null at the top level', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['n'] }, stream)
const obj = {
n: null
}
instance.info(obj)
const o = await once(stream, 'data')
is(o.n, '[Redacted]')
})
test('supports bracket notation', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['a["b.b"]'] }, stream)
const obj = {
a: { 'b.b': 'c' }
}
instance.info(obj)
const o = await once(stream, 'data')
is(o.a['b.b'], '[Redacted]')
})
test('supports bracket notation with further nesting', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['a["b.b"].c'] }, stream)
const obj = {
a: { 'b.b': { c: 'd' } }
}
instance.info(obj)
const o = await once(stream, 'data')
is(o.a['b.b'].c, '[Redacted]')
})
test('supports bracket notation with empty string as path segment', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['a[""].c'] }, stream)
const obj = {
a: { '': { c: 'd' } }
}
instance.info(obj)
const o = await once(stream, 'data')
is(o.a[''].c, '[Redacted]')
})
test('supports leading bracket notation (single quote)', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['[\'a.a\'].b'] }, stream)
const obj = {
'a.a': { b: 'c' }
}
instance.info(obj)
const o = await once(stream, 'data')
is(o['a.a'].b, '[Redacted]')
})
test('supports leading bracket notation (double quote)', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['["a.a"].b'] }, stream)
const obj = {
'a.a': { b: 'c' }
}
instance.info(obj)
const o = await once(stream, 'data')
is(o['a.a'].b, '[Redacted]')
})
test('supports leading bracket notation (backtick quote)', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['[`a.a`].b'] }, stream)
const obj = {
'a.a': { b: 'c' }
}
instance.info(obj)
const o = await once(stream, 'data')
is(o['a.a'].b, '[Redacted]')
})
test('supports leading bracket notation (single-segment path)', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['[`a.a`]'] }, stream)
const obj = {
'a.a': { b: 'c' }
}
instance.info(obj)
const o = await once(stream, 'data')
is(o['a.a'], '[Redacted]')
})
test('supports leading bracket notation (single-segment path, wilcard)', async ({ is }) => {
const stream = sink()
const instance = pino({ redact: ['[*]'] }, stream)
const obj = {
'a.a': { b: 'c' }
}
instance.info(obj)
const o = await once(stream, 'data')
is(o['a.a'], '[Redacted]')
})

245
node_modules/pino/test/serializers.test.js generated vendored Normal file
View file

@ -0,0 +1,245 @@
'use strict'
const { test } = require('tap')
const { sink, once } = require('./helper')
const pino = require('../')
const parentSerializers = {
test: () => 'parent'
}
const childSerializers = {
test: () => 'child'
}
test('default err namespace error serializer', async ({ is }) => {
const stream = sink()
const parent = pino(stream)
parent.info({ err: ReferenceError('test') })
const o = await once(stream, 'data')
is(typeof o.err, 'object')
is(o.err.type, 'ReferenceError')
is(o.err.message, 'test')
is(typeof o.err.stack, 'string')
})
test('custom serializer overrides default err namespace error serializer', async ({ is }) => {
const stream = sink()
const parent = pino({
serializers: {
err: (e) => ({
t: e.constructor.name,
m: e.message,
s: e.stack
})
}
}, stream)
parent.info({ err: ReferenceError('test') })
const o = await once(stream, 'data')
is(typeof o.err, 'object')
is(o.err.t, 'ReferenceError')
is(o.err.m, 'test')
is(typeof o.err.s, 'string')
})
test('null overrides default err namespace error serializer', async ({ is }) => {
const stream = sink()
const parent = pino({ serializers: { err: null } }, stream)
parent.info({ err: ReferenceError('test') })
const o = await once(stream, 'data')
is(typeof o.err, 'object')
is(typeof o.err.type, 'undefined')
is(typeof o.err.message, 'undefined')
is(typeof o.err.stack, 'undefined')
})
test('undefined overrides default err namespace error serializer', async ({ is }) => {
const stream = sink()
const parent = pino({ serializers: { err: undefined } }, stream)
parent.info({ err: ReferenceError('test') })
const o = await once(stream, 'data')
is(typeof o.err, 'object')
is(typeof o.err.type, 'undefined')
is(typeof o.err.message, 'undefined')
is(typeof o.err.stack, 'undefined')
})
test('serializers override values', async ({ is }) => {
const stream = sink()
const parent = pino({ serializers: parentSerializers }, stream)
parent.child({ serializers: childSerializers })
parent.fatal({ test: 'test' })
const o = await once(stream, 'data')
is(o.test, 'parent')
})
test('child does not overwrite parent serializers', async ({ is }) => {
const stream = sink()
const parent = pino({ serializers: parentSerializers }, stream)
const child = parent.child({ serializers: childSerializers })
parent.fatal({ test: 'test' })
const o = once(stream, 'data')
is((await o).test, 'parent')
const o2 = once(stream, 'data')
child.fatal({ test: 'test' })
is((await o2).test, 'child')
})
test('Symbol.for(\'pino.serializers\')', async ({ is, isNot }) => {
const stream = sink()
const parent = pino({ serializers: parentSerializers }, stream)
const child = parent.child({ a: 'property' })
is(parent[Symbol.for('pino.serializers')], parentSerializers)
is(child[Symbol.for('pino.serializers')], parentSerializers)
const child2 = parent.child({
serializers: {
a
}
})
function a () {
return 'hello'
}
isNot(child2[Symbol.for('pino.serializers')], parentSerializers)
is(child2[Symbol.for('pino.serializers')].a, a)
is(child2[Symbol.for('pino.serializers')].test, parentSerializers.test)
})
test('children inherit parent serializers', async ({ is }) => {
const stream = sink()
const parent = pino({ serializers: parentSerializers }, stream)
const child = parent.child({ a: 'property' })
child.fatal({ test: 'test' })
const o = await once(stream, 'data')
is(o.test, 'parent')
})
test('children inherit parent Symbol serializers', async ({ is, isNot }) => {
const stream = sink()
const symbolSerializers = {
[Symbol.for('pino.*')]: parentSerializers.test
}
const parent = pino({ serializers: symbolSerializers }, stream)
is(parent[Symbol.for('pino.serializers')], symbolSerializers)
const child = parent.child({
serializers: {
[Symbol.for('a')]: a,
a
}
})
function a () {
return 'hello'
}
isNot(child[Symbol.for('pino.serializers')], symbolSerializers)
is(child[Symbol.for('pino.serializers')].a, a)
is(child[Symbol.for('pino.serializers')][Symbol.for('a')], a)
is(child[Symbol.for('pino.serializers')][Symbol.for('pino.*')], parentSerializers.test)
})
test('children serializers get called', async ({ is }) => {
const stream = sink()
const parent = pino({
test: 'this'
}, stream)
const child = parent.child({ a: 'property', serializers: childSerializers })
child.fatal({ test: 'test' })
const o = await once(stream, 'data')
is(o.test, 'child')
})
test('children serializers get called when inherited from parent', async ({ is }) => {
const stream = sink()
const parent = pino({
test: 'this',
serializers: parentSerializers
}, stream)
const child = parent.child({ serializers: { test: function () { return 'pass' } } })
child.fatal({ test: 'fail' })
const o = await once(stream, 'data')
is(o.test, 'pass')
})
test('non-overridden serializers are available in the children', async ({ is }) => {
const stream = sink()
const pSerializers = {
onlyParent: function () { return 'parent' },
shared: function () { return 'parent' }
}
const cSerializers = {
shared: function () { return 'child' },
onlyChild: function () { return 'child' }
}
const parent = pino({ serializers: pSerializers }, stream)
const child = parent.child({ serializers: cSerializers })
const o = once(stream, 'data')
child.fatal({ shared: 'test' })
is((await o).shared, 'child')
const o2 = once(stream, 'data')
child.fatal({ onlyParent: 'test' })
is((await o2).onlyParent, 'parent')
const o3 = once(stream, 'data')
child.fatal({ onlyChild: 'test' })
is((await o3).onlyChild, 'child')
const o4 = once(stream, 'data')
parent.fatal({ onlyChild: 'test' })
is((await o4).onlyChild, 'test')
})
test('Symbol.for(\'pino.*\') serializer', async ({ notSame, is, isNot }) => {
const stream = sink()
const globalSerializer = {
[Symbol.for('pino.*')]: function (obj) {
if (obj.lionel === 'richie') {
return { hello: 'is', it: 'me', you: 'are', looking: 'for' }
}
return { lionel: 'richie' }
}
}
const logger = pino({ serializers: globalSerializer }, stream)
const o = once(stream, 'data')
logger.info({ hello: 'is', it: 'me', you: 'are', looking: 'for' })
is((await o).lionel, 'richie')
isNot((await o).hello, 'is')
isNot((await o).it, 'me')
isNot((await o).you, 'are')
isNot((await o).looking, 'for')
const o2 = once(stream, 'data')
logger.info({ lionel: 'richie' })
is((await o2).lionel, 'richie')
is((await o2).hello, 'is')
is((await o2).it, 'me')
is((await o2).you, 'are')
is((await o2).looking, 'for')
const o3 = once(stream, 'data')
logger.info('message')
is((await o3).lionel, 'richie')
is('pid' in (await o3), false)
is('hostname' in (await o3), false)
notSame(await o3, ['pid', 'hostname'])
})

19
node_modules/pino/test/stdout-protection.test.js generated vendored Normal file
View file

@ -0,0 +1,19 @@
'use strict'
const { test } = require('tap')
const { join } = require('path')
const { fork } = require('child_process')
const { once } = require('./helper')
const writer = require('flush-write-stream')
test('do not use SonicBoom is someone tampered with process.stdout.write', async ({ isNot }) => {
var actual = ''
const child = fork(join(__dirname, 'fixtures', 'stdout-hack-protection.js'), { silent: true })
child.stdout.pipe(writer((s, enc, cb) => {
actual += s
cb()
}))
await once(child, 'close')
isNot(actual.match(/^hack/), null)
})

121
node_modules/pino/test/timestamp.test.js generated vendored Normal file
View file

@ -0,0 +1,121 @@
'use strict'
/* eslint no-prototype-builtins: 0 */
const { test } = require('tap')
const { sink, once } = require('./helper')
const pino = require('../')
test('pino exposes standard time functions', async ({ ok }) => {
ok(pino.stdTimeFunctions)
ok(pino.stdTimeFunctions.epochTime)
ok(pino.stdTimeFunctions.unixTime)
ok(pino.stdTimeFunctions.nullTime)
ok(pino.stdTimeFunctions.isoTime)
})
test('pino accepts external time functions', async ({ is }) => {
const opts = {
timestamp: () => ',"time":"none"'
}
const stream = sink()
const instance = pino(opts, stream)
instance.info('foobar')
const result = await once(stream, 'data')
is(result.hasOwnProperty('time'), true)
is(result.time, 'none')
})
test('pino accepts external time functions with custom label', async ({ is }) => {
const opts = {
timestamp: () => ',"custom-time-label":"none"'
}
const stream = sink()
const instance = pino(opts, stream)
instance.info('foobar')
const result = await once(stream, 'data')
is(result.hasOwnProperty('custom-time-label'), true)
is(result['custom-time-label'], 'none')
})
test('inserts timestamp by default', async ({ ok, is }) => {
const stream = sink()
const instance = pino(stream)
instance.info('foobar')
const result = await once(stream, 'data')
is(result.hasOwnProperty('time'), true)
ok(new Date(result.time) <= new Date(), 'time is greater than timestamp')
is(result.msg, 'foobar')
})
test('omits timestamp when timestamp option is false', async ({ is }) => {
const stream = sink()
const instance = pino({ timestamp: false }, stream)
instance.info('foobar')
const result = await once(stream, 'data')
is(result.hasOwnProperty('time'), false)
is(result.msg, 'foobar')
})
test('inserts timestamp when timestamp option is true', async ({ ok, is }) => {
const stream = sink()
const instance = pino({ timestamp: true }, stream)
instance.info('foobar')
const result = await once(stream, 'data')
is(result.hasOwnProperty('time'), true)
ok(new Date(result.time) <= new Date(), 'time is greater than timestamp')
is(result.msg, 'foobar')
})
test('child inserts timestamp by default', async ({ ok, is }) => {
const stream = sink()
const logger = pino(stream)
const instance = logger.child({ component: 'child' })
instance.info('foobar')
const result = await once(stream, 'data')
is(result.hasOwnProperty('time'), true)
ok(new Date(result.time) <= new Date(), 'time is greater than timestamp')
is(result.msg, 'foobar')
})
test('child omits timestamp with option', async ({ is }) => {
const stream = sink()
const logger = pino({ timestamp: false }, stream)
const instance = logger.child({ component: 'child' })
instance.info('foobar')
const result = await once(stream, 'data')
is(result.hasOwnProperty('time'), false)
is(result.msg, 'foobar')
})
test('pino.stdTimeFunctions.unixTime returns seconds based timestamps', async ({ is }) => {
const opts = {
timestamp: pino.stdTimeFunctions.unixTime
}
const stream = sink()
const instance = pino(opts, stream)
const now = Date.now
Date.now = () => 1531069919686
instance.info('foobar')
const result = await once(stream, 'data')
is(result.hasOwnProperty('time'), true)
is(result.time, 1531069920)
Date.now = now
})
test('pino.stdTimeFunctions.isoTime returns ISO 8601 timestamps', async ({ is }) => {
const opts = {
timestamp: pino.stdTimeFunctions.isoTime
}
const stream = sink()
const instance = pino(opts, stream)
const ms = 1531069919686
const now = Date.now
Date.now = () => ms
const iso = new Date(ms).toISOString()
instance.info('foobar')
const result = await once(stream, 'data')
is(result.hasOwnProperty('time'), true)
is(result.time, iso)
Date.now = now
})