You are browsing a read-only backup copy of Wikitech. The primary site can be found at wikitech.wikimedia.org
Logstash/Interface
Supported Interfaces
Systemd Journal (stdout/stderr)
Structured logs written to stdout/stderr will be picked up by journald and copied to rsyslog. From there, rsyslog decides whether or not to forward the log to Kafka and eventually to Logstash.
To indicate to rsyslog that the log message is JSON, a "cookie" is required. Prepending "@cee: " before the JSON blob is sufficient.[1]
NOTE: Logs messages are broken between lines if they are longer than 2048 characters The fix in systemd is available in Debian Buster.
Python implementation
logger_demo.py
import logging
import logging.config
from pythonjsonlogger import jsonlogger
class CustomJsonFormatter(jsonlogger.JsonFormatter):
def add_fields(self, log_record, record, message_dict):
super(CustomJsonFormatter, self).add_fields(log_record, record, message_dict)
log_record['level'] = record.levelname.upper()
class StructuredLoggingHandler(logging.StreamHandler):
def __init__(self, rsyslog=False):
super(StructuredLoggingHandler, self).__init__()
if rsyslog:
prefix = '@cee: '
else:
prefix = ''
self.formatter = CustomJsonFormatter(prefix=prefix)
# Demo code below
if __name__ == '__main__':
logging.config.dictConfig({
'version': 1,
'disable_existing_loggers': False,
'root': {
'handlers': ['demo'],
'level': 'DEBUG'
},
'handlers': {
'demo': {
'class': 'logger_demo.StructuredLoggingHandler',
'rsyslog': True
}
}
})
logging.info('It\'s log!')
UNIX Socket (/dev/log)
tbd
Tailing Log Files
tbd
Configuring rsyslog to forward your logs
Rsyslog needs to know that your logs should be forwarded to Kafka. There are two configuration items that must be in place.
Your application may need to set the SyslogIdentifier option under the Service heading in the systemd unit file. This is especially true for applications that run under a common runtime like Python or Java.
[Service] SyslogIdentifier=appname
The application must also be listed in the rsyslog lookup table and configured to flag the log for sending to Kafka.