
8.3
Analytic Applications Log File Locations
Log files for a given service are located on the node(s) the respective service is running on, which can be
identified using the
urika-inventory
command. For more information, see the
urika-inventory
man page.
Table 37. Analytics Applications Log File Locations
Application/Script
Log File Location
Mesos
/var/log/mesos
Marathon
/var/log/messages
HA Proxy
/var/log/haproxy.log
Mesos frameworks:
●
Marathon
●
Spark
/var/log/mesos/agent/slaves/. Within this directory, a framework’s output is placed in
files called
stdout
and
stderr
, in a directory of the form
slave-
X
/fw-
Y
/
Z
,
where
X
is the slave ID,
Y
is the framework ID, and multiple subdirectories
Z
are
created for each attempt to run an executor for the framework. These files can also
be accessed via the web UI of the slave daemon. The location of the Spark logs is
determined by the cluster resource manager that it runs under, which is Mesos on
Urika-GX.
Grafana
/var/log/grafana/grafana.log
InfluxDB
/var/log/influxdb/influxd.log
collectl
collectl does not produce any logging information. It uses logging as a mechanism
for storing metrics. These metrics are exported to InfluxDB. If collectl fails at service
start time, the cause can be identified by executing the
collectl
command on the
command line and observing what gets printed. It will not complain if the InfluxDB
socket is not available.
Hadoop
The following daemon logs appear on the node they are running on:
●
/var/log/hadoop/hdfs/hadoop-hdfs-namenode-
nid
.log
●
/var/log/hadoop/hdfs/hadoop-hdfs-datanode-
nid
.log
●
/var/log/hadoop/yarn/yarn-yarn-nodemanager-
nid
.log
●
/var/log/hadoop/yarn/yarn-yarn-resourcemanager-
nid
.log
In the above locations,
nid
is used as an example for the node name.
Application specific logs reside in HDFS at
/app-logs
Spark
●
Spark event logs (used by the Spark History server) reside at:
hdfs://user/spark/applicationHistory
●
Spark executor logs (useful to debug Spark applications) reside with the other
Mesos framework logs on the individual compute nodes (see above)
at:
/var/log/mesos/agent/slaves/
Jupyter Notebook
/var/log/jupyterhub.log
Flex scripts:
/var/log/urika-yam.log
Troubleshooting
S3016
248