
Figure 29. Filtering by Type
●
User- Displays the name of the user who submitted the job.
●
Start Time - Displays the time the job started executing. Jobs can be filtered based on starting time using the
filter icon provided on the UI.
●
End Time - Displays the time the job finished executing. This column will be empty for jobs that have not
finished executed yet.
●
Status - Displays the current status of the job, which depends on the job's underlying framework.
To filter a job based on its status, click the filter icon in this column's header.
Make selections as needed and then select the Filter button on the pop-up. Click on the text Failed on the
Status column to view logs for debugging failed jobs. For Spark jobs, this column contains a link titled
Finished, which can be used to view and download logs that help identify whether or not the Spark job
succeeded. Selecting this link will present a login screen, where users will need to enter their LDAP
credentials.
IMPORTANT: If the user logged in with the default user account (having
admin
/
admin
as the
username and password), the system will require the user to log in again with their LDAP or system
credentials to view Spark executor logs.
●
Action - The kill button displayed in this column enables killing a running job. This column will be empty for
jobs that have finished executing. Users can only kill jobs submitted by themselves. However, the system
administrator can delete any job.
NOTE: If the user logged in with the default user account (having
admin
/
admin
as the username and
password), the system will require the user to log in again with their LDAP or system credentials to kill
jobs of type "
CGE
" or "
MRUN
".
Resource Management
S3016
139