On an kerberized SDC instance where private classloaders are disabled, secure Hadoop pipelines intermittently fail with the following exception:
org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
This issue can happen when private classloaders are disabled, and one of the Hadoop pipelines are pointing to Hadoop configurations that do not have the proper kerberos provisions. To check if your SDC instance has private classloaders disabled (non-default), check for the following runtime properties in SDC_JAVA_OPTS in the SDC environment file (sdc-env.sh/sdcd-env.sh):
-Dcom.streamsets.pipeline.stage.destination.hdfs.HdfsDTarget.no.private.classloader -Dcom.streamsets.pipeline.stage.destination.hive.HiveMetastoreDTarget.no.private.classloader -Dcom.streamsets.pipeline.stage.destination.hive.queryexecutor.HiveQueryDExecutor.no.private.classloader -Dcom.streamsets.pipeline.stage.processor.hive.HiveMetadataDProcessor.no.private.classloader
When private classloaders are disabled for Hadoop stages, any pipeline pointing to the wrong set of Hadoop configurations can corrupt the static UGI configuration and lead other pipelines to fail.
In order to fix this issue, you will need to find the secure Hadoop pipelines that are pointing to the Hadoop configurations with non-kerberos settings, and then point these misconfigured pipelines to the proper set of Hadoop configurations.
One hint in discovering a pipeline that may be problematic is to locate the first instance of the exception in the logs and check the secure Hadoop pipeline that was started right before the exception occurrence.
Also, if you have no particular reason for keeping the private classloaders disabled, you can enable private classloaders by removing the relevant runtime properties from the SDC environment file. This will help isolate the authentication issue since the misconfigured pipeline will only affect itself.