Follow

Error ouputted in Flume log "_java.nio.channels.ClosedChannelException_"

Issue

The following error can be outputted in Flume log(s):

  • XX Jan 20XX XX:XX:XX,175 WARN [<kafka_consumer_group>_<STREAM_NODE_FQDN>-1516733223919-769f069b-leader-finder-thread] (kafka.utils.Logging$class.warn:89)  - Fetching topic metadata with correlation id <#> for topics [Set(<Kafka_datasource_event_topic>)] from broker [BrokerEndPoint(1001,<STREAM_NODE_FQDN>,6667)] failed
    • java.nio.channels.ClosedChannelException
    • at kafka.network.BlockingChannel.send(BlockingChannel.scala:122)
    • at kafka.producer.SyncProducer.liftedTree1$1(SyncProducer.scala:82)
    • at kafka.producer.SyncProducer.kafka$producer$SyncProducer$$doSend(SyncProducer.scala:81)
    • at kafka.producer.SyncProducer.send(SyncProducer.scala:126)
    • at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:59)
    • at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:96)
    • at kafka.consumer.ConsumerFetcherManager$LeaderFinderThread.doWork(ConsumerFetcherManager.scala:67)
    • at kafka.utils.ShutdownableThread.run(ShutdownableThread.scala:63)

The error above are outputted in the following log(s):

  • flume-interset_<ds>_events_<did>_<tid>_csv_transform.log
  • flume-interset_<ds>_events_<did>_<tid>_es.log
  • flume-interset_<ds>_events_<did>_<tid>_hbase.log
  • flume-interset_<ds>_raw_<did>_<tid>_csv_multiline_extract.log

Cause

The cause is due to the Kafka Broker not running within Ambari.

Resolution Steps

NOTE: This information is only useful for CSV data ingest using Flume

There are two steps to complete in order to resolve this issue. They are:

  • Validate Kafka component is running
  • Validate Flume logs

Validate Kafka component is running

  1. Open up a web browser and navigate to the Ambari UI URL:
  2. Log in to the Ambari UI as the Ambari admin. The default credentials for the Ambari admin user are as follow:
    • Username: admin
    • Password: admin
  3. Once logged in, click on Kafka (from component list)
  4. Ensure the Kafka Broker is Started under the Summary section
  5. If it is NOT running, please click Service Actions > Start > Confirm Start
  6. Once the Kafka Broker is started, please continue to Validate Flume Logs section

Verify Flume logs

  1. SSH to the STREAM NODE as the Interset User.
  2. Type in the following to navigate to the flume log (/var/log/flume) directory:
    • cd /var/log/flume
  3. The following Flume logs will need to be validated in order to validate the issue has been resolved, and data is being ingested successfully:
    • flume-interset_<ds>_events_<did>_<tid>_csv_transform.log
    • flume-interset_<ds>_events_<did>_<tid>_es.log
    • flume-interset_<ds>_events_<did>_<tid>_hbase.log
    • flume-interset_<ds>_raw_<did>_<tid>_csv_multiline_extract.log
      • NOTE:
        • <ds> - denotes the data source type ingested (i.e. auth, repo. webproxy)
        • <did> - denotes the data source instance ID the data is ingested to
        • <tid> - denotes the tenant ID the data is ingested for
  4. To view the respective log information, see the example below to view the respective Flume log:
    • sudo less flume_interset_<log_name>.log
  5. In the log file, hit the follow key combination jump to the end of the log:
    • Shift + G
  6. In the respective Flume log, look for “EventPutSuccessCount”. This value will keep incrementing until data ingested completed.
    • NOTE: This is not applicable if it is streaming/real time data

Applies To

  • Interset 5.4.x or higher 
Was this article helpful?
0 out of 0 found this helpful
Have more questions? Submit a request

Comments

Powered by Zendesk