Home Forums Kamanja Forums Problems & Solutions Not able to push data in to kafka

This topic contains 13 replies, has 1 voice, and was last updated by  William Tarver 10 months, 1 week ago.

  • Author
    Posts
  • #24702 Reply

    Satya

    I am newbie and went over all the documentation provided and tried to setup in my machine

    cloudera virtual box 5.5, Kafka 0.10, scala 2.6, Kamanja 1.6

    I followed all the steps mentioned in the user guide. But at the final step while pushing data  into Kafka using the shell script present in the

    /inputs/SampleApplications/bin/PushSampleDataToKafka_Helloworld.sh

    I don’t see anything happening after i ran this shell script. i don’t see any success message as mentioned in the guide and also nothing in the error logs i can see.

    P.S: Also when start kamanja engine at first using command “kamanja start” it says it is started and running in the background and when i run jps command

    I can see kamanjaManager process but after few secs that process is getting killed automatically. Is that a problem?

     

  • #24706 Reply

    William Tarver

    Hi Satya,

    There are a few things that could go wrong here. First, I’d like to ask you to run kamanja with the command “kamanja start -v”. This will cause it to run in the foreground. You may also check the logs in [KamanjaInstallDir]/logs. Please post whatever logs you have from either the console or the existing log files. Also post the output when pushing data from Kafka.

    An important thing to know about version 1.6.0 is that when you run any PushDataToKafka_*.sh scripts, they are running against kafka 0.10 by default. To specify a different version of kafka,  This can also be a problem when running Kamanja. If you’re running an earlier version of Kafka, run the script like “bash PushDataToKafka_HelloWorld.sh –kafkaversion “[0.8|0.9]” (dependent on your kafka version. If the Kafka version is to blame, it will also cause Kamanja to fail to properly connect and die. Please check ClusterConfig.json in the Adapters section for the version of kafka you’re using.

  • #24712 Reply

    Satya

    Hi William,

    Thanks for the reply

    when i start kamanja in verbose mode I see class not found error as below

    [cloudera@quickstart Kamanja_1.6.0_2.11]$ sudo kamanja start -v

    /usr

    /usr

    Using the default file, /usr/lib/Kamanja_1.6.0_2.11/config/Engine1Config.properties

    WARN [main] – DATABASE_SCHEMA remains unset

    WARN [main] – DATABASE_LOCATION remains unset

    WARN [main] – MODEL_EXEC_LOG remains unset

    WARN [main] – DATABASE_HOST remains unset

    WARN [main] – ADAPTER_SPECIFIC_CONFIG remains unset

    WARN [main] – SSL_PASSWD remains unset

    WARN [main] – AUDIT_PARMS remains unset

    WARN [main] – GIT_ROOT remains unset

    WARN [main] – DATABASE remains unset

    JAVA_HOME => /usr

    SECURITY_IMPL_CLASS => com.ligadata.Security.SimpleApacheShiroAdapter

    JAR_PATHS => /usr/lib/Kamanja_1.6.0_2.11/lib/system,/usr/lib/Kamanja_1.6.0_2.11/lib/application

    CONTAINER_FILES_DIR => /usr/lib/Kamanja_1.6.0_2.11/input/SampleApplications/metadata/container/

    SERVICE_PORT => 8081

    SECURITY_IMPL_JAR => /usr/lib/Kamanja_1.6.0_2.11/lib/system/simpleapacheshiroadapter_2.11-1.0.jar

    ZOOKEEPER_CONNECT_STRING => localhost:2181

    SCALA_HOME => /usr

    SERVICE_HOST => localhost

    ZK_SESSION_TIMEOUT_MS => 3000

    TYPE_FILES_DIR => /usr/lib/Kamanja_1.6.0_2.11/input/SampleApplications/metadata/type/

    MODEL_EXEC_FLAG => false

    DO_AUTH => NO

    COMPILER_WORK_DIR => /usr/lib/Kamanja_1.6.0_2.11/workingdir

    SSL_CERTIFICATE => /usr/lib/Kamanja_1.6.0_2.11/config/keystore.jks

    API_LEADER_SELECTION_NODE_PATH => /kamanja

    CONCEPT_FILES_DIR => /usr/lib/Kamanja_1.6.0_2.11/input/SampleApplications/metadata/concept/

    ROOT_DIR => /usr/lib/Kamanja_1.6.0_2.11

    MODEL_FILES_DIR => /usr/lib/Kamanja_1.6.0_2.11/input/SampleApplications/metadata/model/

    MESSAGE_FILES_DIR => /usr/lib/Kamanja_1.6.0_2.11/input/SampleApplications/metadata/message/

    CONFIG_FILES_DIR => /usr/lib/Kamanja_1.6.0_2.11/input/SampleApplications/metadata/config/

    MANIFEST_PATH => /usr/lib/Kamanja_1.6.0_2.11/config/manifest.mf

    CLASSPATH => /usr/lib/Kamanja_1.6.0_2.11/lib/system/ExtDependencyLibs_2.11-1.6.0.jar:/usr/lib/Kamanja_1.6.0_2.11/lib/system/KamanjaInternalDeps_2.11-1.6.0.jar:/usr/lib/Kamanja_1.6.0_2.11/lib/system/ExtDependencyLibs2_2.11-1.6.0.jar

    NOTIFY_ENGINE => YES

    NODE_ID => 1

    ZNODE_PATH => /kamanja

    JAR_TARGET_DIR => /usr/lib/Kamanja_1.6.0_2.11/lib/application

    AUDIT_IMPL_CLASS => com.ligadata.audit.adapters.AuditCassandraAdapter

    ZK_CONNECTION_TIMEOUT_MS => 3000

    METADATA_DATASTORE => {“StoreType”: “h2db”,”connectionMode”: “embedded”,”SchemaName”: “kamanja”,”Location”: “/usr/lib/Kamanja_1.6.0_2.11/storage”,”portnumber”: “9100”,”user”: “test”,”password”: “test”}

    DO_AUDIT => NO

    FUNCTION_FILES_DIR => /usr/lib/Kamanja_1.6.0_2.11/input/SampleApplications/metadata/function/

    API_LEADER_SELECTION_ZK_NODE => /ligadata

    AUDIT_IMPL_JAR => /usr/lib/Kamanja_1.6.0_2.11/lib/system/auditadapters_2.11-1.0.jar

    SLF4J: Class path contains multiple SLF4J bindings.

    SLF4J: Found binding in [jar:file:/usr/lib/Kamanja_1.6.0_2.11/lib/system/ExtDependencyLibs2_2.11-1.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

    SLF4J: Found binding in [jar:file:/usr/lib/Kamanja_1.6.0_2.11/lib/system/ExtDependencyLibs_2.11-1.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

    SLF4J: Found binding in [jar:file:/usr/lib/Kamanja_1.6.0_2.11/lib/system/KamanjaInternalDeps_2.11-1.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

    SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]

     

    ——————————————————————-

    GMS: address=quickstart-63159, cluster=EH_CACHE, physical address=192.168.80.128:7800

    ——————————————————————-

    ERROR [main] – Failed to load Status/Output Adapter testfailedevents_1 with class com.ligadata.OutputAdapters.KafkaProducer$

    java.lang.ClassNotFoundException: com.ligadata.OutputAdapters.KafkaProducer$

    at java.net.URLClassLoader$1.run(URLClassLoader.java:366) ~[?:1.7.0_67]

    at java.net.URLClassLoader$1.run(URLClassLoader.java:355) ~[?:1.7.0_67]

    at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_67]

    at java.net.URLClassLoader.findClass(URLClassLoader.java:354) ~[?:1.7.0_67]

    at java.lang.ClassLoader.loadClass(ClassLoader.java:425) ~[?:1.7.0_67]

    at com.ligadata.Utils.KamanjaClassLoader.loadClass(ClassLoaderInfo.scala:55) ~[KamanjaInternalDeps_2.11-1.6.0.jar:1.6.0]

    at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ~[?:1.7.0_67]

    at com.ligadata.Utils.KamanjaClassLoader.loadClass(ClassLoaderInfo.scala:49) ~[KamanjaInternalDeps_2.11-1.6.0.jar:1.6.0]

    at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ~[?:1.7.0_67]

    at java.lang.Class.forName0(Native Method) ~[?:1.7.0_67]

    at java.lang.Class.forName(Class.java:270) ~[?:1.7.0_67]

    at com.ligadata.KamanjaManager.KamanjaMdCfg$.com$ligadata$KamanjaManager$KamanjaMdCfg$$CreateOutputAdapterFromConfig(KamanjaMdCfg.scala:539) [kamanjamanager_2.11-1.6.0.jar:1.6.0]

    at com.ligadata.KamanjaManager.KamanjaMdCfg$$anonfun$LoadOutputAdapsForCfg$1.apply(KamanjaMdCfg.scala:615) [kamanjamanager_2.11-1.6.0.jar:1.6.0]

    at com.ligadata.KamanjaManager.KamanjaMdCfg$$anonfun$LoadOutputAdapsForCfg$1.apply(KamanjaMdCfg.scala:595) [kamanjamanager_2.11-1.6.0.jar:1.6.0]

    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99) [ExtDependencyLibs2_2.11-1.6.0.jar:1.6.0]

    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99) [ExtDependencyLibs2_2.11-1.6.0.jar:1.6.0]

    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230) [ExtDependencyLibs2_2.11-1.6.0.jar:1.6.0]

    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) [ExtDependencyLibs2_2.11-1.6.0.jar:1.6.0]

    at scala.collection.mutable.HashMap.foreach(HashMap.scala:99) [ExtDependencyLibs2_2.11-1.6.0.jar:1.6.0]

    at com.ligadata.KamanjaManager.KamanjaMdCfg$.LoadOutputAdapsForCfg(KamanjaMdCfg.scala:595) [kamanjamanager_2.11-1.6.0.jar:1.6.0]

    at com.ligadata.KamanjaManager.KamanjaMdCfg$.LoadAdapters(KamanjaMdCfg.scala:452) [kamanjamanager_2.11-1.6.0.jar:1.6.0]

    at com.ligadata.KamanjaManager.KamanjaManager.initialize(KamanjaManager.scala:760) [kamanjamanager_2.11-1.6.0.jar:1.6.0]

    at com.ligadata.KamanjaManager.KamanjaManager.com$ligadata$KamanjaManager$KamanjaManager$$run(KamanjaManager.scala:895) [kamanjamanager_2.11-1.6.0.jar:1.6.0]

    at com.ligadata.KamanjaManager.KamanjaManager$.main(KamanjaManager.scala:1325) [kamanjamanager_2.11-1.6.0.jar:1.6.0]

    at com.ligadata.KamanjaManager.KamanjaManager.main(KamanjaManager.scala) [kamanjamanager_2.11-1.6.0.jar:1.6.0]

    ERROR [main] – Failed to get output adapter for (testfailedevents_1,com.ligadata.kamanja.metadata.AdapterInfo@6f3681bc)

    scala.runtime.NonLocalReturnControl$mcZ$sp

    ///////////////////////////dispose

    dispose

    ///////////////////////////dispose

    dispose

    ERROR [main] – KAMANJA-MANAGER: Kamanja shutdown with error code 1

    WARN [shutdownHook1] – KAMANJA-MANAGER: Received shutdown request

     

    Does the class name “com.ligadata.OutputAdapters.KafkaProducer$” contains $ at the end in the kamanja Jars?

  • #24713 Reply

    William Tarver

    Hi Satya,

    I believe the documentation may need some revising as it appears to have led you down an incorrect path. In the config directory, there are 4 ClusterConfig* files. I’m guessing when you added your cluster configuration, you added ClusterConfig.json. This file should have been removed from the package entirely (I’ll open an issue regarding that). You should see three other files: ClusterConfig_kafka_v8.json, ClusterConfig_kafka_v9.json and ClusterConfig_kafka_v10.json. You should run “kamanja add cluster config <path to config with proper kafka version>”. Once you’ve done that, please run Kamanja again.

  • #24726 Reply

    Suresh

    hi William,

    When i push the data into kafka using PushSampleDattoKafka_helloworld.sh, It is not processing or going forward. Below is the log

    [cloudera@quickstart Kamanja_1.6.0_2.11]$ sudo input/SampleApplications/bin/PushSampleDataToKafka_HelloWorld.sh

    User selected:

    Running kafka client version 0.10

    SLF4J: Class path contains multiple SLF4J bindings.

    SLF4J: Found binding in [jar:file:/usr/lib/Kamanja_1.6.0_2.11/lib/system/ExtDependencyLibs2_2.11-1.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

    SLF4J: Found binding in [jar:file:/usr/lib/Kamanja_1.6.0_2.11/lib/system/ExtDependencyLibs_2.11-1.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

    SLF4J: Found binding in [jar:file:/usr/lib/Kamanja_1.6.0_2.11/lib/system/KamanjaInternalDeps_2.11-1.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

    SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]

    I dont see any errors also in the Kamanja Logs folder

    pls help

  • #24731 Reply

    William Tarver

    Hey Satya,

    Please run “bash $KAFKA_HOME/bin/kafka-console-consumer.sh –zookeeper <zookeeperHost:Port> –topic helloworldinput –from-beginning” to verify messages have been properly pushed to the helloworldinput topic. You should see 50 messages there.

    If the messages appear, then run “bash $KAFKA_HOME/bin/kafka-console-consumer.sh –zookeeper <zookeeperHost:Port> –topic testfailedevents_1 –from-beginning”. If you see messages appear there, that means that Kamanja retrieved the messages but failed to process them. If there are messages in this topic, please post them here.

    Finally, if messages are in the helloworldinput topic but none are in the testfailedevents_1 topic, run the command “bash $KAFKA_HOME/bin/kafka-console-consumer.sh –zookeeper <zookeeperHost:Port> –topic testmessageevents_1 –from-beginning”. If messages appear here then Kamanja has retrieved and processed the messages (even if there are failures, this topic will be populated if Kamanja receives messages from Kafka).

    Please post the output of all the commands previously mentioend.

  • #24732 Reply

    Satya

    Hi William,

    Now without any errors I’m able to start kamanja engine.

    But still unable to push data into kafka using the shell script. I dont see anything on the terminal when i run the below command.

    [cloudera@quickstart Kamanja_1.6.0_2.11]$ sudo input/SampleApplications/bin/PushSampleDataToKafka_HelloWorld.sh 

    User selected:

    Running kafka client version 0.10

    SLF4J: Class path contains multiple SLF4J bindings.

    SLF4J: Found binding in [jar:file:/usr/lib/Kamanja_1.6.0_2.11/lib/system/ExtDependencyLibs2_2.11-1.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

    SLF4J: Found binding in [jar:file:/usr/lib/Kamanja_1.6.0_2.11/lib/system/ExtDependencyLibs_2.11-1.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

    SLF4J: Found binding in [jar:file:/usr/lib/Kamanja_1.6.0_2.11/lib/system/KamanjaInternalDeps_2.11-1.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

    SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]

     

    Nothing is happening after the above log4j statements and when i ran wacthOutputQueue.sh nothing is coming up on console.

    pls help me out

    Thanks in advance

  • #24733 Reply

    Satya

    Hi William Sorry that i didn’t see your reply, hence I sent again the same error previously

    Here is the output i got.

    Messages are not even sent to kafka topic helloworldinput

    [cloudera@quickstart kafka]$ sudo bin/kafka-console-consumer.sh -zookeeper localhost:2181 -topic helloworldinput –from-beginning

     

     

    Nothing is coming up in the console which means message is not even pushed in to the topic.

     

     

     

  • #24734 Reply

    Satya

    Also its not a problem with my kafka setup because I’m able to send and receive sample messages from kafka console producer and consumer for a topic named test.

    started producer and consumer simultaneously

    [cloudera@quickstart kafka]$ sudo bin/kafka-console-producer.sh –broker-list localhost:9092 -topic test

     

    hi from producer

     

    now the time is 06:30 PM IST

    and i got it in consumer

    [cloudera@quickstart kafka]$ sudo bin/kafka-console-consumer.sh -zookeeper localhost:2181 -topic test

     

     

    hi from producer

     

    now the time is 06:30 PM IST

  • #24736 Reply

    William Tarver

    What version of kafka do you have installed. the only reason why all topics would be empty while the engine is running is because you aren’t using kafka 0.10. If you have version 0.8 or 0.9, you need to pass in the kafka version you’re using when you run the push scripts.

  • #24741 Reply

    Satya

    Thanks William,

    Now i can see the input file getting processed and i can see data pushed into kafka topic helloworld input

    But i dont see any output in the testout_1 topic or neither in testfailedmessages_1 or testfailedevent_1 topics. all these topics are empty and also logs folderin kamanja doesnt have any errors.

    [cloudera@quickstart kafka]$ sudo /usr/lib/kafka/bin/kafka-console-consumer.sh –zookeeper localhost:2181 –topic testfailedevents_1 –from-beginning

     

     

    ^CProcessed a total of 0 messages

    [cloudera@quickstart kafka]$ sudo /usr/lib/kafka/bin/kafka-console-consumer.sh –zookeeper localhost:2181 –topic testmessageevents_1 –from-beginning

     

    ^CProcessed a total of 0 messages

    [cloudera@quickstart kafka]$ sudo /usr/lib/kafka/bin/kafka-console-consumer.sh –zookeeper localhost:2181 –topic testout_1 –from-beginning

     

    ^CProcessed a total of 0 messages

    [cloudera@quickstart kafka]$ sudo /usr/lib/kafka/bin/kafka-console-consumer.sh –zookeeper localhost:2181 –topic teststatus_1 –from-beginning

     

    ^CProcessed a total of 0 messages

    [cloudera@quickstart kafka]$ sudo /usr/lib/kafka/bin/kafka-console-consumer.sh –zookeeper localhost:2181 –topic helloworldinput

     

    5,USA,3

    14,KSA,2

    25,China,2

    6,Jordan,3

    15,Qatar,2

    26,Finland,2

    37,Poland,3

    40,Serbia,5

    48,Libya,3

    3,Turkia,5

    12,Russia,4

    23,Germany,5

    34,Kuwait,2

    45,Algeria,3

    45,Bahrain,2

    7,India,4

    16,Romania,5

    27,Lebanon,4

    30,costa rica,2

    38,Paraguay,5

    41,Switzerland,5

    49,Morocco,4

    36,Hungaria,3

    47,Hong Kong,3

    2,Syria,2

    11,Italy,5

    19,Brazile,2

    22,Australia,3

    33,Oman,3

    44,Vietnam,5

    4,Greece,4

    13,Palestine,2

    24,UAE,3

    35,Netherlands,2

    46,France,5

    8,UK,5

    17,Maxico,3

    20,Portogal,4

    28,Panama,3

    31,Iraq,4

    39,Rwanda,4

    42,Sweden,2

    1,World,1

    9,Egypt,3

    10,Spain,2

    18,Argentina,5

    21,Canada,3

    29,Japan,4

    32,Iran,3

    43,Yemen,4

    50,Hello,1

     

  • #24743 Reply

    William Tarver

    If the issue you were having with input data was the version of kafka, that is likely the same issue with Kamanja. If you have one version in your cluster configuration and another version of kafka running then you’ll need to update your configuration file with the proper version/jar names for kafka and upload that cluster config again.

    What version of Kafka are you running?

  • #24744 Reply

    William Tarver

    Hi Satya,

    Ignore my earlier questions about kafka version. I see in your original post you’re using kafka 0.10. I suspect that if you’ve properly updated your cluster configuration with the proper version of kafka and have all your metadata, then, Kamanja must not be running. If you see a process for KamanjaManager, then there are two places we can look to see if it’s properly running.

    1. Check zookeeper

    run command “bash $ZOOKEEPER_HOME/bin/zkCli.sh”. This should start zookeeper’s command line.

    Once in, type “ls /kamanja” to see all nodes under kamanja

    Then type, “get /kamanja/monitor/engine/1” to get the heartbeat info for Kamanja node 1.

    The other place to look is kafka topic “teststatus_1”. Regardless of whether messages are processed or not this will always be written to with some basic metrics for Kamanja. If that is not being written to, but you see a process for KamanjaManager, then only one thing can be going wrong that wouldn’t produce obvious error logs which is the version of kafka you have in the cluster configuration is not the same as the version of kafka you’re running.

  • #24745 Reply

    William Tarver

    Additionally, before running Kamanja again, please go to $KAMANJA_HOME/config/log4j2.xml and change the log level from WARN to DEBUG.

Reply To: Not able to push data in to kafka
Your information: