Adapter Changes in ClusterConfig.json

To add a new object to the cluster configuration file, create/modify an existing JSON file. To add an object (in this case an input adapter), add a new ADAPTER object to the configuration file and submit it to the metadata via the kamanja upload cluster config command.

To update an existing object, simply update … Read More

Adapters

Adapters are plugins that allow Kamanja to communicate data with external, third-party systems. There are two general types of adapters: input/output adapters for reading and writing messages (that is, record structures) with systems such as Kafka or MQ. storage adapters for saving data, such as HBase, Cassandra, or SQL Server.

Audit Adapter

The MetadataAPIConfig.properties metadata configuration file can optionally specify a file where the adapter-specific initialization is defined. This is done via the AUDIT_PARMS parameter. The name of the file is passed from Kamanja to the audit adapter implementation in the init method. Note: The contents of this file are adapter-specific and the parsing/processing of the values … Read More

File Data Consumer

Kamanja has a utility called file data consumer that allows users to populate Kafka message queues from files. This tool works as follows: An administrator sets up a directory that is accessible to the tool (the tool must have read/write permissions for this new directory). The file data consumer monitors this directory and detects when … Read More

H2DB

After comparing limitations and benchmarking capabilities, H2DB was chosen as the new single installation storage adapter. There is no need to set up a separate component to store data. There is no need to update the MetadataDataStore field as h2db in the ClusterConfig.json, MetadataAPIConfig.properties, and Engine1Config.properties files. These files are already defined in the Kamanja … Read More

Input Adapters

Introduction to Input Adapters An input adapter is a way of getting streams of data/events into the Kamanja server. Examples are Kafka messages and files. This section describes how to write custom input adapters, referred to here as consumers, that Kamanja can use to process messages. Kafka adapters are provided with the codebase, and therefore, … Read More

Kafka Security

Important: Kamanja 1.5 requires Kafka 0.9.x. (Kamanja 1.5 cannot work with Kafka 0.8.x because of Kafka backward compatibility issues). Kamanja 1.5 uses Kafka API 0.9.x with Kerberos-enabled. Details: Kafka simple consumer and producer now use the Kafka API 0.9.x. SSL client-to-broker security is now enabled. Kafka configuration was modified to use SSL/Kerberos. This section explains … Read More

Logical Partitions

Introduction The logical partitions feature is not visible to the user, but improves the performance and scalability of Kamanja. To increase throughput and the amount of CPU actually used, Kamanja complements physical partitions with logical partitions. To execute Kamanja, an input adapter and an output adapter need to be created. If Kafka is chosen as … Read More