All Downloads are FREE. Search and download functionalities are using the official Maven repository.

templates.home.helpContent.html Maven / Gradle / Ivy

Go to download

A full-featured web-based Apache Kafka consumer. Kafka WebView presents an easy-to-use web based interface for reading data out of kafka topics and providing basic filtering and searching capabilities.

The newest version!






Setup

1. Setup users

You first need to configure who has access to Kafka WebView. Kafka WebView provides two roles for users: Admin and User.

Admin users have the ability to Manage and Configure all aspects of WebView, including defining Kafka Clusters, add/remove users, define View etc..
User users have the ability to view Cluster information, and consume Views.

If you've logged in with the Default Admin account, you'll want to create your own Administrator user account and remove the default one.

Setup Users

2. Connect Kafka clusters.

You'll need to let WebView know about what Kafka clusters you want to view data from.

Connecting with SSL
WebView supports connecting to Clusters using SSL. You'll need to follow the standard Kafka consumer client directions to create a Java Key Store (JKS) for your Trusted CA (TrustStore), and a JKS for your Consumer Key (KeyStore).

Setup Clusters

3. Configure custom Message Formats. (Optional)

Kafka allows you to store data within the Cluster in any data-format and provides an Interface for understanding how to Deserialize your data. Out of the box Kafka WebView supports the following Deserializers that can be used for both Keys and Values:

  • ByteArray
  • Bytes
  • Double
  • Float
  • Integer
  • Long
  • Short
  • String
Often times data is stored using a custom format such as Avro or ProtocolBuffers. Admin users can upload a JAR containing custom Deserializer implementations to extend support to WebView.

Read more about implementing Deserializers.

Setup Message Formats

4. Configure Filters. (Optional)

Filters are a construct unique to WebView. Filters allow you to implement an Interface that can be used on the server side to filter messages coming from Kafka. There are several benefits to doing filtering on the server side in this way. You can use these as a simple search-like filter and avoid passing large amounts of data to the client web browser when you're looking for a small subset of messages. Filters could also be used to enforce a filtered view of data from a Topic.

Read more about implementing Filters.

Setup Filters

5. Define Views.

Views are the last step where you put everything together. Views let you configure what Topic you want to consume from, configure which Message Formats the Topic uses, and apply any Filters.

Setup Views

Writing Custom Deserializers

The Deserializer Interface is provided by Kafka, WebView requires nothing special or additional above implementing this interface. If you already have a Deserializer implementation for consuming from Kafka then you simply can just use it as is.

If you don't already have an implementation, the interface looks as follows:

    /**
     * An interface for converting bytes to objects.
     *
     * A class that implements this interface is expected to have a constructor with no parameters.
     * <p>
     * Implement {@link org.apache.kafka.common.ClusterResourceListener} to receive cluster metadata once it's available. Please see the class documentation for ClusterResourceListener for more information.
     *
     * @param <T> Type to be deserialized into.
     */
    public interface Deserializer<T> extends Closeable {
    /**
         * Configure this class.
         * @param configs configs in key/value pairs
         * @param isKey whether is for key or value
         */
        void configure(Map<String, ?> configs, boolean isKey);
    
        /**
         * Deserialize a record value from a byte array into a value or object.
         * @param topic topic associated with the data
         * @param data serialized bytes; may be null; implementations are recommended to handle null by returning a value or null rather than throwing an exception.
         * @return deserialized typed data; may be null
         */
        T deserialize(String topic, byte[] data);
    
        @Override
        void close();
    }
                                

Example Project

To get up and going quickly, the Kafka-WebView-Examples project on GitHub can be cloned and used as a template. This Maven based example project is configured with all of the correct dependencies and has a few example implementations.


Packaging a Jar

If you're using the Kafka-WebView-Examples project, it should be as simple as issuing the command `mvn package` and retrieving the compiled Jar from the target/ directory.

If you're building from your own project, you'll need to package a Jar that contains your implementation along with any of it's required dependencies.


Writing Custom Filters

Filters allow you to implement an Interface that can be used on the server side to filter messages coming from Kafka. There are several benefits to doing filtering on the server side in this way. You can use these as a simple search-like filter and avoid passing large amounts of data to the client web browser when you're looking for a small subset of messages. Filters could also be used to enforce a filtered view of data from a Topic. The RecordFilter Interface is provided by Kafka WebView and is NOT part of the standard Kafka library. The interface looks as follows:

/**
 * Interface that defines a Record Filter.
 */
public interface RecordFilter {
    /**
     * Define names of configurable options.
     * These names will be passed up to the User Interface and allow the user to define them.
     * When configure() is called below, these same names will be returned, along with the user defined values,
     * in the filterOptions argument.
     *
     * Since the UI provides no validation on these user defined values, best practices dictate that your implementation
     * should gracefully handle when these are not set.
     *
     * @return Set of option names.
     */
    default Set<String> getOptionNames() {
        return new HashSet<>();
    }

    /**
     * Configure this class.
     * @param consumerConfigs Consumer configuration in key/value pairs
     * @param filterOptions User defined filter options.
     */
    void configure(final Map<String, ?> consumerConfigs, final Map<String, String> filterOptions);

    /**
     * Define the filter behavior.
     * A return value of TRUE means the record WILL be shown.
     * A return value of FALSE means the record will NOT be shown.
     *
     * @param topic Name of topic the message came from.
     * @param partition Partition the message came from.
     * @param offset Offset the message came from.
     * @param key Deserialized Key object.
     * @param value Deserialized Value object.
     * @return True means the record WILL be shown.  False means the record will NOT be shown.
     */
    boolean includeRecord(final String topic, final int partition, final long offset, final Object key, final Object value);

    /**
     * Called on closing.
     */
    void close();
}
            

Example Project

To get up and going quickly, the Kafka-WebView-Examples project on GitHub can be cloned and used as a template. This Maven based example project is configured with all of the correct dependencies and has a few example implementations.


Packaging a Jar

If you're using the Kafka-WebView-Examples project, it should be as simple as issuing the command `mvn package` and retrieving the compiled Jar from the target/ directory.

If you're building from your own project, you'll need to package a Jar that contains your implementation along with any of it's required dependencies.






© 2015 - 2024 Weber Informatics LLC | Privacy Policy