Custom Node Development & Deployment¶
Fire Insights follows an open and extensible architecture allowing developers to add custom nodes that can be exposed in Fire UI and embedded into workflows.
The details for building new nodes are available at the URL below:
Examples of more complex nodes are at the URL below :
Start by cloning the github repo: writing-new-node¶
The easiest way to start writing a new node or processor is by cloning the
writing-new-node repo using the command below:
Code the new custom node¶
- Start by creating a new class that extends
- Override the
execute()method to write custom code. The
transformthe incoming DataFrame and then pass on the resulting DataFrame to output node(s).
- In case this new node creates a
new DataFrameby reading data from a Data Source, the incoming DataFrame would be null. The new node will create a new DataFrame from the data directory from the Data Source. Example of data sources include:
- Files on HDFS
- HIVE tables
- HBase tables
- Salesforce / Marketo
- If the node is updating the incoming schema, also override the
Examples of Custom Nodes:
There is minor difference between the code for Apache Spark 1.6.X and Apache Spark 2.X.
DataFrames are used for Apache Spark 1.6.X, while
Dataset<Row> is used for Apache Spark 2.X.
Create the node JSON file¶
Create the JSON file for the new node. The JSON file is used for displaying the new node in the
Workflow Editor and capturing the user inputs of the various fields of the node through a
Dialog box. The JSON for the node also captures the name of the
Java/Scala class which has the implementation code for the Node.
Fire supports various
widgets types for capturing the details of the fields from the user through the
Node Dialog Box.
The details of the various widget types is available at the hyperkink below:
Examples of Node JSON:
Deploy the Custom Node in the Fire Server¶
Now that you have created a new node, follow the steps below to deploy it:
- Create a jar file with
mvn clean package
- Copy the jar file create above into
fire-user-libdirectory of sparkflows
- Place the JSON file for the new node under the
Restartthe Fire Server.
- The new node will be picked up by the Fire Server and be visible in the
- Check that new node is available as expected in the
Use the custom node in Spark submit when running on the Spark cluster¶
- Include the custom node with
--jars <...>when running the workflow on the cluster