Spark connector for Salesforce Wave!

We are delighted and excited to share that we have published the Spark connector for Salesforce Wave! Using this connector makes it really easy to run machine learning algorithm in spark and push the results to Salesforce Wave.

Highlights of the connector:

  1. Packaged library that you can deploy in your spark clusters.
  2. Support java, scala, python libraries to process output of a Dataframe to Salesforce Wave.
  3. Read files from AWS S3 and push to Salesforce Wave.
  4. Push the output of MLLibs into salesforce wave. Same applies for SparkR, output of a R result can be pushed to wave easily.
  5. Uses the Spark API and can execute in concurrent mode to push large files. Batching and data sequencing is built-in.
  6. Easily import the library in databricks cloud
  7. Automated Metadata JSON generation to easily push datasets.

If you have read this far, then thank you for your time! You may be wondering what is the value for a business analyst.

  • Using the model, we can easily perform predictive analytics. e.g Lead scoring, Customer segmentation, propensity to buy etc.
  • Leverage a vast set of libraries that are built by the growing spark community and R users. There are several models Random Forest, Regression, GLM, ARIMA, RFM models that have packaged R code that we can leverage quickly.

More details can be found here

Spark-packages.org – Wave. Read me instructions. We will posting a usage video soon.

We have open sourced the adapter so that others can also benefit from this package. Feel free to reach out to us if you have any questions on usage or would like to see the demo of the connector.

More to come in the next iteration

  1. Ability to execute Salesforce SAQL directly in spark.
  2. Ability to execute Salesforce SOQL directly in spark.