Automate Einstein Analytics Timezone Conversion

Einstein Analytics operates in GMT Time zone, which across the world is considered a common Standard. But common standards in Analytics might not always deliver customer-specific experiences. Consider a scenario where a user is in a different Time zone to GMT for example, San Francisco which is Pacific Standard Time(PST), this user gets to experience erroneous results for all those visualizations that display Today or Current Day numbers. The reason behind these issues is the lag between the Time zones, since the user resides in PST and the Einstein Analytics is in GMT, the exact time when GMT moves to next day is way ahead than the local time when date moves to next day in PST(it is only 5pm in San Francisco when time in GMT is 12am, leading on to the next day).

This time zone lag causes major discrepancies in the dashboards because we either don’t have the data in the system for Daily Loads or inaccurate data for a date field like ‘CloseDate’ of Opportunity (it is yet future in the respective local time zones West of GMT, for anything east of GMT this becomes past) The objective of this blog post is to tackle these Current Date issues because of the Time zone differences.

Date Fields Limitations

To illustrate these limitations, when a date filter is used to filter for Today or Current day and check the Closed Opportunity deals, for all corresponding KPI’s or visualizations that are using these filters, the results return null after 5 pm PST(this becomes 4 pm depending on Daylight saving), because of the Time zone difference.

data filed limitations

SpringML’s solution to this issue is using Einstein Analytics Cloud Dataflow to automate the adjusting or conversion process of today’s Date. The advantages of addressing this issue in Cloud Dataflow give us the added advantage of using the solution at the dataset level, as we build individual lens and dashboards using the dataset. Also this doesn’t add extra load of endless SAQL lines at each step level, saving on dashboard performance.

Use Case Overview

The solution described above addresses the use case of revenue data loaded every day to Einstein Analytics. Company A has revenue data in an external system such as SAP Hana DB and either EA(Einstein Analytics) Connectors or ETL tool(like Mulesoft or Dell Boomi) pushes that data to Einstein Analytics or Salesforce Objects, every day. Senior executives at Company A want to monitor the revenue performance every day on EA Dashboard. But because of the Time zone difference, every day after 5pm the KPI’s and other lenses on the dashboard show no results. Similar steps can be followed to address the CloseDate field on Opportunity for Today’s closed deals, with a followup filter logic added to the Dashboards or Lens at the end (this is discussed at the end of this blog post).

This Cloud Dataflow solution is easy to deploy and only involves creating just a handful of computed fields and date transformations. End-users won’t be required to perform any additional tasks on the dashboard level. This is the benefit of working at the data level in Cloud Dataflow. It should be duly noted that this Cloud Dataflow needs to run every hour or at least twice a day, once any time prior to 5pm PST and once at 5 pm every day.

Solution Build in Cloud Dataflow

Four nodes need to be created in Cloud Dataflow tol take care of the data transformations. Depending on where your data exists, Edgemart if it is external/csv or an already registered dataset and sfdcDigest if its an Object in Salesforce ex: Opportunity.

Follow these four steps listed below, if your Cloud Dataflow has more than 4 nodes to follow. Just make sure you perform the below transformations at the grain level of the dataset with date field:

Step 1: Edgemart the Revenue Dataset (Digest the Opportunity Object or any other Object), make sure you digest the ‘Date field’ that needs to be adjusted for Current date in Einstein Analytics. In this case, we are aiming to adjust the ‘Load Date’ of the Revenue dataset.

Step 2: Create a ComputeExpression node and create a computed Date field called ‘Current_Date’ with the date function now() in SAQL expression, make sure the date format is yyyy-MM-ddTHH:mm:ss.SSSZ

This generates a new computed date field called ‘Current_Date’ with the timestamp of Current time in GMT standard of Einstein Analytics.

Step 3: Create another ComputeExpression node and create the following computed fields.

Current_Schedule: This field carries the Hour timestamp as a string from the ‘Current_Date’ field. We extract the HH(hour) timestamp using the string function toString()

SAQL Expression: toString(toDate(‘Current_Date’, “yyyy-MM-ddTHH:mm:ss.SSSZ”),”HH”)

Time_Change: Create a computed numeric field, that determines if the time is past Midnight 12 am in GMT (This lets us know if the date has changed to the next day in the system)

SAQL Expression: case when ‘Current_Schedule’ in [“00″,”01″,”02″,”03″,”04″,”05″,”06”] then 1 else 0 end

LoadDate_Moved: Create a computed numeric field, that gives the numeric value of the LoadDate + 1 i.e., moving the LoadDate to tomorrow. We attain this by adding 86400 to LoadDate_sec_epoch

Note: _sec_epoch captures the numeric values of seconds that have elapsed since January 1, 1970. This gives us the numeric equivalent of a date.

SAQL Expression: ‘LoadDate_sec_epoch’ + 86400

Adjusted_LoadDate: Create a logic that determines if the Current_Date is past midnight in GMT (So technically next day in users time zone) && when the difference between CurrentDate in GMT and LoadDate in PST is 1 then we move the LoadDate to next day, else it remains intact.

SAQL Expression: case when ‘Time_Change’ == 1 && Current_Date_day_epoch – ‘LoadDate_day_epoch’ == 1 then ‘LoadDate_Moved’ else ‘LoadDate_sec_epoch’ end

Note:  _day_epoch captures the number of days passed since January 1, 1970. Difference between day_epochs of different dates results in the number of days between two dates.

LoadDate_New: Finally we create a new LoadDate field that is adjusted for the time zone difference i.e., after 5 pm in PST every day this date field is pushed to tomorrow (If LoadDate is Aug 6, 2019, after 5 pm this new field transforms the date to Aug 7, 2019)

SAQL Expression: toDate(‘Adjusted_LoadDate’)

Step 4: Register the new dataset with the New date field ‘LoadDate_New’, we can add additional SliceDataset node in the Cloud Dataflow to drop all other computed fields as we won’t be needing them anymore and also gives a cleaner dataset to End users.

Use this new date field for all those lens and steps that need to be filtered on the date field filtered to ‘Current day’ or today, this will work as the Original date till the time zone change occurs and adjusts itself to one day in the future once Time zone change occurs and shows results instead of nulls.

Additional Use Cases (Adding Filters for Opportunity CloseDate)

We can use this logic without any further updates for date fields that don’t carry the future dates (if today is Aug 6, 2019, date field should not have any dates after Aug 6th). Similar to the ‘LoadDate’ we used in our use case above, that deals with the Daily update of data.

If you are using CloseDate field of Opportunity, which has future dates too, make sure you add a filter logic in your lens and dashboard to make sure we don’t double count the Opportunities with CloseDate actually of tomorrow(when CloseDate actually is Aug 7, 2019). For this we need to add additional filter logic that has CloseDate_New == today and CloseDate not equal to today, as shown below:

Want More Information?

If you are looking for additional Salesforce resources, check out these additional resources:

If you still have questions drop us a line at info@SpringML.com or tweet us @springmlinc.