Author: Lucas Wood

I am unable to explain to my family and friends what I do for a living. ---DataShopTalk, editor. Exchange.ai, admin. Ruths.ai, data analyst.

Guest Blog: Limiting by a Property Control for Multiple Axes

Avi Levy is a Business Support Analyst for Roche Products Ltd. up in the UK! Today Avi will be sharing some lessons learned regarding Spotfire Controls and Document Properties.

If you’ve mastered property controls to the point where you’re able to manipulate the $esc and $map functions with ease, look away now, as this post is for someone like me who struggled with this initially. There are several different forums and websites which give great snippets of information about doing X or Y in Spotfire, but personally I sometimes find them really hard to understand as they can be brief to the point of mystifying, and so it can be really frustrating to learn how to do a very specific thing. And that’s what I’m going to show you – selecting multiple items in a list box filter and them all showing up on the axis you want.

Read More

Gah, Encoding in Spotfire!

Are you building Windows paths in Calculated Columns? Are slashes in general messing you up? Consider this workaround for using the slash character in your calculated expressions:

“\\\\servername\\folder”&NameDecode(“%2F”)&[Filename Column]

Let’s take this piece by piece.

  • We need two slashes for a server path \\servername, but that slash happens to be an escape character. This is a problem in every programming language and it comes up because Windows paths use this particular slash. In Spotfire, you need \\\\ to escape your escape character.
  • For regular folder slashes, you’ll just need two slashes for \\folder.
  • Finally, you’ll notice that if you try to put a slash at the end of your String, you’ll inadvertently escape your initial quotes. In Excel, this is a case for the CHAR() function, but in Spotfire we don’t deal in those data types. A workaround for this is using the NameDecode() function. This is the guidance from Spotfire Help:

    Replaces all substring codes with decoded characters.

    Column names in TIBCO Spotfire are stored as UTF-16 encoded strings, while variable names in TIBCO Spotfire Statistics Services are built from 8-bit ASCII characters matching [.0-9a-zA-Z] or ASCII strings enclosed in grave accents. Therefore the column names that are sent to TIBCO Spotfire Statistics Services must be encoded. Column names received from TIBCO Spotfire Statistics Services are automatically decoded by the built-in data functions output handlers. This function can be used to decode results that have not been automatically decoded.

Hope that helps, enjoy!

Connecting to APIs from Spotfire

Just wanted to share this development journey with you trying to build a template. I wanted to create a WordPress connector for DataShopTalk so that I could download all the articles and get a good idea of the content of the blog. Connecting directly to the MySQL database was not really ideal, I wanted to setup an API from WordPress so that a data function could connect to it.

Read More

Guest Blog: Creating Average Lines for a Data Set

Anna Smith is an Engineering Technician at Continental Resources up in Oklahoma. Today she will be sharing her journey creating average lines using TERR.

I had often been asked for average lines on line graphs – seeing the average of a dataset compared to each individual line in that data set. I kept trying to figure it out with just calculated columns and formatting issues, but eventually came to the conclusion that Spotfire just doesn’t give us an easy or clean way to do this. So the idea of using TERR came into play. In my example, we wanted to compare production over time to the average over time for a certain well set – and we want this to be dynamic, i.e., if we change our well set selected, then our calculated average line needs to change. Our TERR code, then, needed to subset each day, calculate an average for that day, and spit out a new value. An important note: the function given at the end of the article that we used requires the input days or months, which means if you have a data set with just dates and production numbers, you need to normalize all those dates back to time zero.

Read More