Author: Lucas Wood

I am unable to explain to my family and friends what I do for a living. ---DataShopTalk, editor., admin., data analyst.

Updating your Desktop

If you having some trouble logging in, that’s a good problem! We’ve updated the server for security and now you will need to log in using a new address. Follow along and we’ll get you set back up.

This is the first error you would have encountered; we’ll update the server name so that you can log in again. Press Manage Servers to update the server location.

Click Edit.

Change the server address to match above.

Once you login, you’ll update and then you should see this screen:

Congrats! You made it through.

Turning Daily Data in Monthly

For the DCA Wrangler, you’ll need to have your data organization into monthly values, which is tricky if you have daily production data that you’re working with. The conversion, however, is simple in Spotfire. Follow these steps to convert daily to monthly:

  1. Select File > Add Data Tables…
  2. Select Source and choose your existing Daily production data from the current analysis

Before you press OK, we need to add some transformations:

  1. First, we need to Calculate new column, we need something to aggregate by and in this case it’s ProductionMonth. Use the expression: Date(Year([Date]),Month([Date]),1)
  2. Now add the Pivot:
    1. for your Row identifiers choose your newly created ProductionMonth.
    2. for your Value columns and aggregation methods select Sum(BOPD)
    3. set your Column naming pattern to %V
    4. if you have metadata to pass through such as well name or API add it to your Transfer columns and aggregation methods
  3. Review the data and press OK.

You’re all set! Repeat the process to include GAS or NGLs.


Guest Blog: Limiting by a Property Control for Multiple Axes

Avi Levy is a Business Support Analyst for Roche Products Ltd. up in the UK! Today Avi will be sharing some lessons learned regarding Spotfire Controls and Document Properties.

If you’ve mastered property controls to the point where you’re able to manipulate the $esc and $map functions with ease, look away now, as this post is for someone like me who struggled with this initially. There are several different forums and websites which give great snippets of information about doing X or Y in Spotfire, but personally I sometimes find them really hard to understand as they can be brief to the point of mystifying, and so it can be really frustrating to learn how to do a very specific thing. And that’s what I’m going to show you – selecting multiple items in a list box filter and them all showing up on the axis you want.

Read More

Gah, Encoding in Spotfire!

Are you building Windows paths in Calculated Columns? Are slashes in general messing you up? Consider this workaround for using the slash character in your calculated expressions:

“\\\\servername\\folder”&NameDecode(“%2F”)&[Filename Column]

Let’s take this piece by piece.

  • We need two slashes for a server path \\servername, but that slash happens to be an escape character. This is a problem in every programming language and it comes up because Windows paths use this particular slash. In Spotfire, you need \\\\ to escape your escape character.
  • For regular folder slashes, you’ll just need two slashes for \\folder.
  • Finally, you’ll notice that if you try to put a slash at the end of your String, you’ll inadvertently escape your initial quotes. In Excel, this is a case for the CHAR() function, but in Spotfire we don’t deal in those data types. A workaround for this is using the NameDecode() function. This is the guidance from Spotfire Help:

    Replaces all substring codes with decoded characters.

    Column names in TIBCO Spotfire are stored as UTF-16 encoded strings, while variable names in TIBCO Spotfire Statistics Services are built from 8-bit ASCII characters matching [.0-9a-zA-Z] or ASCII strings enclosed in grave accents. Therefore the column names that are sent to TIBCO Spotfire Statistics Services must be encoded. Column names received from TIBCO Spotfire Statistics Services are automatically decoded by the built-in data functions output handlers. This function can be used to decode results that have not been automatically decoded.

Hope that helps, enjoy!