Tag: terr

TERR Errors with tidyr Package

  • Have you had problems with TERR since the rollout of TERR 4.4 in Spotfire 7.11?
  • Are you running into compatibility problems between package versions?
  • Would you like to be able to remove the default package installation and install a specific version of a package?
  • Are you getting the error — Error in .BuiltIn(“on.exit”) : the S language function ‘on.exit’ is not stored as a variable value, and can only be accessed by evaluating source expressions containing ‘on.exit’ statements — when trying to use the tidyr R package?

Read More

Guest Spotfire blogger residing in Whitefish, MT.  Working for SM Energy’s Advanced Analytics and Emerging Technology team!

Spatial Objects Using TERR

A key part of analytics in the oil and gas industry is evaluating opportunities at different locations. Space is always present when looking for profitable development projects. We usually look at the already in production wells and try to find some spatial trends. To stay competitive, we need to find better ways  to access the  data of different areas and its wells. For instance, we can transform the spatial information to compact objects that store the location and shape of each well and lease. These objects can be feed  to different calculations and analyses as geometries. For Spotfire, it also has some advantages, you can use the feature layers of the map chart. In this case, we can visualize the leases as polygons and wells as lines.

Read More

TERR Scripts to Read/Write to MS Access

Ruths.ai recently published a free template on the Ruths.ai Exchange that reads and writes data from/to MS Access.  Under the covers, you’ll find two property controls and two data functions working with the RODBC package.  Now, we know that templates are good, but being able to replicate the work is better.  Users want to be able to recreate that functionality in their own files, which is why I am writing this post to explain the code and how everything fits together so you can recreate this functionality in your own DXP files.  Before reading any farther, use this link to download a copy of the template and familiarize yourself with how it works.

Read More

Guest Spotfire blogger residing in Whitefish, MT.  Working for SM Energy’s Advanced Analytics and Emerging Technology team!

Compatible Versions of TERR, RStudio and R

  • Do you know how to check which version of TERR is installed on your Spotfire installation?
  • Are you unsure of which versions of R or RStudio to download so that you know it lines up with TERR?
  • Did you have it all figured out but now you’ve upgraded and lost all the answers???

Read More

Guest Spotfire blogger residing in Whitefish, MT.  Working for SM Energy’s Advanced Analytics and Emerging Technology team!

Linear Regression, the simplest Machine Learning Model

Linear Regression models are the simplest linear models available in statistical literature. While the assumptions of linearity and normality seem to restrict the practical use of this model, it is surprisingly successful at capturing basic relationships and predicting in most scenarios. The idea behind the model is to fit a line that mimics the relationship between target variables and a combination of predictors (called independent variables). Multiple regression refers to only one target variable and multiple predictors. These models are popular not only for solving the prediction task but also for working as a model selection tools allowing to find the most important predictors and eliminate redundant variables from the analysis.

Read More

Spotfire Troubleshooting — Fixing a Bug in Spotfire Logistic Regression Modeling

Two weeks ago, I published a Linear and Logistic Regression template on Exchange.ai that can be found here.  When I built the template, my process was as follows:

  1. Add test and training data sets
  2. Build model on training data set
  3. Insert predicted column based on model in test data set

When following this process for the logistic regression model (a classification model), it inserts two columns of data — ProbPrediction and ClassPrediction.  These two columns give a prediction and a probability.  I noticed that some records contained a value for the ClassPrediction but not the ProbPrediction, which seemed odd.  This happened in records where one or more of my predictor columns were null, in which case, neither column should have been populated.

It turns out that this is a bug that can be fixed with the steps below.

  1. Go to the Tools menu and select TERR Tools
  2. Click the Launch TERR Console button
  3. Type getOption(“repos”)
  4. Type install.packages(“SpotfireStats”)
  5. Type q() to exit the program
  6. Close the program and relaunch

See below for a screen shot of the console.

After I relaunched Spotfire and reran the model, I saw consistent population of the ProbPrediction and ClassPrediction columns.  If you have any questions, feel free to contact me at julie@bigmountainanalytics.com.

Guest Spotfire blogger residing in Whitefish, MT.  Working for SM Energy’s Advanced Analytics and Emerging Technology team!

Guest Blog: Creating Average Lines for a Data Set

Anna Smith is an Engineering Technician at Continental Resources up in Oklahoma. Today she will be sharing her journey creating average lines using TERR.

I had often been asked for average lines on line graphs – seeing the average of a dataset compared to each individual line in that data set. I kept trying to figure it out with just calculated columns and formatting issues, but eventually came to the conclusion that Spotfire just doesn’t give us an easy or clean way to do this. So the idea of using TERR came into play. In my example, we wanted to compare production over time to the average over time for a certain well set – and we want this to be dynamic, i.e., if we change our well set selected, then our calculated average line needs to change. Our TERR code, then, needed to subset each day, calculate an average for that day, and spit out a new value. An important note: the function given at the end of the article that we used requires the input days or months, which means if you have a data set with just dates and production numbers, you need to normalize all those dates back to time zero.

Read More

Technical Director at Ruths.ai