Portals for Tableau New Feature Spotlight: Portal Backups

Portals for Tableau New Feature Spotlight: Portal Backups

Major League Tornadoes

© InterWorks 2019 – All Rights Reserved, Modified ”Tornado Icon” (https://game-icons.net/1×1/lorc/tornado.html) by Lorc is licensed under CC BY 3.0

The new season of Major League Tornadoes is getting kicked off around the InterWorks headquarters, so disaster recovery is on our minds. Portals for Tableau has had the ability to export various pieces of its data for quite some time. However, it could not provide a proper backup since it lacked features such as the ability to export files, logos, etc.

The new full portal backup system rectifies that issue. These backups will export the database structure and data, the portal code and all uploaded files.

Creating a New Portal Backup

To take a new backup, navigate to Backend > Settings > Import/Export > Full Backup tab, and click on the Take New Backup button. When you do, you will be able to watch the status as the new backup is built. The portal will even show you up-to-date stats on how much free space you have available on your portal server to ensure you have room. These stats will be refreshed periodically while the backup is being built if you like to follow along and keep score in the stands.

When your new backup is complete, you can click on it to download the zip archive and store it for safe keeping. You also have the ability to remove old backups that are no longer needed to free up space for new ones.

Scheduling Backups for Your Portals for Tableau

You even have the option to schedule backups to make life easier. To avoid scheduled backups piling up and maxing out your server’s storage, you can also configure how many backups to retain. As new backups are created, old ones will be purged. By default, a weekly backup will take place and it will retain the two latest ones.

With this new backup system, you can rest a little easier the next time the Twisters come to your data center’s town and the game is a total blowout.

creating a new backup in Portals for Tableau

The post Portals for Tableau New Feature Spotlight: Portal Backups appeared first on InterWorks.


Understanding Tableau Prep and Conductor: Facilitating a Join with Aggregation

Understanding Tableau Prep and Conductor: Facilitating a Join with Aggregation

In the previous three posts in this series, we used Tableau Prep Builder to create a workflow to join six tables from two data sources. Our flow contains 11 steps. The challenge presented with the data we have from the Census comes from the different levels of detail (LOD) our sales data has—specific transactions every day for the last four years—and the Census data, which includes population data aggregated at the State-level for the same four years. The Census data is much less detailed.

This is a common issue with financial data, for example, because budget data tends to be less detailed than the transaction-level details you can obtain from business systems. So, we need to use an aggregate step to reduce the LOD of the Superstore sales data so that it’s the same as the Census data.

Using the Aggregate Step in Tableau Prep

The Census data provides population data by State and Year. We’ll use an aggregate step to reduce the detail in the Superstore data to match the Census data.

After performing the aggregation of the Superstore data, we can now get a resulting set that combines sales and population data for every state. The Census data actually includes information for Alaska and Hawaii, for which we don’t have any sales. That’s fine. Delaware is also mismatched in 2018 because there were no sales in the Superstore data that year. By using the aggregate step, we’ve now brought both data sources together and achieved a consistent LOD.

Creating a Per-Capital Sales Calculation

The primary reason I wanted to bring in the Census data was to normalize the sales by state for population. Superstore is anonymized retail sales information. While it’s useful to know that California has the most sales in this data, how does Vermont compare to California when you take into account the population differences between states?

Saving Your Tableau Prep Workflow

At the end of the last video, I showed you how to add an output step, save a workflow and then run the workflow to generate a Hyper output file that we’ll use to build a visualization of the data. Saving the workflow enables you to reuse it later. Running the workflow generates the Hyper data.

Cleaning up the Workflow Visualization

We now have a workflow that consists of 18 steps. Let’s step back for a moment and enhance the information in our workflow so that other people can more easily follow its logic:

The video shows how you can change step colors and add comments, making each step easier for others to follow and understand. I didn’t rename any of them. You may also experiment with the ways you can use color, descriptions and step renaming to make your flows easier for other people to track with.

In the next post in this series, I’m going to build a visualization using the dataset we created in the workflow.

The post Understanding Tableau Prep and Conductor: Facilitating a Join with Aggregation appeared first on InterWorks.


Advance with Assist: REST Error in Tableau and Snowflake Connection

Advance with Assist: REST Error in Tableau and Snowflake Connection

Question: I’m getting an error when I try to log into Snowflake. It says REST error and no such file or directory. How can I get connected to our new database?

Snowflake login error

Snowflake login error

Looking at the images above, you may already know what’s causing the issue, but this is a common question that’s been popping up recently. Here’s some background and how to resolve this error.

Connecting to Snowflake from Tableau

Connecting to Snowflake is pretty straightforward. You need only a couple bits of information for Tableau to connect to your repository:

  • The Server name
  • Authentication – Username/Password, SAML or OAuth

connecting to Snowflake from Tableau

You will need a driver to connect to Snowflake as well, so if you haven’t done that or have an error related to drivers, sign in to your Snowflake instance. Then click Help > Download > ODBC Driver in the top-right corner of your screen.

Tableau Connection REST Error

In the Tableau connection error above, you may have noticed that the first line read a bit strange. https://https://  were connected in the connection string, but you only put it once in the Server connection within Tableau Desktop. When connecting to Snowflake, you’ll always use HTTPS:// so the Tableau connection already prefaces this in the server connection string for you. You only need the server name in order to establish the connection.

This quick fix allows the client to reap all of the benefits of Snowflake as their data source. If you haven’t heard of Snowflake, it’s definitely one to put on your short list of data platforms to learn about.

The post Advance with Assist: REST Error in Tableau and Snowflake Connection appeared first on InterWorks.


Understanding Tableau Prep and Conductor: Clean and Join Steps

Understanding Tableau Prep and Conductor: Clean and Join Steps

The last post in this series showed you how to use the Wildcard union and the Manual union to join four years of sales data that was stored in four different worksheets. In this post, you’ll see how the cleaning step can be used to do the following:

  • Change field-data roles
  • Group and replace members of a specific field set
  • Change the data types of a field and rename fields

All this can be done by using the Profile Cards within the Profile Pane in Tableau Prep Builder.

Using the Clean Step in Tableau Prep

The cleaning step behaves a lot like the data interpreter in Tableau Desktop, but it adds additional visualizations in the Profile Pane and the field Profile Cards that help you understand the shape and contents of each field. In addition, clicking on elements within the Profile Cards highlights related details in the other field cards. I find myself using the cleaning step frequently just to examine the results of prior steps.

Adding a Join Step

The Superstore dataset also includes another worksheet containing the names of Regional Managers. Regional Managers are assigned groups of States. We’ll use a join to add that information to the flow:

Now that the sales data from the Superstore tables have been cleaned up, we will bring some public data from the Census Bureau so that we can enhance our sales data by normalizing sales for the population by state.

Adding the Census Data into the Flow

I’ve been using Census data for many years. It’s useful when you want to account for the population density in analyses. In the example we’re building, this data will ultimately be used to express the sales by state in a way that accounts for the population density of each geography.

Using the Pivot Step and Tableau Prep Builder

The Census data isn’t perfectly formatted. We’ll use Prep Builder to fix the structural problems in that dataset:

In the video, I chose to do most of the field clean-up in the pivot step. I could have performed the same cleaning operations in the cleaning step that I added after the pivot. If the work you’re doing is going to be utilized only by use, fewer steps may save you time. If you work with a team of people who are new to Prep Builder, adding more steps to segregate individual cleaning operations may make your flow easier for others to understand. There aren’t hard and fast rules.

This workflow now includes two different data sources and six tables. You’ve seen two different ways to create a union; you’ve seen a join step and a pivot step; and you’ve learned about different ways you can use the cleaning step to improve the formatting and consistency of the data in the workflow. My colleague Katie wrote a blog post that takes a closer look at splitting and pivoting your data, so read it if you need more in-depth insights into those steps. For further information on cleansing your data, look at my colleague Spencer‘s blog on the topic.

In the next post in this series, we’re going to join the Superstore data to the Census data. Because these two data sources are not aggregated in the same way, we’ll be presented with a challenge that we’ll address with an aggregate step.

The post Understanding Tableau Prep and Conductor: Clean and Join Steps appeared first on InterWorks.


Portals for Tableau 101: Containerizing Your Portal

Portals for Tableau 101: Containerizing Your Portal

While it isn’t our recommended approach, Portals for Tableau can be hosted in a container environment, such as Docker. There be dragons with this process, so consider this your dragon-slaying guide.

Persistent Database and Filesystem

Portals for Tableau uses a small database to house your portal’s configuration. This database and its data need to persist between rebuilds of your container. While the portal code itself can be rebuilt within a container, the portal stores uploaded files to the storage/app/ folder in your portal’s web root directory. This folder will need to be retained between rebuilds of the container.

Portal Configuration and Upgrades

The portal’s configuration files don’t need to be persistent; however, it is important to retain the database encryption key from the config/app.php configuration file across rebuilds. If not, you will lose the ability to decrypt existing data from the database.

There are three parts to a portal upgrade that happen during a normal 1-click upgrade. These will need to be accounted for in your container environment to prevent rebuilds from reverting to the previous version of your portal. In fact, when hosting Portals for Tableau in a containerized manner, it’s recommended to disable all upgrades through the interface for this reason.

  • Kernel Updates

Portals for Tableau uses an underlying kernel, called October, to provide a framework upon which everything is built. To update the kernel, the following zip archive will need to be downloaded, extracted and the contents copied over the web root directory of your portal:


  • Portal Updates

The core portal functionality uses the kernel’s framework to add all the functionality you know and love. To update the portal code, use the manual update link found in your portal to download the zip archive. It will need to be extracted and everything inside of the core/ folder of that extract copied over the web root directory of your portal:

containerizing your Portal for Tableau

  • Database Updates

As features are added to Portals for Tableau, small tweaks to the database need to be made. These database updates are included as migration scripts in the above code updates. Once the kernel and core portal have been updated, the database migration scripts can be executed by running the following command from the portal’s web root folder:

php artisan october:up

The post Portals for Tableau 101: Containerizing Your Portal appeared first on InterWorks.