Test Prep: Become a Tableau Desktop Certified Professional

Do you like being challenged? Is passing the Tableau Desktop Professional Exam (CP) on your to-do list? Then this blog might be handy. I recently became a Certified Tableau Desktop Professional and want to share a few tips to help you prepare.

Getting Started

Your best source of information is the official Tableau guide. There, you can find an exam prep PDF with a test and a link to a sample workbook that will give you a very good benchmark. In order to take the exam, you need to be a valid Desktop Certified Associate (CA). The details of the CA test can be found here, with specifics on how to pass that particular exam here.

The exam environment is very similar to Desktop Certified Associate: you will be using a virtual machine, and a proctor can interrupt you at any moment (which happened twice during my exam). Unlike the Certified Associate, the CP exam isn’t multiple choice. You need to build visualisations in order to answer the questions, and your answers are entered in as free text. The exam length is three hours, and you can’t leave your seat during the exam, so I recommend having water, coffee or something else to stay hydrated.

Tableau Courses to Help You Prepare

From my perspective, taking the Desktop III and Visual Analytics courses lays a solid foundation for the CP exam. I found some of the examples and wording in these courses to be very similar to the CP test questions. It can be beneficial to take one of these courses a few days before the test. These courses not only deepen your Tableau data visualisation knowledge, but they also help you become a better data analyst. Both will teach you different methods to visualise data and help you with best practices.

In addition to the links mentioned above, there are many blogs and free resources available to aid you in your test preparation.

What to Expect and Helpful Resources

Check out this Tableau page dedicated to data visualisation. Here, you can find explanations of poor and great visualisations, as well as guidance surrounding typical data questions, including the most suitable type of analysis and chart to answer them. For example, scatter plots are great for identifying correlations, and when comparing KPI performance against target, a bullet chart is effective. During the exam, it’s handy to quickly apply this knowledge. You won’t have much time to explore different chart options.

You can expect advanced table calculations. Here are several blogs I found very helpful to refresh my knowledge and practice:

There will be Level of Detail expressions and at least one example with nested LODs. Below are some blog posts I found useful:

Most probably, you will need to use parameters in several exercises, and be sure to have a good grip on joins, unions and blends. I needed to create a join calculation in my test.

Formatting is a very important part of the exam. Try to work on titles and tooltips as you go. Another good tip is to set up the formatting on a workbook level at the start of the exam. In addition to this article, which provides guidance, I recommend checking out the sample workbooks that come with your Tableau installation (World Indicators, Regional and Superstore). In these workbooks, you can see what charts Tableau uses to answer different questions:


The exam is three hours long, which may seem like a long time, but trust me, it is not! That time goes incredibly fast, especially when you’re focusing so hard. Keep track of the time, and give yourself strict guidelines when doing each section. You will likely leave the exam thinking, “I could have done this part better and that question in a different way.” Everyone I know feels the same. I don’t think I know anyone who has left the exam feeling 100% happy with their final work. You’re on a very strict time schedule, and Tableau isn’t expecting a perfect outcome.

To prepare, I’d recommend reviewing some existing visualisations and transforming/building them in a different way while adhering to a strict time limit. Give yourself some time at the end to review everything and then at least five minutes to submit your work (sometimes the upload can take a little while).

Exam Structure

The structure of the exam is very well explained in the exam guide, but here are the three main components:

  1. Module 1 – Visual Best Practices: You’ll be given a chart and asked to rebuild it to better answer the question.
  2. Module 2 – Advanced Technical Skills: Application of table calculations, LODs, parameters, etc.
  3. Module 3 – Storytelling: You will be given a use case and asked questions for which you will need to build several visualisations. The end result of this part will be a Dashboard and a Story. You will need to apply all of the above. It’s not mentioned in the latest exam preparation guide, but Storytelling used to give the most points. Therefore, I recommend starting with Module 3 and reviewing it again at the end if you have time.

You can find great suggestions on how to prepare for the Storytelling module in this blog post. Try to work on different datasets and use cases. Participate in Makeover Monday, Workout Wednesday, Storytelling with Data challenge and similar projects to sharpen your skills.

Good luck with your preparation, and I hope to see the same certificate with your name on it. Please let the community know if you find more useful tips. Special thanks to Mavis Liu for her contribution to this blog post.

The post Test Prep: Become a Tableau Desktop Certified Professional appeared first on InterWorks.


Understanding Tableau Prep and Conductor: Building a Viz with Flow Data

Understanding Tableau Prep and Conductor: Building a Viz with Flow Data

In the last four blog posts in this series, we created an 18-step workflow that provides a hyper extract file. I’m going to test my set by building a view in Tableau Desktop. While I could just publish the flow and use an output file as my starting point, I’m going to use the last cleaning step in the flow to launch Tableau.

Of course, the whole point in creating the workflow using the Superstore data with the Census data was to normalize sales by State for the population in each state. Now that we’ve confirmed the dataset is complete and contains the data we need to build views, we want to Create:

building a visualization with flow data in Tableau

In the next post in this series, I’ll show you how to publish your workflow to Tableau Server. We’ll also discuss how you can use the Tableau Data Management add-on to Tableau Server to automate workflow runs to refresh data sources that you are curating to your Tableau Server users.

The post Understanding Tableau Prep and Conductor: Building a Viz with Flow Data appeared first on InterWorks.


Understanding Tableau Prep and Conductor: Facilitating a Join with Aggregation

Understanding Tableau Prep and Conductor: Facilitating a Join with Aggregation

In the previous three posts in this series, we used Tableau Prep Builder to create a workflow to join six tables from two data sources. Our flow contains 11 steps. The challenge presented with the data we have from the Census comes from the different levels of detail (LOD) our sales data has—specific transactions every day for the last four years—and the Census data, which includes population data aggregated at the State-level for the same four years. The Census data is much less detailed.

This is a common issue with financial data, for example, because budget data tends to be less detailed than the transaction-level details you can obtain from business systems. So, we need to use an aggregate step to reduce the LOD of the Superstore sales data so that it’s the same as the Census data.

Using the Aggregate Step in Tableau Prep

The Census data provides population data by State and Year. We’ll use an aggregate step to reduce the detail in the Superstore data to match the Census data.

After performing the aggregation of the Superstore data, we can now get a resulting set that combines sales and population data for every state. The Census data actually includes information for Alaska and Hawaii, for which we don’t have any sales. That’s fine. Delaware is also mismatched in 2018 because there were no sales in the Superstore data that year. By using the aggregate step, we’ve now brought both data sources together and achieved a consistent LOD.

Creating a Per-Capital Sales Calculation

The primary reason I wanted to bring in the Census data was to normalize the sales by state for population. Superstore is anonymized retail sales information. While it’s useful to know that California has the most sales in this data, how does Vermont compare to California when you take into account the population differences between states?

Saving Your Tableau Prep Workflow

At the end of the last video, I showed you how to add an output step, save a workflow and then run the workflow to generate a Hyper output file that we’ll use to build a visualization of the data. Saving the workflow enables you to reuse it later. Running the workflow generates the Hyper data.

Cleaning up the Workflow Visualization

We now have a workflow that consists of 18 steps. Let’s step back for a moment and enhance the information in our workflow so that other people can more easily follow its logic:

The video shows how you can change step colors and add comments, making each step easier for others to follow and understand. I didn’t rename any of them. You may also experiment with the ways you can use color, descriptions and step renaming to make your flows easier for other people to track with.

In the next post in this series, I’m going to build a visualization using the dataset we created in the workflow.

The post Understanding Tableau Prep and Conductor: Facilitating a Join with Aggregation appeared first on InterWorks.


Understanding Tableau Prep and Conductor: Clean and Join Steps

Understanding Tableau Prep and Conductor: Clean and Join Steps

The last post in this series showed you how to use the Wildcard union and the Manual union to join four years of sales data that was stored in four different worksheets. In this post, you’ll see how the cleaning step can be used to do the following:

  • Change field-data roles
  • Group and replace members of a specific field set
  • Change the data types of a field and rename fields

All this can be done by using the Profile Cards within the Profile Pane in Tableau Prep Builder.

Using the Clean Step in Tableau Prep

The cleaning step behaves a lot like the data interpreter in Tableau Desktop, but it adds additional visualizations in the Profile Pane and the field Profile Cards that help you understand the shape and contents of each field. In addition, clicking on elements within the Profile Cards highlights related details in the other field cards. I find myself using the cleaning step frequently just to examine the results of prior steps.

Adding a Join Step

The Superstore dataset also includes another worksheet containing the names of Regional Managers. Regional Managers are assigned groups of States. We’ll use a join to add that information to the flow:

Now that the sales data from the Superstore tables have been cleaned up, we will bring some public data from the Census Bureau so that we can enhance our sales data by normalizing sales for the population by state.

Adding the Census Data into the Flow

I’ve been using Census data for many years. It’s useful when you want to account for the population density in analyses. In the example we’re building, this data will ultimately be used to express the sales by state in a way that accounts for the population density of each geography.

Using the Pivot Step and Tableau Prep Builder

The Census data isn’t perfectly formatted. We’ll use Prep Builder to fix the structural problems in that dataset:

In the video, I chose to do most of the field clean-up in the pivot step. I could have performed the same cleaning operations in the cleaning step that I added after the pivot. If the work you’re doing is going to be utilized only by use, fewer steps may save you time. If you work with a team of people who are new to Prep Builder, adding more steps to segregate individual cleaning operations may make your flow easier for others to understand. There aren’t hard and fast rules.

This workflow now includes two different data sources and six tables. You’ve seen two different ways to create a union; you’ve seen a join step and a pivot step; and you’ve learned about different ways you can use the cleaning step to improve the formatting and consistency of the data in the workflow. My colleague Katie wrote a blog post that takes a closer look at splitting and pivoting your data, so read it if you need more in-depth insights into those steps. For further information on cleansing your data, look at my colleague Spencer‘s blog on the topic.

In the next post in this series, we’re going to join the Superstore data to the Census data. Because these two data sources are not aggregated in the same way, we’ll be presented with a challenge that we’ll address with an aggregate step.

The post Understanding Tableau Prep and Conductor: Clean and Join Steps appeared first on InterWorks.


Understanding Tableau Prep and Conductor: Connecting to a Data Source with Builder

Understanding Tableau Prep and Conductor: Connecting to a Data Source

Before you begin building a workflow using Tableau Prep, it’s helpful to know a little bit about the data source(s) you need to connect to. Consider the following:

  • What kind of source are you connecting to?
  • How large are the files? Record counts? Rows/columns?
  • Are you familiar with the data structures?
  • Do you know of any data quality issues?
  • How frequently will you be updating the data?

Understanding the Basics of Your Data Sources

Understanding the basic data structure and size, as well as the granularity of different sources, helps you plan the flow in your mind before you get into the detailed challenges that the transformation of the raw data sources poses.

In the example I’m drawing upon for this series, I’m using a version of Superstore data I created, along with public data from the Census Bureau. I’m going to create a workflow that will combine four tables containing annual sales data and a single dimension table that will be joined to provide regional manager names. These will be joined with a population dataset from the Census, enabling us to normalize sales for population for each state that had sales.

Connecting to the Sales Data

The data used in this example comes from two different spreadsheets: one that contains four (4) sales worksheets and one (1) Regional Manager worksheet, and another spreadsheet containing the census data.

Experienced Tableau Desktop users should be familiar with the Superstore dataset. In this spreadsheet, I’ve separated each year’s sale into its own worksheet. This data could have been in a text file or a database:

Superstore data for Tableau Prep

The Census data provides population estimates for each state for corresponding years:

Census data for Tableau Prep

Because the world isn’t perfect, we will have to deal with data quality issues in these files, different aggregations of the data, union different files, join files, pivot the data and re-aggregate the data. There are also inconsistencies within specific fields that will have to be cleaned. The datasets are small, but every kind of data transformation step that Tableau Prep provides will have to be utilized to prepare the data for analysis in Tableau Desktop. We will also create a calculation in Prep to normalize sales by state and year for the population in each state.

That data is not perfectly clean, and some of the structures aren’t right. That’s the real world. We’ll use Tableau Prep to address all of the issues and create a clean dataset for other people to use.

Connecting to the Superstore Sales Data

In this first video, you’ll see how to make a data connection to an initial data source and then add other files to that data source. We’ll make the following connections:

  1. Connect to the four sales tables
  2. Demonstrate a wildcard union
  3. Demonstrate a manual union

Using the Wildcard Union in Tableau Prep

Wildcard unions offer an efficient way to bring together many different tables with similar naming conventions that also have consistent column and row structures. If you’re working with unfamiliar datasets that may have data inconsistencies, I believe creating a union manually gives you more direct control and may make it easier for you to deal with data quality issues that emerge as you join each table.

Using the Manual Union in Tableau Prep

I like using manual unions when I’m working with a new dataset because it’s easier to identify mismatched field-naming conventions. The inconsistent field names (Sales vs. Revenue) didn’t appear until I brought in the 2018 sales data. The visual cues that Tableau Prep provided, and the filtering in the data grid for mismatched fields, made it very easy to find and merge two different fields that were actually both holding sales data.

It was also easy to make minor changes using the Profile Cards for specific fields. I used that to remove the Table Names field, which Builder adds automatically when you union data. I don’t want to see that information in my output from this workflow, so I removed it. In addition, because Row ID is a number, Builder treated it as a number in the profile pane and generated a histogram in that field’s profile card. I wanted to validate that the Row ID field is a unique key for this dataset, so I changed the field to a string, and the profile card was changed in a way that made it easy to see every Row ID is, in fact, a single row of the data.

In the next post in this series, I’ll show you how to add a cleaning step to make additional modifications to fields.

The post Understanding Tableau Prep and Conductor: Connecting to a Data Source with Builder appeared first on InterWorks.