Posts

Showing posts from March, 2023

2023: Week 13 - Rolling stock price trends

Image
Created by: Carl Allchin The final challenge of Advanced Month involves working out moving calculations. At the time of writing this challenge they are not natively available in Tableau Prep so am intrigued by the solutions you might come up with.  Trying to conduct complex analysis can be a challenge but simplifying the data set can make life much easier for end users. This is the logic we're trying to apply this week.  Inputs This week's challenge will use the same inputs as the 2023 Week 8 challenge. You can download it here . Requirements Input all the months of data available Create a Trade Order field showing the order each trade has happened in for each Sector from the File Name and ID as the order of each trade within each monthly file Remove all data fields except: Trade Order Sector Purchase Price Create a Parameter to allow the user to select the rolling number of trades to be incorporated into the moving average.  I've set a default of 3 trades in my moving aver

2023: Week 12 - Solution

Image
Solution by Tom Prowse and you can download the workflow here .  This weeks challenge is a long one, so hopefully you stayed with it and chipped away at the different techniques that were required to solve the challenge. Step 1 - UK Bank Holidays First we want to input the UK Bank Holidays table where we can remove the Date and Bank Holiday field, and remove any null values from the Year. After this we can join back to the original table using the Source Row Number (note the source row number is a new feature within Tableau Prep 2023.1).  Within the join we want to create an inner join where Source Row Number >= Source Row Number:  This will allow us to fill down the year for each row and bank holiday date From here we can tidy the table by keeping the relevant information. First we want to find the max source row number for each year:  Max Then we can filter to keep only where the Max = the Source Row Number-1 and finally remove the Max, Source Row Number-1, Source Row Number, and

2023: Week 12 - Regulatory Reporting Alignment

Image
Challenge by: Jenny Martin The challenge this week is heavily inspired by a real life scenario and I'm sure many organisations will be able to relate to the quirky rules they have to follow when doing regulatory reporting. Often with the reasoning behind it being "because that's the way it's always been done!"  Data Source Bank must assign new customers to the next working day, even if they join at the weekend, or online on a public holiday. What's more, they need to report their total new customers for the month on the last working day of the month. This means any customers joining on that last working day will actually be counted in the following month. For example, 31st December 2021 was a Friday. The total number of customers for December would be reported on this day. Any customers joining on the day of 31st December 2021 itself will be counted in January's count of new customers.  What makes this even more confusing is trying to align with branches i

2023: Week 11 - Solution

Image
Solution by Tom Prowse and you can download the workflow here .  This week we are looking at calculating the distance for our closest customers by utilising the Haversine formula as our spatial calculations. We are looking at latitude and longitude and then calculating the distance with some trigonometry. Step 1 - Combine Data First we want to input both of our data sources and then combine them together so that all of the customer locations can be compared to all of the branch locations. To append both of these table we want to create a cartesian join using a dummy field from both tables. For the dummy field we have created a new calculated field with the number 1 but you can use anything within this field as long as it matches on both tables.  From here we can use an inner join on the dummy from each side: After this our number of rows will massively increase as we now have a row for each combination: Step 2 - Transform Lat & Long The next step is to transform each of the lat &am

2023: Week 11 - Which customers are closest?

Image
Created by: Jenny Martin Data Source Bank want some quick and dirty analysis. They know where their customers are, they know where their branches are. But they don't know which customers are closest to which branches. Which customers should they be prioritising based on proximity to their branches? Now Tableau Prep doesn't natively contain spatial functions so we'll need to use some fancy Maths! Namely we'll be making use of the Haversine formula . There are 2 steps we'll need to use: Transform the latitude and longitudes from decimal degrees to radians by dividing them by 180/pi (it was Pi Day yesterday after all ;)) The distance (in miles) can then be calculated as:  3963 * acos((sin(lat1) * sin(lat2)) + cos(lat1) * cos(lat2) * cos(long2 – long1)) Inputs We have 2 inputs for our analysis: Customer Locations  DSB Branches  Requirements Input the data Append the Branch information to the Customer information Transform the latitude and longitude into radians Find th

2023: Week 10 - Solution

Image
  Solution by Tom Prowse and you can download the workflow here . Building on our challenge from last week, we want to take this a step further and allow users to find out their balance, even on days that they don't make a transaction.  Step 1 - Balance Each Day First, we need to calculate the order in which the transactions were made and the total value for each of the days. To calculate the total transaction value we can use a LOD where we group by Account Number and Balance Date and then Sum the Transaction Value.  From here we can then order the transactions using a Rank calculation where we group by Account Number and then Rank the Balance Date Ascending and Transaction Value Descending.  From here, we then only want to take the single transaction from each day so we can identify the latest transaction (max Order) from each account number and balance date combination. Using this max order, we can filter where Max Order = Order, so that we only have a single, latest transaction