Hello, everyone and welcome to the session. Thank you for taking the time and we are excited to talk to you about data and how it can drive digital transformation for your company.
Today we're going to follow along with Northern Trail Outfitters as they unleash the power of their data. So NTO's sales took a deep dive during lockdown and COVID and they were dependent on the sales from their brick and mortar stores. At that point, they found themselves incredibly desperate and their IT team then melded together quickly an e-commerce side of their business. NTO is now facing a compounding of challenges between needing to operate more efficiently and modernize their customer experience. They need to keep up. Recognizing that NTO's ability to continue to adapt and innovate is essential to the success of their business, we see NTO's CEO, Lisa Dawson, ask her executive team on their Slack channel to chime in with their department's biggest challenges. It's inherently clear, as we see here, that the sales department, marketing, service, and IT department are all facing issues surrounding the access of data from numerous sources that each department needs in order to be successful.
For today, we're going to take a closer look at the numerous challenges facing NTO's sales department. It's imperative to the success of their team, that they have the visibility of real-time inventory and accurate sales forecasting. At the moment, their forecasting and inventory data are siloed and very inaccurate. In order to thrive in this modern digital world, NTO needs to break down the data silos within their organization in order to make informed decisions around historically separated parts of their business. As you see here, NTO has an SAP legacy system and a point of sale system for their storefront, as well as Shopify. The question is, how does NTO go about unlocking data from their existing systems to be analyzed in Tableau. The answer being, MuleSoft's Anypoint platform. So let's begin to take a look at how this single unified platform enables NTO to unlock and unify data using connectors or API building blocks from anywhere.
You see here MuleSoft's Anypoint platform, the number one platform for integrations and APIs. Using this single platform, NTO developers are able to support the entire API life cycle from designing and validating to deploying and maintaining APIs all with one skill set under one single pane of glass. In Anypoint Exchange, a marketplace of APIs, assets, and connectors as well as templates, NTO developers can find pre-built connectors and templates provided by MuleSoft and also all of the APIs prebuilt by NTO. Let's start to see if there are some prebuilt connectors for the systems that we're looking to unlock.
So, it looks like we have an SAP connector that exposes business data and makes it so that end users don't need to concern themselves with BAPIs or IDOCs. Shopify, let's see about that. While our Shopify connector does everything programmatically, as opposed to the IT team having to natively learn Shopify. In addition to that, let's see about Excel. Our Excel connector creates, edits, and collaboratively manages your spreadsheets with others without additional integration. So now that we've unlocked all the data with these pre-built connectors, NTO can begin to tackle the sales department's business problems.
So as the department unlocks those systems of record, NTO can pump it into either a database such as Snowflake or another other databases that can support JSQL or other types of SQL. So in this case, let's see, do we have a Snowflake connector? Looks like we do. But even more importantly for the analytics team to move directly into Tableau, we can actually make use of MuleSoft's new Tableau connector. This way we can go straight through MuleSoft without the help of something like Snowflake. But what if the API the NTO developers needed is not located in Exchange? What if it had not been created yet?
In this case, the developer can begin creating an API spec here simply by going into Anypoint platform's Design Center where new APIs can be created using a no code interface to connect to any application, system, or database. Simply by clicking create new and new API spec, you're able to create one using this visual guide. Now let's take a look at one that has already been created by the NTO IT team.
NTO's IT team designed this in Design Center and then published it in Exchange for it to be available for other developers to use without having to unlock the data from those systems. On this API page, we can see a multitude of different things from the different endpoints that are exposed as well as detailed information on how a developer can interact with the point of sale.
So additionally, MuleSoft also offers a mocking service that makes it possible for developers to test against this API spec while still being developed. Here, we can get detailed information on how the [inaudible] would return data or other endpoints. MuleSoft allows for NTO developers to do all this work before it's even implemented and there's a lot more we can go into from here. But the key thing to highlight is that NTO developers are able to quickly unlock data from numerous systems of record under one skillset and this single pane of glass using MuleSoft Anypoint platform.
So at this point, we've confirmed that all the necessary connectors and APIs have been designed, vetted, and are ready to be deployed. Next, I'm going to ask Matt in IT, do you have everything you need to implement and deploy what I've covered so far?
I believe so, Nicole. I really appreciate all that. We've got the API specs, we've got some data types. I believe we're ready to get started.
So the story continues. So Grant's team, again, from the top recap, they wanted to solve this two business problems of inventory and sales forecasting. So our team here in the IT world, we want to solve those business problems by accessing silos of data and making that data available to Tableau. So you can see, we have three silos we want to talk about here on the left side, our inputs, if you will, or SAP ECC, Shopify, and a store point of sale application that really relies on Excel CSVs under the covers. And so what we're going to talk about now is how we can use MuleSoft's unified platform and the Anypoint Studio in particular, to implement these APIs here. Now before we get started, Nicole's team's done a great job of speccing out and mocking up the data and Grant, I just want to make sure this data is what you need and we are all in agreement.
Yes, it's looking great. I think if you could add some weather data in there as well, it'd be very helpful in terms of forecasting sales and inventory.
All right. Well, we'll see if we can get to that. And so we're actually going to start from the left hand side top here with the inventory problem. So let's switch over to Anypoint Studio and talk about how we can start moving forward and solving the inventory problem and unlocking that data from the SAP world.
So normally, or when you start a new API in MuleSoft, we're just going to do a new Mule project. I just want to take you all from top to bottom here. And so we've got the API spec define and we're going to work on NTO inventory. And since we have the spec already defined and agreed upon by all parties, we can just pull that inside the tool itself from our exchange database and we're just going to type in NTO here and bring the candidates. And we're going to start with inventory API, and we'll just add that right here. And you can see it's the newest version as part of a version control. And normally we would click finish and let Anypoint Studio scaffold it out, but in the interest of time, we're going to cancel out of this and we're going to look at an already scaffolded API implementation.
So if we can see from the top, we've got some built in air handling, which is always great because who wants to spend all the time building routine air codes when it can be built for you on the fly automatically? And we can scroll down here and we can see some more drudge work has been done for us. We can see that some of our methods we are looking for, such as Grant's need to get inventory, they get inventory sold, get the inventory by store ID, and get the inventory sold by store ID. So you can see we've got a lot of legwork done for us by the tool.
And so, speaking of getting the legwork done a lot faster, we can look at our SAP connector and this is how we're going to unlock that silo of data a lot faster than what we're used to doing. And so if we look on the right side here, we have our pallet and we have our SAP connectors. And in the SAP connector, we've got a lot of methods we can pick from, things you'd expect like transaction management, getting information from IDOCs, sending IDOCs, doing remote function calls and asynchronous remote function calls. And so Grant, I believe you probably are not that interested in how IDOCs or BAPIs work. Would that be true?
That is a fact. I'd rather not know what it is.
I understand that completely and I'm pretty sure the rest of the business would not be that interested either. And part of the beauty of MuleSoft and our API-led approach, the API-led connectivity method, and all of our 200 plus built-in connectors is that you can pick these connectors that are out of the box, part of the product, and already tested and ready for use. And so we can use our SAP connector here to encapsulate all of the data objects that Grant's interested in. And I kind of abstract away how those are stored in SAP.
And so to make the actual call to SAP, for those that are technically inclined, is pretty straightforward. We just have SAP call here, but to make any call, we have to do our preparatory work. So we're going to set up our message, which was passed in the store ID here as part of the payload. And we'll prepare that into a transform message right here and it'll make the call to SAP. And then when it returns, we'll unpack that here in this transform message, and we're able to return that through JSON to Grant's team. And so what we've done again is we've abstracted out all of the under the covers database SAP logic, and turned it into data objects that Grant can work with here. So in this case, we're getting our quantity out of that call. And so that is a great example of shortening your development life cycle by using our pre-built connectors.
So that was our first problem for Grant was getting the inventory data. Now we needed to get some order data. So we're going to take the same approach because we know that our order lists are stored in Shopify. Now, MuleSoft, with its many connectors, we happen to have a Shopify connector here. You can see we've got a great number of methods available to us, for taking care of customers, orders, products, transactions. Now in the case of what we want to do for Grant and his business need is return an order list. So it's the simplest for me is to grab this order list item, drag it over here and drop it on to our flow and let that be done that way. And so how that looks in the development life cycle, we can see we've got our store ID being passed in here again, as part of our payload and we just request those orders. I chose to save those off for future use in a variable, and then we'll return those to Grant.
Now I decided for the sake of illustration here, we've now retrieved this information from Shopify, we would like to place it into a data lake using Snowflake. So same model. We just go ahead and choose our Snowflake connector over here from our pallet. We can see we've got all the operations we need from inserts, the classic create, read, update, delete. We just did the insert method right here. We just drug it over and dropped it in and that's all it was to it. And so the insert statement looks like this, for those that are technically inclined. In this case, we just dropped in store ID, description of sale price into our insert statement and that got run inside of Snowflake.
So to recap what we've been trying to solve from a business level, let me just take us back to our presentation here. The goal was to get our three areas or silos of information unlocked. So we've taken care of SAP. We've got Shopify. We've moved them through MuleSoft, Snowflake, and are ready for consumption by Grant's analytics team in Tableau. So now we've come through the third and final part, which is getting our information out of our Excel world, which is the underlying part of the point of sale, and we're going to use a new piece of technology called the MuleSoft Tableau connector. And so we're going to bypass Snowflake and any other intermediate databases or data warehouses, data lakes, and just go straight from Mule through the connector into Tableau.
Now of course, I just want you to take away the fact that with MuleSoft, you can park it in any one of our connectors in a data lake, or however you want to place it for the output side of the equation, but you can also go straight through the Tableau connector and place it in Tableau. So what I want to do now is we'll take you back into Studio and we'll just show you how we did that in this flow right here. So let me scroll over into the top. And so Grant, I believe you're familiar with hyper files, right?
Oh, certainly. It's a in-memory technology.
Yeah. Grant just said that's in-memory technology. That is one of the workhorses of the Tableau world. And so what we're going to do here using the new MuleSoft Tableau connector is create a hyper file, and I've split all these steps out by the way to make them a little easier to process in your mind, so create a hyper file. We're going to take that Excel CSV data and put it into the hyper file, just using a single call from the Tableau connector. And then we're going to add some random, or not random, but additional point of sale data into it at the end. And then here's our flow to actually take that data, and it's kind of stepwise to make it easy to understand, it's not very efficient, but it shows you the proof of concept that we're going to take that hyper file, we're going to move it into Java, set some input parameters and publish it to our Tableau world, online.tableau.com.
And then I decided to take this a little bit further and do a little bit of inside of MuleSoft processing and do a little bit of integration with Slack. So if we find that there's any sort of unique data or out of boundary type of data, we're going to kick off a Slack message to the data team that Grant's a member of so they can take whatever actions necessary. So to show you this in real time, our whole soup to nuts of taking information out of a CSV file and pushing it into Tableau, we'll just switch over here, and here is Tableau online. So I'm just going to refresh that and show you the counters. We have 220 in this collection and 380 in this collection, and we'll just slip over here and we'll do a simulation of adding some point of sale data. So we'll just click enter here and we'll drop an item in. Let me move this out of the way on my screen and we'll go here and we'll go ahead and publish this data source. We'll let that run.
And you can see, we triggered a Slack notification because something about the data was out of alignment and might need some analysis from Grant's team. So let me take us back over here to our summary screen inside of Tableau and we'll refresh that data. Again, this is live from online.tableau.com and we've added 38 to this left-hand collection and a few more to the right-hand collection. So we've definitely got some data coming through from our point of sale system.
So let me just take us back and recap where we've gotten to here. So we've been able to finally complete the third step of our puzzle, or our business problem, in getting the data from CSV through MuleSoft to the Tableau connector, into Tableau directly. So I think that might be what you need, Grant. How does that look to you?
Yeah, that looks great. So tell me about the Slack notification that I got from NTO sales. What's that all about?
Oh, that's a great question, Grant. I know it was a little bit of a extra we put into our work effort, but the idea that MuleSoft adds a lot of value by integration and you're seeing the integration efforts right here, which we completed. We also take care of automation and API management. And so I took that liberty of adding a little bit of automation and acting from inside of MuleSoft. And in case you're curious how we did that, if we just click over here. Making that Slack call was very straightforward. It's the same model we've been using for the rest of the ones. We'll take the Slack connector, pick from all of the methods we have available here. We wound up using the send message one and just dropped it into our flow and we were able to do the send message with very low code, actually just dragging and dropping and it took care of sending you that Slack message.
So my whole goal behind that was to make you and your team's life a little bit easier by using MuleSoft's capability to act. And just a little bit more to answer your question. The idea is that we can use automation of these Slack notifications, and this would give your NTO team as a whole, including the executives and your data team, the ability to look at this stuff in real time and act faster, which just shows how Mule can handle integration, automation, and taking action in real time. I hope that answered your question.
Yes, it did. That's great. What that also helps for me and my team to do is it'll prompt us to investigate this brand new data set that you just built for us.
Well, great. I'm glad that helped out.
Yeah. And also what's great about this data set that you created for us is it was from historically siloed data sources. Not only are they now integrated data sets, but they're also streamlined and up to date so my team can spend way less time managing data and way more time analyzing the data. I can also trust that any dashboard built from this data asset will stay up to date. So, dashboards will stay fresh.
Yeah, I mean, and what's great about now putting this inside of Tableau is a business user like myself are empowered to make data-centric decisions to powerfully impact my business based on these historical data silos. And what I plan to show in this following section is how Tableau makes it easy with clicks, not code, to ask intelligent questions of your data at every stage of the analytics journey from the very beginning, all the way to the end dashboard. Because Matt and his team has done the hard work of actually stitching together the data sets, putting it inside of Tableau. Now I get the fun part, which is to analyze the data and ask questions. So... And now that Tableau and MuleSoft have come together, it's easier than ever. And thanks to MuleSoft, you can see here, I now have my point of sale data, inventory data, store level data, even weather data, all within this one data asset for me to really dive into.
I could also see that this data source has been certified, which means that I can trust this data set for when I'm asking questions about it. Tableau makes it easier than ever to see and understand this new data set. If you want to, you can count how many times I'll say the word data throughout this section. So let's dive into this in ad hoc data analytics. You can start counting, that might be one.
So with natural language processing, Tableau makes it easy to slice and dice data on the fly and ask whatever question might come to mind. So I see we have collections, which is similar to our department. I see we also have some sales data. I may want to understand, what are my sales by collection and year? So just like that with natural language, Tableau goes and queries my brand new data set and retrieves for me a visual response. It's clear to me that for most of our categories, we're up year to year, eco conscious especially, but our winter collection seems to be down. That's pretty interesting. I wonder if our sales team are aware of this issue and because now we brought together our inventory data set, I can ask questions about how sales relate to inventory.
It looks like we have some people in the waiting room if people can add them.
So as I was saying, really with the Salesforce platform and with Tableau, I'm empowered to dig deeper into this data set on my own without having to rely on IT or someone like Matt to answer my business questions for me. If I want to, I can get curious and see what my inventory levels are like across the states in which we operate in. California has quite a bit of inventory, so does Arizona, but it looks like Illinois has the lion's share of inventory across all of our products. If I want to, I can drill into a collection level or product level, which I won't do for now, but I wonder if our merchandising team is aware of this particular outlier. I could see this across here, if I'm interested in. Really, Tableau's going to make it easy so that there's no back and forth between the people creating the dashboards and the people actually making business decisions like myself.
But for the sake of time, I'm going to fast forward to the end of this analysis as I found some fascinating insights worth sharing with others.
So, across the state of my business, it looks like we're fairly healthy overall. Sales dollars are up. Inventory is surprisingly high and our profit margin is looking healthy. And if we think back to that initial insight we found, our winter collection is down year over year, if I'm interested in, I could drill into this particular collection. At a product level, I see some fascinating points. And with visual cues, Tableau is making it very clear to me that we have some problematic inventory. With smart AI and ML built into this data set, I can see that our men's alpine skis is at risk and has too much inventory, so does our women's alpine skis and helmets, which is pretty interesting. This isn't where our AI and ML capabilities stop.
So, not only can it be informed in a data set, but because it's weaved throughout the Tableau platform, I, as an end user, can get curious and I can ask Tableau to explain this particular data point for me. Tableau can build statistical models and propose possible explanations for this selected mark on the fly. In our case, Tableau is helping me to better understand why this particular data point has had such high inventory levels. If I drill into this high inventory, I could continue to get curious, see our average inventory levels, as well as two particularly extreme values, which I wouldn't have known as it wasn't in our view in the first place.
So this is interesting. And if I want to, I could continue down this rabbit hole, but for the sake of the demo, I'm just going to continue with the story. And what I'm interested in doing is taking a look at these problematic products in our winter collection, because this is certainly something worth investigating. And as I multi-select these red products, I can see at a store level how we're performing for these four products. I see quite a bit of red, or just gray on our map, which isn't good. And particularly for these two stores, I could see that they're underperforming. But it looks like our Aspen and Breckenridge stores. And what's even cooler than being able to drill into a store level, we've also incorporated weather data into this data set so I could see that for our ski town locations, we're expecting quite a bit of snowfall for this upcoming winter. That's great. Personally, I may want to take my family on a ski trip, but from a business perspective, I know there's going to be a lot of demand for inventory. Maybe I can begin to reallocate inventory to these stores.
Well, let's take a look at that product performance just at these two stores. And what's interesting, if we look year over year for our helmets and men and women ski, we're performing terribly year over year. This could be due to a number of reasons, but my first hypothesis that I'd like to pose is, does this have to do with new rental programs in the area? Maybe people that are going to Aspen and Breckenridge aren't buying equipment, but instead would just like to rent it. And could we, in our two stores, create a rental strategy of our own with this excess inventory?
I know there's a lot going on here, so it's probably time for me to want to loop in my leadership. [inaudible] native integration with Slack, it's easier than ever to have a data-centered conversation with my colleagues all around the world. Let me share my findings with my boss for him to discuss with his executive team. Oops. As I drill into these two stores, I could create a snapshot of this view with our brand new data set, with this brand new insight, I'll tag him in it, include that snapshot for it to arrive in his Slack inbox. So just like that, I could kickstart data-centric conversation with my colleagues.
As we just saw from Grant tagging Harry Boone, you might remember him, who's Head of our Sales here at NTO. We here are back in Harry Boone, he's on LinkedIn, he's looking for some new leads, and we can see that Slack notification from Tableau coming directly to him. So clicking on it and bringing us straight into that Tableau online app within Slack, we can see Grant sharing that snapshot view of the report and his commentary around ways or insights on ways to utilize these high levels of inventory.
Harry's obviously able to take that insight, that data, that snapshot and share back within that exec team channel with his colleagues in order to help start making some conversation and start brainstorming ideas with the other execs in that NTO exec channel in Slack of ways of utilizing that inventory. So getting creative and perhaps looking at new revenue streams of how NTO can utilize these high levels of inventory with all that information that we saw coming through, like weather forecasts in certain regions as well.
Again, as we speak to the reusability aspect of MuleSoft, they come to the decision and the decision to act fast and to make a rental capability or add a rental capability onto their recently created e-commerce website. And as Lee Hao from IT here suggests, they can reuse a lot of those APIs that they already built with MuleSoft to help get that up and running and realize that value much faster.
So if we think, and I suppose just stop for a second, see what we just watched as the NTO exec team was really able to collaborate and quickly come to that decision using Slack with MuleSoft and Tableau, NTO were able to solve obviously for those business problems. For today, they were the very current ones as we saw at the very start of Harry Boone's sales problems of getting that full view of inventory and sales data. But they're also able to create and compose new experiences and products and services as they're looking to move forward and move faster in order to keep up with trends and changes in the market, they're able to even solve those larger problems much faster, much more efficiently, and they become much more agile as they really leverage those valuable reusable capabilities and APIs and those packaged business building blocks that are available to them that we just discussed.
NTO can continue to, of course, securely scale, which is of course a huge priority in those operations for their business in all sorts of new ways. So, we just saw an accelerated delivery of their rental program, for instance, or perhaps a future mobile application if they want, as well the integration of potential systems like Salesforce Sales Cloud or Marketing Cloud, et cetera, potentially, or replacing those old systems for new ones. So, very much able to swap out the SAP ECC for perhaps SAP S/4HANA if they want to do that down the road, really easy just to plug it out and plug S/4HANA back in.
So to summarize, MuleSoft really unlocks that power of the data by, of course, securely automating those business processes without any code while Tableau, as we mentioned, really analyzes that data easily and, of course, really efficiently