INTERVIEW

Pritom Dey

On automated decision-making and how to use bots to grow your business

Logo - KPIs Studio
Sumified logo

When it comes to Google Data Studio resources, we try to leave no stone unturned. But, as with any kind of content, sometimes it’s hard to find information that’s valuable and does not repeat itself. 

Luckily, there are also plenty of instrumental resources for the Data Studio community. One of those is Pritom Dey’s blog, Sumified. Pritom is also the creative mind behind this time-saving Google Data Studio RegEx Generator. He writes in such a clear and to-the-point way about Google Data Studio, that we had to ask him to join us for an interview. 

“I love automation, in every possible case, I create a bot and let the bot do analysis, reporting, forecasting, alerting for me.” Our topic today is automation and how it impacts decision-making because this is what Pritom Dey does on a daily basis.

Pritom, I really love your description on LinkedIn: “Making data talk”. How can we make data talk? And what about listening to it (as marketers, as analysts, as decision-makers)?

Thank you! My LinkedIn headline is my mantra. It serves as a purpose. The purpose is to make sure everything I do with data should lead to some kind of action. For example, if I present data to you, my intention will be to let the data talk for itself so that at the end you find it useful to make better decisions. If you are creating dashboards, basically you are trying to make your data talk and tell a story so that even when you are not present, your end users can understand what the data is trying to say.

You can even take it to the next level by creating a recommendation system within a dashboard or perhaps even better with alerts so that the business can take actions on time rather than finding out something too late from the dashboards. When you take it to the recommendation level, the voice for your data becomes even louder. 

The listening part should be the default goal for you if you are trying to make data talk. By the time you start working on a dashboard, or on any platform where you will present your data, you already have the knowledge of who will be your audience, what questions they might be asking and what answers your dashboard will have to provide. Your target audience is basically ready to listen to the data whenever you send it to them or whenever they access it.

When you are done creating a dashboard or a report, the first person who will be listening to the data will be you. You will be the first critic of your data. You will test if the intention is served. You will check if the data is saying what it is supposed to say. When you are satisfied, you will share it with marketers or decisions makers or other business stakeholders. 

In the process, you have already decided to share it with other people because you already tested it by listening to the data. When the final product is reviewed and improved if required, it will become a great medium for everyone in the business to come back and listen to what the data is trying to say.

Automation and how to use bots to grow your business; these are topics you talk about in your content. What are the tools that help you do this, and how is the automation landscape evolving, in your opinion?

These bots are basically my reporting assistants. They are not any fancy bots that you ask questions and give you answers. I basically create them as per the need to save time for me, for my colleagues, or even sometimes for my family. In the process, it creates more time for other important work. These bots just do the repetitive work for me day after day.

I probably have 15-20 of them running continuously and reporting on my behalf. They are simply powered by Google Sheets and Apps Scripts to prepare the insights or send alerts to multiple communication platforms.

In terms of tools, I mainly use Google Sheets to pull data from various sources including APIs, BigQuery, Google Analytics, etc. then use AppsScript to set up the trigger for distribution. In the market, there are also some tools available for anomaly detection, automated reporting, etc. but they may fail to help with your custom needs. Also, they are expensive.

The theory behind this bot reporting idea is that if you have repetitive tasks for data flow, analysis, or reporting you should automate them. Google Data Studio is also a great tool for that purpose as long as you need to deliver it via email. But if you need something for alerting, Google Data Studio is not there yet to do it for you. However, you can do it easily with Google Sheets and Google Apps Script.

There are a lot of new companies helping businesses automate their reporting or anomaly detection. Anodot, Dynatrace, Statsbot are just to name a few. Any business will love to buy time. Tools like these are great to serve this purpose. The competition will grow in this space for sure.

Let me see if I can convert this to some numbers. Let’s say you use tool A and tool B, to retrieve data every day, and then decide on the next action. By the time you do this, you have already spent a minimum of 2 to 5 minutes. If you repeat this a few times a week, you probably will be spending 15 minutes per week or 60 minutes per month. The bot process can actually get you back that 1 hour per month easily. Instead of repeating your work, the bot will deliver you the summary that you care about. And if you have multiple team members pulling the same metrics then in terms of saving time, the numbers are even greater. 

Sumified Data Studio Connectors Library
Pritom has put together the most extensive list of Google Data Studio Connectors

Processes that involve judgement-based decision-making are usually in the hands of business managers.
How can a company automate decision-making? What makes it ready for that?

Let me just start with an example. Let’s say you use a weather app. Your weather app says it will be raining in the next hour or so, you should take an umbrella if you are going outside. The app is using data, predicting what may happen, and recommending a certain action. It’s up to you now, if you want to take an umbrella or not.

The automation process I apply in my personal or work life for the decision-making process is similar. The situation is known, you already know what actions you will have to take for certain types of scenarios. When you have this background, it’s easy to make recommendations with data for better and faster decision-making.

Some decisions are repetitive and predictive. For example, if decision-makers are out of budget for a marketing campaign, they will have two choices; either top-up or pause the campaign. Another example can be if you are a content manager and you were asked to publish 100 pages for 10 different markets or for 10 different clients by a certain time frame. You will have to allocate your resources accordingly. If you know the logic behind resource allocation, using automation you can basically say, “Hey, vertical X is falling behind, invest more in it”.

You can apply the same idea while actioning SEO items. As soon as your posts go live, you can have a process to report back to you about potential issues. The report will pick up the issues and tag you to flag what should be fixed or what went missing for improving SEO sooner rather than later.

With the automation process, you are analyzing data, and then flagging what action your team members should be taking based on a predetermined action plan. The automation process just eliminates a few manual steps in the process but final decisions of course will be taken by the business managers.

It’s a simple differentiator like a wall calendar versus a digital calendar where a wall calendar can’t give you a reminder for a specific action; you will have to reach out to the wall calendar. A digital calendar can easily remind you about the action, you don’t have to go to the digital calendar, the reminder will come to you.

How can we make sure the automated decision-making process doesn't go wrong?

You might remember this example from 2014, when Amazon wanted to automate résumé selection using AI. One year later, Amazon realized that the AI was biased, learning from biased information over a 10-year period, when most applicants were men. The AI ended up not vetting valuable female applicants. The system just learned that men candidates were preferable. How can situations of this kind be avoided?

This is tough. It happens at human level as well. People are biased even when data says something else. I am not an expert on AI but I believe the automation process using AI will improve over time. You have to launch and learn from the mistakes, otherwise, there won’t be any progress, right?

Think about autonomous cars. Cars are now making decisions on behalf of you. Does it mean nothing will go wrong from here? Of course not. There has been and there will be. As long as we learn and adjust, the future is bright.

Automated decision-making processes that I have worked on personally are based on predetermined action items. Actionable options will be sent to the decision-makers (humans), who will make the final decisions. Nothing is eliminated in the process from data gathering to analyzing to proposals of possible actions.

That doesn’t mean proposed actions will always be right. Things may go wrong anywhere in the process. For example, if your data tracking system or the data pipeline breaks, the proposed action items will be incorrect at the end.

People and bots are sometimes in a “it’s complicated” kind of relationship. Are there ways in which we can move towards a better symbiotic relationship?

I think everything has its own merits, it could be good or bad. You run a business, your end goal is to stay profitable. If a bot can answer at least a few questions on your behalf to free up some time for you, wouldn’t you use a bot? Wouldn’t you use that time somewhere else for better productivity?

You can not make a relationship work on day one, it takes time to get better with a relationship. I think it’s the same here. Things will get better and better with technological advancement. In terms of small alerts or reporting, at the beginning a bot may send false alarms or report incorrectly, once you spot them you can improve the whole process of capturing data, analyzing, and alerting.

If a bot can answer at least a few questions on your behalf to free up some time for you, wouldn’t you use a bot?

Bare with me, Pritom, I have a cascade of questions for you on this next one. For me, there’s no doubt that bots are taking over some tasks and they’ll continue to do that at a fast pace.

  • Do you agree?
  • Do you think that’s dangerous?
  • Do you think a distinction between tasks and jobs is necessary? (I’m referring here to the classical – “Robots are taking over our jobs”)

Yes, I agree. For better returns, I think it’s ok to let this trend grow. Anything that is teachable to a machine, we should teach and experiment and see how it goes. That opens up so many doors, not only for better outcomes but also you can use the human brain on something more productive than on monotonous tasks. 

If it is dangerous or not, it probably depends on the type of job or task. If it requires a doctor to make the final decisions, then of course it’s dangerous. Or at least at this point, it’s still dangerous. Technology is not going to stop here. We will see a lot of advancements in more and more complicated decision-making processes. 

We are already seeing AI-assisted reagent selection. It helps eliminate inefficiencies and errors in the entire reagent selection process that cause costly experimental failure for scientists. Of course it’s a good eliminator in the decision-making process. The process helps you make a decision in 30 seconds, whereas it probably takes 8-12 weeks with human interaction involved.

The distinction between tasks and jobs depends on the situation. If someone has an autonomous car and wants to use it as Uber, he or she has to drive the car physically. You can not use an autonomous car to drive your passengers and make money. Driving is a task here. It becomes a job depending on how you use it. So the question of distinction will be determined by the respective authorities. 

However, the advancement that we have seen will continue for a better future. Robots taking over our jobs is actually not a bad thing, I think. Instead of thinking about losing jobs, we should start thinking about upskilling and reskilling our workforce for more productive outcomes.

Can you share with us of your favorite automations that you’re using in your personal life?

Google Data Studio dashboards are great examples of automation. Think about days before the existence of Google Data Studio when multiple people used to go to Google Analytics and pull the same metrics for a business. Multiple people had to repeat the same process, which you can now automate for all by just using Google Data Studio. This is my universal favorite.

Google Data Studio RegEx Generator is one of my favorites. I often go back to the tool to find RegEx solutions. The beauty of it is that once it solves a problem for X, it becomes capable of solving similar problems for thousands of others. It is simply capable of saving time for an unlimited number of people from the Google Data Studio community. It has the potential to help Google Sheets, BigQuery, and even the AppsScript community in the future. 

Another favorite one of mine is a Slack bot that sends me alerts as soon as it detects product updates and new releases from Google Data Studio and BigQuery. The process basically keeps an eye on those pages 24/7.

Pritom, one last question: what is something that you’re not seeing yourself ever (or at least in the near future) automating?

Of course, there will be lots of things that I won’t be automating. If the final decision is being made for the first time without having any established process of decision making and it has important consequences, then the whole analysis process will take place differently.

In my personal life, I won’t feel comfortable getting medical advice from a robot just yet. I will stick to a real doctor for now, haha!

Want to keep up with Pritom? Find him on LinkedIn.

Find more Interviews on our blog. Have a suggestion for us? Contact us.

Pritom Dey

Pritom Dey

Pritom is a lead data analyst by profession, and a blogger by hobby. He holds a Masters degree in Computer and Information Systems from Central Queensland University, Australia. Currently living in Toronto, Canada.

He has been in the technology and marketing space for over 10 years. He has been working remotely since 2012 and loves the flexibility of working from anywhere. When he is not thinking about data, he is found playing with his son or running.

Share this interview:

Share on linkedin
Share on facebook
Share on twitter
Share on pocket
Share on reddit

This is Pritom.

Pritom knows the value of data.

Pritom is smart.

Be like Pritom.