People talk a lot about the democratization of technology. This notion broadly concentrates upon the idea that anybody in any company (in any role, in any department, at any skill level) should be able to use relatively sophisticated software tools that might normally be the sole preserve of highly trained engineers.
Where we once only had software engineers and trained computer programmers, we now have ‘citizen developers’ using low-code drag-and-drop software platforms to design templated apps that other businesspeople will ultimately use. Where we once had highly specialized data analysts and data science professionals, we now have ‘citizen analysts’ who are able to use powerful number-crunching tools (where the complexity of the underlying technology is abstracted to be presented in a higher level user interface) to predict business outcomes and market moves and so on.
The jury is probably still out on whether ‘citizen professionals’ using democratized and simplified technology is a good thing or a bad thing. The reality remains that this is a core trend for the IT industry and many more of us are coming into contact with complex technology (often made simple through Artificial Intelligence – AI) every day.
Talking to data
Data analytics visualization and Business Intelligence (BI) company Tableau Software is following this market trend with its latest product release. As already noted on Forbesby contributor David A. Teich, Tableau’s latest 2019 product release features its Ask Data function to use Natural Language Processing (NLP) to enable people to ask data questions in plain language and instantly get a visual response in Tableau itself.
The company insists that this capability makes it easy for anyone, regardless of skill set, to engage deeply with data and produce analytical insights they can share with others without having to do any setup or programming. This is a logical progression for Tableau in some senses i.e. the firm is already known for putting drag-and-drop functionality into data analytics
Tableau chief product officer Francois Ajenstat explains that the company has engineered a conversational approach into the speech recognition behind Ask Data. This (claims Ajenstat) allows users to ask questions to their firm’s datasets in the way that they naturally think.
“With Ask Data, customers can simply type a question such as, ‘what were my sales this month?’ and Tableau will return an interactive data visualization that they can continue to explore, refine the question and drill into further detail. There is no need to have a deep understanding of the data structure or programming skills. Whether they are a product manager, a manufacturing supervisor, a doctor, or a pizza shop owner, Ask Data enables anyone to have a conversation with their data. It uses sophisticated algorithms that are driven by an understanding of the person’s intent, not keywords, which helps Tableau return more relevant results,” said the company, in a press statement.
Behind the scenes, Ask Data uses algorithms to automatically profile and index data sources. Ask Data knows that when someone types ‘American furniture’ with their sales data, they need ‘product category’ filtered to ‘furniture’ and ‘country’ set to ‘United States’, if indeed they are in the USA and their business is solely focused on the domestic market. In this way, it combines statistical knowledge about a data source with contextual knowledge about real-world concepts.
Is all this safe?
Now that you can talk to a database, a dataset or perhaps even a live dataflow as real-time information traverses your company’s systems… so is that all good news? Largely, initially and for the most part, the sensible answer is ‘mostly yes’.
But AI is not without its downfalls and there are plenty of exampleswhere chatbots have gone rogue and bots have started to edit information for us that they shouldn’t have. Tableau says that its software is built with an intelligent parser(a function to break data apart, separate it and classify it) to automatically cut through ambiguous language to allow users ask sophisticated questions in a natural, colloquial way.
“Often the devil is in the detail when it comes to data, the ability to cut through the plethora of signals and identify causality behind a result takes experience, expertise and skill. Whilst easier access to insight is clearly of benefit to business, granting anyone access to any data can introduce risk, especially when those individuals are not able to identify statistical significance, correlation relevance and the further analysis required before determined decisions can be formed. As well as miss-interpretation, data access is a major security consideration that needs to be thoroughly tested in any automated system, especially when a more dynamic method of access is introduced such as in Tableau’s case,” said Andrew Gorry, director of digital at UK-based free online mortgage broker company Mojo Mortgages, a firm currently using AI and NLU to expand its product offering.
Even if the Natural Language Processing (NLP) is super safe, the parsers are sharpened and the Artificial Intelligence (AI) has built-in safeguards to stop anything too untoward happening… then perhaps the danger still lies in us human beings and our propensity to start talking nonsense to databases and the machines finally getting sick of us, becoming self-aware and starting to run the world on their own.
AI-driven interpretation of contextual language semantics is still a good thing, for now at least.