Artificial Intelligence

The Democratization of Artificial Intelligence

To enable the democratization of Artificial Intelligence in the true sense of the phrase, organizations must facilitate a few other democratization initiatives within themselves

The democratization of artificial intelligence (AI) is in the top 5 technology trends identified by the majority of research organizations out there. The concept, going by its definition, is simple to grasp and envision. The complexities of it, however, come to light when you get down to the process, the implications, the challenges, the forces behind it, and the specific calls to action. That’s what will be the focus of this article, starting with the basics as follows.

What is the democratization of Artificial Intelligence?

To answer that, let’s look at the meaning of the word “democratization”. The dictionary definition is “the action of making something accessible to everyone”. So, extending that to the question at hand, it means making AI accessible to the masses. In practice, it means that AI is breaking out of the realms of technology experts, enterprises, and platforms with deep pockets, i.e., the New Age unicorns.

How is Artificial Intelligence making its way into the mainstream?  

To enable the democratization of AI in the true sense of the phrase, organizations must facilitate a few other democratization initiatives within themselves.

Foremost amongst these is the democratization of access to data. Unless everyone in the organization has access to clean and curated datasets to run their Machine Learning (ML) algorithms against, the democratization of AI is not possible. A data catalogue plays a very vital role here, as it makes curated datasets discoverable. This, of course, is assuming that the right checks, governance, and access-related privileges are in place.

The second democratization initiative must focus on improving access to the right tools and technologies for data preparation. This includes data curation, cleaning, merging and so on, with the intent of making the data consumable by the ML models/algorithms. There are various commercial and open source tools which enable this functionality.

The last one on this list looks at access to predefined ML algorithms; something which will not require an individual to code ML algorithms using languages such as Python or R. Various companies offer a drag-and-drop kind of interface for building ML models and offering marketplaces. This is a key development in the movement towards the democratization of ML.

The combination of highly scalable cloud computing platforms and specialized chipsets, which enables the running of advanced ML models, is a significant catalyst in accelerating this process.

What are the implications?

Two advantages of this can be seen: bridging the deficit of costly data scientists and increasing the adoption of these ML models. The first will definitely result in lowering training and recruiting costs. Developers and advanced business users will be able to build models to validate their hypothesis around ML. The quality of the business problems that the models solve will also have a higher chance of production deployment, as the models are built by the people who know the business and the underlying data very well. This will feature prominently in use cases that are very vertical focused. ML adoption will increase significantly, which will result in a very interesting mix of software that will power our world.

What are the challenges?

One needs to watch out for the “Garbage In-Garbage Out” phenomenon. An enterprise needs to consider the pitfalls before opening huge datasets with non-experts, who may not be well trained in the collection and processing of data, and in the use of the right statistical and modelling tools to build models around these data sets. Training, governance, and the proper checks and balances will need to be in place to get that off the ground.

What is the call to action for enterprises?

What should enterprises do to unlock the true potential of ML? Here is a quick list of steps:

  1. Invest in data governance, data cataloguing and data sharing platforms. This will allow for curated data sets to be provided and discovered by anyone interested in analyzing that data.
  2. Invest in tools, technologies and platforms geared towards self-service data preparation. This will facilitate the massive compute required by advanced ML algorithms.
  3. Invest in platforms which will allow users to stitch ML algorithms. This can be achieved using easy drag-and-drop mechanisms, or through API based access.
  4. Launch training and awareness programs. These will help get your developers and power business users aligned with the concepts and principles of ML.
  5. Ensure that you have a team of trained data scientists on board. They should join forces with the developers or business users who are working on ML applications, coordinating to ensure that they are deployed in production.

Today, there’s a lot of talk about enterprises opting for “no code” or “low code” platforms for their applications. The objective there is, again, to make continuous transformation possible for many different users, so there can be large-scale adoption of new technology. The democratization of ML will usher this phenomenon into the AI/ML world as well, allowing the movement towards technological progress to take place in big leaps.

By Sameer Dixit, General Manager – Data, Analytics and AI/ML, Persistent Systems

Leave a Reply

Your email address will not be published. Required fields are marked *