David A. Teich Senior Contributor | Forbes
A regular message in this column is that artificial intelligence (AI) won’t spread widely until it’s easier to use than the requirement to have programmers who can work at the model level. That challenge won’t be solved instantly, and it’s slowly changing. While technical knowledge is still too often required, there are ways in which development time can be shortened. One way that’s been happening has been is the increased availability of pre-built models.
A few years back, a tech CEO loved to talk about the “Cambrian Explosion” of deep learning models, as if a lot of models meant real progress in the business world. It doesn’t. What matters is the availability of useful models useful for business. In the usual meaning of the cliché, the 80/20 paradigm still matters for business. While a large number of models might be of interest to academics, a much smaller subset will provide significant value to people attempting to gain insight in the real world.
In an attempt to help companies not have to recreate the wheel, ElectrifAi has built a body of AI models that can be called by applications. Those models are identified by use case, so developers can quickly narrow down options and chose to test models close to their needed use. I first became aware of the company when it issued a press release about entering the marketplace on Amazon SageMaker. They are also on the Google Cloud Marketplace.
Having worked with other companies using major sources’ marketplaces, I was curious. While there are still long term questions about such marketplaces, including how acquisitions of companies might impact partner applications on those marketplaces, it was important to find out more.
One key issue about buying models is the fact that privacy is increasingly important. Yet another company seeing data could be a compliance weak spot. “We build and support models for our customers,” said Luming Wang, CTO, ElectrifAi. “However, our business model is that we don’t see their data and they don’t see our code. While we pre-structure and partially train models, we provide support and services that help customers tune models to their own use with their own data without us needing to see any information.” Outside of those marketplaces, the company also works with systems integrators and other partners who work with their clients in implementation.
Mentioned earlier was the customers being able to choose appropriate models. That also extends to the fact that when AI is mentioned, we’re not only discussing deep learning. The models are built in a variety of AI techniques, including rules engines, xgboost, and neural networks (deep learning). “Different domains require different techniques,” said Mr. Wang. “Still, rules engines can work seamlessly with neural networks for complex problems. Over one hundred rules, a neural network has advantages. In between, depending on context and data, either technique or other technologies can be used.”
Given the focus on building a library of models for business, it is no surprise that very few of the models have a UI to present their own data. The models are accessed as function calls by the controlling applications. This is a key step in the evolution of AI accessibility. There needs to be some AI knowledge in order to evaluate the appropriate models to choose, but once that decision has been made, non-AI programmers only need to understand the calls and then can use the results in the wrapper application to address the business solution.
This attitude is excellent for the current state of AI in business. This presents AI not as something scary, or something requiring expensive and unique personnel, but rather as another easy to call function that can be accessed quickly by existing programmers working so solve a problem. The more programmers can access AI by calls, without having to know the details of a neural network or random forest, the faster AI will spread through the corporate technology infrastructure.