Imagine a world where anyone, regardless of technical expertise, can easily harness the power of artificial intelligence (AI) to gain insights from their data. Where building and deploying machine learning models is as intuitive as querying a database. This is the future promised by MindsDB, a revolutionary open-source platform that’s democratizing AI for everyone.
MindsDB = SQL + AI
MindsDB combines both AI and SQL functions in one; users can create, train, optimize, and deploy ML models without the need for external tools. Data analysts can create and visualize forecasts without having to navigate the complexities of ML pipelines.MindsDB is open-source and works with well-known databases like MySQL, Postgres, Redit, Snowflakes, etc.
How MindsDB works?
MindsDB boasts a unique architecture designed to bridge the gap between developers and AI through its AI SQL Server approach. Here’s a breakdown of its key components:
1. AI Building Blocks:
These are MindsDB’s core functionalities represented as virtual tables accessible via SQL queries. They include:
- Models: Trained machine learning models, allowing predictions and inferences.
- Agents: Pre-built AI functions for tasks like sentiment analysis or chatbots.
- Knowledge Bases: Structured knowledge graphs for entity recognition and relationship extraction.
2. SQL Interface:
MindsDB leverages your existing SQL knowledge, allowing you to:
- Fine-tune Models: Improve existing models using your data within familiar SQL syntax.
- Create AI Tables: Define new AI components like agents or knowledge bases using CREATE statements.
- Query AI Results: Integrate AI insights into your applications and pipelines using JOINs, filtering, and other familiar operations.
3. Datasources:
MindsDB can access data from various sources to fuel your AI models:
- Databases: Connect directly to popular relational and NoSQL databases.
- Applications: Integrate with APIs and web services for real-time data streams.
- Vector Stores: Utilize dedicated data structures for efficient retrieval of high-dimensional data.
4. Jobs and Triggers:
Automate AI-powered tasks and workflows:
- Jobs: Schedule queries or model training at specific times or intervals.
- Triggers: React to data changes automatically by invoking AI operations.
5. Underlying Engine:
MindsDB utilizes various open-source AI frameworks under the hood to handle model training and inference, like PyTorch, TensorFlow, and ONNX. These frameworks remain transparent to the user, who interacts only through the SQL interface.
Why do you need MindsDB Docker Extension?
For Data Scientists, setting up MindsDB in a Docker container might be a tricky situation. With MindsDB Docker Extension, data scientists don’t need to understand Docker concepts but to install it with a single click.
The MindsDB Docker Extension empowers developers to:
- Integrate AI seamlessly into containerized workflows. Ensure consistent environments from development to production.
- Share and deploy AI models with ease.
- Optimize resource usage and security.
- Streamline deployment to production.
Getting Started
Pre-requisite
- Install Docker Desktop(Mac/Linux/Windows)
Install MindsDB Docker Extension
Open Docker Desktop > Dashboard > Extensions and search for “MindsDB” > Click on “Install”
You will need to wait for 1 minutes to get it fully installed on your laptop.
Don’t panic if you see this blank screen for next 20-30 seconds. Have patience and wait till the complete app is up and running.
You will soon the MindsDB Dashboard as shown below:
Predicting Text Sentiment with OpenAI GPT
You can find this sample tutorial in the MindsDB “Learning Hub” section. In this sample example, you will create a predictive model to infer emotions behind a text, a task also known as sentiment analysis. The model, or an AI table, leverages OpenAI’s large language models to complete this task.
Connect a Database
Start by connecting a demo database using the CREATE DATABASE statement:
CREATE DATABASE mysql_reviews_db
WITH ENGINE = "mysql",
PARAMETERS = {
"user": "user",
"password": "MindsDBUser123!",
"host": "db-demo-data.cwoyhfn6bzs0.us-east-1.rds.amazonaws.com",
"port": "3306",
"database": "public"
};
Let’s look at the data stored in the amazon_reviewstable that contains the input columns for the model:
SELECT *
FROM mysql_reviews_db.amazon_reviews;
Configure an OpenAI Engine in MindsDB
Please note that using OpenAI models require OpenAI API key. Therefore, before creating a model, you need to configure an engine by providing your OpenAI API key as below.
CREATE ML_ENGINE openai_engine
FROM openai
USING
api_key = 'your-openai-api-key';
Create an OpenAI GPT Model
Our integration automatically manages requests to pre-trained large language models from OpenAI. For example, we can create a model for text sentiment analysis like this:
CREATE MODEL sentiment_classifier_gpt
PREDICT sentiment
USING
engine = 'openai_engine',
prompt_template = 'describe the sentiment of the reviews
strictly as "positive", "neutral", or "negative".
"I love the product":positive
"It is a scam":negative
"{{review}}.":';
In practice, the CREATE MODEL statement triggers MindsDB to generate an AI table called sentiment_classifier_gpt that uses the OpenAI integration to predict a column named sentiment.
The USINGclause specifies the parameters that this handler requires. You can check our documentation for detailed explanations of all parameters that can be used when creating various models with the OpenAI engine.
Once the above query has started execution, we can check the status of the creation process with the following query:
SELECT * FROM models
WHERE name = 'sentiment_classifier_gpt';
It may take a while to register as completedepending on the internet connection.
Make a Single Prediction
Once the creation is complete, the behavior is the same as with any other AI table – you can query it either by specifying synthetic data in the actual query:
SELECT review, sentiment
FROM sentiment_classifier_gpt
WHERE review = 'It is ok.';
Make Batch Predictions
Or by joining with another table for batch predictions:
SELECT input.review, output.sentiment
FROM mysql_reviews_db.amazon_reviews AS input
JOIN sentiment_classifier_gpt AS output
LIMIT 5;
The amazon_reviewstable is used to make batch predictions. Upon joining the sentiment_classifier_gptmodel with the amazon_reviewstable, the model uses all values from the reviewcolumn.
MindsDB enables in-database ML with your favorite frameworks. You need just a handler for MindsDB to integrate the ML framework of your interest.
Conclusion
In essence, MindsDB aims to make AI accessible and approachable for developers by leveraging familiar SQL tools and integrating seamlessly with existing workflows. This enables developers to build AI applications without extensive machine learning expertise, paving the way for wider adoption and democratization of AI.
References
Keep Reading
-
Testcontainers and Playwright
Discover how Testcontainers-Playwright simplifies browser automation and testing without local Playwright installations. Learn about its features, limitations, compatibility, and usage with code examples.
-
Docker and Wasm Containers – Better Together
Learn how Docker Desktop and CLI both manages Linux containers and Wasm containers side by side.
-
All Things Cloud Native Meetup: Join Us in Bengaluru! 🌟
Are you passionate about Cloud-Native technologies? Do you enjoy exploring topics like Docker, Kubernetes, GitOps, and cloud transformation? Then mark your calendars! Devtron, Nokia, and Collabnix are collaborating to host “All Things Cloud-Native,” an extraordinary gathering for cloud-native enthusiasts, technologists, and DevOps experts. It’s an opportunity to immerse yourself in the latest trends, tools, and…
-
How Do Coaxial Pogo Pins Differ from Standard Pogo Pins?
In the world of electronics, connectors play a critical role in ensuring seamless communication between components. Among these connectors, pogo pins stand out as versatile and reliable solutions, offering both flexibility and precision. Within the pogo pin category, two primary types are commonly discussed: standard pogo pins and coaxial pogo pins. While they share similarities…
-
How SAST Enhances DevOps Pipeline Security
Static Application Security Testing (SAST) plays a crucial role in enhancing the security of DevOps pipelines. By integrating SAST early in the development process, teams can identify vulnerabilities right within developers’ integrated development environments (IDEs). This proactive approach allows for faster remediation and reduces the likelihood of security issues appearing later in the pipeline. While…