
MuleSoft AI Connectors, Supercharged: Unleash Next-Gen AI with Inference and Vector Connectors
Connect And Orchestrate Applications, Data, And AI Technologies
Back in September 2024, MuleSoft introduced the open-source AI Chain project: our step toward empowering MuleSoft customers and developers to design, build, and manage AI technologies directly within the Anypoint Platform. With the success of the open-source project and the growing need for orchestrating AI systems across LLMs , vector databases, and traditional enterprise applications, we created the Mule AI Connectors: simplifying AI technology integration and orchestration for the enterprise.
With AI Connectors, MuleSoft users don’t have to leave the ecosystem they know and trust to harness the power of AI. Instead, they can use their existing skills and the full potential of the Anypoint Platform to drive the next wave of innovation.
Today, we’re excited to add two powerful new connectors to our portfolio of AI Connectors: the Inference Connector and the Vector Connector. They unlock key capabilities for building sophisticated AI-powered applications, and bring us one step closer to making agent development accessible to every enterprise team.
Supercharge Agents with New AI Connectors
The Inference and Vector connectors are designed to handle the heavy lifting of AI integration, allowing developers to focus on business logic and innovation instead of wrestling with complex AI plumbing. Together, these connectors are the evolution of the original AI Chain connector, built from the ground up to expand capabilities and improve performance . They are our newest innovations in MuleSoft’s portfolio of AI Connectors, which also includes the MuleSoft AI Chain Connector, Agentforce Connector, and Einstein AI Connector. The Inference connector is light-weight and offers out-of-the-box capabilities to integrate with many of the top LLMs on the market while the Vector connector serves as a specialized offering to allow customers to integrate seamlessly with industry-leading vector databases. Both connectors are also available in the MuleSoft Government Cloud.
MuleSoft Inference Connector: The Universal Translator for LLMs
The Inference Connector acts as a single, unified entrypoint to the world of Large Language Models. Use one simple connector to interact with 30+ models from providers and microservices like OpenAI and NVIDIA NIM , eliminating the need for custom code for each API.
- Integrate LLMs with Ease: Connect to any supported LLM with a standard configuration, allowing you to easily swap models to find the best fit for your needs and budget.
- Automate Business Processes: Empower agents to take action by using function calling to invoke any MuleSoft API, turning text-based requests into automated workflows.
- Govern AI with Confidence: Use built-in toxicity detection to filter harmful content and apply token governance to monitor and control your AI spending.
MuleSoft Vector Connector: Give Your Agents a Powerful Memory
The Vector Connector simplifies how your AI agents connect to and retrieve information from vector databases, which provide the essential long-term memory for intelligent responses. Use the Vector connector to connect to popular vector stores and tools like Milvus , Pinecone , PG Vector , and more.
- Build Smarter Agents, Faster: Easily perform key vector operations like upsert and query to implement powerful Retrieval-Augmented Generation (RAG) without being a database expert.
- Get More Accurate Answers: Improve agent performance and reliability by grounding them in your private company data, ensuring they provide relevant and trustworthy answers.
- Scale with Flexibility: A standardized connector supports multiple vector database providers, giving you the flexibility to choose the best vector technology for your needs.
Real-World AI Orchestration
With the new Inference and Vector connectors, organizations can unlock powerful new levels of automation. Let’s conjure a reimagined customer support experience leveraging these innovative new offerings.
When a customer submits a ticket, the workflow is orchestrated seamlessly:
Contextual Knowledge Retrieval: The Vector Connector instantly queries a knowledge base (containing product documentation and previously resolved tickets) to find the most relevant information related to the issue.
Intelligent Response Generation: The retrieved documents and the original ticket are passed to the Inference Connector. The connector is configured to send a carefully crafted prompt to an LLM, asking it to generate a step-by-step solution based on the provided enterprise context.
Autonomous Action: If the solution requires an action, like processing a return, the Inference Connector uses function calling to invoke a "Create Return" API, automating the next step in the business process.
This entire intelligent workflow is built faster and managed more reliably within the Anypoint Platform, with the new connectors handling the complex AI interactions.
A Foundation for the AI-Powered Enterprise
By integrating AI capabilities directly into the fabric of your application network with the Inference and Vector Connectors, every organization can innovate with greater speed and confidence.
The MuleSoft AI Chain connectors have been a game-changer for BARCO. They abstract away low-level AI complexities and provide the necessary modularity, allowing us to focus on what truly matters — building composable agentic AI solutions that deliver value — fast!
JorisIntegration Architect, BARCO
What’s Next?
As the world moves toward more intelligent, autonomous systems, MuleSoft is committed to providing our customers with the tools they need to lead the charge, turning the promise of AI into tangible business value. We’re excited to continue to add new connectors to our MuleSoft AI Connectors portfolio.
To learn more about the overall vision for AI on Anypoint Platform, check out our MuleSoft AI Connectors page.
Extend your AI capabilities with MuleSoft.
Start your trial.
Try MuleSoft Anypoint Platform free for 30 days. No credit card, no installations.
Talk to an expert.
Tell us a bit more so the right person can reach out faster.
Stay up to date.
Get the latest news about integration, automation, API management, and AI.