This is an old revision of the document!


ARDI MCP

MCP or the Model Context Protocol is a method for services to integrate into Large Language Models such as Copilot, Gemini, Claude and ChatGPT.

It provides a range of tools that the AI can call to get information about assets and data.

Try It Out

We have a public MCP server for our Paint Line demo.

You can Try MCP with Claude.

Server-Side Setup

Firstly, you'll need to install and enable the MCP Addon. This gives your ARDI server dedicated functions to give data to AI models.

If you are planning on using a web-based MCP server, you'll need to install the MCP service. This is available as a Python module. See the service install instructions.

Local Servers

Most ARDI servers run on local networks and aren't exposed to the Internet. Because of this, you won't be able to use web-based chat to talk to your ARDI system, as there's no way for the provider (OpenAI, Microsoft, Anthropic etc.) to reach your server to ask for information.

In most cases, you'll use a desktop application such as Claude Desktop. This allows your computer to work as a 'bridge' between the two networks - able to reach out to the Internet to talk to the LLM, but also able to reach the ARDI server to ask for data.

The method to connect varies between applications.

Claude Desktop Setup

We will include additional instructions as products become available. If you'd like instructions for your preferred LLM tool - or would like to share instructions you've created - please contact us.