
Vol. 3 No. 03 (2026): AI-Powered Library Management System Using Model Context Protocol and Large Language Models
This paper presents an AI-powered Library Management System (LMS) that combines the Model Context Protocol (MCP) with a cloud-based large language model to interact through a smart virtual library assistant called LibraBot. Instead of rigid keyword search, the system offers a natural-language chat interface built on the Ollama Cloud LLM and the SmolAgents platform, enabling users to query the library in conversational form. Nine dedicated MCP tools encapsulate core library operations, including checking item availability, locating items on shelves, managing reservations, and recommending books based on their content, all connected to a catalog of over one hundred B.Tech engineering titles stored in SQLite and Excel and served via a multi-page Flask web backend. Personalized recommendations are generated using a TF–IDF cosine-similarity model that performs pure content-based filtering and does not require any historical user interaction data. Across 70 test scenarios, the system achieved a 100% task success rate, with an average response time of 1.2 seconds and uptime of 99.5% even under concurrent loads of up to 150 users, indicating that MCP-based tool orchestration with LLM reasoning can provide a scalable, always-available alternative to traditional library management workflows.