Chat Assistant
The AI-Powered Professional Profile - My Personal MCP Server
Bridging My Profile with the Future of Information Retrieval
In an era where Large Language Models (LLMs) are rapidly becoming a primary interface for information discovery, I began to think about how professionals like myself could present their skills and experience in a way that's not just human-readable, but also optimally structured for AI consumption. The static nature of traditional resumes and online profiles, while valuable, often requires manual updates and can lack the dynamic, queryable depth that modern AI systems are capable of leveraging.
My goal was to create a "living resume". A dynamic, always up-to-date, and interactive source of truth for my professional journey. I envisioned a system where an AI could directly and reliably access detailed information about my skills, projects, experiences, and even recent contributions, going beyond what's typically available on a standard LinkedIn page or PDF resume. This led me to explore the Model Context Protocol (MCP).
What is It? My Personal Professional MCP Server
This project is a custom-built MCP Server designed to provide comprehensive, structured, and real-time information about my professional profile. Instead of an LLM relying solely on its potentially outdated training data or general web scraping to "know" about me, it can (with the right setup) directly connect to my personal MCP server.
This server exposes various aspects of my career as "tools" and "resources" that an AI can utilize:
In this page, I've integrated a chat interface that allows visitors to "talk" to an LLM that is, in turn,
using my MCP server as its primary source of information about me. This offers an interactive way to explore
my profile in depth.
You can also directly connect to the MCP server using this address:
https://staging-selfmpc-hf9i.encr.app/mcp/sse
Technology & Architecture
Building this system required a modern, scalable, and developer-friendly backend architecture. Here is a high-level overview of the technology stack:
Backend Framework: Encore.dev with Go.
I wanted to get familiar with Encore, this project was a good fit to experiment with it.
Its strong opinions on infrastructure, automatic boilerplate generation, built-in observability, and typed service-to-service
calls allowed me to focus on the business logic of each data domain (profile, experience, projects, etc.) without
getting bogged down in complex infrastructure setup. Each aspect of my professional profile is managed by a distinct
Encore microservice.
MCP Server Implementation: Also built with Encore, acts as the dedicated MCP Server. It receives requests from MCP clients, interprets them according to the Model Context Protocol, and then communicates with the various backend Encore data services to fetch the required information. The responses are then formatted according to the MCP specification.
MCP Host Implementation: To facilitate the interaction between an LLM and my MCP Server for
the portfolio chat, I've developed an MCP Host component. This host leverages the Go MCP library from github.com/mark3labs/mcp-go, which provides the
necessary MCP Client functionality.
This MCP Host is responsible for managing communication with both the LLM and my the MCP Server.
WebSocket Server (Chat Backend): A WebSocket server, also written in Go, provides the
real-time communication layer for the chat interface on this portfolio.
When a user sends a message through the chat, the WebSocket server receives it and forwards the query to the
MCP Host.
The MCP Host, using its MCP Client, interacts with an LLM. The LLM, in turn, can decide to use tools or resources
by making requests to my MCP Server via the MCP Host.
The final response from the LLM (enriched with data from my MCP Server) is then relayed back through the WebSocket
server to the user's chat interface.
Data Storage: While some initial data is statically defined within the services for rapid prototyping, the architecture is designed for PostgreSQL databases managed by Encore, allowing for dynamic updates and scalability.
External API Integration: For features like listing GitHub contributions, the system securely connects to the GitHub REST API.
Frontend (This page): The interactive chat on my portfolio uses standard web technologies. It establishes a WebSocket connection to the chat backend, sending user queries and displaying the LLM's responses.