Hacker Newsnew | past | comments | ask | show | jobs | submit | init0's commentslogin

TCP/IP took nine years to deploy. MCP moved to the Linux Foundation in one. That contrast explains everything about how protocol development has changed.


TCP/IP took nine years to deploy. MCP moved to the Linux Foundation in one. That contrast explains everything about how protocol development has changed.

I've been tracking the explosion of AI Agent protocols over the last 18 months. The contrast with history is staggering: - TCP/IP: 9 years from paper to "Flag Day." - OAuth 2.1: 5+ years and still counting. - Model Context Protocol (MCP): <1 year from launch to Linux Foundation.

It’s not just MCP. In 2025 alone, we saw: - Google's Agent2Agent (50+ partners), - Universal Commerce Protocol (20+ retailers) - AP2 (Payments) all ship. - Agent Protocol, UTCP and few more.

We are entering an era of "Room Consensus": where a few giants agree on a spec and ship it to billions, bypassing the slow deliberation of the RFC era. Is this efficiency? Or fragility?

I break down the landscape of the new agent protocols and what this means for developers in my latest post.


from piragi import Ragi

kb = Ragi(["./docs", "s3://bucket/data/*/*.pdf", "https://api.example.com/docs"])

answer = kb.ask("How do I deploy this?")

that's it! with https://pypi.org/project/piragi/


I built a lib for myself https://pypi.org/project/piragi/


That looks great! Is there a way to store / cache the embeddings?


npm funds is that to a certain extent -> https://docs.npmjs.com/cli/v11/commands/npm-fund



Problem: Every AI app wants you to paste your OpenAI/Anthropic key. Keys spread across dozens of apps with zero visibility, and you can only revoke by rotating the key itself.

Proposal: OKAP (Open Key Access Protocol) - like OAuth, but for API keys.

How it works: 1. Keys stay in YOUR vault (self-host or hosted) 2. Apps request access via token (scoped to provider, models, expiry) 3. Vault proxies requests - apps never see your actual key 4. Revoke any app instantly without touching your master key

Not to be confused with LiteLLM/OpenRouter (those are proxies you pay for). OKAP is a protocol for user-owned key management - your keys, your vault, your control.

Working implementation: - Hosted vault: https://vault.okap.dev - Python SDK: pip install okap - Spec: https://okap.dev

Looking for feedback. Would you use this for your AI tools? What's missing?


Hey folks! I just launched https://mcphost.link/ a web-based MCP host that lets you connect to multiple remote MCP servers and interact with them through a simple chat-style interface.

Key Features

Multi-server support — connect to several MCP servers at once

OAuth 2.0 & Bearer Token auth (with PKCE)

Persistent sessions — servers + credentials saved locally

Full MCP features — tools, resources, prompts

LLM support — bring your own inference backend

The goal is to make exploring and working with the Model Context Protocol much more approachable.

Happy to answer questions, take feedback, or hear feature requests!


SEP-1865 MCP Apps Extension, even though in draft currently, will change how AI agents deliver interactive experiences.

The idea: MCP tools return HTML/CSS/JS directly. The client renders it in a sandboxed iframe. That's it.

Your AI agent calls a tool, gets back a full interactive UI. Dashboard, form, chart - whatever you need.

How it works: - Tool returns text/html+mcp resource - Client renders in iframe with CSP - UI talks back via JSON-RPC 2.0 postMessage - Fully sandboxed, secure by default

Built a sample implementation with vanilla Web Components. This is where MCP is heading.


I got tired of complex agent frameworks with their orchestrators and YAML configs, so I built something simpler.

  AgentU uses two operators for workflows: >> chains steps, & runs parallel. That's it.
``` from agentu import Agent, serve import asyncio

  def search(topic: str) -> str:
      return f"Results for {topic}"

  # Agent auto-detects available model, connects to authenticated MCP server
  agent = Agent("researcher").with_tools([search]).with_mcp([
      {"url": "http://localhost:3000", "headers": {"Authorization": "Bearer token123"}}
  ])

  # Memory
  agent.remember("User wants technical depth", importance=0.9)

  # Parallel then sequential: & runs parallel, >> chains
  workflow = (
      agent("AI") & agent("ML") & agent("LLMs")
      >> agent(lambda prev: f"Compare: {prev}")
  )

  # Execute workflow
  result = asyncio.run(workflow.run())

  # REST API with auto-generated Swagger docs
  serve(agent, port=8000)
```

  Features:
  - Auto-detects Ollama models (also works with OpenAI, vLLM, LM Studio)
  - Memory with importance weights, SQLite backend
  - MCP integration with auth support
  - One-line REST API with Swagger docs
  - Python functions are tools, no decorators needed

  Using it for automated code review, parallel data enrichment, research synthesis.

  pip install agentu

  GitHub: https://github.com/hemanth/agentu

  Open to feedback.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: