• 0 Posts
  • 16 Comments
Joined 1 year ago
cake
Cake day: November 28th, 2024

help-circle





  • The author uses their feelings a lot to describe their distaste for MCP. It doesn’t read like a very well informed article should, in my opinion.

    MCP servers are like proxies, that can adapt their represented tools and resources based upon external conditions. They’re far from being static in nature and can provide an entry point for otherwise hidden or secured functionality. Ie. some actions may be provided via an MCP and not otherwise. File resources may be present behind an MCP server and not otherwise. Tools may be relevant for a certain agent and not others, or they may become unavailable.

    That, and regular APIs often don’t expose data in a streaming capacity that LLMs benefit from. That’s why you see MCP servers serving HTTP streams or SSE.

    Static files make this inflexible for what, simplicity?

    At least we have a standard now, for this kind of thing. Static files would be a lazy half-arsed solution at beast.










  • Get everything migrated across to my new k3s cluster. I’ve been using larger boxes (unraid) and a couple of 1L mini PCs with proxmox to run my homelab until now… but I work with kubernetes and terraform daily and wanted something declarative.

    I’ve now got k3s setup with a handful of services migrated (Immich, Tailscale, Nextcloud etc) but there’s still a ton to go (arr suite, various databases, Plex, Tautulli etc). It’s another job entirely.

    I love it but sometimes I wonder why I do this to myself 😅


  • I appreciate the sentiment here, though I would agree that it is certainly paranoid 😅. I think if you’re careful with that you self host, where you install it from, how you install it and then what you expose, you can keep things sensible and reasonably secure without the need for strong isolation.

    I keep all of my services in my k3s cluster. It spans 4 PCs and sits in its own VLAN. There isn’t any particular security precautions I take here. I’m a developer and can do a reasonable job verifying each application I install, but of course accept the risk of running someone else’s software in my homelab.

    I don’t expose anything except Plex publicly. Everything else goes over Tailscale. I practise 3-2-1 backups with local disks and media as well as offsite to Backblaze. I occasionally offsite physical media backups as well.

    I’d be interested to see what others think about this… most hosting solutions leave it all open my default. I think there’s a lot of small and easy ways one can practice good lab hygiene without air-gapping.