Hacker Newsnew | past | comments | ask | show | jobs | submit | monteslu's commentslogin

AI agents want your API keys. Don't give them your API keys.

I built agentgate - a proxy that lets AI read my GitHub, calendar, Bluesky, etc. But any writes get queued for me to approve first. I review them over coffee.


Made an open source MCP (model context protocol) server to let an AI coding agent like Claude Code see its canvas output and also send errors and debugging for quickly iterating on a codebase. Video here: https://www.youtube.com/watch?v=z2on3KelaH4



The source for pagenodes is here: https://github.com/monteslu/pagenodes and you can look at src/editor/nodeDefs/core/espeak.js as an example.

Also MDN has some pretty good documentation on the WebSpeech API https://developer.mozilla.org/en-US/docs/Web/API/Web_Speech_...


Pure MQTT is done over TCP which browsers don't support without an extension. I'm trying to keep this purely web based as long as I can :)

Some MQTT servers tunnel messages via websockets, sever-sent-events, and REST calls which are supported by pagenodes.


HA does have a REST API, but with the way PageNodes works you'd have to hardcode the HA password right into the PN workflow. Have you considered adding the equivalent of environment variables, which can be set in a PN account and used as placeholders in workflows?


That's a good question. Our storage is in local indexeddb. And the site is https, so no one should see your flow if you don't share it. That said there's nothing stopping you from reaching out to another secure service or plugin before making requests.


Am I interpreting it right that Pagenodes basically aims to be node-red, but in the browser?


The goals are similar. PageNodes does its best to leverage newer browser capabilites. WebRTC, WebUSB, SeviceWorker, offline support, etc.


Ok, so less "prototype in the browser, then offload to a server once it works", but more for local "app" type things? Interesting idea, has some limitations but also opens up tons of interactions that are harder for a server-based solution (webcam, ...)


Bingo! It's always evolving too. We do use a lot of experimental flags from the browser, this helps work with up and coming features as well for learning about new APIs very easily.


Of course this was more of a fun example of something you can do with the Page Nodes platform than an actual echo replacement. And a way of getting started connecting services.

Definitely move on to some dedicated hardware if you're serious about this sort of thing.


Need to be able to profile frames/second for HTML5 gaming. The frames part of the timeline tab in Chrome is extremely useful.


I'd also love other performance hints with regards to optimizing for games or other things that require heavy animation.

- Which layers/elements are being hardware-accelerated, and which aren't? Just as importantly, why or why not? - How much VRAM is being used by which element/etc, and how much total RAM/VRAM am I using at a given moment?


Frame-by-frame debugging. Not the first time it's asked. Thanks.



Awesome.

You guys took one of my suggestion(ease of use) and mix match few great tools with your awesome talent and build something really useful.

This is what I exactly thought. And I won :) (and most of JS lovers too).


The Chrome beta on ICS is great, but it's only accessible to <4% of all android devices. ICS has been out for months, and the providers are completely dragging their feet on it.

Chrome Beta needs to be backported to at least 2.3


AFAIK it also doesn't make any difference to apps that use webviews. I mean, it's just as well- if you changed the underlying engine without notifying devs insanity would ensue. But it would be good to be able to set a flag that says "use the decent engine".


You're correct. But no reason google couldn't add a ChromeWebView lib as part of the install of the new browser for PhoneGap to wrap.


Or google could integrate chrome web-apps/NaCL into android, as first class apps.


"ICS has been out for months, and the providers are completely dragging their feet on it."

Can anyone explain why this is such a problem? Does Android not have a proper hardware abstraction layer or driver model that would allow OS updates immediately, as long as these layers remained compatible and a build existed for your processor family?

Windows has supported disparate hardware configurations for many, many years. Yet, generally, you can run out and buy the new version and install it. Why can't Android do this (albeit with a version for each processor)?


Many providers have customized the OS for the look and feel along with adding their custom apps and settings including locking out tethering. One of the big advantages of 3.0 was it was suppose to provide an easier way to separate out the look and feel out of the OS so the providers didn't need to create a custom OS. This make pushing out updates very difficult.

For example, I worked on an app to be included with the device by the OEM. The OEM had roll a custom OS and kernel to handle their hardware and replace/update the mail and calender app because they are very tied to Google by default. We needed hooks into the the mail and calendar app in ways that are not supported by Google so the OEM also had to add those hooks for us. At this point, we were deep into unsupported territory by Google, but supported by the OEM. These things change unexpectedly with minor OS updates from Google which make them brittle to maintain. An update to a newer OS would be a hell of a lot of work.

All that said, if we could have based off of 3.0 instead of 2.3, many of these issues would not exist because of some abstraction layers added just for this purpose. I expect 4.0 to make that even nicer.


Thanks. At least there's some indication that it's getting better.

Too bad the providers have to degrade the experience with their crap, instead of just providing additional apps if they feel it's necessary.


For one thing, last I looked, Chrome Beta requires the GL_OES_egl_image extension for off-thread texture upload, and had support for the ICS-only SurfaceTexture (although judging by the WebKit source that's ifdef'd out). GL_OES_egl_image support is very spotty on pre-ICS devices.


If you're calling i18n and a11y "plugins", you're missing the point. These types of considerations are taken thoughout the framework in a consistent manner.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: