I've got the strong feeling that AI model and agent requires different operating system (OS) paradigm that's data centric rather than file-system for more efficient, effective and trustworthy operations. This new OS should work seamlessly with data natively across different processors for examples CPU, GPU, TPU, NPU, accelarators, etc.
For working example, please check TabulaROSA (Tabular Operating System Architecture) proposed by the MIT team. Instead of normal OS system call, it utilizes data based operations with D4M that can work mathematically via associative array with structured or non-structered data [1],[2].
With the advent of new CPU acceleration with fully homomorphic encryption as demonstrated by Intel, the AI model and agent can even analyze the data without even decrypting them [3],[4].
[1] TabulaROSA: Tabular Operating System Architecture for Massively Parallel Heterogeneous Compute Engines
I don’t get it. It says nothing leaves your computer, but it’s sending things to OpenRouter, not running models locally. Perhaps I am dumb (and I always feel dumb after reading an AI generated README for yet another AI tool Tbf)
Yes it appears your personal data IS being sent to open router and the model provider here. The problem I think is that a lot of people (especially in the openclaw community) mistake “I run it on my mac mini” to mean their data is private. Meanwhile all data is being shipped off for training to anthropic via openrouter and both of those parties see everything.
I guess you could theoretically plug in a local model here but of course the readme should be more precise here when talking about privacy
> Yes. OpenYak is local-first. Your conversations and files are stored only on your machine. When using cloud models, only API calls to LLM providers leave your computer.
So local-first and still upload files to cloud models if you configure it.
It sends to OpenRouter if you chose to use OpenRouter. Can use Ollama. Idk how to get more local than that? Any tool will be non-local, when you do something explicitly non-local.
Not to be too conspiratorial here but since the founder of OpenClaw was snatched up, there seems to be a rush of “open source” AI projects desperately bidding to be alternatives. Which can generate huge returns if one of the major players decides that “they also need a cowork-style product”
So its uniquely viable to be a sellout here and attempt to clone a major lab’s attempt on the off-chance you get acquired later
Why did the OP make a comment about the project like he was someone else?
https://news.ycombinator.com/item?id=47560380#47560381
When it's clear he is one of the major contributors to the project?
https://github.com/openyak/desktop/graphs/contributors
Because it’s just silly AI generated spam, don’t read too much into it.
I've got the strong feeling that AI model and agent requires different operating system (OS) paradigm that's data centric rather than file-system for more efficient, effective and trustworthy operations. This new OS should work seamlessly with data natively across different processors for examples CPU, GPU, TPU, NPU, accelarators, etc.
For working example, please check TabulaROSA (Tabular Operating System Architecture) proposed by the MIT team. Instead of normal OS system call, it utilizes data based operations with D4M that can work mathematically via associative array with structured or non-structered data [1],[2].
With the advent of new CPU acceleration with fully homomorphic encryption as demonstrated by Intel, the AI model and agent can even analyze the data without even decrypting them [3],[4].
[1] TabulaROSA: Tabular Operating System Architecture for Massively Parallel Heterogeneous Compute Engines
https://dspace.mit.edu/handle/1721.1/126114
[2] D4M: Dynamic Distributed Dimensional Data Model:
https://d4m.mit.edu/
[3] Intel Demos Chip to Compute with Encrypted Data (121 comments):
https://news.ycombinator.com/item?id=47322815
[4] Intel Demos Chip to Compute With Encrypted Data: Fully homomorphic encryption chip speeds operations 5,000-fold:
https://spectrum.ieee.org/fhe-intel
I don’t get it. It says nothing leaves your computer, but it’s sending things to OpenRouter, not running models locally. Perhaps I am dumb (and I always feel dumb after reading an AI generated README for yet another AI tool Tbf)
Yes it appears your personal data IS being sent to open router and the model provider here. The problem I think is that a lot of people (especially in the openclaw community) mistake “I run it on my mac mini” to mean their data is private. Meanwhile all data is being shipped off for training to anthropic via openrouter and both of those parties see everything.
I guess you could theoretically plug in a local model here but of course the readme should be more precise here when talking about privacy
[dead]
I read it as "everything controlled by us is local first and we do not collect any data about you"
I agree that someone may misunderstand their phrasing though
> Yes. OpenYak is local-first. Your conversations and files are stored only on your machine. When using cloud models, only API calls to LLM providers leave your computer.
So local-first and still upload files to cloud models if you configure it.
>only API calls
Given the software‘s broad appeal, I’d rephrase to make it more clear every word/file you send would leave your computer.
> It says nothing leaves your computer
Where does it say that?
It sends to OpenRouter if you chose to use OpenRouter. Can use Ollama. Idk how to get more local than that? Any tool will be non-local, when you do something explicitly non-local.
> run locally via Ollama
Are you saying this part is a lie?
You're reading it correctly: it's a thin OpenRouter wrapper calling itself local while your prompts still leave the machine.
I still strongly believe every developer should be vibecoding their own cowork/openclaw/devin
Here are the prompts I use for my AI environment, though it's changed a bunch since the last snapshot
https://github.com/rbren/personal-ai-devbox
Neat! I might give it a try.
What do you mean by interfaces in "These interfaces can do literally anything on the host machine. You're responsible for your own security"?
Also, your backdooring image links to a 404.
Thanks for the link. You mention security; is the _average_ developer safer going with OpenClaw?
[dead]
How does this differ to Open Code Desktop?
This doesn't support Linux where Open Code does.
Nice! MacOS download link is a 404
What's the difference between this and OpenWork which has existed for a while?
OpenWork supports Linux where this does not
Anyone else getting a 404 when trying to download?
What does “owns your filesystem” mean? That sounds dangerous.
Its your filesystem which is now, also, owned.
Not to be too conspiratorial here but since the founder of OpenClaw was snatched up, there seems to be a rush of “open source” AI projects desperately bidding to be alternatives. Which can generate huge returns if one of the major players decides that “they also need a cowork-style product”
So its uniquely viable to be a sellout here and attempt to clone a major lab’s attempt on the off-chance you get acquired later
> owns your filesystem
Just when I thought it couldn't get worse than OpenClaw, someone proposes this, in all seriousness. I see a stellar future for them at OpenAI.
A simpler version of openclaw?
So it's like open claw but you have to pay for it?
It looks free / open source to me?
I have used Cowork so much over the last couple of months and I have no reason to switch. But I’ll definitely give this a try.
[dead]
[dead]
[dead]