Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Duple – Private cloud at home (duple.io)
78 points by louisknows 56 days ago | hide | past | web | favorite | 110 comments

I'm really into home Cloud stuff. This answers none of my questions.

1) Where will my files be stored?

2) Does Duple store a copy?

3) What does Duple do with my meta data? What do they do with usage stats?

4) Does Duple have any third party contracts or contractors with access to my data?

5) How does putting Duple software on an RPI make my files available from anywhere? How do you enable access to my device from anywhere?

These guys are asking for an awful lot of faith from users who are looking to potentially store their entire lives on their service. We deserve to know how it really works.

A lot of those questions are answered in their FAQ.

The Duple application on your 'server' will simply expose a directory on that system (or a remote SFTP server) to other Duple applications you connect to it. All data will be on the machine you maintain. So no, they are not storing your data for you.

To enable access to your device from outside your local network, you have to either give the Duple running machine a public ip and open the port, or do port-forwarding if you are behind a NAT firewall (which is the most common situation for home cloud stuff)

So this concept is similar to BitTorrent Sync? (Or whatever it’s called now)

Yups, and syncthing as well. and probably others.

What is the purpose of having "the cloud" in your home at all? To me "the cloud's" primary value is that it is running off-site, in an environment that capably and regularly deals with broken equipment. If my work is backed up in Google/Dropbox's data center, I don't have to worry about it if my house gets flooded.

Having the cloud in your house, running on whatever stuff is laying around, just seems one small step removed from just keeping everything on your laptop and hoping it doesn't break.

You're not wrong about the benefits of the Cloud. Some people make bomb shelters in their basement, some people stockpile canned goods. I stockpile data. I even have an old text only offline copy of the English Wikipedia.

My main issues are privacy. I don't want Google peeking into everything I upload for marketing materials. I also don't want to trust a third party company with my kids photos. They can change their privacy policy in the blink of an eye, or disable my account by accident without reason or warning.

Then there's the security aspect, and I'm settling in for a fight here. Google, AWS, Azure... These are all consolidated attack vectors. When WWIII breaks out it's not starting in the physical world. It's starting on the internet, and the first casualties are going to be major infrastructure providers followed immediately by FB, Google, Microsoft, and AWS.

Take out those 4 companies and our world is in chaos. Your company won't get their emails, their apps will be offline, and half our economy will be out of business by morning time.

But not me. My RAID arrays are still spinning away and if I want to I can reach out and touch them. Who knows where everyone elses data is. That's not my problem.

Great dialogue here, you should turn it into a video with some epic music in the background, and fading images in and out, rock flag and eagle.

And host the video on a rasberry pi, at your home, on Peertube

I have been enjoying learning about self hosting, Matrix.org has been the next project I am interested in, once I get bored with other current hobby projects. Hosting my own node, that and I am waiting for the Google photos clone to get up version 1.0 so I can self host all my images too.

> waiting for the Google photos clone to get up version 1.0

What is the closest open source alternative to Google photos?

It's a pretty narrow window of global chaos where this approach is advantageous. You're optimizing for a situation where cloud infrastructure is demolished, but the power grid and/or fuel supply chain required to keep your specific RAID arrays spinning is operational.

One is dependent on the other, no?

If I don't have electricity I don't have ANY storage. Mine or Google's.

But I can still have mine without Google. It doesn't work the other way around. And Google's got a much larger target on their back than I do.

Well, all good points, except no one really cares about your data on the cloud. You might see relevant ads though, i.e. your particular docs and photos are just slightly above pink noise.

And "take those 4 companies out" is - close to impossible. In case of WW III, you might have more things to worry about than your kids photos.

I'm fine with Apple having a copy of my notes, docs and pics. I do have a local backup. I'm not going through the trouble of trying to host it on my 5Mbit upload stream.

I like to have control over my data and I don't have a lot of trust to most cloud storage services (especially not google).

That said in the end it's personal preference, it's not for everyone. I'm also a hobby admin who enjoys it.

I mean, what you're looking for is... an external hard drive, which is what this essentially appears to be (but maybe with some nice versioning built in).

I don't know if it's possible to distinguish this product from a USB stick with git.

The thing you're missing is the ability to access it from anywhere, without having to carry around devices. The whole reason people use the Cloud is because they leave USB sticks and drives behind all the time. Home Cloud solves the trust issue of using a third party to handle private data and also the problem of leaving your storage devices all over the globe (where they can be stolen).

Ah, I found their FAQ - sorry I'd been trying to figure stuff out via the marketing material. So it looks like it's basically a multi-transport supporting SFTP-like server that is mostly based around intelligently syncing and ordering your file versions.

The application provides no discoverability solutions (so to use it remotely you've got to know how to acquire a static IP, maybe a name - and then configure firewalls as needed) and from what I can see there is no data replication other than the optimistic statement "The data will probably be on at least one local server and the central file store".

Home Cloud is just a ridiculous phrase. Cloud should mean raided hard drives inside clustered servers, with failover switches and routers, possibly even geo-redundancy. A single point of failure in almost every regard, even accessible remotely, is the antithesis of abstracting away failure points. Cloud should not be restricted to abstracting away hardware and only surfacing a provisioning layer, it needs to include invisible redundancies.

Really? Because anyone with a PC and an internet connection can assume their own hosting duties without all that.

And if they really wanted to make up the geographic shortcomings they can image and encrypt their files and still use Google drive like everyone else. There's your redundancy with privacy.

Cloud, to me, just means "someone else's computer."

Cloud is supposed to be where downtime isnt a thing, or isnt really possible (minus dns of course.)

Someone elses computer, without failovers, will never be "always up."

Someone elses computer is just "hosting." A cloud is a datacenter, not a server.

Your own computer available from anywhere on the internet is just "remote access."

Cloud has always meant either distributed computing or complexity abstracted away because its moot and only the resulting service matters.

Hum, my memory of the early days of "cloud" as a term paints a different picture.

You had highly resilient, scalable, backed up, distributed servers before the "cloud": cloud provided all that with an API and instant UI, and made it affordable.

In that sense, "putting my data in the cloud" always makes me giggle, even though I accept how the term has come to be used. What people usually mean with that is that they are using an online file storage service that may or may not be implemented using a cloud infrastructure, but they don't really care.

I mean, why is OpenStack "a private cloud" (or Eucalyptus if you remember that one)? And why wouldn't you be able to do DNS on cloud infrastructure as well (in fact, many do)? FWIW, DNS itself is a prime example of a resilient (virtually no downtime), scalable, distributed service that predates "cloud" by decades.

What decade are you calling the early days? Early 90s, mid 2000s?

Mid-2000s: the term "cloud" was highly popularized by Amazon, and went on to have this particular connotation for a while.

In the last decade, it has turned to simply mean "on the internet", which, to me, is unnecessary since you can just say "on the internet" instead. Eg. how is an internet file storage service different from cloud storage service?

Late 80s

the "cloud provided all that with an API and instant UI, and made it affordable" came in the 80s?

Semantics aside, are you saying that someone with extra compute resources shouldn't put those resources to work to store their personal data to reduce reliance on for-profit third parties?

What is your argument here? That people shouldn't be encouraged to do something on their own? That people shouldn't bother reducing their reliance on "freemium" or paid PaaS companies, even if they have the skills/resources to do so?

You seem to want nothing more than for me to admit that "the online place where I store my personal files" is not the same as a "Cloud" and in reality it doesn't matter what it's called. It's purpose is the same, and it fulfills that purpose well.

It's a completely semantical argument. I said the phrase was ridiculous.

I didn't once say anything about the technology or whether it was better to store things yourself. All I said was those other things have their own names and calling them cloud is either misleading or lazy.

And yes, that's exactly what I want you to say, that your home server, despite being remotely accessible, isn't covered under the definition of cloud computing.

> No server needed. No expensive hardware required... You can use your Router, NAS, Raspberry Pi, Smart TV.

So... it turns those things into servers then (many of which already run servers by default)? Maybe I'm misunderstanding, but a device running server software is still a server, even if it isn't dedicated to only that task.

The FAQ (linked from the very bottom of the page) is more interesting. The server part is just a file server. The client in this case handles all the file syncing, including locking. The client is the interesting part.

Cloud = servers someone else has and mostly maintains for you

Cloud...at home? This is a dumb headline. The definition of "server" has gone from big bulky slabs of bare metal to include smaller things like a RPi.

> Cloud = servers someone else has and mostly maintains for you

No, ”cloud = on-demand provisioned server resources”; that was what distinguished cloud computing from pre-existing remote-leased or on-premises resources. “Private cloud” (usually, on-premises of the user) has been a thing since very early in cloud computing, and “at home” has been plausible for even fairly casual users at least since Ubuntu 9.04 bundled Eucalyptus.

But this service is just abusing “cloud” to mean “SaaS”, I think (in that it's a demo of a paid SaaS product that incorporates your hardware.)

No, cloud does not mean that. Generally the cloud means that you can access data stored on a cloud as long as you are connected to the internet (or the same network). I don't know where you got that definition of cloud from.

I think the biggest issue in this thread is the multiple definitions of what a cloud is. To most of the people in this thread, the cloud isn't possible at home. To consumers, the cloud is just somewhere else to store their files. This addresses the latter, giving consumers an easy way to store their files on a "server" i.e. their computers, while having it in your home.

Reminds me the ridiculous "no software" campaign by Salesforce.

LMAO, now I need to hear about it.

Every computer can be a server and every server is a computer. The only difference is on a layman's mind. Just because you're interfacing with the computer through a screen instead of a network connection doesn't mean the backend service/OS calls changed a bit.

From what I can gather from the FAQ, this won't work for routers behind a NAT, which is the case for me and I assume many others too, this should probably be mentioned somewhere.

A bit worrying that it'll be forever closed source, especially since it needs to expose your router to the internet and that it's written in C and also this sentence from the FAQ:

> Everything was built from scratch

Long live zerotier

Or wireguard with your own DNS (unbound) blackhole-ing ad networks: https://www.ckn.io/blog/2017/11/14/wireguard-vpn-typical-set...

Or IPv6. Personally, I like running a Mikrotik Hex Lite as my home router with Ubiquiti wireless APs behind it. I've definitely learned a lot about networking trying to get everything set up nice and secure. That'd make a good blog post, actually...

EDIT: Zerotier is a great service. I kind of dream of some collaboration between Zerotier and Wireguard. Zerotier is Software Defined Networking, I guess(?), and it'd be cool to have wireguard be part of that stack. Though maybe that's not necessary. I haven't dug into Zerotier recently enough to know if thats redundant or at all useful.

Thanks for the tip. I'd never heard of it. For others like me: https://www.zerotier.com/manual/

Here's their getting started page (not terribly mobile friendly though)


This is the offer:

"The beta version of Duple is now available. You can download it and use it for free. However if you’d like to participate in the Duple beta program, and get a lifetime discount as a reward, click here."

But I don't understand what we are paying for going forward. Presumably you pay for the software once. Is there a recurring charge of some sort? What for? If the software is on premise, you can't really turn it off or stop it from working.

Does duple, the company, ever touch the data?

If privacy is the killer feature, it seems that an open source version will ultimately displace this.

Better to use syncthing, simple equivalent & free / open source.

Syncthing is great (I use it for all my syncing needs), but it's scope is limited. It doesn't really solve file sharing in a user-friendly way.

While I agree Syncthing is about seamlessly syncing files with devices you own, not file sharing as such, it does a good 98% of what this program offers with a more open source background and is completely free.

I share files, what, once a year? Syncthing works great for that use case. If I want to share files too big to email, I can stand to upload to Onedrive/Dropbox/an open directory on my web server for that one-time use. For more frequent setups, sure an Owncloud/Nextcloud instance wouldn't be amiss.

Oh I wasn't arguing in favor of using this. I doubt I'll ever use non open software for my data again. Just saying I've run into problems trying to use syncthing for sharing files.

No offense intended or taken, you and are I are in complete agreement re: what software should be touching our data.

I believe you when you say you've had problems performing file shares to arbitrary people, my comments were directed only that my use case is somewhat different.


Agreed. This just feels like closed-source syncthing.

It's free now, but you have to enter a magic code. The code expires in a month or so and then you'll need a new code, which will presumably cost money. So it's a free trial.

I found this in the FAQs:

> We will have in the future our own dynamic DNS name service that will be included in the app price, so people don't have to pay for it with another company if they don't want to.

They have some fairly hefty competition:



I'm pretty sure that most NAS vendors have some variation on "Home Cloud."

I think most folks that want to set something like this up, will want a turnkey solution.

Looking at their "about us" page removed my confidence from their product. They should fix that asap.

For my data, I want reliability and security. I want something that has been battle-tested, and that I am confident will be around in a few years. This is why I went with Synology. Two guys in a "garage/bootstrapping" does not instill confidence.

I recently went from a DIY NAS (Linux PC with lots of HDDs shared on my LAN) to a Synology and am now absolutely in love with it.

All my devices now back themselves up to the NAS overnight. Cameras dump direct into the NAS via a card reader plugged into it. Family can all browse the family photo library from their phones or any device they want or cast slideshows to the TVs. Download torrents with my phone, saving directly to the NAS, then immediately watch it on my TV transcoded in real-time with automatic pulling of subtitles and metadata. I want to marry it.

Sure I’d prefer if all this was open source etc, but I played with FreeNas and various other packages and they don’t even come close in usability or ease of administration.

https://doc.duple.io/duplecli-documentation/ https://doc.duple.io/faq/

So basically Dropbox but with an SFTP server or network/local filesystem for hosting.

When someone says "modern day snake oil", this is what I think of. A "cloud" that isn't the cloud at all, pretends to be more reliable and secure than it is, and over-sells basic features like replication. "This USB stick will solve all your problems for a low, low price!"

I'm sorry but Nextcloud is getting a bit of an unfair shake here. I recently got a raspberry pi 4 and installing nextcloud on there wasn't rocket science: https://smalldata.tech/blog/2019/07/12/setting-up-a-raspberr...

In fact the biggest issue that I faced was that my router did not support NAT loopback which led me to using the pi for DNS in order to be able to use my "private" cloud.

Nextcloud is an open-source dropbox and is written in PHP. It can be very easily installed via docker and is quite mature at this point with a rich ecosystem of 3rd party apps for functionality other than file sync. Big props to the folks working on it!

It's not open source and it's not for sale, even though it's a business, so beware about what it's future may be.

However it might be reusable with an alternate core:

> Q: Do you plan to open-source it later? A: We'll open source everything (server, interface, etc...), except from the C Library. Reason being that the library is what gives us our technical competitive advantage (being that you get the full private cloud experience with no need for a server). It's also important to note that you can't patent your code/algorithm in Europe, so there's no other way to protect it. But everything else expect from the library will be open-source.

How is this different from syncthing - https://github.com/syncthing/syncthing/ ? I am still trying to figure out

Syncthing is peer-to-peer, Duple is one master + many slaves.

First time I have seen Syncthing. I am currently using Resilio Sync to sync my "Sites" directory across my iMac and MacBook Pro. Resilio seems to work well, but it has not been updated in a long time (running v2.6.3) and sometimes it uses nearly 100% of a single CPU core for no apparent reason for extended periods of time.

How does Syncthing stack up against Resilio?

"military-grade encryption" in marketing materials always scares me. Just tell me what the algos are

ROT13 of course, designed to Roman military specifications.

Caesar ciphers mostly shifted by 1 or 3. Their alphabet had only 23 letters (no J, U, or W) so ROT-13 wouldn't had the cool property of being its own inverse.

ROT13 is 4 rounds of Caesar-3 and one round of Caesar-1, and multiple rounds of encryption and mixing different modes makes it that much more secure, right?

My mistake, ROT-13 is actually beyond military grade :D

A bit offtopic, but does anybody know a micro server the size of an Intel NUC or Mac Mini, but with 100+W desktop/server (real, not cloud) hardware for use as CI server?

Check out ASRock's DeskMini series [1]. While slightly larger than a NUC you can throw in a 65W CPU. Especially the DeskMini A300 [2] for AMD APU's seems to be pretty popular.

[1]: https://www.asrock.com/nettop/index.asp#DeskMini

[2]: https://www.asrock.com/nettop/AMD/DeskMini%20A300%20Series/i...

I wanted also a small, energy efficient server for CI at home. I was looking at SBCs (Single Board Computers) like Raspberri Pi and now others. Most are ARM based and max out at 2Gb or 4Gb RAM. I ended up buying an Odroid H2, which is x86 and takes up to 32GB RAM. I just received it so haven't put it together yet... also depends what you want to do with it how much you want to spend, I bought 32GB RAM, 1TB nvme and 6TB SATA HDD and plan to run everything on containers (docker). Also instead of say, Gitlab, Jenkins and Artifactory I think I'll use Gitea, Drone.io self hosted and I was looking at strongbox instead of Artifactory but not sure if it's production ready.

What kind of enclosure are you using for it?

I currently use a server I built inside a Fractal Node 304 case for that same purpose. While compact for a regular PC type server, it's still a little bulky for my liking.

Odroid-H2 Case Type 1

Any chance you're going to post updates on this? I would love to hear how this treats you!

I was considering writing a blog post

Good question. I too wonder. To extend your question with my own; What would be a cost effective solution to setting up home servers for CI/storage/whatever?

I could of course just buy random PC parts, but I'm curious to hear what people are liking. Are you liking your home servers form factor? Price tag? etc.

I have long been a fan of Synology NAS's for this type of thing (I have been using one since 2007). They offer CPUs these days that should be beefy enough for a CI use case, unless you have a very large project. QNAP offers similar products as well.

I bought one as I just wanted a low power always-on server that Just Worked with a software ecosystem around it. You can now run docker containers on them as well easily.

I can access at home or via the web- they provide a dyndns service so I can just go to http://quickconnect.to/$username and access from anywhere.

A Raspberry Pi 4 with 4GB Ram or a Intel NUC Skull Canyon

A Pi is too weak for my workloads I'm afraid ;) Actually I considered a Hades Canyon (successor to Skull Canyon) with an early Intel 10th gen part but it has it's iGPU disabled and comes with a Radeon (!) GPU which just calls for trouble and additional cooling, though it's still much better than nvdia for Linux I guess. But then I'd rather go all-in on AMD CPUs which are winning in the bang-for-buck and multi-threaded department anyway. Plus, I don't want lasercut skull-shaped grilles and similar gaming traits if I can help it ;)

Edit: also it's already out well over a year, and given Intel's atypical immaturity wrt their 9nm parts, it might be better to skip on this one

After the latest update the mac mini is actually a very fine server. They can be configured to hold a 6-core i7 with 64gb RAM.

Not bad I guess, though I failed to find a decent review telling even what CPU and wattage the thing has, let alone how to run it with Linux. Haven't considered a Mac Mini for the job since mine is so dog-slow I'd rather run CI on my notebook.

Seems the new Mac Mini will only run Mac OS or Windows: https://www.techrepublic.com/article/2018-mac-mini-blocks-li...

So how does this thing work? Does the private cloud only exist when the storage device is plugged in?

I see "smart TV" as a host option, does that mean the storage devices can use unaware USB hosts to be a communication mechanism?

Interesting idea, I'll have to try it @home. Whitepaper on how the tech works would be nice.

A white paper is necessary to understand this. It doesn't make sense otherwise. How would a smart TV or any TV for that matter be configured to export it's USB devices as a network mount? I mean, there's absolutely no way to reach a USB drive attached to your TV except from within the TV itself.

"You have one repository folder and one folder Duple on each device where you can access your cloud. This Duple folder works like a Dropbox folder, and everything is synchronized in multidirectional way between all the devices (all the Duple folders) and the repository folder which contains the totality of the private cloud."

So, here I am, exposing my 36TB nas using this new duple thing. Because every client needs the repository folder which contains the totality of the privace cloud, how is this going to work?

I finally ditched SpiderOak for SyncThing about a month ago - and I haven't looked back. It solves all of these issues, I fully host it myself, and I can access everything, exactly the way I want to based on shares.

I have a machine in my office, a shared folder on my mobile, two machines at home, and my wife has her office, and her work laptop. It's everything these things should be, other than the lack of an iOS app for her.

I have a lot of devices as well - syncthing can handle em all.

3x phones for photo upload, music sync, TWRP zips etc.

2x personal laptops

2x work devices + a personal VM

2x close friend party shares for easy linux iso sharing overnight

And most recently, added one volunteer device for instant/eventually consistent overnight poor-man's offsite data backup.

I can mix n match folders for this on the fly and feel no loss of functionality with how little I do arbitrary internet file sharing.

I need more from private cloud than just storage

Apart from all the other stuff, it seems suspicious that they would use Serpent for encryption, rather than using AES or another more well-known cipher suite (also, no talk about AEAD).


The article you link perhaps hint at why the developers felt like this: The NIST report apparently suggested Serpent was actually a bit more secure, but Rijndael was chosen for AES because it allowed for a more efficient software implementation. The developers may feel that that trade-off wasn't worth it, although obviously, going with a less-common strategy for encryption is generally discouraged.

Which is crazy. AES is good enough for TOP SECRET, it's the most widely supported & vetted modern cipher, and all modern hardware has instruction sets dedicated to it that prevent timing side channels (which Serpent can't claim). Not using AES here is a product design red flag.

Does anyone remember younity? Same exact product, same exact tag line "personal cloud" etc. (ex-engineer here)

I see https://www.duple.io/en/blocked.html when I click Try Duple What countries do they intend to not sell their software in and why? A FAQ would be nice.

I found the FAQ[0] to be the most informative.

[0] https://doc.duple.io/faq/

So I read the frontpage, download/try page and installation and still have absolutely no idea what this project is about.

Somewhat related- I've gotten a lot of mileage out of using Cryptomator (no affiliation) with iCloud to have access to encrypted documents across devices without having to worry about the pain of self-hosting. Here's a blog post I wrote about it https://karlshouler.com/posts/2019-05-31-secure-cloud-storag...

I clicked for a "Private Cloud", and all I saw was storage of files. Am I missing more?

Home cloud is an oxymoron, no? At what point is it just "I have some servers"?

Blocked in Venezuela. Not entirely surprised, but very disheartening.


It's an easier to install and use Nextcloud alternative, with open source components (to be released in the future) but closed source core.

I thought of nextcloud. Seems very similar:

For those unfamiliar nextcloud is kinda an open source Dropbox.


Blocked in South Africa.

how does this differ from owncloud/nextcloud?

>How does it work?

>Just turn it on!

That’s not an explanation of how it works, that’s an explanation of how to use it.

(I mean, you just know something stupid like that is coming the moment you see a loading progress bar for a static page. It’s not surprising, just disappointing.)

> you just know something stupid like that is coming

Please don't be a jerk in HN comments, especially when commenting on someone's work. Your comment would be fine without that last bit.



I mean, we need this sort of technology to take hold with non-technical users, so I applaud it from that aspect... but us technical users want to know what's actually happening here. And the fact that the source is "coming soon" doesn't help either.

Oh, here's what we're looking for: https://doc.duple.io/faq/

Thanks for that. That intro was anemic, even for laypeople.

Sounds like you still need a server unless you use local storage.

>we need this sort of technology to take hold with non-technical users,

I'm pretty sure non-technical users are not looking for a private cloud.

Are you sure? Eventually people are going to realize that the SaaS companies have sold them out. And they may not understand what a private cloud is, but that's what they're going to want: The convenience they're accustomed to in a device only they control which can't be shut down or taken away from them.

ISPs (at least in the US) have been selling their customers out for years. Nobody cares as long as Netflix works and their photos get to the people they want to see them.

>Eventually people are going to realize that the SaaS companies have sold them out.

In what way? And which SaaS companies?

Terrible website design. Absolutely no reason to block all content just because I have js turned off.

While it would be nice if they provided HTML endpoints, there are valid reasons for using SPAs, even for text-heavy sites. That said, the loading bar is a bit mind-blowing IMO.

What are those valid reasons?

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact