
Ask HN: What are some fun projects to try out on a spare Linux file server? - lordleft
I installed Kubuntu on an old PC and am currently using it as a remote dev environment &amp; FTP server. Any other cool use-cases for a spare Linux box?
======
whalesalad
Check these out:

\- reddit.com/r/homelab

\- reddit.com/r/homeserver

I'd say the content on /r/homelab is about 33/33/33 distributed between

1\. Masturbatory massive rigs at your home that consume thousands of watts of
power but look cool and its fun to say you have a 10-node vSAN cluster next to
your dehumidifier.

2\. People who like to pirate content like its their job - setting up Plex and
all the newsgroup/torrent apps to auto-pirate shows and movies. For a 13 year
old, that can be cool. For an adult who clearly has the cash to spend on non-
essential items and in a world where time is money: I do not get it.

3\. People with an old PC running Kubuntu for a remote dev / FTP environment
who want to do more cool shit with their gear!

So as long as you enter the homelab reddit knowing this, you will have a
better experience there.

I definitely second Pihole. I might suggest experimenting with virtualization:
so you can turn that single computer into 5 or 6. Tools like Proxmox or even
just installing KVM and managing it with ([https://virt-
manager.org/](https://virt-manager.org/)). You could Also look into running
your own installation of Gitlab (although due to the footprint of the install,
I might suggest keeping that isolated to a virtual machine)

~~~
oauea
> 2\. People who like to pirate content like its their job - setting up Plex
> and all the newsgroup/torrent apps to auto-pirate shows and movies. For a 13
> year old, that can be cool. For an adult who clearly has the cash to spend
> on non-essential items and in a world where time is money: I do not get it.

It took me about an afternoon or two to set up, was fun to do, and now my
favorite TV shows just appear in Plex for me to watch whenever I want to. Half
of the time they're not available on Netflix either, so I'd have to figure out
what streaming provider they're on, deal with their special websites, not get
notifications when a new episode airs, etc. It's just a better experience.

~~~
8fingerlouie
Or the case of discovering a great show on Netflix/HBO, binging all available
seasons, only to discover that 4 more seasons have aired, and Netflix/HBO just
didn’t bother enough to renew the seasons.

Or show is only available on the Nth $9.99/mo streaming platform, and I
already pay for 3 others. Normally I sign up for a trial with a fake email
address (getnada is great!), but some require a (valid) credit card, and I
don’t have an unlimited supply of those.

Or the show simply isn’t available in your geographic region.

~~~
tudorconstantin
Revolut has "virtual cards" which are invalid after the first charge.

~~~
deanclatworthy
Only for paying customers. It used to be three free disposable cards per
customer on the free tier but I woke up to find only virtual card left and no
disposable ones. It's a shame these kinds of features are limited to these
somewhat-sketchy online banks. None of the banks in my motherland offer
digital disposable card numbers yet.

~~~
groovybits
privacy.com

~~~
deanclatworthy
US only.

------
moviuro
Along with all other suggestions:

\- Syncthing node [https://syncthing.org](https://syncthing.org) ;

\- WireGuard server [https://wireguard.com](https://wireguard.com) ;

\- Pi-hole or just lying DNS Server [https://pi-hole.net](https://pi-
hole.net), [https://gitlab.com/moviuro/moviuro.bin/blob/master/lie-to-
me](https://gitlab.com/moviuro/moviuro.bin/blob/master/lie-to-me) ;

\- Nextcloud [https://nextcloud.com/](https://nextcloud.com/) ;

\- Your own little blog/site with https (LetsEncrypt), nginx, etc.

~~~
greggyb
Last item (blog/site) could run afoul of ISP terms. I would double check
before hosting anything public.

~~~
henryfjordan
I have Spectrum and I just checked, it is Prohibited to:

> Running any type of server on the system that is not consistent with
> personal, residential use. This includes but is not limited to FTP, IRC,
> SMTP, POP, HTTP, SOCS, SQUID, NTP, DNS or any multi-user forums.

That said, I run a small stack of personal stuff, including a public resume-
style site, and I've never heard a peep from them. Because there's nothing
commercial I feel like I could make an argument my site is "residential".

I did have my server pwned once though (I came home and it was very clear from
the fan intensity that it was mining bitcoin) so I'm not sure I'd recommend it
to anyone.

~~~
sucrose
My Synology luckily was never pwned, but there were millions of attempts.
Noticed higher than usual traffic one day, opened the access logs and found
someone attempting to brute-force the SSH credentials... unfortunately for
them I don't use passwords, only certs for authentication.

------
bluGill
I have a freenas NAS at home. I mostly use it so that I can have one home
directory (via nfs) on multiple computers. (Firefox sucks when doing this - if
anyone works for mozilla - help). I want to extend this to open LDAP so that I
can use one login to all computers (not so much for me, but the family).

I have cups installed so that my printer needs only be setup once to work for
all my machines. I'm thinking about putting my scanner on it as well (ideally
I'd figure out how to use the buttons on the scanner making a copier), but
since I never use my scanner this isn't really likely.

As a file server this is the obvious place to put all your shared media, then
use kodi [https://kodi.tv/](https://kodi.tv/) or volumio
[https://volumio.org/](https://volumio.org/) or some other media center to
share your media to all rooms. Setting this up is next on my list of projects,
but so far I haven't got it working. Actually I had it working at my last
house, but it was too hard to use so we didn't - let this serve as a lesion:
make sure not only that it works, but that it is convenient to use.

Using it as a central hub for your home automation is an idea. I've resisted
so far because [https://www.home-assistant.io/blog/2016/01/19/perfect-
home-a...](https://www.home-assistant.io/blog/2016/01/19/perfect-home-
automation/) makes a good point: I can't come up with a use for home
automation that isn't worse than what I have without it.

My fundamental idea is I have one powerful, reliable, always-on machine.
Anything that I want to run all the time should be on it. My other computers
are only turned on when I want to do something on them.

~~~
whatshisface
To solve your problem with Firefox, you might want to find their directory and
symlink it to somewhere on a local drive.

~~~
bluGill
That makes setting up new systems a pain, and is user-hostile

Users need to create a new profile for each new computer I buy. Firefox sync
sortof works but I shouldn't need to store this data externally if I don't
want to - and if I do want to it should still auto setup.

~~~
dijit
Look for a cache directory in ~/.firefox - symlink it to tmp or mount a
ramdisk over it.

I used to do this at my previous work. It’s frustrating that browser vendors
keep caches like that.

~~~
bluGill
I have a dozen users (my family and some friends who visit often enough...).
While nothing compared to any corporate system, it is still enough that I
don't know how to do this for everybody...

------
zelon88
Low power 'buntu home servers are my jam. I've got a few open-source projects
that you can plug right into a LAMP stack that you might be interested in...

[1] [https://github.com/zelon88/HRCloud2](https://github.com/zelon88/HRCloud2)
[2]
[https://github.com/zelon88/HRConvert2](https://github.com/zelon88/HRConvert2)
[3] [https://github.com/zelon88/HRScan2](https://github.com/zelon88/HRScan2)

------
jasonjayr
Running home servers, I'm always worried about power use:

300 Watt(average)/1000 * 24 hr = 7.2 kw/hr per day

7.2 * 30 * $0.14 (roughly tax, generation + distribution in my area) ≈
$30/month to run the machine.

~~~
cr0sh
$30 a month should not be a hardship for most; I think I spend more for fast
food every week (I certainly spend way more than that for random ebay and
amazon purchases each month - and let's not get into what I spend on my Jeep).

I can understand it being a concern if you are trying to run multiple machines
like this (beowulf cluster?) - or if you just don't want to be wasting the
power by leaving the machines running an idle...

...but really, at idle without anything happening, I doubt a system even pulls
half that. The best way to find out would be to plug the machine into some
kind of "kill-o-watt" monitor and check what it uses at idle vs under load.

I would guess that a relatively modern machine, with a basic-level CPU,
running headless and using SSDs or laptop hard drives - or just one or two
standard desktop drives - likely wouldn't pull more than 150 watts (or could
be made to do so).

~~~
ozim
If you can get VPS for $5 a month, I kind of don't feel like running some old
hardware at home is worth $30. Ofc depends on what you use it for. For NAS I
would rather buy actual NAS which would have Plex and other stuff and would
take a lot less in power than old computer and would be a lot better in terms
of storing data.

~~~
ihattendorf
I've got a 5 year old server with 2x8 core sandy bridge xeons, 32GB ram (I had
128GB total but went down to 32GB since it wasn't being used), an SSD for the
OS, and 8 hard drives from 2-8TB each (~30TB total).

I idle at 100W +/\- 10W, which is ~$10/month. Sure, I could get a VPS for
$5/month (with much lower specs), but the massive amount of local storage plus
the ability to easily host multiple VMs really outweighs the extra $5/month
cost to have everything local.

I run a handful of VMs and docker containers, such as pfSense, emby (similar
to plex), HAProxy, samba shares, snapraid+mergerfs for storage, etc. Worth it
to me, might not be for others unless they're interested in server
administration.

When this dies I'll probably downgrade to something smaller and shoot for <50
watts, but it's not worth it until it actually fails.

------
jolmg
Just to see what happens, it might be cool to find a way to safely (as in
without compromising your local network) provide root read, write and execute
access to the whole anonymous, public internet while still being able to fully
inspect everything that people (and bots) do. Maybe you can set an "intranet"
of virtual machines to also see how people/bots move around the network. Kind
of like:

[https://www.xkcd.com/350/](https://www.xkcd.com/350/)

~~~
somepig
expect a bricked firmware and/or child porn repository within 72 hours

~~~
jolmg
Ha. Would people really do that? I honestly don't know what to expect. Maybe
bitcoin mining?

Anyway, if we're talking about fun, I can't think of anything more exciting
than seeing what happens in this scenario.

I imagine the way you expose it makes a difference, too, in what kind of
people you attract.

~~~
snazz
I don’t think you need to expose it—a root account with “password” as the
password and open SSH on port 22 would do the trick.

~~~
dddddaviddddd
With adequate traffic filtering/limitations and VM snapshots, could be
interesting.

~~~
theossuary
Buy a second internet connection from your ISP if you're going to do that.

------
rocky1138
I've got an old HP laptop running a pihole-alike, boinc, syncthing, and a
webcam capture from my window: [https://johnrockefeller.net/guelph-
webcam](https://johnrockefeller.net/guelph-webcam).

In addition, I also host local files in a temporary way so other machines in
my home network can pick them up easily with `php -s`.

Running a home server on an old laptop is very useful. If you're worried about
power usage, just set the CPU to the lowest MHz like I did.

If you're doing it, please don't forget to look at boinc, because they need
all the help they can get with computing power.

~~~
emilburzo
How do you get such a glare-free image behind a window?

~~~
rocky1138
I hot glued the camera to the glass so there is no air gap.

------
aetherith
I've had a ton of fun with setting up FreeIPA [1] to do LDAP/Kerberos for
single sign-on. It's a primarily RH project but I think they have a working
server implementation for Ubuntu and derivatives.

[1][https://www.freeipa.org/page/Main_Page](https://www.freeipa.org/page/Main_Page)

~~~
dundercoder
I set this up 4 or 5 years ago, has it gotten easier since then? Specifically
TOTP was rough.

~~~
elcritch
Try out the free version of the Univention server. It’s pretty easy with ldap
and active directory running out of the box, but you have to KVM it.

1:
[https://www.univention.com/products/ucs/](https://www.univention.com/products/ucs/)

------
wdroz
DNS tunnel "server" using iodine
[https://github.com/yarrick/iodine](https://github.com/yarrick/iodine)

"Free" Wifi in the whole world after that as most AP allow DNS requests.

~~~
sandinmyjoints
I tried to do this about a year ago on an Ubuntu DO Droplet with a static
public IP to point the NS record to and I never could get it to work. Even
tried this fork which has a few more commits than upstream:
[https://github.com/advoryanskiy/iodine](https://github.com/advoryanskiy/iodine)
So are you running this successfully? Mind sharing your stack and hosting
environment?

~~~
wdroz
With fresh Ubuntu 18.04, I just installed some packages to be able to run
`make` without issues.

I did exactly the same as stated in the README

    
    
        t1    IN NS t1ns.mydomain.com.  ; note the dot!
        t1ns  IN A 10.15.213.99
    

I configured these entries from the domain provider directly.

Then I use proxychains with a sock tunnel (ssh -D ...) to run whatever I want
through the tunnel.

------
NickBusey
Check out HomelabOS to help install a bunch of services that can be useful to
have on a file server. Many of them are mentioned elsewhere in this thread. It
currently has over 50 services you can optionally enable.

The idea is you download it, you run `make`, it asks you a few questions, and
then you have a working Dockerized deployment configured via Ansible, with
versionable configuration.

[https://gitlab.com/NickBusey/HomelabOS](https://gitlab.com/NickBusey/HomelabOS)

------
ChuckMcM
I used an old VAX for a while connected to full frontal internet with a Telnet
port available. One of the cool things VMS had was a way to mirror the session
from one terminal on another. It would watch while script kiddies would try to
break in, only to be confused about what it was they had broken into when they
succeeded.

I'm sure you could do something like this by running VMS on simh and a
vlan/pty and use your modem's port routing nat feature to send it telnet
traffic.

My go to "spare Linux server" environment has been RasPis for a while though
since headless they are pretty cheap, with an actual x86 there are perhaps
some interesting PCIe, PCI ,or ISA cards (depends on how old it is) that are
fun. Setting it up as a media transfer station to move things off 5.25" floppy
or 3.5" floppy or Zip disks or something can be a fun use as well.

------
LinuxBender
It's a little time consuming to set up correctly and manage, but there is
Ampache [1], for creating a scalable streaming music server that can be joined
to others, so that catalogs are merged. It was created initially by a bunch of
college kids (ok, they were college kids back then) at OSU and paid for with
beer.

[1] -
[https://github.com/ampache/ampache/](https://github.com/ampache/ampache/)

~~~
kuzimoto
It's really not too bad to setup, I promise! Windows is a pain right now, but
Linux is pretty decent. I'm in the process of writing up a guide for setup on
Windows for the Wiki, and will also do one for OSX, and a generic one for
Linux. The Arch Wiki has a good guide already.

It relies on a webserver, MySQL/MariaDB, and PHP, so a great project to try to
get something useful running. Version 4 is going to be released soon and
brings a lot of nice changes. I've been trying to update the Wiki to make sure
content is up to date and isn't missing important information.

~~~
LinuxBender
In full disclosure, I only mentioned that because when I had last set it up,
it took some tinkering to get it fully functional. That was a few years ago.
I'm sure a lot of improvements have been made since then. It probably doesn't
help that I tend to harden servers a bit.

~~~
kuzimoto
No worries, I just wanted to make sure people wouldn't get scared off too
easily ;)

------
teekert
NextCloud FTW! Caldav, Carddav, Webdav, dropbox-replacement. What's not to
like? Installed in a snap: sudo snap install nextcloud

------
cr0sh
For a while I ran a box as a combo LAMP and ZoneMinder server; I ended up
shutting it down as I wasn't using the LAMP server much any longer, and
ZoneMinder was overkill for my purposes...

...but if you've never played around with a security camera system like
ZoneMinder allows for, it can be an interesting experience. Like I said,
complete overkill for my home security camera system - but it would be perfect
if you set it up like an appliance, on a beefy system - because it needs a lot
of CPU to process frames and such - and you have a lot of cameras to work
with.

You can hook up virtually any kind of camera; from a simple USB webcam (heck,
you could probably even get a parallel port black and white quickcam to work
if you were crazy enough), to a set of cameras on a multi-input video
digitizer card, to a full-on professional level PTZ (pan-tilt-zoom) camera
(with PTZ control!).

It is fairly complex to set up properly; one could easily create a business
around it selling turnkey camera security appliances if one wanted to, I
believe - though I don't think I've seen any such business like that (unlike
how there are/were tons of companies doing VOIP using custom Asterisk
systems).

The whole application is written (IIRC) in Python; I don't recall what it uses
for the video processing (whether OpenCV or some other library). It uses
multiple processes to handle everything (it isn't a monolithic system), so it
is fairly robust. But as I mentioned, the more cameras/video streams you have
(and the higher resolution those cameras are), the more CPU you need to
process everything. I'm not sure if there is a hard limit, or a way to
segregate things across multiple systems (ie - some kind of clustering?); I
think you can specify a separate DB server...

Anyhow - it's something different; I had it at one time running on a PII-350
with 64 Mb of RAM years ago - the last machine, IIRC, was a P4-933 with 256 Mb
of RAM - and I could process a couple of streams of 640x480 color video from a
couple of IP cameras (all I needed for my home system). But that was the limit
for that machine. A more modern machine would probably work much better.

------
rafaquintanilha
There was a recent thread with Raspberry Pi ideas that you might find useful:
[https://news.ycombinator.com/item?id=20264911](https://news.ycombinator.com/item?id=20264911)

------
rhinoceraptor
A couple things I find useful:

\- ZFS fileshare

\- Plex Server

\- Time Machine server for Macs

\- CUPS server, my printer is old and requires drivers, so now I can print
from my phone and computers without drivers

\- Run your own IRC bouncer if you're into that kind of thing

\- Run a seedbox, just for Linux ISOs of course :)

~~~
callmeal
>\- Plex Server

I had to stop using plex after they insisted on knowing everything I was
watching. Mine runs pihole, a print-server, and a fileserver for kodi now.

------
tombert
My latest project has been to start ripping my entire blu-ray and DVD
collection (which is massive), and using Emby to serve them. Since I didn't
want my laptop to overheat by compressing the video, I wrote a script to
transcode files on my server to a smaller size once the ripping is done.

It's a little different, since my "server" is actually six ODroid XU4's glued
together with Docker Swarm and NFS, but it wouldn't be hard to adapt for your
thing.

I'd be happy to share that script if you'd like but it's not terribly
elaborate.

~~~
beagle3
What do you use to rip those? I am about to embark on a similar project,
planning to use vobcopy so that I can retain all the DVD features (multiple
subtitles, multiple soundtracks, menu, etc.), but I suspect there's a better
option - I don't want to handbrake as this seems to lose all of those
features.

~~~
aidenn0
As tombert mentioned, ddrescue to generate an iso is probably fine for DVDs.
Note that you will need to use _something_ to initialize the DVD or many
drives will just refuse to read out encrypted sectors; something like
"dvdbackup -I" will work fine.

As a note, do not use slim dvd or bluray drives for this; I went through
several and found that they would have many read errors for DVDs that play
fine in actual players and full-height drives.

I am not aware of a good way of getting the full menu experience on bluray
discs. For many discs, stripping the menus saves 1-4 minutes of time when
watching the movie though, so I haven't worried about it too much.

[edit]

Also, if your DVDs have any dust on them, clean them with a microfiber cloth
and spray for cleaning glasses before inserting. ddrescue will usually
complete eventually, but it can literally be days with a dirty or scratched
disc.

For everyone else: if you don't care about menus &c. makemkv is great. It's
not libre software, but the linux version is perpetually in beta so it's free-
as-in-beer (though I purchase a license because it's just so darn useful).

~~~
tombert
I think MakeMKV is slightly better than "free as in beer", because you _do_
have the source to audit, you just can't distribute it willy-nilly like with
something with a true FLOSS license.

I too actually bought a license, because it's one of my favorite pieces of
software that perpetually "just works".

~~~
aidenn0
I thought the bulk of makemkv was closed source, with only a porting layer
that is built and linked as part of the install process?

[edit]

I looked at a recent download and a lot more of it delivered as source than I
thought (even under a GPL or LGPL in many cases). makemkvcon is shipped as
binary only though.

------
thom
I like to leave spare machines running chess engines to analyze openings etc.
You could also leave it churning for days on end on various Kaggle
competitions (especially if it had a relevant GPU). Could also get the thing
scraping interesting datasets at a slow and respectful pace.

~~~
vidar
Interested in the chess aspect, what do you do with the analysis when it
completes?

~~~
thom
Often I'm just looking for some vaguely novel line in an opening that might be
interesting to pursue in a game (I am not a strong player but love studying).
I sometimes like seeing the difference in analysis between Stockish and Leela
of real games (Magnus Carlsen attributes much of his recent form to trying to
play more like AlphaZero with all sorts of positional pawn sacrifices etc).
I've also tried to do some programmatic stuff around the analyses to automate
things (i.e. find a move in a common line that nobody in my database has ever
played but the computer says is best, or find lines where it looks like the
opponent only has a single path through to stay in the game etc).

------
PopeDotNinja
If you're interested in learning about Systemd, you could service that runs in
the background. Here's an an example of a Slack it I wrote...

A couple years ago & at the last minute, I needed to cover for a sys admin on
a new, complicated system that didn't yet have much automation or monitoring.
I only got 2 hours of instruction before he needed to disappear for two weeks.
We had a status page that showed the status of services, but it wasn't wired
to email alerts, PagerDuty, etc.

I really didn't want to keep staring at the status page. So I wrote a simple
bash script that curled the status page once per minute. Then the script grep-
ed thru the html looking for a line that indicated there was a critical
failure. Finally, I did a curl post request to a slack channel that said "the
servers are on fire" if there was something to report, and occasionally posted
a "the servers are not on fire" if everything was fine. It was a great hack
that made life easier while I learned how to properly manage the system, and
help with wiring up the monitoring. It was pretty quick to write, too!

------
smivan
I wrote an article about this exact question:
[https://blog.ivansmirnov.name/what-can-you-do-with-a-
persona...](https://blog.ivansmirnov.name/what-can-you-do-with-a-personal-
server/)

Some new tech that hasn't made it on the list yet:

\- [https://github.com/issmirnov/zap](https://github.com/issmirnov/zap) \-
recursive URL expander, can be run at DNS level. Ie, "n/" ->
"[https://news.ycombinator.com/"](https://news.ycombinator.com/") or
"f/g/homelab" -> facebook.com/groups/homelab

\- netdata

\- [https://concourse-ci.org/](https://concourse-ci.org/)

\- [https://varnish-cache.org/intro/](https://varnish-cache.org/intro/)

\- AFP/NFS/samba for file sharing

\- [https://github.com/janeczku/calibre-
web](https://github.com/janeczku/calibre-web) \- web UI for calibre ebooks

\- [https://jupyter.org/](https://jupyter.org/) for python scripts and web
scraping

\- [https://github.com/cdr/code-server](https://github.com/cdr/code-server)
for an IDE in a tab.

\- [https://www.portainer.io/](https://www.portainer.io/) if you use docker

\- [https://yourls.org/](https://yourls.org/) for a private URL shortener in
your home

------
moduspwnens14
If you haven't learned Kubernetes yet, it's probably a good use case for that.

Then use any of the other ideas here, except deploy them through Kubernetes.

------
tenebrisalietum
\- FreshRSS; get all your RSS feeds remotely in one spot.

\- OnlyOffice; this is basically an open-source version of Google Docs. It's
rather heavyweight - you will need 2 hosts for this to run and the install
uses many varied components, so this is a great opportunity to work with
virtualization or containerization.

\- EtherCalc; a shared spreadsheet, much easier to setup than OnlyOffice.

\- WARCproxy; record your web browsing to a WARC archive and play it back
later.

\- Install nginx, setup a 64GB VM, install a graphical Linux desktop on it,
and use NoVNC and nginx websockets to allow external access to it.

\- Play with Yacy, a search engine.

\- Seed some Debian torrents and help out the community.

------
gzu
If its built out of a conventional pc, wipe it and install proxmox on an SSD.
With that you can virtualize a openmediavault NAS instance then have spare for
spare Linux/windows VM instances, Pihole VM, Nextcloud, etc

------
giomasce
If you are close to a window and willing to buy a receiver, install dump1090
and feed to FlightAware and FlightRadar24.

------
malceore1
I would sugguest: 1\. CUPS print server 2\. Pihole 3\. Automated Backup
storage 4\. Network Storage with Samba

At home I use Mozilla Webthings gateway but running on Lubuntu Server instead
of the sugguested Raspberry Pi. I even host a local only voice assistant that
leverages Gateway to control device states.
[https://hacks.mozilla.org/2019/04/introducing-mozilla-
webthi...](https://hacks.mozilla.org/2019/04/introducing-mozilla-webthings/)

------
sirwitti
Gitlab and NextCloud are pretty cool. With NC you can quite easily host your
own calendar and contacts.

------
akeck
One of my favorite recent projects I've done was making a full wpa2 wifi
access point with firewall out of a Raspberry Pi 3B. (Yes, I know the
limitations of hardware. This was a "hold my beer" project.). If your old PC
isn't too old and has virtualization instructions, you have a lot of fun with
KVM for VMs and iSCSI for a SAN.

~~~
nibalizer
I agree with trying to move more of the network functionality out of the ISP
provided device and into home servers. A simple and cheap access point[1] can
provide all the L1/L2 you need. Then you can run dns and dhcp inside Linux,
configure firewall rules, segment your network, etc. The sky is the limit!

Note: I reccomend a dedicated access point over hostapd because I had a lot of
problems with it circa 3 years ago.

[1] [https://www.dell.com/en-us/work/shop/tp-link-tl-
wa801nd-300m...](https://www.dell.com/en-us/work/shop/tp-link-tl-
wa801nd-300mbps-access-point-wireless-access-
point-80211b-g-n-24-ghz/apd/a8859068/networking)

------
bristleworm
If you plan to keep the server running you could run pihole on it.

~~~
true_tuna
I run mine on a raspberry pi zero w. I wouldn’t waste the power of dedicating
a whole server to it.

~~~
bristleworm
Me neither. My thoughts were: if the server already is running 24/7 for some
other reason, you could put pihole on it as well.

------
ohiovr
You can install bind on it to serve your own dns, or wireguard vpn so you
don’t get pwnd in a coffee shop. You could get dnsmasq going so you can give
your lan computers proper domain names though it might not play nice with bind
if it is on the same server.

------
godshatter
Run a freenet node, or setup a YaCy node, or some other network node. An
always-on network node for projects like those can really benefit them.

------
gsliepen
Run a mirror for your favorite projects. This can be for individual programs,
or for whole repositories of data like the Debian archive or Wikipedia. Please
do check with whatever project you are mirroring if they are OK with it (in
case bandwidth will be significant) and what the best practices are that they
recommend.

------
saxonww
Familiarize with other distributions/operating systems: nixos, one of the
bsds, etc.

------
cr0sh
It's been mentioned in passing form a few times, but if the machine is decent
enough (cpu and ram), you might consider setting it up to run and learn Docker
or some other virtualization/container system.

------
gigama
Running a Folding@Home client is a helpful way to contribute spare CPU cycles
to science...

[https://foldingathome.org/](https://foldingathome.org/)

------
lawrencegs
If you have tons of space, how about running a blockchain node? Something like
FileCoin node is interesting because you can theoretically made some tokens
when you’re doing it.

------
jedberg

        rm -rf /bin
    

Then try to recover from it.

------
true_tuna
I set up a gitlab ci/cd server. It was fun, just the right about of
challenging. Although gitlab has free hosted accounts so maybe not too useful.

------
peterbmarks
Ever since Google Reader was killed I've run a [https://tt-
rss.org](https://tt-rss.org) server.

------
decentralizer
Hosting a Tor relay. Also if you contribute to network with at least 500
KBytes/s internet speed, you get a Tor T-shirt after 60 days.

------
eswat
I have my Pi set up with Pi-hole and an IRC bouncer.

------
BossingAround
Deploy a single-node kubernetes instance and learn some containers/dev-ops to
put on your CV, that'd be my advice.

~~~
true_tuna
Don’t expose it to the internet please. Or if you do make sure you know how to
secure it first.

------
grahamburger
Matrix and Mastodon are kinda fun, and federated so you can join a community
even if it's just you on your server.

------
larrysalibra
plex, tor relay, bitcoin + lightning nodes

------
EamonnMR
[https://www.plex.tv/](https://www.plex.tv/)

------
jamieweb
Rosetta@Home - perform medical research using your spare CPU cycles.

------
quickthrower2
A Kubernetes Node.

An elastic search for whatever you fancy searching

------
hestefisk
Install Asterisk and run your own home PBX using SIP.

------
crobertsbmw
Fork bomb. Seriously, it’s a fun learning excersize.

~~~
scottydelta
Many years ago I was in a course which had a lab. We would log onto linux
machines with network based login(kerberos IIRC). Once logged in, you could
also ssh into a server available to us for running codes(limited access).

One day while the lab was in session, I was just playing with stuff and tried
executing a fork bomb on the server. The moment I executed the fork bomb, all
the 120 machines in the lab became unresponsive, including mine. I didnt
realize at that time that they were using the same server for code execution
as well as auth server. No one knew what happened and the class was dismissed.

------
josteink
Get to know ZFS.

------
asoltys83
run a sia node: [https://sia.tech/](https://sia.tech/)

